• Lab
  • AndroidForMobile Foundation at
    HOME
              
    LATEST STORY
    Welcome to GDPR: Here are the data privacy notices publishers are showing their Europe-based readers
    ABOUT                    SUBSCRIBE
    March 8, 2018, 9 a.m.
    Audience & Social

    Living in a sea of false signals: Are we being pushed from “trust, but verify” to “verify, then trust”?

    BuzzFeed’s fake-news reporter outlines some of the dangers ahead: “We have a human problem on our hands. Our cognitive abilities are in some ways overmatched by what we have created.”

    Editor’s note: — the indefatigable BuzzFeed journalist who specializes in misinformation, disinformation, and all things fake news — recently gave testimony at the of the , which is charged with examining “the causes and consequences of a collapse in trust in democratic institutions, with a focus on trust in the media, journalism and the information ecosystem.” Here are his remarks, republished with his permission.

    I have read some of the other testimony before the commission and was pleased to see people speak about media literacy, the alarming ease with which technology will enable us to create compelling audio and video fakes, so-called brought on by massive changes in communications technology, and the shift in trust from institutions to “people like me.” These are important topics.

    I’m here to talk about Native American content on Facebook.

    Yes, really.

    I believe this case study encompasses many of the challenging and urgent issues related to trust, misinformation, and digital communities that this commission is grappling with.

    There is a massive network of Native American pages, groups, and user accounts on Facebook that collectively have millions of fans, members, and friends. They publish articles about Native issues and share photos and videos of events such as the protests at Standing Rock. There are also pages that exclusively publish photos of attractive Native American women and beg fans to comment on their beauty. Some of the pages and related websites publish articles about unrelated health topics such a fibromyalgia. Their articles about actual Native American issues are often plagiarized from genuine Native publishers based in the United States.

    Some theses sites and pages — which by the way are some of the biggest and most active Native American pages on all of Facebook — also publish completely false stories or trade in classic clickbait articles. One group of pages recently shared a fabricated story about a police officer arresting Malia Obama and later being found dead. It was plagiarized from a fake news site.

    If you’re a person with an interest in Native American topics and issues, these pages and groups will present as some of the best places to get that content on Facebook. They have signals of authority such as a high number of fans, and a name that seems legitimate. Often the groups or pages are administered by profiles that profess to be Native American. The pages often falsely list an address or organization in US that they are affiliated with. In some cases, the people who run these groups use a checkmark emoji in the group name to make it seem as if the group has been verified by Facebook.

    So, who is running these groups and pages?

    Young men in places such as Kosovo and Vietnam.

    when the Standing Rock issue sparked interest in Native American pages and groups, and when these page operators were doing a booming business selling t-shirts with design stolen from real Native American artists. I covered it again earlier last year when I wrote about , which I’ll discuss in a moment. Media Matters also and as a result many pages were taken down. .

    I must also mention a woman in Indiana named Sarah Thompson who, when not raising her kids and working her farm, .

    I spoke to her last week and she made an important point: there are real communities of Native Americans on Facebook. But for many people on the platform, Native American culture means glamour shots of women in headdresses, clickbait content about unrelated topics, and fake news about Malia Obama. This sea of fake Native Americans is drowning out the real voices. They are taking up space, attention, and revenue from actual Native Americans.

    This is happening because of a combination of factors that play out time and again in a variety of niches and topics, and not just on Facebook. These fake Native publishers are winning because they are better at playing the game. Authenticity and accuracy do not determine success. At times, they hamper it. Attention often flows to those who best know how to exploit the systems that capture it.

    For me, this example encompasses so much about the current reality of media and online misinformation. For one thing, it shows that online misinformation is about more than just information itself — it also involves economic and algorithmic incentives on a global scale. We also need to think about human factors, such as cognition and belief.

    It’s a complex problem, and I’m sorry to say I don’t have the answers to it for you today. I do, however, have a list of things I think about as I do of detecting and revealing digital deception and misinformation in their many forms. I offer them to you in case you want to fold them into your work.

    • Our human faculties for sense-making, and evaluating and validating information, are being challenged and in some ways destroyed in this new information ecosystem. We are all getting false signals. This affects our ability to construct and apply trust.
    • It also creates an opportunity for bad actors who understand how to exploit this ecosystem. Whether it’s overseas spammers focused on Native American content or a well-funded Russian effort to mine the divisions in American society, in both cases they are exploiting openness, free speech, and the new media reality that is, on the one hand, more democratized than ever before — while simultaneously dominated by powerful platforms and mediated by algorithms.
    • The result is that .
    • One thing we must keep in mind is the technology and systems that enabled this reality are optimized to capture and surveil the activities of users. They therefore have the potential to become the world’s greatest censorship infrastructure. So we must be careful that attempts to root out misinformation do not have the unintended consequence of cracking down on free speech.
    • Just as many of us are now used to seeing a label on our clothing that lists somewhere far away, the same is happening to information. This is . The flattening effects of platforms — whereby content and actors largely look the same in a timeline or News Feed — has enabled this on a massive scale. Think about how globalization has impacted manufacturing and other industries. Now let’s think about his this will apply to information.
    • The cues that people have used to determine the authenticity of information are in many cases no longer sufficient. Just about everything can be fabricated. We must keep this in mind as we look to establish new signals of trust, because they too will inevitably be fabricated and gamed.
    • A class of these new signals must therefore be unimpeachable and difficult to fabricate. This, in general, could mean metadata, chains of custody, and perhaps open, blockchain-like ledgers that track the flow of information. I’m not an expert in this areas but suffice it to say we need to create incentives for experts to work on this.
    • Technology is not the only answer. We have a human problem on our hands. Our cognitive abilities are in some ways overmatched by what we have created. We must develop our capacity to navigate this complex and confusing information ecosystem. For now, a new orientation is to assume fakeness, assume deception. An old mantra was “trust, but verify.” But now I think this default assumption of trust is in some ways under attack, or perhaps self-defeating. I also think we see it receding with the ongoing decline in trust in institutions. And so perhaps we are entering an era of “verify, then trust.”
    • This raises serious questions about how we evolve society and democracy in a world where the assumption of trust can be exploited, and where it is no longer the default. What does “verify, then trust” do to our social fabric? I hope it’s something you will marshal your resources and brainpower to consider and, hopefully, point a way forward.

    And I also hope that when you come across Native American content on Facebook or elsewhere you will pause and consider whether it is what it appears to be. Verify, then trust.

    A final editor’s note: Let this tweet serve as an addendum to Craig’s remarks.


    is a media editor for BuzzFeed News based in Toronto.

    POSTED     March 8, 2018, 9 a.m.
    SEE MORE ON Audience & Social
    SHARE THIS STORY
       
     
    Join the 45,000 who get the freshest future-of-journalism news in our daily email.
    Welcome to GDPR: Here are the data privacy notices publishers are showing their Europe-based readers
    We’re seeing what publishers have decided to implement on their websites as of May 25 — whether they’ve decided to block European Union and European Economic Area-based traffic outright, set up buckets of consent for readers to click through, or done something simpler (or nothing new at all).
    What is it that journalism studies is studying these days? A lot about newsrooms, less about everybody else in the news ecosystem
    Also, has the “fake news” moment already passed for academics?
    Is your fake news about immigrants or politicians? It all depends on where you live
    Plus: Facebook is accepting proposals for fake news research, and fake news was growing as a topic of media discussion even before the U.S. presidential election.
    узнать больше www.profvest.com

    читать дальше

    У нашей фирмы классный web-сайт со статьями про Реабилитация в Германии http://touristmedservice.ru