• Lab
  • AndroidForMobile Foundation at
    HOME
              
    LATEST STORY
    Apple News Plus is a fine way to read magazines, but a disappointment to anyone wishing for a real boost for the news business
    ABOUT                    SUBSCRIBE
    March 8, 2019, 9:34 a.m.
    Audience & Social

    If Facebook goes private, where will the misinformation go?

    Plus: YouTube adds fact-checks (in India), and Facebook moves to combat anti-vaxxing after receiving loads of public pressure.

    The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

    What would Facebook’s turn to privacy mean for misinformation? This week Mark Zuckerberg his “privacy-focused vision” for Facebook, writing, “I expect future versions of Messenger and WhatsApp to become the main ways people communicate on the Facebook network…I believe we should be working towards a world where people can speak privately and live freely knowing that their information will only be seen by who they want to see it and won’t all stick around forever.”

    If such a shift really happens, what sort of impact would it have on misinformation on the platform?

    “It would push some of Facebook’s biggest PR problems under a rug, such as fake news, hate speech, election interference, and harassment, which would become much harder to police — or to hold Facebook accountable for,” Will Oremus . “And it would open new ones, creating ‘dark social’ networks that could be havens for criminal or even terrorist activity, while giving equal shelter to everyone from dissidents to hate groups.”

    Wired’s Nicholas Thompson Mark Zuckerberg about some of this:

    If Russian intelligence operatives had just used private encrypted messaging to manipulate Americans, would they have been caught? As Facebook knows from running WhatsApp, which is already end-to-end encrypted, policing abuses gets ever harder as messages get more hidden.

    In our interview, Zuckerberg explained that this, not fears about the business model, is what keeps him up at night. “There is just a clear trade-off here when you’re building a messaging system between end-to-end encryption, which provides world-class privacy and the strongest security measures on the one hand, but removes some of the signal that you have to detect really terrible things some people try to do, whether it’s child exploitation or terrorism or extorting people.”

    Techdirt’s found a bright side, it’s possible that “if Facebook were to move to more of a ‘protocols’ approach to messaging, rather than controlling everything, they might then be able to open up the system so that end users themselves could make use of third party apps or filters to help them decide if messages were legit or not, rather than leaving it entirely up to Facebook.” This seems naive: The people who proactively go out and, say, install a fake-news-identifying browser extension are not the problem. At any rate, Facebook is already relying on third-party fact-checkers now to help it police content, so I don’t see how this opens up new opportunities for outside fact-checkers.

    And on how political discussion changes when it moves from social media platforms to closed messaging platforms.

    Under pressure, Facebook will block anti-vax content. In a , Facebook outlined how it will — after weeks of public pressure — curb misinformation related to vaccines.

    — We will reduce the ranking of groups and Pages that spread misinformation about vaccinations in News Feed and Search. These groups and Pages will not be included in recommendations or in predictions when you type into Search.

    — When we find ads that include misinformation about vaccinations, we will reject them. We also removed related targeting options, like “vaccine controversies.” For ad accounts that continue to violate our policies, we may take further action, such as disabling the ad account.

    — We won’t show or recommend content that contains misinformation about vaccinations on Instagram Explore or hashtag pages.

    — We are exploring ways to share educational information about vaccines when people come across misinformation on this topic.

    Also, YouTube will be showing users fact-checks (which it’s calling “information panels”) on topics that are “prone to misinformation,” BuzzFeed’s , though the feature is only available to some users in India right now and YouTube hasn’t said when it will expand it globally. And it’s unclear .

    “Newspaper clippings and television news screen grabs (real or fake) were extensively shared.” The general election that India will hold this year is being described as its : Since 2014, when the last general election was held, WhatsApp usage has skyrocketed in the world’s largest democratic country: As of 2017, , a figure that has certainly only grown since then (the company hasn’t released an updated figure).

    Fake news shared on WhatsApp . When the BBC did an in-depth analysis of a group of Indian WhatsApp users in 2018, it found that the majority of messages shared within their private networks could be categorized either as “scares and scams” or “national myths.” The most common way that information is shared, the researchers found, was via images — “visual information, sometimes layered with a minimum amount of text.”

    This week, The Hindustan Times shared in more than 2,000 public, politics-focused Indian WhatsApp groups during the 2018 state elections. Here’s reporter :

    Doctored screenshots and news clippings are used to make the content seem more reputable:

    Seven of the ten most shared misleading images in the pro-BJP WhatsApp groups were media clippings. The most shared image was a screengrab of a primetime segment of Times Now, an English TV news channel, claiming that the Congress party manifesto in Telangana was Muslim-centric. Seven “Muslim only” schemes were included in the manifesto, the image claimed, including a scholarship for Muslim students and free electricity to Mosques. Except that the information was misleading. Alt News, a left-leaning fact-checking news website, later debunked how the news channel had misreported the story, by selectively picking parts of the manifesto to create a false narrative.

    This message repeatedly appeared in various forms — eight of the top ten misleading images in the BJP groups were only about the manifesto — including screen grabs from CNBC-Awaaz, another news channel, and standalone graphics.

    The example illustrates a key point: “fake news” as commonly understood has various shades. Unlike the morphed ABP news screenshots (second most shared) that propagated outright lies, the Telangana manifesto story is based on partially-true information that was later found to be misleading. The intent in the latter case is not clear and often difficult to establish.

    Why are there so many media clippings? One possible explanation for this phenomenon is that WhatsApp-ers leverage mainstream media artefacts to compensate for the declining credibility of WhatsApp content.

    Illustration from L.M. Glackens’ The Yellow Press (1910) via .

    POSTED     March 8, 2019, 9:34 a.m.
    SEE MORE ON Audience & Social
    SHARE THIS STORY
       
     
    Join the 50,000 who get the freshest future-of-journalism news in our daily email.
    Apple News Plus is a fine way to read magazines, but a disappointment to anyone wishing for a real boost for the news business
    Few entities have the potential to help improve news production and consumption more than Apple. This falls short of hopes.
    The long, complicated, and extremely frustrating history of Medium, 2012–present
    “A beautiful space for reading and writing” and pivoting.
    Instead of helping Canadian news startups, a new government subsidy will only prop up failed models
    “We don’t need a handout. But a policy that actively disincentivizes new media outlets from launching and growing to serve the information needs of Canadians is bad policy.”