• Lab
  • AndroidForMobile Foundation at
    HOME
              
    LATEST STORY
    Bad news from Mashable, BuzzFeed, and Vice shows times are rough for ad-supported digital media
    ABOUT                    SUBSCRIBE
    Oct. 13, 2017, 8:49 a.m.
    Audience & Social

    Even smart people are shockingly bad at analyzing sources online. This might be an actual solution.

    Plus: Facebook is just fixing a bug (right!), labeling fake news seems to work, and lawmakers will release the Russia ads.

    The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

    Many smart people are still very bad at evaluating sources. Stanford’s and “10 Ph.D. historians, 10 professional fact checkers, and 25 Stanford University undergraduates…as they evaluated live websites and searched for information
    on social and political issues.” What they found:

    Historians and students often fell victim to easily manipulated features of websites, such as official-looking logos and domain names. They read vertically, staying within a website to evaluate its reliability. In contrast, fact checkers read laterally, leaving a site after a quick scan and opening up new browser tabs in order to judge the credibility of the original site. Compared to the other groups, fact checkers arrived at more warranted conclusions in a fraction of the time.

    In one exercise, for instance, participants were asked to compare articles from two sites: One from the , and the other from the . The two organizations sound similar, but are very different:

    The Academy, established in 1932, is the largest professional organization of pediatricians in the world, with 64,000 members and a paid staff of 450. The Academy publishes Pediatrics, the field’s flagship journal, and offers continuing education on everything from Sudden Infant Death Syndrome to
    the importance of wearing bicycle helmets during adolescence.

    By comparison, the College is a splinter group that in 2002 broke from its parent organization over the issue of adoption by same-sex couples. It is estimated to have between 200-500 members, one full-time employee, and publishes no journal (Throckmorton, 2011). The group has come under withering criticism for its virulently anti-gay stance, its advocacy of “reparative therapy” (currently outlawed for minors in nine U.S. states), and incendiary posts (one advocates adding P for pedophile to the acronym LGBT, since pedophilia is “intrinsically woven into their agenda”) (American College of Pediatricians, 2015).

    The American College of Pediatricians , but “students overwhelmingly judged the College’s site the more reliable” — as did a fair percentage of historians.

    “They seemed equally reliable to me. I enjoyed the interface of the [College website] better. But they seemed equally reliable. They’re both from academies or institutions that deal with this stuff every day,” one student said.

    Another said, “Nice how there’s not really any advertisements on this site. Makes it seem much more legitimate.”

    The whole paper is really fascinating, very readable — the best thing I’ve read so far on digital literacy. Don’t miss the section at the end that talks about where schools’ media literacy curriculums — with their easily gameable checklists — may be going wrong.

    Most of the other digital literacy content I’ve read focuses on funny things middle schoolers say, or focuses on the immensity of the problem. But the skills outlined in this paper — the authors call them “heuristics” — are very teachable, they’re not hard to explain, and they can be easily incorporated into curriculums. .

    Could Facebook help more people act like factcheckers? that a recently announced Facebook feature that adds context to shared links could actually be successful because it helps more people do the type of lateral reading that the Stanford study outlines. Facebook is “on to something genuinely valuable that is actually pretty hard to game, and sufficiently open to user choice without telling people what they should be accepting as true and false,” Greenup writes.

    And now it is time to stop saying nice things about Facebook:

    Facebook hides the data that had let one researcher look at the spread of disinformation. Strangely timed “bug” or the obfuscation of information? Last week’s column led with the story that the Tow Center’s had found a way to determine the extent to which disinformation was shared by six Russian-controlled, election-related, now-defunct accounts — and the spread, Albright determined, was huge: .

    Albright had used the Facebook-owned CrowdTangle to analyze the posts. But now, The Washington Post’s , Facebook has “scrubbed from the Internet nearly everything — thousands of Facebook posts and the related data — that had made [Albright’s] work possible. Never again would he or any other researcher be able to run the kind of analysis he had done just days earlier.” (Also in The Washington Post, George Washington University associate professor in accepting Albright’s analysis since CrowdTangle data can be very hard to analyze accurately; still, he noted to Timberg, “Any time you lose data, I don’t like it, especially when you lose data and you’re right in the middle of public scrutiny.”)

    Facebook told the Post that it had merely “identified and fixed a bug in CrowdTangle that allowed users to see cached information from inactive Facebook pages.” What interesting timing. On Thursday, Facebook COO Sheryl Sandberg Axios’s that “things happened on our platform that shouldn’t have happened” during the 2016 presidential campaign, that “we’ll do everything we can to defeat [Russia],” and that Facebook owes America “not just an apology, but determination” for its role in allowing Russian interference to take place.

    Meanwhile, lawmakers plan to release the 3,000 Russian ads to the public “after a Nov. 1 hearing on the role of social media platforms in Russia’s interference in the election,” in The New York Times. “That hearing, and a similar one that the Senate Intelligence Committee plans to hold with Facebook, Google and Twitter, will place Silicon Valley’s top companies under a harsh spotlight as the public perception of the giants shifts in Washington.” ()

    Facebook says its fact-checking stuff is working. From a leaked email to Facebook’s factchecking partners, :

    We have been closely analyzing data over several weeks and have learned that once we receive a false rating from one of our fact checking partners, we are able to reduce future impressions on Facebook by 80 percent. While we are encouraged by the efficacy we’re seeing, we believe there is much more work to do. As a first priority, we are working to surface these hoaxes sooner. It commonly takes over 3 days, and we know most of the impressions typically happen in that initial time period. We also need to surface more of them, as we know we miss many.

    Illustration from L.M. Glackens’ The Yellow Press (1910) via .

    POSTED     Oct. 13, 2017, 8:49 a.m.
    SEE MORE ON Audience & Social
    PART OF A SERIES     Real News About Fake News
    SHARE THIS STORY
       
    Show comments  
    Show tags
     
    Join the 45,000 who get the freshest future-of-journalism news in our daily email.
    Bad news from Mashable, BuzzFeed, and Vice shows times are rough for ad-supported digital media
    The rapid growth of Google and Facebook continues to take its toll on digital media companies.
    Asking members to support its journalism (no prizes, no swag), The Guardian raises more reader revenue than ad dollars
    The Guardian revamped its ask and its membership offerings — moving from 12,000 members in the beginning of 2016 to 300,000 today.
    Beating the 404 death knell: Singapore news startups struggle to cover costs and find their footing
    Political news reporting doesn’t seem to be holding up well as a business in the city-state. And it’s even harder when you’re seen as “alternative” media.