• Lab
  • AndroidForMobile Foundation at
    HOME
              
    LATEST STORY
    Andrew Yang, the most meme-able 2020 candidate, also wants to save journalism
    ABOUT                    SUBSCRIBE
    Feb. 8, 2019, 7:29 a.m.
    Audience & Social

    A little knowledge is a dangerous thing — no, seriously, it is, according to this new research

    If you thrive on emotion, read this.

    The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

    People who’ve scanned Facebook for news gain a little knowledge. Why do some of them think they’ve gained a lot? Consider statements like “I feel that I need to experience strong emotions regularly” and “I feel like I need a good cry every now and then.” How much do these statements apply to you?

    If you have a high “need for affect,” you’re driven to seek out strong emotions (whether positive or negative). If you have a low need for affect, you try to avoid feeling strong emotions. And many of us fall somewhere in between.

    It turns out that where you are on the “need for affect” scale also influences how right you think you are about news on Facebook, according to a paper published this week in :

    We found that Facebook’s News Feed, with its short article previews, provides enough information for learning to occur. This in itself is an important and normatively positive finding: in a relatively new way of acquiring information, Facebook users are learning by merely scrolling through their News Feed. However, this learning comes with an additional consequence: audiences who only read article previews are overly confident in their knowledge, especially individuals who are motivated to experience strong emotions and, thus, tend to form strong opinions. These individuals demonstrating a high “need for affect” () are significantly more likely to overestimate their knowledge when encountering snippets of information in Facebook’s News Feed.

    The authors, Nicolas Anspach, Jay Jennings, and Kevin Arceneaux, hypothesized that “those who are high in need for affect [would] form relatively strong opinions based on the limited information gleaned from the Facebook News Feed and, therefore, [would] be more likely to come away with an illusion of confidence in their knowledge relative to those who score lower on need for affect.” This hypothesis was borne out.

    The study included 990 U.S. adults on Mechanical Turk. Their cognitive styles — particularly their need for affect — were measured; then they were assigned to one of three groups: one read a Washington Post article summarizing a Pew survey on the safety of genetically modified food, another looked at a faux News Feed that contained an article preview of the Pew research (as well as other article previews), and a third control group saw neither. They were then asked six factual questions about GMOs.

    They found that subjects did glean a little knowledge about GMOs from just the News Feed preview, but “the amount of knowledge acquired is modest.” However, individuals with a high need for affect — people who thrive on emotion — were much more certain that they’d learned a lot:

    Upon gaining a little bit of knowledge from article previews, high-NFA subjects fail to reflect upon the fact that such previews convey an incomplete picture of the issue at hand. These individuals tend to evaluate their knowledge with their gut reactions, and display a higher level of overconfidence than their less emotive counterparts.

    It is worth noting that this was only true for the people in the “preview” group. “This NFA only influences overconfidence in a ‘middle ground’ information environment,” the authors write. “NFA effects do not exist when no information is provided, nor in an information-heavy context.”

    It seems to be that little bit of learning that can be dangerous. If you want to try to gauge your own need for affect, .

    Oh, and another study recently found that the most extreme opponents of GMOs . The researchers surveyed adults in the U.S., Germany, and France, and found that

    [in] all countries, self-assessed knowledge increased significantly with extremity and the gap between self-assessed and objective knowledge grew. Objective knowledge decreased significantly with extremity in the United States. However, in the European countries, although the direction of this effect was the same, it was not statistically significant.

    The authors, Philip Fernbach, Nicholas Light, Sydney Scott, Yoel Inbar, and Paul Rozin, write:

    A traditional view in the public understanding of scientific literature is that public attitudes that run counter to the scientific consensus reflect a knowledge deficit. Science communicators have made concerted efforts to educate the public with an eye to bringing their attitudes in line with the experts. These initiatives have met with limited success, which has led for calls to abandon this approach altogether. Our findings highlight a difficulty that is not generally appreciated. Those with the strongest anti-consensus views are the most in need of education, but also the least likely to be receptive to learning; overconfidence about one’s knowledge is associated with decreased openness to new information. This suggests that a prerequisite to changing people’s views through education may be getting them to first appreciate the gaps in their knowledge.

    Just two misleading claims by politicians were tweeted 10 times more often than 3,200 Russian troll tweets. U.K. researchers found that misinformation from politicians was much more impactful than thousands of troll and bot tweets (working paper ). They looked at claims and tweets during Brexit and found in part:

    In particular, just two of the many misleading claims made by politicians during the referendum were found to be cited in 4.6 times more tweets than the 7,103 tweets related to Russia Today and Sputnik and in 10.2 times more tweets than the 3,200 Brexit-related tweets by the Russian troll accounts.

    This fits well with a Medium piece by the University of Michigan’s , “” (Nyhan has written about this before.) He writes:

    The most worrisome misinformation in U.S. politics remains the old-fashioned kind: false and misleading statements made by elected officials who dominate news coverage and wield the powers of government. As 2016 illustrated, the costs of making unsupported claims are low in highly partisan contexts, which limits the incentive for politicians to avoid them. Reading a fact-check of Trump’s convention speech, for instance, false beliefs that crime was increasing in the long term but did not affect his support.

    Trump has gone on to make more than during his first two years in office, many of which are amplified in cable news chyrons or in credulous online news headlines. As a result, a sizable minority of Americans still some of his most frequently repeated false claims. These beliefs persist despite unprecedented fact-checking efforts, which struggle to overcome of polarization in media trust. Even more corrosively, Trump’s supporters are increasingly rationalizing those falsehoods. Belief in the importance of presidential candidates being honest has from 71 percent among Republicans in 2007 to just 49 percent today, threatening the previously uncontested norm that the president should be expected to say things that are true, or at least not obviously false.

    Illustration from L.M. Glackens’ The Yellow Press (1910) via .

    POSTED     Feb. 8, 2019, 7:29 a.m.
    SEE MORE ON Audience & Social
    PART OF A SERIES     Real News About Fake News
    SHARE THIS STORY
       
     
    Join the 50,000 who get the freshest future-of-journalism news in our daily email.
    Andrew Yang, the most meme-able 2020 candidate, also wants to save journalism
    Yang’s core idea — that local journalism, increasingly unable to pay for itself, should be subsidized by the federal government — would broaden the range of policy proposals up for mainstream political debate.
    Luminary’s rough launch continues as another high-profile podcast asks to be removed from its app
    The Joe Rogan Experience is the latest show to opt out of populating the top of the paid podcast app’s conversion funnel.
    Months from launch, The Markup abruptly fired cofounder Julia Angwin, setting off an editorial exodus
    The majority of the site’s editorial staff resigned this morning, and the future of a much-anticipated watchdog for technology companies is very much in doubt.