• Lab
  • AndroidForMobile Foundation at
    HOME
              
    LATEST STORY
    Just showing our work isn’t enough
    ABOUT                    SUBSCRIBE
    Aug. 15, 2018, noon
    Reporting & Production

    Democracy is cracking and platforms are no help. What can we do about it? Some policy suggestions

    Here are a few in a new Canadian report: greater transparency requirements for digital news publishers, holding social media companies legally liable for the content on their platforms, and mandatory independent audits for platform algorithms.

    Platforms aren’t efficiently self-regulating. (or some know it too well). The internet can be a cesspool of spiteful users and malicious bots and yeah, in some places, digital-based communities and positive connections. But what can be done?

    How about requiring internet companies to be legally liable for the content appearing in their domains?

    Auditing algorithms regularly and making the results publicly available?

    Launching a large-scale civic literacy and critical thinking campaign?

    Giving individuals greater rights over the use, mobility, and monetization of their data?

    These are some of the suggestions floated in a new Canadian report by Public Policy Forum CEO/former Globe and Mail journalist and University of British Columbia assistant professor/Columbia Journalism School senior fellow . The ideas are bold, sure, and maybe a little far-fetched — especially when viewed from the very different regulatory context of the United States — but hey, bold thinking is at least somewhere to start.

    “We believe that certain behaviors need to be remedied; that digital attacks on democracy can no more be tolerated than physical ones; that one raises the likelihood of the other in any case; and that a lowering of standards simply serves to grant permission to those intent on doing harm,” they wrote.

    Greenspon also authored a report last year about , with 12 specific steps.

    In the new report, Greenspon and Owen start with assumptions like “there is a necessary role for policy; self-regulation is insufficient on its own” and “elected representatives have a responsibility to ensure the public sphere does not become polluted with disinformation and hate by setting rules, not by serving as regulators.”

    (Side note: In a survey with results out today, though from Canada’s southern neighbor, internet users narrowly opted for companies to be held accountable for accurate and unbiased information rather than for the government to get involved. But a third felt that the users should be responsible instead.)

    The recommendations also push for more transparency and accountability in the platforms and companies that contain the vast majority of public dialogue today. These include:

    “The internet represents the greatest advance in communications since the printing press, but its consolidation by a handful of giant global companies and the exploitation of its vulnerabilities by individuals and organizations intent on destabilizing our democracy have reversed its early promise and challenged the public interest,” Greenspon and Owen wrote. .

    POSTED     Aug. 15, 2018, noon
    SEE MORE ON Reporting & Production
    SHARE THIS STORY
       
     
    Join the 50,000 who get the freshest future-of-journalism news in our daily email.
    Just showing our work isn’t enough
    “There’s very little current demand for the majority of reproducible code from newsroom leadership or the general audience.”
    Let’s talk about power (yours)
    “If we don’t use it in ways that give people quality news, useful information and power, people will find a way around us.”
    Newsrooms take the comments sections back from platforms
    “Local news organizations should become a driving force for better online public discourse, because Facebook and Twitter aren’t cutting it.”