Trump threatens China with new tariffs on another $200 billion of goods

 Fujifilm is suing Xerox for more than $1 billion over failed merger

Global stocks sink as Trump ups the stakes in China trade fight

ZTE stock drops 25% after US Senate pushes to keep ban

Audi CEO Rupert Stadler arrested in Germany

Amazon shareholders call for halt of facial recognition sales to police

Google bets $550 million on Chinese e-commerce firm JD.com

Microsoft urges Trump administration to change its policy separating families at border

TECHNOLOGY TOP STORIES

Regulation or Research? The Search for Solutions to Reduce Truth Decay in the Media

Truth Decay in the Media

What is social media’s role in the decline of trust in the media? Is government intervention needed to help stop the spread of misinformation on these platforms? These questions were the focus of a recent RAND Corporation event in Boston on the connection between the media and Truth Decay.

The consensus of a panel of researchers: To begin solving the problem, more data, access to how social media platforms work, and transparency are needed.

Jennifer Kavanagh, a political scientist at RAND, opened the talk by defining what RAND researchers call “Truth Decay”—the diminishing role of facts and analysis in American public life. The panelists addressed how changes in the information system, including the rise of social media and the use of algorithms for news gathering, are driving Truth Decay. Kavanagh was joined by David Lazer, a professor of political science at Northeastern University, and Claire Wardle, a research fellow at the Shorenstein Center at Harvard Kennedy School and executive director of the nonprofit First Draft.

Everything you see is “algorithmically mediated” on social media platforms,” Lazer said, but there’s very little research on the role algorithms play.

A lot of disinformation is being shared globally through encrypted messaging apps and text messaging services, according to Wardle. Without access to the type and volume of content spread on these closed systems, she said researchers are missing a huge part of the ecosystem. More understanding of how these platforms work is needed before society moves toward “regulation with a capital ‘R’,” she said.

Increasing transparency would be a step in the right direction, said Kavanagh. Social media platforms could provide clarity on where their advertising money comes from or open their application programming interfaces, and they could work to identify and monitor bots on their systems. But the companies need incentives and encouragement to make these types of changes, which run counter to their business model “whether that’s regulation or the threat of regulation remains to be seen,” she said.

“Transparency does create its own kind of incentives,” Lazer added.

Kavanagh advised that social media users and consumers of media need to be part of any solutions to address Truth Decay. “We can implement all the regulations that we want, but if people aren’t willing to look for facts and take the time to identify what is a fact, then I don’t think it makes a difference,” she said. “There has to be an understanding of why facts matter—and why it’s important to be an informed participant in democracy—if democracy is what you want.”

Laura Hazard Owen

Leave a Comment