Pay packets rose by an average of 3.1% in the three months to August, the fastest pace for nearly a decade, according to UK official figures.

Nomura has agreed to pay $480m (£364m) to settle US claims relating to the mis-selling of mortgage-backed securities ahead of the financial crisis.

Audi has been fined €800m (£700m) for failings that enabled the firm to sell almost five million diesel with software designed to cheat emissions testing.

Social media-first publisher Ladbible has snapped up rival Unilad following its financial collapse earlier this month, saving 200 jobs.

Scottish Power and Drax, Two of the UK’s biggest energy companies have bought and sold assets from each other in a £702m deal that marks one of the sector’s biggest shake-ups in years.


Security threats from Chinese companies building 5G networks could end up “putting all of us at risk” if they are not tackled quickly, according to a former security minister Admiral Lord West.

Paddy Power Betfair fined £2.2m over gambling check failings


Regulation or Research? The Search for Solutions to Reduce Truth Decay in the Media

What is social media’s role in the decline of trust in the media? Is government intervention needed to help stop the spread of misinformation on these platforms? These questions were the focus of a recent RAND Corporation event in Boston on the connection between the media and Truth Decay.

The consensus of a panel of researchers: To begin solving the problem, more data, access to how social media platforms work, and transparency are needed.

Jennifer Kavanagh, a political scientist at RAND, opened the talk by defining what RAND researchers call “Truth Decay”—the diminishing role of facts and analysis in American public life. The panelists addressed how changes in the information system, including the rise of social media and the use of algorithms for news gathering, are driving Truth Decay. Kavanagh was joined by David Lazer, a professor of political science at Northeastern University, and Claire Wardle, a research fellow at the Shorenstein Center at Harvard Kennedy School and executive director of the nonprofit First Draft.

Everything you see is “algorithmically mediated” on social media platforms,” Lazer said, but there’s very little research on the role algorithms play.

A lot of disinformation is being shared globally through encrypted messaging apps and text messaging services, according to Wardle. Without access to the type and volume of content spread on these closed systems, she said researchers are missing a huge part of the ecosystem. More understanding of how these platforms work is needed before society moves toward “regulation with a capital ‘R’,” she said.

Increasing transparency would be a step in the right direction, said Kavanagh. Social media platforms could provide clarity on where their advertising money comes from or open their application programming interfaces, and they could work to identify and monitor bots on their systems. But the companies need incentives and encouragement to make these types of changes, which run counter to their business model “whether that’s regulation or the threat of regulation remains to be seen,” she said.

“Transparency does create its own kind of incentives,” Lazer added.

Kavanagh advised that social media users and consumers of media need to be part of any solutions to address Truth Decay. “We can implement all the regulations that we want, but if people aren’t willing to look for facts and take the time to identify what is a fact, then I don’t think it makes a difference,” she said. “There has to be an understanding of why facts matter—and why it’s important to be an informed participant in democracy—if democracy is what you want.”

Laura Hazard Owen

Leave a Comment