Stock markets have retreated again over worries of further US interest rate rises after the Federal Reserve defied Donald Trump to increase rates for the fourth time this year.

The EU has confirmed it is “actively investigating” a potential breach of its diplomatic communications network, following reports that secret cables had been stolen by hackers.

The Bank of England has welcomed a “crucial and positive” move by the EU to help keep a key part of the financial system functioning in the event of a “no-deal” Brexit.

A handful of banks will be forced to write multimillion pound cheques to buy shares in the construction giant Kier Group after some of its biggest investors snubbed the chance to take part in a £250m fundraising.

GlaxoSmithKline (GSK) is to merge its consumer healthcare unit with that of rival Pfizer, to create a new market leader with almost £10bn in annual sales.


Santander has been fined more than £30m for “serious failings” in processing the accounts of dead customers, the Financial Conduct Authority (FCA) says.


Regulation or Research? The Search for Solutions to Reduce Truth Decay in the Media

What is social media’s role in the decline of trust in the media? Is government intervention needed to help stop the spread of misinformation on these platforms? These questions were the focus of a recent RAND Corporation event in Boston on the connection between the media and Truth Decay.

The consensus of a panel of researchers: To begin solving the problem, more data, access to how social media platforms work, and transparency are needed.

Jennifer Kavanagh, a political scientist at RAND, opened the talk by defining what RAND researchers call “Truth Decay”—the diminishing role of facts and analysis in American public life. The panelists addressed how changes in the information system, including the rise of social media and the use of algorithms for news gathering, are driving Truth Decay. Kavanagh was joined by David Lazer, a professor of political science at Northeastern University, and Claire Wardle, a research fellow at the Shorenstein Center at Harvard Kennedy School and executive director of the nonprofit First Draft.

Everything you see is “algorithmically mediated” on social media platforms,” Lazer said, but there’s very little research on the role algorithms play.

A lot of disinformation is being shared globally through encrypted messaging apps and text messaging services, according to Wardle. Without access to the type and volume of content spread on these closed systems, she said researchers are missing a huge part of the ecosystem. More understanding of how these platforms work is needed before society moves toward “regulation with a capital ‘R’,” she said.

Increasing transparency would be a step in the right direction, said Kavanagh. Social media platforms could provide clarity on where their advertising money comes from or open their application programming interfaces, and they could work to identify and monitor bots on their systems. But the companies need incentives and encouragement to make these types of changes, which run counter to their business model “whether that’s regulation or the threat of regulation remains to be seen,” she said.

“Transparency does create its own kind of incentives,” Lazer added.

Kavanagh advised that social media users and consumers of media need to be part of any solutions to address Truth Decay. “We can implement all the regulations that we want, but if people aren’t willing to look for facts and take the time to identify what is a fact, then I don’t think it makes a difference,” she said. “There has to be an understanding of why facts matter—and why it’s important to be an informed participant in democracy—if democracy is what you want.”

Laura Hazard Owen

Leave a Comment