Skip to content
MIT Better World

By Michael Blanding

 
But none could have foreseen the power of social media to spread disinformation worldwide.

Sinan Aral PhD ’07, the David Austin Professor of Management at the MIT Sloan School of Management, where he holds a joint appointment in the IT and Marketing groups, recently put the adage to the test by examining all the tweets sent in Twitter’s first 10 years. Published in Science with colleagues Soroush Vosoughi ’08, SM ’10, PhD ’15 (now an assistant professor at Dartmouth) and MIT professor of media arts Deb Roy SM ’95, PhD ’99 last year, the study found that false news stories on Twitter spread six times faster than true ones, and reached 100,000 people on average compared to 1,000.

“False stories diffused further, faster, deeper, and more broadly than the truth, in every category of information that we studied,” says Aral. “Sometimes by an order of magnitude.” For two decades, Aral has studied “social contagion” between connected users online. His work will culminate with a new book, The Hype Machine, to be published by Crown this September, on the eve of the 2020 US election. The timing is fitting, given concerns over Russian interference in the last election, as well as the political disinformation trolls continue to propagate. Not all social contagion is bad, however. “This technology has the potential for tremendous promise and tremendous peril,” says Aral. “It depends on how you use it.”

Aral began examining how information spread online in 2001 as a managerial economics PhD student at MIT. He has been leading one of the primary research groups within the MIT Initiative on the Digital Economy (IDE) at MIT Sloan since its inception six years ago, and in July, will become the IDE’s director. He examines how advertisers, governments, and nonprofits harness social media to influence online users. “Social media is really just a behavior change agent,” he says. “If we point it toward problems we want to solve, we can do a lot of good in the world.”

In an ongoing controlled study in South Africa, for example, Aral is examining the efficacy of a program to encourage HIV testing using phone messages from loved ones. In another study, he examined peer influence on exercise using a running app, finding people were more apt to run during inclement weather if they had a friend also running on the app that day.

Determining causation between false news and voting has been trickier. Even though 126 million Americans were exposed to Russian propaganda in 2016, researchers have been unable to tell how that affected the election, in large part due to Facebook and other companies’ refusal to share data on individuals, something Aral calls the transparency paradox. “On one hand, they are facing tremendous pressure to show us how it all works,” he says. “On the other, they are facing tremendous pressure to lock it all down to not violate people’s privacy.”

Techniques do exist, however, to anonymize data and fulfill both needs. In another Science article last year, Aral and Dean Eckles, the KDD Career Development Professor in Communications and Technology at MIT Sloan, argue that it is essential to employ them, both to determine the effects in the last election and to protect the next one. “Voting is a cornerstone of our democracy,” Aral says. “If we are going to harden our democracy against that threat, we have to understand how it works.”