Two Clemson researchers fighting online disinformation have a new hub for truth

The past few years have been a whirlwind for Darren Linvill, an associate professor in Clemson’s Department of Communication, and Patrick Warren, an associate professor in Clemson’s John E. Walker Department of Economics. From their initial project in 2017 to uncover and expose more than 3 million Russian Twitter troll tweets, the researchers have become international experts in identifying and exposing social media disinformation campaigns.

Hundreds of news articles and countless social media impressions later, a new venture has launched to broaden Clemson’s impact in this area. The University’s new Media Forensics Hub, part of the Watt Family Innovation Center, which is a flexible space for collaboration, innovation and project development on campus, will expand the ways that Clemson can help journalists, policymakers and fellow academics around the world gain a more complete understanding of the complex social media landscape.

But the Media Forensics Hub goes beyond educating those who study media. It also is designed to empower all users of Facebook, Twitter, Instagram and other social media sites to better understand disinformation: how it works, how to spot it and how stop it from spreading.

A hub for humans

Two years have passed since the day two major, national publications released breaking news stories on Linvill and Warren’s research. It also happened to be Linvill’s birthday, and as it turned out, it was also a celebration of the stunning realization they were truly onto something.

“When our work was highlighted in the Washington Post and Wired magazine on the same day, I remember thinking, ‘Woo, my next year is sure going to be different,’” Linvill said. “I completely underestimated just how different it would be.”

Now, the Media Forensics Hub will work to build society’s understanding of the context, origins and impact of modern media and, in the process, maintain Clemson University’s role as a leader in this new field of research. The center is a virtual space based in the Watt Family Innovation Center that will house disinformation data as well as resources and digital tools for spotting disinformation campaigns.

“We want the Hub to be a service to the state of South Carolina, to journalists, to the public, to the government and to Clemson,” Linvill said.

Funded by a four-year grant obtained by Watt Center Director Todd Marek through the South Carolina Research Authority, the Media Forensics Hub will be managed by David White, a research assistant professor in Clemson’s Department of Parks, Recreation and Tourism Management. Steven Sheffield, a Ph.D. student with two decades of media forensics background, also will be working with Linvill and Warren to manage the Hub.

In addition to disinformation data, the hub will house data and facilitate research on misinformation, which is when someone unintentionally shares untrue information. Disinformation, Warren said, is when someone intentionally spreads untrue information to cause chaos.

For researchers, it will be a virtual space to find collaborators and aggregate outside research and data related to disinformation. One of the goals of the hub is to foster a multidisciplinary community of students and researchers. To achieve this goal, the Hub has started a working group of scholars from around the world: Clemson, Duke University, the University of Texas in Austin, the University of Wisconsin, MIT Lincoln Laboratory and Wilfrid Laurier University in Canada, all working to understand the flow of misinformation. In the working group, researchers present their findings to each other, solicit feedback on it and line up additional collaborators.

The professors’ efforts with this working group and with each other helps strengthen the influence of this disinformation campaign research, said Wendy York, dean of the College of Business.

“Patrick and Darren’s work is a perfect example of the power cross-college research collaboration can have for the University and beyond,” York said. “Leveraging that collaborative power internally and working with external entities keeps the University’s research current and relevant.”

Training everyday troll spotters
Continued academic research will be central to the work of the Hub. But academic research alone won’t solve the disinformation problem, which is why the Hub has also been designed to provide resources and tools for the general public to help identify disinformation campaigns earlier and more definitively.

Infographic: How to spot a fake account? Are profile images of an every day person? Is the account anonymous? Is there a personalized description? Is the account verified? Most accounts have some person posts - do they?One such tool will be “Spot the Troll,” an online learning quiz disseminated through the Hub. The quiz, which can be found at, trains participants to more easily differentiate between a troll account and a real person’s account on social media. New methods of media literacy education are necessary for stopping the spread of disinformation, even as tactics for its dissemination continue to change and evolve.

“Information-spread has shifted from a centralized media system to one that is much more disaggregated that thrives on people’s reliance on news from social media,” Warren said of the past two decades. “Journalists have guidelines for how to present news within the proper context, but a random person on Twitter doesn’t have those guidelines.”

Just as important is growing that knowledge and awareness to the next generation, the professors explained. That’s why the Hub is incorporating the work of students and involving them in research about the value of understanding disinformation. This semester, a Clemson University Creative Inquiry undergraduate research class will investigate disinformation campaigns connected to the 2020 U.S. presidential election, among other hot-button topics. This work and student research are examples of Clemson’s land-grant mission to help the state — and it’s a prime example of the College of Behavioral, Social and Health Sciences’ (CBSHS) mission, said CBSHS Dean Leslie Hossfeld.

“As the country begins its final sprint toward the 2020 presidential election, professors Linvill and Warren’s involvement in the new Media Forensics Hub contributes to the College of Behavioral, Social and Health Sciences’ commitment to building people and communities in our ever-connected world,” Hossfeld said. “At a time of national importance, their activities extend the impactful research and teaching happening across the college, including work in our nationally recognized Social Media Listening Center.”

Linvill is also teaching an independent study with three Ph.D. students focused on researching disinformation. The communications expertise he brings into the classroom includes historical context about disinformation, which often employs principles that communication scholars mapped out a generation ago.

“The same strategies currently being employed by trolls have served advertising practitioners for years,” Linvill said. “The Russians are simply following in the footsteps of strong public relations campaigns.”

Because of this research, Warren has introduced an entirely new unit about propaganda for a graduate class he teaches for the College of Business about the intersection of politics and economics. He also has given community lectures about disinformation to Rotary clubs, public affairs groups and the League of Women Voters, among others.

Outing what’s fake

Close to home, this past June, Linvill and Warren worked with Clemson University leaders to identity a fake account pretending to be a Clemson student spreading racist content. In 2019, they partnered with the Wall Street Journal to investigate how Chinese disinformation was impacting the National Basketball Association.

These are just two of the practical, real-world applications of their research at work, and it is the direction they see the impact of the Hub going. Of late, the professors have mostly studied disinformation surrounding elections, but disinformation campaigns target much more than politics, which is a growing concern worldwide, Linvill and Warren said. In other words, when November 3 passes, the problem won’t go away.

“Misinformation and disinformation affect our perceptions of a range of things outside of elections such as science, public health and social movements,” Linvill said. “It’s also of concern to businesses and brands, which can be negatively affected by these campaigns.”

And the reality is, the work of two University researchers could never expose every targeted disinformation campaign in the world. However, the Media Forensics Hub, through its efforts to expand, engage and educate the public at large, makes it more likely that disinformation will be easier to spot.

That means, one day soon, we may be able to read it, refute it and within moments, simply move along.