MIS Speaker's Series: Elina Hwang
1 p.m. to 2 p.m. March 25, 2022
Elina Hwang, Assistant Professor of Information Systems, University of Washington at Seattle.
Title: A Nudge to Credible Information as a Countermeasure to Misinformation: Evidence from Twitter
Abstract: Fueled by social media, health misinformation is spreading rapidly across online platforms. Myths, rumors, and false information on COVID-19 and vaccines are flourishing, and the aftermath can be disastrous. A more concerning trend is that people are increasingly relying on social media to obtain healthcare information and tend to believe what they read on social media. Given the serious consequences of misinformation, this study aims to advance our understanding of a potential cure for the infodemic we face. Specifically, we focus on a countermeasure that Twitter currently employs, which is to nudge users toward credible information when users search topics for which erroneous information is rampant. Twitter’s policy is unique in that the intervention is not about censorship but rather about redirecting users away from false information and toward facts. Our analysis utilizes 1,796 news articles that contain misinformation about health topics such as measles, vaccine, cancer, and COVID-19. Our analysis reveals that Twitter’s policy effectively reduces misinformation diffusion. After the policy introduction, a news article that contains misinformation is 17% less likely to start a diffusion process on Twitter. In addition, tweets that contain a link to misinformation articles are less likely to be retweeted, quoted, or replied to, which leads to a significant reduction in the aggregated number of tweets each misinformation article attracts. We further uncover that the observed reduction is driven by the decrease both in original tweet posts—those that first introduce misinformation news articles to the Twitter platform—and in those resharing the misinformation, although the reduction is more significant in resharing posts. Lastly, we find that the effect is driven primarily by a decrease in human-like accounts that share links to unverified claims but not by a decrease in activities by bot-like accounts. Our findings indicate that a misinformation policy that relies on a nudge to a credible source rather than on censorship can substantially contain misinformation.
Bio: Elina Hwang is an assistant professor of information systems at the Foster School of Business, University of Washington. She earned her Ph.D. from the Tepper School of Business, Carnegie Mellon University. Her research focuses on social technologies and their impact on businesses and society. She employs econometrics, network analysis, and machine learning to analyze large-scale field data. She has published her research in top-tier business journals such as Information Systems Research, Manufacturing and Service Operations Management, Organization Science, and Production and Operations Management.