Nitin Agarwal, Ph.D.,
Maulden-Entergy Chair and Distinguished Professor of Information Science,
Director, Collaboratorium for Social Media and Online Behavioral Studies (COSMOS),
University of Arkansas at Little Rock
IN THE MIDST of the recent turmoil at Twitter, our first instinct was to get in touch with Dr. Nitin Agarwal, a UA Little Rock professor who studies social media for the likes of the National Science Foundation, the Department of Defense, NATO, and the Department of Homeland Security. Predictably, Dr. Agarwal has some interesting and insightful things to say about Elon Musk and his recent acquisition, but we were really surprised at what he says about the current state of social media in general.
-------------------------------------------------
As someone who’s never been on any “social” media, I find it crazy to be starting this year asking you to talk about Twitter. But that important platform appears to be in disarray. What do you think about the changes there, and how are they affecting your team’s research?
First off, I would like to wish all your readers a happy, healthy, productive, and prosperous 2023.
Twitter is one of the most prominent social media platforms in the Western world and is of keen interest to journalists, celebrities, influencers, public figures, organizations, any individual sharing conscious stream of thoughts, and researchers like us, who use social media data for studying humans’ cyber social behavior. The changes at Twitter are seismic and have happened fast. There is quite a lot to unpack.
So tell me—is Elon Musk a good thing or a bad thing?
That’s the 44-billion-dollar question. There’s been a lot of speculation about this issue—good and bad. But here’s my perspective. There are several challenges with Twitter such as the infestation of bots, widespread misinformation, prevalence of hate speech, disturbing and toxic content, and the inherent bias and lack of transparency in algorithms that curate our feeds. These are all severe problems and Musk has recognized them as such. If he's able to address these issues, it's going to make a big difference in how people digest information from Twitter.
However, certain actions have also become a cause for concern, such as the bungled launch of the paid subscription, banning several prominent journalist and influencer accounts (that has put a dent in Twitter’s ad revenue stream), and eviscerating the content moderator team (also known as the trust and safety team). Most recently, Musk’s poll on whether he should continue as the CEO resulted in an unfavorable response from the Twitterati, and that has led to an uncertainty in the leadership and vision. I believe time will tell.
Aside from Twitter, how is social media in general these days part of this moment?
There was a world before Twitter, and whatever happens, there will still be a world after it. In fact, many Twitter alternatives like Mastodon, Koo, and so on are already scooping up the Twitter diaspora. Having said that, Twitter has changed our communication culture. Terms like hash-tagging or retweeting didn’t exist before Twitter.
But more importantly, your question requires us to step back and think about why social media platforms, and more broadly, the Web 2.0 phenomenon, emerged in the first place. The goal was to democratize information production and consumption, meaning anyone should be able to produce, access, or consume information. Prior to social media, information production was controlled by a select group of publishers, which meant they controlled what information was available. Now anyone with a phone can tweet or live-post about things that are happening on the ground in that moment, turning us into citizen journalists. And that has connected us beyond all geographical barriers and across time zones. When you look at it from that angle, Twitter, and social media in general, has played a major role in accomplishing that goal of information democratization—whether it’s raising awareness about a social issue, mobilizing support for a campaign, organizing a protest or large-scale social movements.
However, there have been off-and-on cases where adverse impact of this technology has affected us as a society. Due to the openness of these platforms, it’s hard to control who says what, giving rise to various types of aggressive, malicious, or deviant behaviors—whether it’s espousing misinformation on these platforms, wittingly or unwittingly, thereby putting vulnerable populations at a much greater risk, or spreading toxicity in discourse, or adversarial bots that are programmed and unleashed in a discourse to cause polarization, or threats posed by violent extremists (e.g., to operational security of a mission, or around military bases). Furthermore, the lack of transparency in these platforms’ algorithms that curate our content feeds perpetuate implicit biases that affect our behaviors. These and other forms of deviant behaviors are the focus of various studies at our COSMOS Research Center.
So how has your work changed since we talked last year? You’re focused on the entire world, and Twitter is just a little part of it.
Yes, you're absolutely right. When you look at global social media usage, Twitter is just one part of it. Much of the information in the world nowadays is consumed via platforms like YouTube, TikTok, Instagram Reels, and so on—in other words, user-generated multimedia content-sharing platforms. According to recent statistics, YouTube has the second highest share of the global Internet traffic, recording 15 billion visits every day. More than 500 hours of video footage is uploaded every minute, and close to 700,000 hours of videos are streamed on YouTube. So, there is a need to look beyond Twitter and generally the text-based communication medium. We need to study these fast-paced video-sharing platforms and examine the wide spectrum of emergent socio-digital behaviors and evolving communication culture, which is also the focus of some of the recent research projects at our COSMOS Research Center, with over $5 million in funding from the U.S. Department of Defense (DoD) and National Science Foundation (NSF).
At a much broader level, we ought to be thinking how we can leverage these platforms in bringing our communities together. We need to develop ways to build resiliency in our communities with equal stakes from scientific research, innovation, education, and policy. As we have seen, social media platforms have the capacity to lead large social movements and transform societies towards democratic values and principles. Demonstrably, these platforms have the potential of bringing people together and, ultimately, providing the means for civil
discourse.
Recently, however, these principles seem to have taken a back seat. We need to make sure that civil discourse can be brought back to our communities, in order to heal our communities, and make them more resilient. This is one of the primary thrusts of our recent research projects—to study how social networks or social media can be used in ways that help bridge the divide in our communities, heal fractured communities, reduce polarization, raise awareness about important issues like toxicity, hate speech, misinformation, and adversarial information actors (whether state- or non-state-sponsored), and develop inoculation strategies to combat misinformation. In a competition organized by the NATO’s Innovation Hub, in which 132 teams across 30 member nations participated, the solutions
developed at our COSMOS Research Center were recognized by NATO as one of the top 10 solutions to counter the invisible threat of cognitive warfare. These efforts are funded by the U.S. DoD’s Minerva Research Initiative and NSF through grants close to $3 million.
That brings to mind the midterm elections back in November: I frankly was surprised that the country didn't get burned down around them. Are people becoming more likely these days to have a civil discourse, or are they still just screaming at one another?
To a large extent, people are recognizing various malicious behaviors on social media platforms. For instance, we’re starting to see that the adversarial tactics that were initially quite successful in sowing discord or provoking hysteria in our communities aren’t as successful now. This could be an early indication of the increasing level of awareness about such adversarial efforts in our communities. We need to be cautiously optimistic, though, since the adversarial tactics will evolve just as we are evolving.
Moreover, social media platforms are ramping up efforts to combat malicious behaviors and stem the tide of fakery, although it is still a long and winding road ahead.
Further, social media netizens are becoming more tolerant of diverse opinions, meaning we’re seeing early signs of breaches in conventional echo chambers. It's just that we’re at a delicate phase in that process and need to keep nudging it in the right direction. With proper education, generational investments in advancing behavioral research driven innovations, appropriate policy implementations, and visionary leadership at the helm of social media companies, this positive change can be amplified so that we can build resilient communities.
When did you start noticing this change?
Based on our research for various U.S. DoD and NSF funded projects, from general discourse on social media platforms to the security-related discourse in the Americas, Europe, and Indo-Pacific region, we started seeing this change during the COVID-19 pandemic, when people began to realize the serious threats of COVID-19 misinformation to their lives and to the lives of their loved ones. I believe it was a wake-up call—that not everything being pushed on social media is true, and one has to be extremely cautious in believing what is posted on anyone's feed or Facebook page.
That's really a hopeful sign.
Indeed. History has taught us that adversity acts as a great binding force. It brings human beings together. We’ve all been through the pandemic together. We’ve felt the effects of COVID-19 in almost every part of our lives. This brought us closer to not just fighting the pandemic, but also to fighting the “misinfodemic”—the misinformation mania that came along with the pandemic. Misinformation had always been there, but during COVID-19 it touched our lives at levels that we hadn’t imagined before. COVID-19 brought a shift in people's opinions about the information that’s presented in social media, how to be more aware, and how to teach ourselves media literacy.
With seed funding from Arkansas Research Alliance (ARA), our COSMOS Research Center launched a research-driven educational effort to raise awareness about COVID-19 related misinformation and scams in our great state of Arkansas. ARA helped establish a partnership between COSMOS and the Arkansas Office of the Attorney General, which was vital for the success of this effort. We put together a COVID-19 misinformation tracking website that shed light on misinformation, scams, and conspiracy theories running amok on the Internet. We were able to track the growing impact of our website, especially in remote parts of Arkansas that could be highly vulnerable to such misinformation campaigns. That was certainly a highly rewarding experience for us, to see that our work—a tiny effort in the large scheme of things—was actually making a difference in people's lives. World Health Organization (WHO) recognized our effort as one of the key technological innovations developed across the world to address the COVID-19 pandemic. Now, additional funding from U.S. DoD and NSF is helping sustain this effort.
Well, is this in a way what’s happening with Facebook? They’re getting rid of a lot of people. Is this a societal correction that is affecting their business?
We witnessed seismic shifts in Big Tech in 2022. The mass layoffs were indeed shocking, whether you look at Twitter, Facebook, Google, Amazon, or other technology companies. News reports and internal company memos (now public) showed that the technology companies over-recruited during COVID-19 in anticipation of post-COVID-19 business gains or profits, which turned out to be a missed target. The tech industry refers to this recalibration as right-sizing as opposed to downsizing, which is an interesting way to look at it!
There are also cases like the bankruptcy of FTX (the third largest cryptocurrency exchange) and conviction of the founders of the failed startup Theranos, once the poster child of Silicon Valley. Such cases are telling of a bigger issue with Big Tech than simply recalibration or right-sizing. Some of these most wildly successful companies of our times lack vision beyond simply scaling up their operations. They need to develop a serious interest in social and civic implications of their products or services. A more measured approach is needed—one that increases the currently scant oversight but doesn’t stifle the entrepreneurial spirit.
How do you view your work coming into 2023? Where will your focus be?
There are a lot of exciting things coming up for us in the next year, especially building on the lessons we’ve learned so far in the problem space of information campaigns and influence operations that includes misinformation, disinformation, malinformation, propaganda campaigns, extremism, and other forms of deviant behaviors. Particularly, we’re expanding our focus to study multi-domain, multi-national, and multi-cultural settings that play out predominantly on multi-platform and multimedia information environments. The regions of interest include Americas, Europe, Indo-Pacific, and Africa. Our emphasis is on understanding the role and affordances of digital communication platforms in these societies and how they’re used by information actors who may have malicious intent. The year 2022 witnessed key changes to the technological landscape, which urges us to examine the co-evolution of communication technologies and human behavior in a variety of contexts. So we step into 2023 with several important research questions—social, behavioral, and computational—plus seismic shifts in the technological landscape and a set of complex real-world problems. These challenges create an infinite spectrum of opportunities where interdisciplinary and multi-sector contributions await.
-----------------------------
Funding statement: The findings shared in this interview have resulted from studies funded in part by the U.S. National Science Foundation, U.S. Office of the Under Secretary of Defense for Research and Engineering, U.S. Army Research Office, U.S. Office of Naval Research, U.S. Air Force Research Laboratory, U.S. Defense Advanced Research Projects Agency, Arkansas Research Alliance, the Jerry L. Maulden/Entergy Endowment at the University of Arkansas-Little Rock, and the Australian Department of Defense Strategic Policy Grants Program. The researchers gratefully acknowledge the support.
Comments