By Zeynep Kilik for The Opportunity Agenda
Millennials and Gen-Zs have been raised in both the physical and digital spaces. The term “IRL” (in real life) is just proof of our physical reality becoming a second space. While the internet is a medium in which billions of individuals can share and consume all different forms of content, our online spaces have become extensions of our everyday physical reality. We can no longer just look at the internet as a secondary space, but instead recognize it as a primary space, akin to the public square. Except this public square is inhabited by billions all around the world, with many of whom are fully anonymous.
In today’s world, conversations around internet regulation are more important than ever. As a response to Google’s 2017 digital counter-terrorism efforts against ISIS, Jeffrey Chester, executive director of the Center for Digital Democracy, stated: “The danger here is that Google and Facebook are making decisions about how the future of the digital media system operates without public oversight and accountability.” Therefore, we must recognize that a one-size-fits-all approach toward curtailing violent extremism online cannot guarantee the safety of all internet users. Does this mean those with dangerous views are entitled to an online platform? Of course not. Simply said, the solution to this is a complicated and delicate web that must balance the need to protect vulnerable groups from hate crimes with the foresight necessary to guarantee a safe exchange of dialogue online.
Millennials and Gen-Zs are no strangers to the underbelly of the internet. As we’ve grown with the internet, we have been urged to think about our online usage from two perspectives: our own usage and us being used. In documentaries like “The Social Dilemma” and “The Great Hack,” this discussion of the user as a commodity has been dubbed as “the attention economy.” The attention economy thrives from negative emotions, thus boosting creators and movements that feed into these emotions. Unfortunately, this leads to a slippery slope of radicalization and, in many cases, real-world violence. The most prominent and recent examples that come to mind are the mass shootings in Buffalo, NY, Robb Elementary school in Uvalde, TX, and Highland Park, IL. This to me, shakes me to my core. I started to feel powerless against the constant online misinformation and extremism I would witness daily. But then I remembered the start of my own internet experience nearly over a decade ago – witnessing an influx of distasteful, racist, ableist, sexist content, post after post, nearly nonstop.
The New York Times’ profile on the ex-YouTube radical Caleb Cain, described as a “college dropout looking for attention,” has truly illuminated the dangerous intersection between the attention economy and far-right extremist ideology. In their report, the NYT writes: “These [content creators] were entertainers, some of them were part of the alt-right, a loose cohort of pro-Trump activists who sandwiched white nationalism between layers of internet sarcasm.” Extremist content has a snowballing effect, rather than a direct hit. This is detailed In the video essay titled The Alt-Right Playbook: How to Radicalize a Normie, about how fictional character Gabe— a “normie” (also known as our everyday average person) — can slowly descend into a far-right extremist ideology.
Nevertheless, it is clear that the inconspicuous descent into far-right extremist ideology online can catch one by surprise. Its consequences, however, can become fatal and destructive to countless communities. The “replacement theory” cited in the Buffalo supermarket shooter’s manifesto is just one example of this. However, sometimes the motive is unclear. In the case of Robert E. Crimo III, the Highland Park parade shooter, many have referenced his online activity on neo-fascist and far-right communities along with his QAnon-inspired music (which has since been erased from streaming platforms). It is important to remember that the profile of extremists, and especially those who commit violence on behalf of their extremist beliefs, is not a standard profile. On the other side of this, far-right groups online regarded the Uvalde tragedy as a hoax similar to Alex Jones’ rhetoric on Sandy Hook (in which he was recently found guilty of defamation earlier in August 2022).
Regulatory agencies and governments have tried their hand at regulating harmful online content. In my early internet days, the controversial 2012 House bills SOPA + PIPA caused quite a stir in the metaverse. My Tumblr dashboard was inundated with posts criticizing these bills, memes about these bills, and (most importantly) what to do about them. At the height of this, Tumblr itself hosted a meeting at its headquarters with several technology companies, politicians, and advocacy groups. The grassroots effort by Tumblr users had also been quantified by Tumblr staff, with over 85,000 calls to representatives. This effort, which was led by Fight for the Future and supported by companies like Google, stopped these bills from becoming law. Of course, not every proposed Internet bill met the same fate as SOPA/PIPA. The Trump-era SESTA-FOSTA and the 2017 repeal of Net Neutrality rules come to mind. YouTube, TikTok, and Twitter have also enacted their own internal regulations to prevent certain unlawful, distasteful, or downright dishonest content from being disseminated. As a result, users have had to self-censor to keep their content monetized, or even available on the site. This has been a big point of contention, bringing up arguments around one’s right to free speech, protected by the First Amendment, online.
At The Opportunity Agenda, we use a communications tool: VPSA, or, Value, Problem, Solution, Action. We stress that values-based communication can mobilize audiences and, most importantly, prevent the feeling of helplessness that comes with heavy topics, like white supremacy. Our basic value here is this: free speech is a human right/we are entitled to our freedom of speech. However, the problem that intersects with this value is that many far-right/white supremacist/militant groups have appropriated this value and are using it to justify their hate speech/online harassment/violence. How can we help alleviate this issue? Determining a concise solution is crucial, albeit very nuanced. When it comes to protecting and ensuring the dignity of our right to free speech, we must recognize— name and shame— arguments/discussions/dog whistles used by hate groups to incite violence and prejudice against certain groups/ideologies.
Before we bring this all together in an actionable way, we need to recognize that one person, one group, cannot do everything. Change is granular and is mobilized through individuals/communities working together towards a common goal. While there is a thin line between maintaining one’s right to free speech online and combatting far-right and white supremacist ideology, we can begin the process of change by simply being more aware of our online surroundings and mindful of this elusive “attention economy.” The road toward a safe, equitable, and democratic online ecosystem will take time and cooperation – not only with governments, regulatory agencies, and tech companies but with everyday Internet users as well.
Zeynep Kilik supports The Opportunity Agenda’s president and works closely with the organization’s board of directors, staff, and partners. She has worked in global advocacy and nonprofit administration.