Stanford faculty warn of democracy’s vulnerability to technology misuse


The Stanford Daily

Stanford faculty expressed urgent concern for the possibility of significant misuse and abuse of technology surrounding the upcoming presidential election at an Oct. 20 seminar. 

Panelists warned that we are at what they described as a “tipping point” where coordinated action is urgently needed to staunch further erosion of democratic systems by malignant interference on digital platforms.

The “Technology and the 2020 Election” seminar was hosted by the Stanford Institute for Human-Centered Artificial Intelligence (HAI). 

Panelists included HAI associate director and political science professor Rob Reich, former member of the European Parliament and HAI international policy fellow Marietje Schaake and Lisa Einstein M.A. ’21 from the Stanford Internet Observatory.

The seminar evolved from POLISCI 52K: “Technology and the 2020 Election,” the popular one-unit Stanford course taught by Reich and Schaake. More than 400 students from around the world are enrolled in the joint class for Stanford students and Stanford’s Continuing Studies community.

Reich and Schaake said technology has had an unprecedented impact on the election, citing recent controversies in content regulation. In the lead-up to the upcoming elections on Nov. 3, social media platforms have been inundated with a record amount of disinformation, according to Reich. Tech companies have been forced to rely on automated content moderation approaches powered by artificial intelligence, which Reich said have been mired in controversies.

The lack of transparency and accountability of platforms due to the reliance on algorithmic curation methods has made difficult tracking down malignant forces attacking our social media platforms, panelists said.

“We are at the tipping point where the laissez-faire, hands-off approach is changing into a momentum of realization … that there are a lot of risks for not governing for democratic values,” Schaake said.

Schaake said that online discourse is more commonly regulated by tech companies than democratic governments, adding that tech companies have ramped up their policies regarding the regulation of online misinformation and disinformation, especially in the lead-up to the elections.

“I imagine that if we look back from 2025, or 10 years from now, we will see that [this year] will start off a number of regulatory interventions,” she said. “And I would hope that those regulatory interventions would put democracy first and not trust technology first.”

Schaake called for democratic institutions to step up in setting boundaries for acceptable discourse of these social media platforms. She also called for more independent oversight to ensure accountability and transparency of these platforms, highlighting the opportunity for universities like Stanford to provide checks and balances by conducting independent research.

Both panelists cautioned against holding a U.S.-centric view in misinformation and disinformation campaigns and said that the responsibility of social media platforms in protecting the democratic rights of people from malign interference extends globally. 

Given that the majority of users are located outside of the U.S., Schaake said, the spread of misinformation and disinformation campaigns has enormous potential to incite violence in places with volatile socio-political situations, such as Myanmar and Kenya.

In Myanmar, there are pervasive problems with hate speech on Facebook, further escalating ethnic tensions and inciting violence. In Kenya, private chats like WhatsApp, now owned by Facebook, is another platform where hatred and rumors can spread, heightening violence and tension.

“I would really hope that however important these U.S. elections are, that as soon as possible, the view extends to a global one again and that we really think about the global implications of these American companies,” Schaake said.

“It is extremely important to know that democracy is fragile and needs to be protected, and [is] not assumed to be resilient enough in and of itself,” she added.

Reich also warned of the dangerous potential for social media platforms to enable the incitement of violence in the immediate aftermath of the U.S. elections.

“I think we in the United States are at a point of vulnerability to violence, in the wake of an election, that we haven’t seen in the past decades,” he said.

He predicted that the election results might be close, potentially allowing people to use social media to spread disinformation aimed at distorting the truth and inciting violence, such as claiming the premature victory of a certain candidate.

Reich also cautioned against withdrawing from social media and technology usage as a response to the problem of disinformation on the internet. 

“The solution to this problem is not to grayscale our smartphones, or to delete Facebook,” Reich said. “The much better solution is to think not as a consumer, but as a citizen … and to make your voice heard in various ways, so we collectively shape our technological future, rather than just give it to the engineers.”

Reich and Schaake have also been involved in writing issue briefs on topics related to technology and the elections. Reich and Schaake will be speakers at the Nov. 4 Election Debrief discussion regarding the role of digital technologies in the 2020 election. 

Contact Penny Shi at pennyshi ‘at’ stanford.edu. 


Source link