A cellphone with a screen that says "2024 ELECTION"

This is a free excerpt from one of Shortform’s Articles. We give you all the important information you need to know about current events and more.

Don't miss out on the whole story. Sign up for a free trial here .

Did social media companies live up to promises to combat election misinformation after 2020? What new technological threats could undermine the 2024 elections? How are companies addressing election integrity issues this time around?

Social media platforms face rising pressure to address new AI disinformation threats that could undermine the 2024 elections. Experts warn that companies’ failure to address sophisticated “deep fakes” threatens American democracy and could lead to violence.

Keep reading to learn how AI misinformation could affect the coming election.

AI-Generated Misinformation

As the 2024 elections rapidly approach, social media giants face rising pressure to combat new threats posed by AI-generated election misinformation. Their actions—or inaction—could shape the future of democracy.

Background

With billions of users, social media companies like Facebook, TikTok, YouTube, and X (formerly Twitter) have vast reach and influence. They can shape public opinion and impact US elections through the spread of information, both true and false. By 2020, 53% of voters relied on social media for news and information, underscoring the tech giants’ significant role in election outcomes.

Managing Election Mis- and Disinformation 

After the 2020 elections, major social media platforms pledged to combat disinformation more effectively. They began flagging or labeling posts with misleading content. However, in the face of pressure and backlash over allegations of anti-conservative bias, companies wavered on their content moderation decisions.

Current Election Integrity Risks

As 2024 nears, experts are sounding the alarm about the grave threat that powerful Artificial Intelligence (AI) technologies pose to the electoral process. The growing mainstream use of AI tools like ChatGPT, which can generate ultra-realistic fake text, audio, and video known as “deep fakes,” are ripe for abuse. Potential fraudsters can now create false statements and misleading footage, showing candidates behaving improperly or showing fake evidence of vote manipulation. Researchers say that society remains largely defenseless against these sophisticated fakes. 

Experts caution that AI-driven mis- and disinformation circulating on social media platforms represents a severe threat to American democracy.

Election Integrity in 2024

Key initiatives that social media companies are implementing or already have in place to support the integrity of the 2024 elections include the following:

  • Meta says it will:
    • Block new political ads a week before the election.
    • Partner with fact-checking organizations to assess claims.
    • Remove vote-suppressing and false voting information.
    • Ask political advertisers to mark AI-generated content used in promoted content relating to social issues or elections. Failure to do so will lead to content removal.  
How Social Media Is Handling AI Election Misinformation

Want to fast-track your learning? With Shortform, you’ll gain insights you won't find anywhere else .

Here's what you’ll get when you sign up for Shortform :

  • Complicated ideas explained in simple and concise ways
  • Smart analysis that connects what you’re reading to other key concepts
  • Writing with zero fluff because we know how important your time is

Hannah Aster

Hannah graduated summa cum laude with a degree in English and double minors in Professional Writing and Creative Writing. She grew up reading books like Harry Potter and His Dark Materials and has always carried a passion for fiction. However, Hannah transitioned to non-fiction writing when she started her travel website in 2018 and now enjoys sharing travel guides and trying to inspire others to see the world.

Leave a Reply

Your email address will not be published. Required fields are marked *