The Fort Morgan Times

2024 is the year of elections. Let’s act now to protect them at all costs

- By Paige Alexander and Kristin Lord Chicago Tribune

As we look ahead this year, voters in more than 50 countries, including the United States, will go to the polls.

The elections will take place during a period of global democratic backslidin­g and in rapidly changing social media environmen­ts characteri­zed by new threats from generative artificial intelligen­ce and tech platforms’ reductions in trust and safety protection­s. The challenges to election integrity and public confidence are daunting and require all levels of society — individual­s, communitie­s and institutio­ns — to act.

Three times as many countries are moving toward autocracy as toward democracy. The quality of elections has worsened in at least 30 countries over the past decade — a period marked by increasing attacks on the media and freedom of expression, as well as diminishin­g levels of citizen trust in democratic institutio­ns. At the same time, some major social media companies are signaling that election integrity is not a priority, in effect opening the door wider to abuse by state and nonstate actors. In the past year, Big Tech layoffs have decimated trust and safety teams responsibl­e for combating election misinforma­tion and malign interferen­ce around the world.

Amid the content people are inundated with, including text, audio, video and photo, comes a wide range of manipulati­ve content, such as deliberate­ly misleading informatio­n about vaccines or hateful language that could incite individual­s to violence. The challenge is that our brains are wired to generate strong reactions to emotional content and to share it with others. Manipulati­ve informatio­n also takes advantage of cognitive shortcuts our brains developed to make sense of the world quickly. Traits that were self-protective hundreds or thousands of years ago now make us more susceptibl­e to manipulati­on in an era of social media.

Fortunatel­y, people can learn skills to help sort fact from fiction and recognize and manage reactions to emotional content. By understand­ing how social media algorithms cater to our biases and drive emotional responses, we can exert greater control over our own reactions. Taking control of these very human vulnerabil­ities to manipulati­ve informatio­n can help citizens vote based on reasoned decisions about what is best for our communitie­s, not reactions to content generated by others trying to manipulate us for their own purposes.

Given the sheer number of elections around the world this year, stronger steps to protect citizens from manipulati­ve informatio­n are needed. This would be a mammoth challenge for tech platforms even with fully staffed trust and safety teams, let alone their shrinking department­s. The situation is especially concerning when factoring in the unknown threat posed by generative AI.

Social media platforms are now even more susceptibl­e to weaponizat­ion by bad actors than in 2020 when misinforma­tion peddlers coordinate­d to create and spread false narratives about election fraud faster than they could be moderated. Protection­s that were introduced after the spread of hateful narratives on social media led to the ethnic cleansing

of Myanmar’s Rohingya population in 2017, have since been rolled back.

While it’s impossible to fully prevent the spread of election misinforma­tion, there are actions we can take to mitigate adverse impacts.

First, we can build resilience to manipulati­ve informatio­n. The public must be equipped with knowledge and skills to critically evaluate sources of informatio­n, discern what’s credible informatio­n and what’s false and misleading informatio­n, recognize manipulati­ve content and refrain from unwittingl­y spreading misinforma­tion. A 2021 United Nations report cited low levels of “digital and media literacy” worldwide and called on countries to support digital literacy.

We can support independen­t media reporting accurately and fairly on elections, candidates and their policies. As the counterpoi­nt to manipulati­ve informatio­n, such high-quality media content can help people make informed decisions and hold those in power accountabl­e. We can help educate the public on election technologi­es. Black box technology, a system producing informatio­n without revealing its sources, is ripe for false narratives. Educating the public on how voting and counting technologi­es work and the steps that impartial officials take to verify election results could address the informatio­n vacuum in which misleading or manipulati­ve content thrives. This could include expanding voter education beyond when and how to vote and emphasizin­g informatio­n about vote counting, processes for verifying results and safeguards to fair elections.

And we need independen­t evaluation­s of elections. This means a commitment to supporting nonpartisa­n election observatio­n to assess the quality of the election process and outcomes. Public reports issued by credible observatio­n missions provide independen­t and systematic assessment­s that can counter misinforma­tion narratives and help promote public confidence. As election processes are increasing­ly intermedia­ted by technology, election observers must continuous­ly adapt their methodolog­ies to integrate assessment­s of the online informatio­n space and the technology used to record and tabulate votes.

With heightened threats to public confidence in democracy and elections, internatio­nal donors should prioritize funding focused on informatio­n integrity, media literacy and new challenges in digital technology — in as many countries as possible.

With Big Tech abdicating its responsibi­lity amid a spate of high-stakes elections in 2024, the time to act is now.

Paige Alexander is

CEO of The Carter Center, a not-for-profit organizati­on with a mission of advancing peace and health worldwide. Kristin Lord is president and CEO of IREX, a nonprofit dedicated to building a more just world by empowering youth, cultivatin­g leaders, strengthen­ing institutio­ns, and extending access to quality education and informatio­n.

©2024 Chicago Tribune. Visit at chicagotri­bune.com. Distribute­d by Tribune Content Agency, LLC. 1 p.m. FUNtastic Friday: 7 p.m. Al-Anon:

at 229 Prospect St., Fort Morgan. Use south side entrance. Info: 1-720-227-8775 or 1-970-867-5719.

7to 8p.m. Brush Alcoholics Anonymous open meeting:

at Rankin Presbyteri­an Church, 420Clayton St., Brush. Use west door to the basement. Saturday, Jan. 27

9 a.m. Fort Morgan Alcoholics Anonymous meeting, at United Methodist Church, 527 State St./117 E. Bijou Ave., Fort Morgan. Use east door. Info:

Sharon at 970-391-3925 or Northern Colorado AA Hotline at 970-224-3552.

3 p.m. Makerspace open: 10 a.m. Family StoryTime: 10 a.m. Gentle Yoga: 6 p.m. Platte Valley Band practice:

At the Gene Doty Senior Center.

7 p.m. Al-Anon:

at 229 Prospect St., Fort Morgan. Use south side entrance. Info: 1-720-227-8775 or 1-970-867-5719.

7 p.m. Fort Morgan Alcoholics Anonymous meeting:

Newspapers in English

Newspapers from United States