latest-news-headlines Market Intelligence /marketintelligence/en/news-insights/latest-news-headlines/lawmakers-propose-regulation-to-limit-covid-19-misinformation-online-67329154 content esgSubNav
In This List

Lawmakers propose regulation to limit COVID-19 misinformation online

Podcast

MediaTalk | Season 2 | Ep. 29 - Streaming Services, Linear Networks Kick Off 2024/25 NFL Showdown

Podcast

MediaTalk | Season 2 | Ep. 27 - College Football Preview & Venu Injunction

Podcast

Next in Tech | Ep. 181: Lighting up Fiber

Podcast

MediaTalk | Season 2 | Ep. 26 - Premier League Kicks Off


Lawmakers propose regulation to limit COVID-19 misinformation online

In recent months, U.S. lawmakers have proposed a series of laws designed to rein in harmful political speech, conspiracy theories and other false news as misinformation about COVID-19 has spread online.

The Information Technology and Innovation Foundation will discuss the various proposals floated and the possible implications during a webinar Nov. 9 called "Protecting Political Speech While Reducing Harm on Social Media."

The topic takes on extra significance as lawmakers increasingly scrutinize companies including Meta Platforms Inc.'s Facebook platform, Google LLC and Twitter Inc. over misinformation spreading online. Misinformation is often financially or politically driven to influence a specific view and can end up harming users and misleading voters.

Several efforts are underway by the public and private sector to curb coronavirus misinformation and hold online platforms accountable for harmful content.

U.S. Rep. Jennifer Wexton, D-Va., and Sen. Mazie K. Hirono, D-Hawaii, in March introduced the COVID-19 Disinformation Research and Reporting Act of 2021 that would require the National Science Foundation to contract with the National Academies of Sciences, Engineering, and Medicine to study misinformation about COVID-19 on social media platforms.

In July, Sen. Amy Klobuchar, D-Minn., and Sen. Ben Ray Luján, D-N.M., introduced the Health Misinformation Act of 2021, which would hold online platforms liable for health misinformation posted by users during public health emergencies. If passed, the bill would create an exception to the Section 230 liability shield for platforms with algorithms that promote misinformation related to an existing public health emergency.

And in October, U.S. Rep. Frank Pallone Jr., D-N.J., and other lawmakers introduced the Justice Against Malicious Algorithms Act that would lift the Section 230 liability shield when an online platform knowingly uses an algorithm or other technology to recommend content that contributes to physical or severe emotional injury.

"Social media platforms like Facebook continue to actively amplify content that endangers our families, promotes conspiracy theories, and incites extremism to generate more clicks and ad dollars," Pallone said in a statement. "The time for self-regulation is over, and this bill holds them accountable."

Meanwhile, several online platforms themselves have been proactive about curbing misinformation by taking down posts with false information and policing content to limit the spread.

Twitter is labeling and removing misinformation about the coronavirus, including false information about how the virus is spread and misleading instructions on how to self-diagnose. Google ranks algorithms to identify signals about pages that correlate with trustworthiness and prohibit sites that misrepresent their ownership or primary purpose. To prepare for elections, Facebook's security teams take down inauthentic accounts, groups and pages that seek to manipulate users.

Ashley Johnson, policy analyst for ITIF, said the Nov. 9 discussion will also explore alternatives to the currently proposed legislation, including the creation of company best practices that would reduce harmful speech and unwanted content, as well as regulations that would hold the firms accountable to content moderation standards.

She said regulation that directly tells people what they can say and dictates to companies what they can allow people to say on their platforms could conflict with the spirit of the First Amendment, which protects free speech.

"A lot of these proposals would probably be struck down in court," Johnson said. "There are more things that we can do rather than the knee-jerk reaction of just telling social media platforms what they should and should not do."

Government

Nov. 10 The Federal Communications Commission will host its Precision Ag Connectivity Task Force meeting at 10 a.m.
Industry, legal and think tank events
Nov. 8-9 The National Association for Business Economics will hold its fifth annual Tech Economics conference where economists, data scientists and business leaders will discuss and demonstrate artificial intelligence-based computing, machine learning and behavioral economics.
Nov. 8-11 NVIDIA's GTC, a conference and training event focused on AI, graphics, accelerated computing, data centers and more, takes place.
Nov. 9 Silicon Flatirons will host a webinar titled "The State of the Art of Artificial Intelligence in the Practice of Law" that will feature a discussion on the use of AI in law and its benefits and pitfalls.
Nov. 9 ITIF will host a webinar titled "Protecting Political Speech While Reducing Harm on Social Media."
Nov. 9-10 The News York Times' DealBook Online Summit is taking place. Speakers will include Apple CEO Tim Cook, General Motors CEO Mary Barra and Pfizer CEO Albert Bourla.

Stories of note

Cloud industry continues to soar as demand set to extend past pandemic

Amazon, large retailers well positioned to weather holiday supply chain pain

Qualcomm leads other major US chipmakers in setting net-zero goal

Meta challenges ahead after Facebook rebrand

Facebook's facial recognition about-face sparks renewed calls for US federal law

Some external links may require a subscription. Links are current as of publication time, and we are not responsible if those links are unavailable later.