(Ed. Note: The “Innovation Fact of the Week” appears as a regular feature in each edition of ITIF’s weekly email newsletter. Sign up today.)
With COVID cases rising in 42 states and 29 percent of U.S. adults hesitant about receiving a COVID-19 vaccine, the fight against dangerous COVID-related misinformation has become even more urgent. Much of the political debate and media coverage of the issue has focused on social media’s role in spreading misinformation; legislation was introduced to make platforms liable for this content.
The problem of COVID-related misinformation is more significant than social media, and forcing platforms to remove misinformation would not make the pain go away—but it would raise serious questions of constitutionality.
Sen. Amy Klobuchar’s (D-MN) proposed Health Misinformation Act attempts to sidestep these questions by amending Section 230 of the Communications Decency Act to make platforms liable for health misinformation posted by users during a public health emergency. In its current form, Section 230 protects online services from liability for third-party content.
However, unless Congress also makes publishing health misinformation illegal, which would violate the First Amendment, lawsuits against social media companies for failing to remove health information during a public health emergency would have no legal standing. In this sense, Klobuchar’s bill functions mainly as a warning for Facebook and Twitter to crack down on health misinformation.
Another issue with forcing social media platforms to remove COVID-related misinformation is the question of who should decide what qualifies as misinformation. The Health Misinformation Act would allow the Secretary of Health and Human Services to define misinformation, which brings up more First Amendment issues by giving the government power to decide between fact and fiction.
As the COVID-19 pandemic has demonstrated, this distinction is not always easy to make during an ongoing public health crisis. For instance, during COVID’s early months, U.S. officials only recommended masks to healthcare workers; as new evidence about the spread of the disease emerged, they encouraged everyone to wear masks. In turn, new evidence from the University of Waterloo suggests that cloth masks are not very effective in limiting the spread.
Even if COVID-related misinformation was easy to identify and regulating it raised no constitutional issues, removing misinformation from social media would only solve part of the problem. COVID-related misinformation was featured not only on social media feeds but also on prominent news networks and even on the House and Senate floor.
Removing it from social media would also do nothing to address the root issues behind the distrust of many Americans in science and medicine and experts. Indeed, the decline in faith in experts—in part due to a failure of many experts—has been growing long before social media was popular. Anti-vaxxers have been around since vaccines have been invented. The politicization of the COVID-19 pandemic has also played a role in sowing doubt: as of July 2021, 33 percent of U.S. adults who voted for Trump did not plan to get vaccinated compared to 3 percent of those who voted for Biden.
Addressing COVID-related misinformation on social media is a complex problem with a host of First Amendment concerns. It is only a part of the larger problem of misinformation and medical mistrust in America. An effective, multi-pronged solution to COVID-related misinformation should balance the competing interests of preventing harm and preserving free speech and target all avenues through which misinformation spreads.
Ideally, social media, traditional media, and societal leaders (e.g., elected officials, clergy, community leaders, and others) will work to identify and limit the spread of misinformation while also maintaining flexible policies that are not overly restrictive and can change as new information emerges.
If the government wants to address the crisis involving Americans ignoring medical science, it can do so not by threatening social media companies, but through dedicated efforts to boost Americans’ rationality and understanding of science while ensuring that experts live up to their responsibilities. It can start with fixing the abysmal education system. According to the National Science Foundation, 27 percent of Americans think the sun revolves around the Earth, 52 percent think electrons are larger than atoms, and 49 percent think antibiotics kill viruses. Thinking that horse medicine treats COVID is not a far step from this kind of ignorance.
At the same time, we need to hold experts to a higher standard. As The Atlantic has pointed out and as Michael Lewis has shown in his new book, Premonition, experts made many preventable mistakes in handling the pandemic. Graduate schools need to start focusing less on indoctrination and more on teaching critical thinking. Major professions need to reject fashionable groupthink and call out their peers who fail in their duties. We saw this during the pandemic when several experts, including epidemiologists and public health officials, publicly stated that it was okay last summer for people to mass gather to protest for Black Lives Matter, but just a few weeks earlier it was not okay to engage in other protests, including by conservatives. This dereliction of duty eroded the faith of many Americans in experts. Of course, anyone can and should be able to say that they think people should or should not protest in the middle of the COVID-19 pandemic, but that should be a personal opinion, not couched in expertise.
And finally, both sides of the aisle need to reduce the political tribalism that has turned a deadly global pandemic into a political rather than scientific debate. But that is all a tall order—likely too tall given the state of America today. Let’s instead blame social media.