​​It’s time to close the loophole ​that lets tech run wild


A quiet, agreeable and incredibly important hearing was recently held by the House committee that oversees the tech industry a few days ago. While too many congressional​​ ​hearings can quickly devolve into showmanship, this one remained thoughtful and bipartisan. What was the topic that generated such agreement? Changing a fundamental law that governs the way the Internet — particularly social media companies — operate.  

Section 230 of the Communications Decency Act of 1996 has remained untouched since it was crafted nearly 30 years ago. This provision has long been hailed as a cornerstone of the internet, granting platforms immunity from liability for content posted by users.  

Although there’s been plenty of empty talk on the Hill in previous years of amending Section 230, something different is in the air. Members of Congress in both parties seem to get it: Tech companies shouldn’t have blanket immunity any longer — they’re hurting our kids, our mental health and our democracy. As one of the witnesses, Professor Mary Ann​e​ Franks from George Washington University, testified: “There is no justification for exempting the tech industry from the liability that virtually all individuals and industries face when they contribute to harm.”  

Social media companies are now the largest and most powerful media companies on Earth, yet they don’t have to play by the same rules as all other companies do, including traditional media companies. Yet they​​ continue to look ​more like traditional media companies every day. They are audio and video broadcasters, they are publishers, they are online amusement parks and gaming venues and much more. They can aggregate audiences in the tens of millions instantaneously. The main difference between them and traditional media companies is that most of their content is produced by freelancers, for free — or what they like to call “users” and “creators.”  

When I was a magazine publisher, probably 95 percent of our content was produced by freelancers (or creators) who weren’t on our payroll, but we paid most of them for it, and we edited them and vetted them — including letters to the editor — because we could be held liable for what we published.  

​​Imagine this, though: Recently, Fox News found itself embroiled in a massive $787.5 million settlement for defamation filed by Dominion Voting Systems, stemming from false claims made about Dominion’s machines in the 2020 election. While Fox News faced legal repercussions for spreading false claims about Dominion, platforms like Twitter, where similar falsehoods about Dominion circulated even more widely, and were monetized, have completely evaded consequences. It’s remarkable that they’ve been allowed for so long to hide behind the veil of being “neutral platforms.” 

As Rep. Frank Pallone (D-N.J.) said at the hearing: Their “get-out-of-jail-free card has to end.” And all of his colleagues — Democrats and Republicans — agreed with his sentiment.  

But how, specifically, would that happen? One way is to simply repeal the first part of 230 — section (c)1 — that shields tech companies from responsibility for what’s posted on their sites. But a more nuanced approach might be necessary. Specifically, social media (and other sites) should be open to liability when they exhibit deliberate indifference to harm caused by speech or have agency in promoting content — from signing revenue-sharing deals with content creators to making curatorial choices to intentional machine (algorithmic) boosting.   

What will be necessary to make this work for both big and small companies is a legal mechanism for people to chronicle harm and request take downs of content. For that, we can draw inspiration from the Digital Millennium Copyright Act (DMCA), created to address copyright infringement online and hold platforms legally accountable for failing to remove — or be forced to pay for — copyrighted content.  

Free speech thrived in America long before the advent of the internet and social media. Our nation has the best speech laws in the world, with a rich jurisprudence supporting the First Amendment. To be clear: None of that will change. Any updates to Section 230 will be grounded in our country’s proud tradition of safeguarding free expression while also protecting those being spoken to, or about. Radicals, geniuses, revolutionaries, conspiracy theorists and heretics will be just as free as they were before the internet.

op2

Beyond its legal implications, altering Section 230 has profoundly positive cultural and moral implications. Laws do more than dictate what is legal or illegal; they shape the norms and values of society. By absolving platforms of responsibility, Section 230 has, for nearly 30 years, created a mentality that anything goes online, regardless of its impact on individuals or society as a whole. We’ve all seen the disastrous consequences of that mentality on ourselves, our kids, our personal and familial relationships, and on the tone and tenor of our conversations and debates about America. So changing 230 will not just change laws, it will also help dilute the toxicity in the collective bloodstream. ​

​​​In the middle of the House hearing, Rep. Buddy Carter (R-Ga.) said: “My daddy used to tell me, ‘When you don’t do somethin’ you’re doin’ somethin.’ So we’ve gotta address this.” Indeed, it’s time for Congress to do something, and the bipartisan path and sense of urgency seem increasingly clear.

​​​Nick Penniman is founder and CEO of Issue One.

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.



Source link

About The Author

Scroll to Top