This Bill to Reform Section 230 Is Bad News for Big Tech. It’s Even Worse for the Little Guy – Inc.

For most people, the details of the "26 words that created the internet" are probably a little fuzzy. Known colloquially as "Section 230," the law is a carve-out to the Communications Decency Act that provides internet companies immunity related to the content published by users on their platforms. It also gives those companies protection from being sued over content moderation decisions they make regarding those platforms.

As important as the law is at making the open internet possible, most people had probably never heard of Section 230 until politicians started talking about reforming or repealing it in the last year or so. Ostensibly, that's not a terrible thing on its face--there's room to make an argument that the law couldn't have foreseen every possible scenario of online content and that some amount of change is warranted.

The latest effort comes in a bill from Senator Mark Warner (D-Va.)called the Safe Tech Act.While Warner and his co-sponsors'intentions may be good, the bill is very bad. And, the thing is, it's not just bad for big tech companies like Facebook or Twitter. It's especially bad for the startup economy.

The law would no longer grant protection if "the provider or user has accepted payment to make the speech available or, in whole or in part, created or funded the creation of the speech.'' While the authors say that's primarily targeted at ads, it won't end there.

YouTubers, for example, get paid for posting content on YouTube through ads and through sponsorships. Under the proposed changes, that would mean that YouTube could be liable for the content posted by every content creator, even if a creator isn't paid directly by YouTube. ButYouTube isn't even the biggest problem.

More concerning is that in the event of a lawsuit, the bill would eliminate what is currently an "immunity," and instead provide for an "affirmative defense." That would require a company to prove it shouldn't be held liable.

Currently, if you were to sue Facebook over a review someone left of your business, the social-media company would ask the judge to dismiss it on the grounds that it is protected by Section 230 and that would be the end of it. That would change now in the event there's any money that has exchangedhands for either the platformor the user. In that case, the platform would have to prove that it should be protectedand that your lawsuit doesn't fit one of the carve-outs provided under the new law.

Even for a company like Facebook, which has more than enough resources to hire lawyers and fight lawsuits, it could quickly become untenable. For small companies, however, it would be devastating.

There are hundreds of small web-hosting companies, for example. The changes proposed would makethese companies liable for the content on every website or blog hosted on their platform. There is simply no way to monitor or moderate every website on the internet.

Or, imagine you're a food blogger with a few thousand readers. You allow your readers to pay for a membership, and in return, they get access to a members forum, or to leave comments on theblog. Suppose one of those members posts a negative comment about a meal they had a restaurant in town, and how no one should ever eat there.

The restaurant owner finds out and is angry enough about it that they want you to take the comment down. If you don't, they threaten to sue youand your web host.

In the past, that threat would have been hollow. If the comment isn't otherwise illegal, Section 230 would provide protection from liability for what a member said. Under this proposal, that immunity is gone since money changed hands.

Finally, it's worth talking about intentions. To that end, I'm willing to give the authors the benefit of the doubt that their intentions are good. It would be great for everyone if there was less online harassment or discrimination.

The problem is that making providers liable for all of the content published on their platforms won't result in less harassment or discrimination. It will simply lead many of them to decide it isn't worth the potential risk. There will just be fewer platforms.

When I reached out to the Warner's office, an aide referred me toaFAQ thatsaysthe new law would have no such effect on startups since they are toosmall to sue,which is a sentiment completely divorced from reality. That's part of the problem, really--while the authors seem to have good intentions, it appears they fail to grasp how the internet actually works.

For that matter, the bill fails to grasp how people actually work. Angry people sue all the time. It doesn't matter if the target is small.

Collecting damagesisn't even often the point. The goal is to force you to take down a review, or a comment, or a statement they don't like. Small businesses don't have the resources or the time to fight lawsuits and can end up bankrupt if they have to defend themselves.

People tend to measure the impact of what they do by their own best intentions, and not by the way it will be used by others who may not share those intentions. If your goal is to do something you consider just, it's easy to dismiss concerns that you might cause something terrible.

Yet, unfortunately, that's exactly what this bill will do.

The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.

More:

This Bill to Reform Section 230 Is Bad News for Big Tech. It's Even Worse for the Little Guy - Inc.

Related Posts

Comments are closed.