The 19th Century Idea that Could Help Fix Big Tech

Should tech companies protect users, including children, from harm? Or be neutral and let users have free rein? There’s a way to do both.

Apr 7, 2024 - 14:59
 0
The 19th Century Idea that Could Help Fix Big Tech

When it comes to tech companies, lawmakers often seem to be talking about different things at once. Sometimes lawmakers are angry that tech companies don’t take enough action to protect their users, saying companies need to be held liable for harms, particularly harms to children. At other times, they complain tech companies are taking too much action by unfairly excluding some users and content from social media.

These two issues — liability and deplatforming — are sometimes treated as separate issues, but they are closely connected.

In January, social media executives appeared before the Senate Judiciary Committee in hearings focused on the harms that the platforms have done to children. Tech companies currently enjoy broad immunity from liability for content posted on their platforms thanks to a provision of federal law known as Section 230. Senators from both parties have said they want to repeal Section 230, and at the hearing they excoriated the executives, asking why they should not be held financially liable for the injuries caused, and demanded further protections for children from harmful posts and algorithmically curated content.

If Section 230 were repealed and platforms were sued, they would invariably take down content that harms kids in order to prevent future lawsuits. In other words, they would need to deplatform harmful content and potentially individuals who post it.

But while many elected officials might welcome such actions, others are pushing in the opposite direction. For instance, state legislators in Texas and Florida have passed laws that would severely restrict the ability of social media platforms to deplatform individuals and content. The platforms have challenged the laws in the Supreme Court, and at oral argument in February, the justices seemed to have a hard time grappling with the question of whether the states’ can prevent deplatforming.


This tension between content moderation and deplatforming also affects the people who run social media sites. While Elon Musk was skeptical of deplatforming and content moderation before buying Twitter (now X), Musk’s platform has since done both: suspending Ye, aka Kanye West, for antisemitic posts, blocking accounts that tweeted the location of Musk’s private jet, and deplatforming links to a competitor app, Mastodon.

These various actions raise a big question, perhaps one of the central conundrums in the age of social media: How should we think about deplatforming? Should tech platforms allow all content to be posted and all users to post, in line with principles of equal access? Or should they restrict content and users that harm children, other users, or society?

This question might seem novel, but it is actually very old. For hundreds of years, American law has been grappling with the problem of whether, when, and how to deplatform individuals and content. Not on social media platforms, which obviously didn’t exist, but in public utilities and other infrastructural businesses.

Laws in these areas offer important insights into how to regulate social media and other tech platforms. The first step is to start thinking of technology a bit differently, less as a product or service and more as a utility.

Enterprises that provide infrastructure or utility-like services have long been categorized differently from those that sell ordinary goods. In the 18th and 19th centuries, ferries, mills, blacksmiths, inns and wharfs, among other things, were included in this category. With industrialization, successive generations of Americans added the telegraph, telephone, railroads, pipelines, airlines and other businesses, usually in the transportation, communications and energy sectors. In a new book, my co-authors Morgan Ricks, Shelley Welton and Lev Menand call this category of businesses NPUs — networks, platforms, and utilities.

NPU enterprises form the foundations for modern commerce, communications and civic connections — just like contemporary tech platforms. They also tend to end up wielding monopoly or oligopoly power, again like our modern tech giants. As a result, throughout history, courts, state legislatures and Congress imposed a distinctive set of regulations on these businesses. Specifically, to ensure access to these important services, NPUs were usually required to “accept all comers” in a nondiscriminatory fashion, including even their competitors.

Some old cases provide some instructive examples. For example, in 1881, only five years after Alexander Graham Bell invented the telephone, a carriage company in Louisville, Kentucky. wanted telephone service so customers could contact it. But the telephone company refused to serve them. Why? Because the telephone company owned a competing carriage company. The carriage company sued and a Kentucky court held that the telephone company was “bound to serve the general public … on reasonable terms, with impartiality.” Over time, policymakers went further and banned many NPUs from owning other businesses. This ensured they could not leverage power from one sector into other areas of commerce.

But that duty to serve all users was never absolute. NPUs were also allowed to exclude some customers, so long as the exclusion was reasonable. For example, in 1839, the New Hampshire Supreme Court declared that innkeepers and common carriers “are not bound to receive all comers. The character of the applicant, or his condition at the time, may furnish just grounds for his exclusion.” A passenger trying to “commit an assault” or “injure the business of the proprietors” could be excluded. In another case, a court allowed a telephone company to deny service to a user who repeatedly used a shared phone line to disrupt other people’s calls.


These and other examples of deplatforming across a wide range of networks, platforms, and utilities throughout American history have lessons for thinking about today’s social media platforms. Reasonable exclusions from NPUs generally fit into three categories. First was ensuring service quality. A railroad didn’t need to take a passenger who wouldn’t pay for a ticket or one who wanted to stop the train from running. Second, the American tradition generally allowed exclusion to protect other users or society itself from harms. A steamboat didn’t need to let a known thief get on board. A streetcar had an obligation to exclude a rider if it was reasonably foreseeable he would injure another. Current law prohibits taking weapons on airplanes, and airlines can and do ban passengers who have been violent in the past. Finally and most controversially, the American tradition permitted a limited amount of exclusion related to social norms. In the Jim Crow era, this meant unacceptable and discriminatory exclusions on the basis of race. But exclusions based on social norms sometimes have more reasonable justifications: For example, broadcasters are subject to regulations on airing indecent programming.

What history shows is that deplatforming is an endemic issue for any network, platform, or utility — it is not a challenge unique to social media or even tech platforms. The reason is simple: For any network, platform, or utility enterprise, it is unworkable to serve literally everyone without exception because there will always be some bad actors.

At the same time, it is problematic to give owners of essential, utility-like services the ability to exclude whomever or whatever they want. Should a single person have the power to exclude persons from services that are essential to modern commerce and social life? Should a corporation whose legal duties are to its shareholders, rather than the public welfare, have free rein even if padding their profits injures people?

Historically, Americans did not leave such decisions exclusively to self-regulation by those who controlled platforms. In the 19th century, judges imposed a duty to serve on NPU businesses and also allowed for reasonable exceptions. If a platform violated the duty to serve — or if a user was injured or harmed — the user could sue, leading to liability for the platforms. This liability, of course, created an incentive for the platforms to exclude those who were likely to injure others.

Liability and deplatforming, in other words, can work together to balance the duty to serve and the protection of users from harm.


This line of reasoning suggests that it would make sense to remove social media platforms’ Section 230 liability shield and allow individuals to sue them for all manner of harms and injuries. Lawsuits after the fact, if history is a guide, would likely get the platforms to change their behavior in advance. Eventually, a case-by-case process will lead to a set of stable rules on what kinds of behaviors are permitted.

At the same time, lawsuits have some serious downsides. Unlike the small amount of deplatforming that took place with telephones or train cars, tech platforms have millions of users and millions more posts. The 19th-century approach also places courts and judges at the center of the action, which could result in a patchwork of potentially idiosyncratic rules across the country.


Another alternative — one more suited to scale — would be for our elected representatives in Congress to establish rules that govern tech platforms as utilities and include rules on deplatforming and content moderation. These might involve procedural rights and responsibilities and could potentially include some substantive rules, governing extreme cases like inciting violence and indecency.

This approach also has challenges. Debating and passing such rules would be difficult. In our polarized environment, people have different views about what should be deplatformed. For social media platforms, the First Amendment will also, appropriately, constrain what is possible. But the First Amendment is not absolute even today, and it is unclear how the Supreme Court will apply it to the cases on social media regulations that are on its docket this term.

Whatever path lies ahead, history shows that reasonable deplatforming is not only common but inevitable. Without it, platforms can become socially destructive, rather than beneficial. Accepting that reality is the first step in figuring out how to balance the desire for platforms to serve the public — and the need for platforms to protect the public.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

viralnews360 I'm an IT whiz by day, a wordsmith by night. With a keyboard in hand and a head full of code, I translate the complexities of the digital world into engaging stories for the folks at ViralNews360. When I'm not deciphering algorithms or wrangling servers, you'll find me exploring the latest tech trends and crafting articles that inform, inspire, and maybe even spark a few laughs. Join me on the journey as I bridge the gap between tech and everyday life, one byte at a time!