Section 230
www.eff.org
external-link
47 U.S.C. § 230 The Internet allows people everywhere to connect, share ideas, and advocate for change without needing immense resources or technical expertise. Our unprecedented ability to communicate online—on blogs, social media platforms, and educational and cultural platforms like Wikipedia and the Internet Archive—is not an accident. Congress recognized that for user speech to thrive on the Internet, it had to protect the services that power users’ speech.  That’s why the U.S. Congress passed a law, Section 230 (originally part of the Communications Decency Act), that protects Americans’ freedom of expression online by protecting the intermediaries we all rely on. It states:  "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." (47 U.S.C. § 230(c)(1)). Section 230 embodies that principle that we should all be responsible for our own actions and statements online, but generally not those of others. The law prevents most civil suits against users or services that are based on what others say.  Congress passed this bipartisan legislation because it recognized that promoting more user speech online outweighed potential harms. When harmful speech takes place, it’s the speaker that should be held responsible, not the service that hosts the speech.  Section 230’s protections are not absolute. It does not protect companies that violate federal criminal law. It does not protect companies that create illegal or harmful content. Nor does Section 230 protect companies from intellectual property claims.  Section 230 Protects Us All  For more than 25 years, Section 230 has protected us all: small blogs and websites, big platforms, and individual users.  The free and open internet as we know it couldn’t exist without Section 230. Important court rulings on Section 230 have held that users and services cannot be sued for forwarding email, hosting online reviews, or sharing photos or videos that others find objectionable. It also helps to quickly resolve lawsuits cases that have no legal basis.  Congress knew that the sheer volume of the growing Internet would make it impossible for services to review every users’ speech. When Section 230 was passed in 1996, about 40 million people used the Internet worldwide. By 2019, more than 4 billion people were online, with 3.5 billion of them using social media platforms. In 1996, there were fewer than 300,000 websites; by 2017, there were more than 1.7 billion.  Without Section 230’s protections, many online intermediaries would intensively filter and censor user speech, while others may simply not host user content at all. This legal and policy framework allows countless niche websites, as well as big platforms like Amazon and Yelp to host user reviews. It allows users to share photos and videos on big platforms like Facebook and on the smallest blogs. It allows users to share speech and opinions everywhere, from vast conversational forums like Twitter and Discord, to the comment sections of the smallest newspapers and blogs.  Content Moderation For All Tastes  Congress wanted to encourage internet users and services to create and find communities. Section 230’s text explains how Congress wanted to protect the internet’s unique ability to provide “true diversity of political discourse” and “opportunities for cultural development, and… intellectual activity.”  Diverse communities have flourished online, providing us with “political, educational, cultural, and entertainment services.” Users, meanwhile, have new ways to control the content they see.  Section 230 allows for web operators, large and small, to moderate user speech and content as they see fit. This reinforces the First Amendment’s protections for publishers to decide what content they will distribute. Different approaches to moderating users’ speech allows users to find the places online that they like, and avoid places they don’t.  Without Section 230, the Internet is different. In Canada and Australia, courts have allowed operators of online discussion groups to be punished for things their users have said. That has reduced the amount of user speech online, particularly on controversial subjects. In non-democratic countries, governments can directly censor the internet, controlling the speech of platforms and users.  If the law makes us liable for the speech of others, the biggest platforms would likely become locked-down and heavily censored. The next great websites and apps won’t even get started, because they’ll face overwhelming legal risk to host users’ speech.  Learn More About Section 230 Most Important Section 230 Legal Cases Section 230 is Good, Actually How Congress Censored the Internet With SESTA/FOSTA Here's an infographic we made in 2012 about the importance of Section 230.

cross-posted from: https://literature.cafe/post/1133610

I am seeing a lot of fearmongering and misinformation regarding recent events (CSAM being posted in now closed large lemmy.world communities). I say this as someone who brought attention to this with other admins as I noticed things were federating out.

Yes, this is an issue and what has happened in regards to CSAM is deeply troubling but there are solutions and ideas being discussed and worked on as we speak. This is not just a lemmy issue but an overall internet issue that affects all forms of social media, there is no clear cut solution but most jurisdictions have some form of safe harbor policy for server operators operating in good faith.

A good analogy to think of here is if someone was to drop something illegal into your yard that is open to the public. If someone stumbled upon said items you aren’t going to be hunted down for it unless there is evidence showing you knew about the items and left them there without reporting them or selling/trading said items. If someone comes up to you and says “hey, there’s this illegal thing on your property” you report it and hand it over to the relevant authorities and potentially look at security cameras if you have any and send them over with the authorities then you’d be fine.

A similar principle exists online, specifically on platforms such as this. Obviously the FBI is going to raid whoever they want and will find reasons to if they need to, but I can tell you for near certainty they probably aren’t as concerned with a bunch of nerds hosting a (currently) niche software created by 2 communists as a pet project that gained popularity over the summer because a internet business decided to shoot itself in the foot. They are specifically out to find people who are selling, trading, and making CSAM. Those that knowingly and intentionally distribute and host such content are the ones that they are out for blood for.

I get it. This is anxiety inducing especially as an admin, but so long as you preserving and reporting any content that is brought to your attention in a timely manner and are following development and active mitigation efforts, you should be fine. If you want to know in more detail click the link above.

I am not a lawyer, and of course things vary from country to country so it’s a good idea to check from reputable sources on this matter as well.

As well, this is a topic that is distressing for most normal well adjusted people for pretty obvious reasons. I get the anxiety over this, I really do. It’s been a rough few days for many of us. But playing into other peoples anxiety over this is not helping anyone. What is helping is following and contributing the discussion of potential fixes/mitigation efforts and taking the time to calmly understand what you as an operator are responsible for within your jurisdiction.

Also, if you witnessed the content being discussed here no one will fault you for taking a step away from lemmy. Don’t sacrifice your mental health over a volunteer project, it’s seriously not worth it. Even more so if this has made you question self hosting lemmy or any other platform like it, that is valid as well as it should be made more clearer that this is a risk you are taking on when making any kind of website that is connected to the open internet.

I would also suggest reading User Generated Content and the Fediverse: A Legal Primer – https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer

Make sure to follow its advice - it isn’t automatic and you will need to take affirmative steps.

The safe harbor doesn’t apply automatically. First, the safe harbor is subject to two disqualifiers: (1) actual or “red flag” knowledge of specific infringement; and (2) profiting from infringing activity if you have the right and ability to control it. The standards for these categories are contested; if you are concerned about them, you may wish to consult a lawyer.

Second, a provider must take some affirmative steps to qualify:

Designate a DMCA agent with the Copyright Office.

This may be the best $6 you ever spend. A DMCA agent serves as an official contact for receiving copyright complaints, following the process discussed below. Note that your registration must be renewed every three years and if you fail to register an agent you may lose the safe harbor protections. You must also make the agent’s contact information available on your website, such as a link to publicly-viewable page that describes your instance and policies.

Have a clear DMCA policy, including a repeat infringer policy, and follow it.

To qualify for the safe harbors, all service providers must “adopt and reasonably implement, and inform subscribers and account holders of . . . a policy that provides for the termination in appropriate circumstances of . . . repeat infringers.” There’s no standard definition for “repeat infringer” but some services have adopted a “three strikes” policy, meaning they will terminate an account after three unchallenged claims of infringement. Given that copyright is often abused to take down lawful speech, you may want to consider a more flexible approach that gives users ample opportunity to appeal prior to termination. Courts that have examined what constitutes “reasonable implementation” of a termination process have stressed that service providers need not shoulder the burden of policing infringement.

And further down:

Service providers are required to report any CSAM on their servers to the CyberTipline operated by the National Center for Missing and Exploited Children (NCMEC), a private, nonprofit organization established by the U.S. Congress, and can be criminally prosecuted for knowingly facilitating its distribution. NCMEC shares those reports with law enforcement. However, you are not required to affirmatively monitor your instance for CSAM.

Ada
link
fedilink
91Y

but so long as you preserving and reporting any content that is brought to your attention

Say what? Preserving it?

gabe [he/him]
creator
link
fedilink
111Y

Some countries (mainly the US. I don’t know about elsewhere.) require preservation of the files in some sort of storage that is secured access only for at least 90 days and is essentially treated like toxic waste. (usually a tiny VPS)

https://uscode.house.gov/view.xhtml?req=granuleid:USC-prelim-title18-section2258A&num=0&edition=prelim

(h) Preservation.-
(1) In general.-For the purposes of this section, a completed submission by a provider of a report to the CyberTipline under subsection (a)(1) shall be treated as a request to preserve the contents provided in the report for 90 days after the submission to the CyberTipline.
(2) Preservation of commingled content.-Pursuant to paragraph (1), a provider shall preserve any visual depictions, data, or other digital files that are reasonably accessible and may provide context or additional information about the reported material or person.
(3) Protection of preserved materials.-A provider preserving materials under this section shall maintain the materials in a secure location and take appropriate steps to limit access by agents or employees of the service to the materials to that access necessary to comply with the requirements of this subsection.
(4) Authorities and duties not affected.-Nothing in this section shall be construed as replacing, amending, or otherwise interfering with the authorities and duties under section 2703.

A community dedicated to fediverse news and discussion.

Fediverse is a portmanteau of “federation” and “universe”.

Getting started on Fediverse;

  • 0 users online
  • 6 users / day
  • 14 users / week
  • 72 users / month
  • 298 users / 6 months
  • 1 subscriber
  • 1.02K Posts
  • 13.6K Comments
  • Modlog