Yote.zip
link
fedilink
371Y

Hey this is your friendly reminder to spread out in the Fediverse. Stop making communities on the big servers. Now all those users just lost a big chunk of content and they’re likely to leave Lemmy and spread the word about how the Fediverse will never work because of trigger-happy admins.

On the other side, you still want an instance that will last long enough.

We lost lemmy.film a few weeks back, that was a loss for everyone invested in the main community (about movies and films).

But I agree with you that we should spread about communities across servers. I like lemm.ee and sh.itjust.works, the admins seem pretty chill and really use defederation as last resort.

ram
link
fedilink
10
edit-2
1Y

Personally I prefer sticking with smaller instances of maybe a few hundred or a thousand users. The more evenly spread out we are across instances, the more democratized the federation is.

I have multiple accounts across multiple instances just to make it easier to shit post

Username checks out, however the message content is highly informative…

The problem with that is that you aren’t going to comment the same thing across different instances of what’s basically the same community, you generally want to engage with the most users. I recently got banned under false accusations, and it’s pretty easy to see when it happened because the amount of upvotes and engagement dropped drastically between comments to the same instance - only the people from my instance were seeing it now.

The next step for lemmy might be the concept of mirrored communities where comments are automatically propagated across instances if they belong to the same owners and they have it enabled. Admins would control access/visibility for users of their instance, and the community owners would control the access/visibility of all instances they’ve reserved the community under. Admins could just decide to remove all the moderators/de-federate the community in the instance they control to sever the mirroring and create their own, but it might still help the smaller instances to get going.

I’m not sure how admins, specially the ones who are ok with lying to their users, would be ok with it, and it’s meaningless if they just wield their charisma and taint those communities as well. So far, they are pretty blatant, yet admins either aren’t bothering to check the evidence or they simply don’t want to de-federate, which is just another way of condoning their behavior to avoid risking user engagement. Adding mirrored communities into the mix may just not solve it because the problem is still there: a divided user base who’s getting treated like cattle without them knowing.

@jsdz@lemmy.ml
link
fedilink
35
edit-2
1Y

It was added to the “exclude” list in an apparently unrelated commit three days ago with absolutely no explanation. Glancing at its front page I see nothing objectionable, just a lot of anime stuff. When challenged u/dessalines had nothing to say other than “no, that is full of CSAM” and just closed the discussion without further comment.

Unless some more info comes to light it does not look good. Probably as good a time as any to depart from lemmy.ml.

density
link
fedilink
21Y

No stake in any of this but curious, what info would you find convincing?

cacheson
link
fedilink
121Y

Literally any evidence at all beyond “dessalines said so” would be a good start. Hell, even dessalines specifically describing what he saw would be great.

density
link
fedilink
-121Y

So you want a link to csam or admission from someone thst they seaech for and viewed csam?

cacheson
link
fedilink
81Y

What are you on about? Dessalines said “No, that is full of CSAM.” I would like to know how they came to that conclusion.

@jsdz@lemmy.ml
link
fedilink
7
edit-2
1Y

I wouldn’t need to be wholly convinced that there’s anything heinous going on over there, just that the person accusing them of it had good reason to think so. So pretty much anything more than no info at all would probably have done the trick. Anyway, thanks for putting up with me for a little while and good luck to everyone at lemmy.ml, but I’m outta here. I’ll probably go try kbin or something.

Kbin is actually pretty great.

@IzzyData@lemmy.ml
link
fedilink
2
edit-2
1Y

Probably as good a time as any to depart from lemmy.ml.

If the devs / admins of lemmy.ml can’t be trusted and the admins of lemmy.world are abusive then it is safe to say the experiment called Lemmy has failed. There is no recovery from the top 2 instances which make up most of the “content” are not worth supporting. I could go to another instance and block lemmy.world and lemmy.ml once the BE 0.19.0 update rolls out, but then the site is just dead. It’s already pretty much like talking to the wind. but the site would be truly empty at that point.

I started noticing the trend of instances defederating into little islands months ago, but it seems obvious at this point that it the concept of federation isn’t going to work out well. The easily self hostable part is still nice even if it eventually ends up as singular instances with maybe 1 or 2 federated connections that actually post things. There will be a lot of instances that have nothing, but I don’t think that really counts.

You could go to another instance and remain federated with both of those and ani.social. Whether they defederate with ani.social or not doesn’t stop you from engaging with them.

I understand that this is possible. If it were some bad community moderators I would just avoid those communities. If the entire instance is tainted then I wouldn’t want to engage with it even if the instance is federated.

What is with this awful title? There’s no evidence “found” that there was any CSAM

Uranium3006
link
fedilink
81Y

loli content (CSAM)

that term lasted about a week before being watered down just as bad as CP was

AlexisFR
link
fedilink
41Y

That’s because it is CSAM.

Uranium3006
link
fedilink
51Y

you’ve proved my point

Might as well disallow all NSFW content if naked anime girls is going to be considered CSAM. Relating these two things is making light of a real problem.

ram
link
fedilink
241Y

IDK where the lemmy admins are based out of, but many countries consider hentai depicting underage characters to be illegal; my country of Canada’s one such country.

Yote.zip
link
fedilink
81Y

For the record Lemmy.ml does actually disallow NSFW, and they defederated from Yiffit.net (a general furry instance) because it has NSFW communities. Tread carefully around them or they’ll remove you from the Lemmyverse, is the apparent message.

How would federation work in that case? Are they going to defederate any instance that has NSFW content? By their own definition I’ve found CSAM on lemmy.world and every other instance that has NSFW communities.

Yote.zip
link
fedilink
11Y

That seems to be their goal, though they are probably targeting specific instances that they notice most often. I think Yiffit tried to convince them to just block NSFW content or just specific communities instead of defederating entirely but apparently that didn’t work - I’m not in the loop on how the conversation went.

It’s interesting they even programmed the ability to flag communities and posts as NSFW and turn it off in user settings if they didn’t want any NSFW content to be federated with them.

Veraxus
link
fedilink
24
edit-2
1Y

Actual CSAM, depicting an actual crime against an actual child… or “someone drew dirty cartoons and I’m a moron who thinks dirty drawings are the same as one of the most heinous crimes imaginable - harming a vulnerable child”?

I think drawn porn of kids also shouldn’t exist, but it’s definitely different.

Veraxus
link
fedilink
7
edit-2
1Y

This is a very simple calculation for me. I follow the “golden rule of liberty”, which can also be called the “harm principle.”

That is, “your rights end where mine begin. my rights end where yours begin”. Or, it is unethical to restrict anyone’s freedoms/liberties (especially expression) if they are not inflicting harm on others (i.e. infringing on their rights).

Furthermore, I object to any level of subjective analysis of the “legality” of art. Ergo, the mindset of “I think this looks childish, therefore it is a child, therefore it is CP” is exceptionally unethical and should not be tolerated.

And moreso, all of this only muddies and minimizes the ACTUAL crime of abusing children and diverts resources away from protection of real, actual children all because of some inane moralizing over someone’s artwork.

You are too hung up on “punishing the immoral” to realize that the ACTUAL need is “protecting the vulnerable” - and those two things are NOT the same.

This has nothing to do with legality or restricting freedoms. It’s about the admins building a forum to the form they’d like to see.

Plus the harm principle is really fuzzy. What level of interaction is the cutoff for harm vs inadvertent impact?

I also think drawn kiddie porn hurts the people who view it inadvertently. I don’t know if there have been studies on a causation link between viewed cp content and sexual preferences towards minors, or causal relationship between that and abuse/grooming. But that’s another possible harm connection.

Honestly this argument reminds me of the age old claim that “video games causes violence” because people thought glorifying violence in video games would get you to shoot people. In reality, there is still no link between gaming and violence. Sick people hurt other people and blaming art for lack of responsibility is sad.

That’s why I said there’d need to studies on a causation link for this specifically. I know video games have had those studies done and found that there isn’t a link. So you’d want a similar study for this. But there’s still the accidently stumbling across it issue too.

Butt Pirate
link
fedilink
-10
edit-2
1Y

deleted by creator

Uranium3006
link
fedilink
71Y

loli content (CSAM)

in this case, the latter.

people get more offended by cartoons than actual child abuse I swear

Does anyone know if ani does have a lot of csam? Their rule against it seems pretty robust. Was there stuff getting by the rule, or do the .ml admins have a more broad definition of csam?

  1. Do not submit content depicting a child (both real and virtual) engaged or involved in explicit sexual activities. A child is defined as a person who is under 18 years old; or a person, regardless of age, who is presented, depicted or portrayed as under 18 years old.
WadamT
link
fedilink
161Y

I am also on ani.social but never seen any CSAM contents there. Note there are safe-to-view anime loli character meme posts there.

Maybe the .ml admins classified sfw loli as csam? I don’t think I’ve seen any on ani, but I did see some sfw loli foot stuff somewhere that was disturbing. Maybe it was something like that?

Metal Zealot
link
fedilink
21Y

Wikipedia:

In Japanese popular culture, lolicon (ロリコン, also romanized as rorikon or lolicom) is a genre of fictional media in which young (or young-looking) girl characters appear in romantic or sexual contexts.

You are seriously not trying to say that there is such a thing as safe-for-work underaged porn, are you?

I’m saying the stuff I saw was fully clothed.

cacheson
link
fedilink
91Y

or do the .ml admins have a more broad definition of csam?

Their definition seems to be “I don’t like anime”.

Veraxus
link
fedilink
61Y

I’m guessing this is yet another tiring instance of some idiot thinking “manga + nudity = CP”

cacheson
link
fedilink
91Y

OP is lying through their teeth, nothing was found.

A community dedicated to fediverse news and discussion.

Fediverse is a portmanteau of “federation” and “universe”.

Getting started on Fediverse;

  • 0 users online
  • 6 users / day
  • 14 users / week
  • 72 users / month
  • 298 users / 6 months
  • 1 subscriber
  • 1.02K Posts
  • 13.6K Comments
  • Modlog