• 0 Posts
  • 6 Comments
Joined 1Y ago
cake
Cake day: Jun 01, 2023

help-circle
rss

I can think of some things i could implement on the lemmy server side that could help with this, i’m pretty sure that the IWF maintains a list of file hashes for CSAM and there are probably a few other hash sources you could draw from too.

so the process would be something like the following

  • create a local db for and periodically (like once a day) update CSAM hash list
  • I would be very surprised if hashes for uploads are not already created, compare this hash with list of known harmful material
  • if match found, reject upload and automatically permaban user, then if feasible automatically report as much information as possible about user to law enforcement

so for known CSAM you don’t have to subject mods or user to it before it gets pulled.

for new/edited media with unrecognised hashes that does contain CSAM then a mod/admin would have to review and flag it at which point the same permaban for the user, then law enforcement report could be triggered automatically.

The federation aspect could be trickier though. which is why this would probably be better to be an embedded lemmy feature rather than a third party add on.

I’m guessing it would be possible to create an automoderator that does all this on the community level and only approves the post to go live once it has passed checks.


these could be a little more difficult. they seem to be instance level features.

i might be able to do a tool for the first one using filters if there is a way to insert keywords into a report e.g. “To Mods” or “To Admins”



i’ve said it before and i’ll say it again. give me a spec and i’ll (try to) write you a tool.

i’m a competent coder, but i have no idea kind of what mod tools are needed.


Was a bit of an adjustment when i went to the US and was told off for doing it myself.

We haven’t had full service petrol stations in the UK for decades