We have been informed of another potential CSAM attack to our federated instance lemmy.ml.
After the events of the last time, I have preemptively and temporarily defederated us from lemmy.ml until the situation can be assessed with more clarity.
I have already deleted the suspicious posts (without looking at them myself, all from the database’s command line) and banned the author. To the best of our knowledge, at no point in time any CSAM content was saved on our server.
EDIT: 2023-09-03 8:40 UTC
There have been no further reports of similar problems arising from lemmy.ml or other instances, so I am re enabling federation. Thank you for your patience.
Thank you for being on it.
Yeah, it was hidden behind a tiktok video. Talk about bait 'n switch, huh
I’ve submitted another report via the cyber tip line. Next steps are to apply for an account with the NCMEC to get access to tools like Cloudflare’s CSAM scanning tools.
If cloudflare gets introduced to lemmy, i’ll migrate to something else. Fuck crimeflare, that MITM’ing piece of garbage.
I hate centralization.
What’s wrong with Cloudflare? This one in particular would just be their caching service, our host remains our own, and the primary reason for tunneling through that would be levraging this anti CSAM tool. Because as it stands that’s our only reasonable (and free) option to detect it and automatically block it. Don’t wanna end up behind bars just because you unknowingly had it on your sever.
The privacy crowd doesnt like cloudflare brokering their SSL connection. If youre going to use CF you might as well use their WAF to stop XSS attacks. The pictures portion of lemmy was vulnerable to that recently.
Yeah I remember that. We lost our first instance to that XSS attack (this one we are writing on is the second one).
And I get why some people might not like Cloudflare, but to my knowledge that’s quite literally the only tool at our disposal. These constant attack can be stressing to some admins, it’s illegal stuff after all. Even if we are doing everything right and reporting it to the authorities, as soon as I got notice of this I had to drop anything I was doing, jump on SSH and start fixing stuff. This isn’t really sustainable in the long run.
I understand. You could roll your own HA proxy but it would be more expensive and wouldnt be able to provide you the inappropriate content inspect CF provides.
If someone is really concerned about privacy they shouldnt be using lemmy to begin with.
I don’t really care about caching or load balancing, the only reason I’m considering Cloudflare is that CSAM filter.
If someone is really concerned about privacy they shouldnt be using lemmy to begin with.
That’s correct, actually. On one hand, the devs seem so focused on the privacy of users that they often prioritize that over improving the safety of the software (for instance the Lemmy server has next to no logs, apparently for that reason). On the other hand, it’s crazy how much data is transferred over federation. For instance, I have already developed a script that allows me to view EVERY post or comment someone has upvoted. The data is all there, wouldn’t take much for someone to harvest it en masse and start profiling users.
Us. We hoover-in-mass and profile users.
I’m with you. I don’t like Cloudflare either. Not only for privacy reasons, but I’ve just had a number of generally bad customer interactions with them on other projects. Unfortunately, it seems to be the only solution for this issue. We’ve been lucky so far with these incidents. If somebody ever uploaded that stuff here it’d be an incredible pain in the ass.
If only there way to have MS,AWS or Google’s vision AI scan all the images and automatically remove them when it’s determined to be inappropriate.
Those tools are targeted towards large customers and you need a special relationship to get access.
They have a free tier that does 5000 images a month.
As Atalocke said, there’s a Cloduflare tool that automatically scans every image on your site and blocks them if it finds any flagged content. That would solve most of our issues. Access to it requires approval from the US agency that handles this type of problems, though, so it might take a while.
deleted by creator