Child Sexual Abuse Material (CSAM)
There are multiple reasons for this: Not (yet) knowing how to swim, heat strokes, heart attacks, exhaustion, medical issues and so forth.
I dont’ know doesn’t look like it. But sounds like a good idea, i’m sure you could send a request to the devs.
I already posted some info on the derailment here: feddit post
More information from srf.ch
Thanks for letting me know and taking your time for the write-up.
Most of the best practice is based on us-rules, which not really apply to switzerland but i see that there might be some useful information in it, thanks.
I agree, the problematic thing here is: they have it, so they gonna use it, even if it is not related to their case.
Which suggestions are you speaking of?
Thank you, i appreciate it. I try to keep the necessary maintenance as low as needed to keep everything up and running and in a stable condition.
You’re very welcome. With every update many issues get fixed and the performance gradually improved, which should result in a smoother experience, which is nice.
Great summary for this interesting topic, thank you very much.
It affects the commands in the Dockfile for the container creation. Because that process is built up on OverlayFS.
Good article. But the police did only seize their mastodon server in a “take it all” approach because of an other issue.
Last May, Mastodon server Kolektiva.social was compromised when one of the server’s admins had their home raided by the FBI for unrelated charges. All of their electronics, including a backup of the instance database, were seized.
Good point. If the main function would be implemented on all nodes and everyone would exchange information (hashes), which elements are harmful, we would maybe have a chance to skip central nodes from the equation. If the information is distributed throughout multiple maybe random nodes, but this would pose new problems - where to find that information.
I believe if users are empowered to participate in combating the problem, almost all users would. The question is, how can the nodes harvest this? What can ordinary users do to help?
I agree. Maybe the node can create some “unsafe” hashes and share them between the nodes - kind of like an antivirus creates hashes for malware. The user can always report to the admins - when there are more instances there are automatically more admins to handle the request themselves or with bots.
No, you can register for it at the end of the year, like you would for a new paper vignette.
News are almost always about sensations - they want the people to spend time on their platform and let them see some ads.
I find the conclusion interesting, quote:
Investment in one or more centralized clearinghouses for performing content scanning (as well as investment in moderation tooling) would be beneficial to the Fediverse as a whole.
This contradicts somehow the use case of a federated network, but there would also be major benefits from something like this.
Another Info post from swissinfo.ch
https://www.swissinfo.ch/eng/tornado-like-superstorm-batters-swiss-town/48686428
Nice one, good find.
For your info: I removed your deleted comments.
Seems like, yes. He also presented a short preview what it could look like: https://twitter.com/elonmusk/status/1682978324375543808
Bye Bye Tweeting
It seems like yes. For this particular case, it also seems like they wanted to bring down the community - which they achieved. The whole discussion raises the problem of such content being federated to other instances, which is really problematic.