• SorteKanin@feddit.dkOP
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    8 months ago

    Such is her disillusionment with the company and its apparent lack of desire to change, the Danish psychologist has resigned from the group, claiming Meta does not care about its users’ wellbeing and safety. In reality, she said, the company is using harmful content to keep vulnerable young people hooked to their screens in the interest of company profit.

    This really should not be surprising to anyone at this point. We need politicians to realise how bad this is and regulate it into the ground.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    8 months ago

    This is the best summary I could come up with:


    A leading psychologist who advises Meta on suicide prevention and self-harm has quit her role, accusing the tech giant of “turning a blind eye” to harmful content on Instagram, repeatedly ignoring expert advice and prioritising profit over lives.

    Lotte Rubæk, who has been on Meta’s global expert group for more than three years, told the Observer that the tech giant’s ongoing failure to remove images of self-harm from its platforms is “triggering” vulnerable young women and girls to further harm themselves and contributing to rising suicide figures.

    Such is her disillusionment with the company and its apparent lack of desire to change, the Danish psychologist has resigned from the group, claiming Meta does not care about its users’ wellbeing and safety.

    We’ve consulted with safety experts, including those in our suicide and self-harm advisory group, for many years and their feedback has helped us continue to make significant progress in this space.

    Rubæk, who leads the self-injury team in child and adolescent psychiatry in the Capital Region of Denmark, was first approached about joining the select group of experts – which has 24 publicly listed members – in December 2020.

    The invite came after she publicly criticised Meta, then known as Facebook, over an Instagram network linked to suicides of young women in Norway and Denmark following a documentary by Danish broadcaster DR.


    The original article contains 771 words, the summary contains 223 words. Saved 71%. I’m a bot and I’m open source!

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      8 months ago

      Lemmy is an implementation of the activitypub protocol, so there’s no expectation it would do anything. Moderation is up to an instance’s admins and a community’s moderators.

    • SorteKanin@feddit.dkOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 months ago

      As the other commenter notes, the responsibility lies on each instance. If posts like this get posted, the instance should probably recruit more moderators to ensure that it can be removed.

      Unlike Meta, a lemmy instance definitely has clear motivation to remove this stuff.