Genetic testing company 23andMe said Monday that hackers were able to access the data of about 6.9 million people, far more than the company previously acknowledged.

The finding is the result of an investigation 23andMe launched in October, after at least one list of people whom the site identified as having Ashkenazi Jewish ancestry was posted online.

  • Th4tGuyII@kbin.social
    link
    fedilink
    arrow-up
    39
    arrow-down
    1
    ·
    11 months ago

    If I needed anymore convincing that you shouldn’t just give your genomic data away forever to shady companies for “FuN tRiViA” about your ancestry, this is certainly it

    • BruceTwarzen@kbin.social
      link
      fedilink
      arrow-up
      14
      arrow-down
      2
      ·
      11 months ago

      No one could ever explain to me what the point of these are, except that they can say now that they are 11% Italian. It’s like a online iq test with spit.

      • deft@ttrpg.network
        link
        fedilink
        arrow-up
        13
        arrow-down
        1
        ·
        11 months ago

        i dunno a lot of history is about human diaspora for different reasons. people are allowed to be interested in their history and shouldn’t have to worry about this kind of bullshit

        • BruceTwarzen@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          11 months ago

          You can be interested in your IQ, that doesn’t mean taking an facebook IQ test is the right play

      • Soap10116@lemm.ee
        link
        fedilink
        arrow-up
        7
        ·
        11 months ago

        I think 23nMe looks into genome related predisposal to health issues. Like “this sequence here is related to high probability of pancreatic cancer” or some bullshit like that.

        In the hands of hackers, I guess it could be used to target individuals for highly specific scams or something like that. That’s only what I can think of though. Who knows what they’ll use it for and if it even has identifiable personal info past just ancestry.

        The tinfoil hat theory is that this info could be used to charge extra for insurance premiums/denial of coverage…

      • girlfreddy@lemmy.ca
        link
        fedilink
        arrow-up
        7
        arrow-down
        3
        ·
        11 months ago

        In Canada it’s more like 11% First Nations, Inuit or Metis … so white people can play pretendian.

      • littlecolt@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        After my gma and gpa had Alzheimer’s, we (mom, me, brother) did it to see if we had the genetic markers for increased risk for the disease.

        Bonus: we also found out we have a half sister on Dad’s side that we did not know about that was born the year after Mom and dad divorced, and Dad also didn’t know about her,. or so he says. So that was informative.

          • littlecolt@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            Hope for medical advancement, mostly. However, there is positivity and purpose in knowing. I now know that I have one of two markers and my risk is elevated compared to the average person. Working to keep my mind sharp and healthy should be a higher priority for me now. We are learning more about this stuff as the time. Knowing is always better than ignorance.

  • DrunkenPirate@feddit.de
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    11 months ago

    Captain Obvious was hiding for a while. Now, he runs with your most personal data. Even your kids will thank you in coming years for whatever behavior, diseases, IQ or political preferences will be found to be rooted in genomic data. The world will know theirs.

    Edit: Oh, and you‘ll help your family members to be jailed, if somehow involved in criminal acts.

    • Mamertine@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      11 months ago

      Oh, and you‘ll help your family members to be jailed, if somehow involved in criminal acts.

      The police have been using that data to solve crimes for years. That’s how they found the golden state killer and others.

      https://www.seattletimes.com/seattle-news/law-justice/dna-family-tree-help-solve-52-year-old-seattle-slaying/

      Public genealogy databases, which contain information from people who have obtained their DNA profiles from companies like 23andMe and Ancestry.com, have become a powerful police tool…

        • Mamertine@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          11 months ago

          OMG that was terrible. I usually proof read before posting. I’ll get that cleaned up. Thanks for pointing that out.

          • DrunkenPirate@feddit.de
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            11 months ago

            Indeed. Now, publically available for every governmental police. Not the US only. Even Iranian or Saudi police. Hey wait, don’t they punish family members of political enemies?

            Seriously, once there were police men at my door asking for a DNA test by free will. There was a rape in my county and they asked every male. I wondered who‘s going to be that stupid and handing over police your DNA. Once its stored in a database…

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    arrow-up
    23
    ·
    11 months ago

    Who can have possibly have foreseen this happening repeatedly?

    If you have a pile of gold it can only be stolen once. After that you no longer have it. If you have data it can be “stolen” an infinite number of times since each time you still have it.

  • Danny M@lemmy.escapebigtech.info
    link
    fedilink
    arrow-up
    22
    arrow-down
    2
    ·
    edit-2
    11 months ago

    It’s truly a shame that in this advanced age of technology, encryption remains a distant, unattainable dream! In this archaic age of ours, safeguarding customer data is just not possible yet because nobody has ever invented the concept of public private key pairs yet, and hackers are having a field day with our data. Clearly, we’re still stuck in the digital dark ages where safeguarding sensitive information is just a pipe dream. 🙄

    Seriously, how is it possible that they’re still not using key pairs for encrypting this data? It would be so simple, you just include a flash drive, or a qr code, in the box with the key and accessing the website to view the data would require that key, how is that still not something they’re doing?

    #EncryptionPlease

      • Danny M@lemmy.escapebigtech.info
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        11 months ago

        that has nothing to do with it, you just give each user a private key, and in order to view the data you need the key, simple as that, in fact it’s even simpler for a user, no passwords to remember

        • atzanteol@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          ·
          11 months ago

          that has nothing to do with it

          Wut? You mentioned “encryption” over and over (#encryptionplease) and now it “has nothing to do with that?”

          you just give each user a private key, and in order to view the data you need the key, simple as that

          You’re expecting people to download a GPG encrypted file, handle key maintenance, and offline decrypt it for viewing? And not store the decrypted data on their drives? Almost nobody would be willing to do that. And it’s not necessary.

          WebAuthn or MFA would have helped in this situation and are far easier for end-users to use properly.

          • Danny M@lemmy.escapebigtech.info
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            11 months ago

            Wut? You mentioned “encryption” over and over (#encryptionplease) and now it “has nothing to do with that?”

            I’m saying that the fact that they accessed the data through compromised accounts is irrelevant. If the data was properly encrypted and only decrypted on the user’s machine with the key.

            You’re expecting people to download a GPG encrypted file, handle key maintenance, and offline decrypt it for viewing? And not store the decrypted data on their drives? Almost nobody would be willing to do that. And it’s not necessary.

            You made a lot of assumptions there.

            I’m not really familiar with the entire process of how 23AndMe but based on what the service is I assume that they ship you a box with a vial or maybe a swab to collect saliva, then include a code, maybe a qr code or a redemption id that you can use to register on their website.

            The packaging could very easily include a QR code with an RSA or EDCSA key, then the website could ask you to scan that QR code to login.

            The website would then use that to calculate the public key based on that private key and make a request to the server for the data associated with that public key, then the user’s browser would decrypt that data and display it

            What exactly is the problem here? If anything it’s simpler than username/password from a user standpoint.

            And as for WebAuthn, yeah that would work and it’s definitely better than a password, and would perhaps solve part of the issue, but as a user I would feel much safer with my implementation

            • atzanteol@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              11 months ago

              I’m saying that the fact that they accessed the data through compromised accounts is irrelevant. If the data was properly encrypted and only decrypted on the user’s machine with the key.

              It was encrypted and only decrypted on the user’s machine. It’s called TLS.

              You made a lot of assumptions there.

              Well of course I did - all you said was “use encryption” like that meant anything specific.

              Your proposed solution has lots of problems and isn’t materially more secure than properly implementing WebAuthn or even requiring MFA. All you’ve added is a public/private key authentication which is what WebAuthn does.

              You also need a solution for lost keys, using multiple browsers, etc. or nobody will use it. This is also being provided by WebAuthn.

              The problem here is not “encryption” it’s “properly identifying who should have access to the data.” It’s an authentication problem. The data was encrypted from the servers to the browser in a secure manner. At issue here is account security and authentication.

              To be clear - you solution would “work” but without the above issues addressed nobody will use it because it will be a giant pain in the ass. It’s not like we don’t know how to share data securely, it’s that doing so is complicated and often relies on users to be sophisticated about security. They’re your weakest link so you need a system that:

              1. Is secure
              2. Users will use
              3. Can easily be used correctly by unsophisticated users and
              4. Is easy to automate (not manually generating and handing out keys, etc.)

              And this is what WebAuthn promises.

              And as for WebAuthn, yeah that would work and it’s definitely better than a password, and would perhaps solve part of the issue,

              It only has to solve part of the issue. The end-to-end encryption is covered with TLS.

              I don’t see what your solution would do that isn’t covered by existing tech. And it would require a lot of new protocols to be put in place, support by major browsers, good user interfaces, etc.

              • Danny M@lemmy.escapebigtech.info
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                11 months ago

                It was encrypted and only decrypted on the user’s machine. It’s called TLS.

                How is TLS relevant in this discussion? In this specific case TLS only solves MiTM, that’s it.

                Well of course I did - all you said was “use encryption” like that meant anything specific.

                It was an offhand comment in a lemmy post, of course I’m not gonna go into details… but fair enough

                […] isn’t materially more secure than properly implementing WebAuthn or even requiring MFA.

                This is a bit disingenuous, don’t you think? To be clear I like WebAuthn, I think it’s a great technology that I’ve been evangelizing about it to coworkers and friends for years, it’s definitely the future of authentication, but that’s only marginally relevant, in the case of sensitive data like this you want the data to be both encrypted at rest and during transmission, with a unique pre-generated key, otherwise a rogue employee or in general someone with access to the database can see everything, regardless of anything else.

                • atzanteol@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  11 months ago

                  How is TLS relevant in this discussion? In this specific case TLS only solves MiTM, that’s it.

                  “If the data was properly encrypted and only decrypted on the user’s machine with the key.”

                  TLS is “encryption” that is “only decrypted on the user’s machine with [a] key”. I feel like I’m completely misunderstanding what sort of protocol you have in mind though.

                  [re: WebAuthn] but that’s only marginally relevant,

                  This breach was a failure of authentication via reused passwords. It’s the exact scenario WebAuthn is designed to mitigate. I have no idea how you could consider it to be “marginally relevant”.

                  in the case of sensitive data like this you want the data to be both encrypted at rest and during transmission, with a unique pre-generated key, otherwise a rogue employee or in general someone with access to the database can see everything, regardless of anything else.

                  I don’t know what you mean. Just that the database should be encrypted (I have no reason to believe it wasn’t) or that the company should encrypt the data such that the only person who can view it at all is the person whose account it is for?

    • voxel@sopuli.xyz
      link
      fedilink
      arrow-up
      7
      arrow-down
      3
      ·
      edit-2
      11 months ago

      because I’m pretty sure they need some of that data to be unencryped;
      records of related customers can improve accuracy drastically
      and they’re probably also probably selling it

      also this “hack” was done by just abusing built-in features (“dna relatives” system), not actually breaking any security.

      • Danny M@lemmy.escapebigtech.info
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        11 months ago

        because I’m pretty sure they need some of that data to be unencryped; records of related customers can improve accuracy drastically

        I don’t even think this should be a feature, but, if it has to, then they can have two versions of it, one that they use for training and improving the results and a user can only access their data from a frontend by decryping it (locally) with their key

        also this “hack” was done by just abusing built-in features (“dna relatives” system), not actually breaking any security.

        irrelevant. if you had a key pair no amount of password guessing would get them there

    • 👍Maximum Derek👍@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      11 months ago

      A strange number of engineers think DARE and transmission encryption are sufficient for sensitive data, as though that encryption isn’t effectively transparent if the tools designed to access it are compromised.

      • Danny M@lemmy.escapebigtech.info
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        Fair point!

        To be clear I wasn’t arguing that DARE is enough, you are absolutely correct that depending on the situation it isn’t, but in my opinion in this specific case. if the data was DAREd, and sent to the user in its encrypted state and only decrypted on the user’s machine with the user’s key, that’s not stored in any server, it would have completely fixed this specific issue. Naturally, however, to your point, with encryption there is no one-size-fits-all argument!

  • sexy_peach@feddit.de
    link
    fedilink
    English
    arrow-up
    18
    ·
    11 months ago

    In a just world they would be fined out of existence. The owners/CEO etc should spend time in prison.

  • HubertManne@kbin.social
    link
    fedilink
    arrow-up
    14
    ·
    11 months ago

    I really wish I lived in a world society where this would not be a big deal and would actually be less likely to happen because there was no financial incentive to it.

      • HubertManne@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        11 months ago

        nope. a post scarcity world where the only reason someone would want the information is for specific purposes like a doctor or just a nosey snoop. Basically a world where anyone who would want to know would likely not be inclined to take the necessary work to bypass token safeguards.

        • cucumber_sandwich@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          I like that perspective. But I’m not sure we’ll ever reach s state of post scarcity where people don’t cheat in their partners and produce offspring that way and that not causing drama.

          • HubertManne@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            11 months ago

            yeah that is what I meant though. those kind of things would either be legitamate (court order) or it would be a rare individual who both had the motive and the skill to get past token safeguards. I mean these breaches are usually done by teams looking to make money (or sow discord although in that case its like money and discord they want)

  • Illuminostro@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    11 months ago

    Is anyone here not aware they’ve already been selling that data to insurance companies and who knows who else?

  • anewbeginning@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    11 months ago

    One of those events you could see from afar. It’s almost non news. You create a treasure trove of data, you can be sure it will be targeted and eventually, stolen.

    • rebul@kbin.social
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      11 months ago

      You are correct. Wait until one of the really large data collectors gets hacked, i.e. Google, Meta, etc. It’s not if, but when.