End to end is end to end. Its either “the devices sign the messages with keys that never leave the the device so no 3rd party can ever compromise them” or it’s not.
Signal is a more trustworthy org, but google isn’t going to fuck around with this service to make money. They make their money off you by keeping you in the google ecosystem and data harvesting elsewhere.
Thats a different tech. End to end is cut and dry how it works. If you do anything to data mine it, it’s not end to end anymore.
Only the users involved in end to end can access the data in that chat. Everyone else sees encrypted data, i.e noise. If there are any backdoors or any methods to pull data out, you can’t bill it as end to end.
You are suggesting that “end-to-end” is some kind of legally codified phrase. It just isn’t. If Google were to steal data from a system claiming to be end-to-end encrypted, no one would be surprised.
I think your point is: if that were the case, the messages wouldn’t have been end-to-end encrypted, by definition. Which is fine. I’m saying we shouldn’t trust a giant corporation making money off of selling personal data that it actually is end-to-end encrypted.
By the same token, don’t trust Microsoft when they say Windows is secure.
When you use the Google Messages app to send end-to-end encrypted messages, all chats, including their text and any files or media, are encrypted as the data travels between devices. Encryption converts data into scrambled text. The unreadable text can only be decoded with a secret key.
The secret key is a number that’s:
Created on your device and the device you message. It exists only on these two devices.
Not shared with Google, anyone else, or other devices.
Generated again for each message.
Deleted from the sender’s device when the encrypted message is created, and deleted from the receiver’s device when the message is decrypted.
Neither Google or other third parties can read end-to-end encrypted messages because they don’t have the key.
Not that I can find. Can you post Signals most recent independent audit?
Many of these orgs don’t post public audits like this. Its not common, even for the open source players like Signal.
What we do have is a megacorp stating its technical implementation extremely explicitly for a well defined security protocol, for a service meant to directly compete with iMessage. If they are violating that, it opens them up to huge legal liability and reputational harm. Neither of these is worth data mining this specific service.
I do consider Signal to be a more trustworthy org than Google clearly, but find this quibbling about them “maybe putting a super secret backdoor in the e2ee they use to compete with iMessage” to be pretty clear FUD.
Even if we assume they don’t have a backdoor (which is probably accurate), they can still exfiltrate any data they want through Google Play services after it’s decrypted.
They’re an ad company, so they have a vested interest in doing that. So I don’t trust them. If they make it FOSS and not rely on Google Play services, I might trust them, but I’d probably use a fork instead.
They can just claim archived or deleted messages don’t qualify for end to end encryption in their privacy policy or something equally vague. If they invent their own program they can invent the loophole on how the data is processed
This part is likely, but not what we are talking about. Who you know and how you interact with them is separate from the fact that the content of the messages is not decryptable by anyone but the participants, by design. There is no “quasi” end to end. Its an either/or situation.
It doesn’t matter if the content is encrypted in transit if Google can access the content in the app after decryption. That doesn’t violate E2EE, and they could easily exfiltrate the data though Google Play Services, which is a hard requirement.
I don’t trust them until the app is FOSS, doesn’t rely on Google Play Services, and is independently verified to not send data or metadata to their servers. Until then, I won’t use it.
Provided they have an open API and don’t ban alternative clients, one can make something kinda similar to TOR in this system, taking from the service provider the identities and channels between them.
Meaning messages routed through a few hops over different users.
Sadly for all these services to have open APIs, there needs to be force applied. And you can’t force someone far stronger than you and with the state on their side.
When you use the Google Messages app to send end-to-end encrypted messages, all chats, including their text and any files or media, are encrypted as the data travels between devices. Encryption converts data into scrambled text. The unreadable text can only be decoded with a secret key.
The secret key is a number that’s:
Created on your device and the device you message. It exists only on these two devices.
Not shared with Google, anyone else, or other devices.
Generated again for each message.
Deleted from the sender’s device when the encrypted message is created, and deleted from the receiver’s device when the message is decrypted.
Neither Google or other third parties can read end-to-end encrypted messages because they don’t have the key.
They cant fuck with it, at all, by design. That’s the whole point. Even if they created “archived” messages to datamine, all they would have is the noise.
Exactly. We know corporations regularly use marketing and doublespeak to avoid the fact that they operate for their interests and their interests alone. Again, the interests of corporations are not altruistic, regardless of the imahe they may want to support.
Why should we trust them to “innovate” without independent audit?
It could be end to end encrypted and safe on the network, but if Google is in charge of the device, what’s to say they’re not reading the message after it’s unencrypted? To be fair this would compromise signal or any other app on Android as well
That’s a different threat model that verges on “most astonishing corporate espinoage in human history and greatest threat to corporate personhood” possible for Google. It would require thousands if not tens of thousands of Google employees coordinating in utter secrecy to commit an unheard of crime that would be punishable by death in many circumstances.
If they have backdoored all android phones and are actively exploting them in nefarious ways not explained in their various TOS, then they are exposing themselves to ungodly amounts of legal and regulatory risks.
I expect no board of directors wants a trillion dollars of company worth to evaporate overnight, and would likely not be okay backdooring literally billions of phones from just a fiduciary standpoint.
It would require thousands if not tens of thousands of Google semployees coordinating in utter secrecy
This is usually used for things like the Moon Landing, where so many folks worked for NASA to make it entirely impossible that the landing was faked.
But it doesn’t really apply here. We know for example that NSA backdoors exist in Windows. Were those a concerted effort by MS employees? Does everyone working on the project have access to every part of the code?
It just isn’t how development works at this scale.
Ok but no one is arguing Windows is encrypted. Google is specifically stating, in a way that could get them sued for shitloads of money, that their messaging protocol is E2EE. They have explicitly described how it is E2EE. Google can be a bad company while still doing this thing within the bounds we all understand. For example, just because the chat can’t be backdoored doesn’t mean the device can’t be.
How do spyware services used by nation-state customers, like Pegasus, work?
They use backdoors in commonly used platforms on an industrial scale.
Maybe some of them are vulnerabilities due to honest mistakes, the problem is - the majority of vulnerabilities due to honest mistakes also carry denial of service risks in widespread usage. Which means they get found quickly enough.
So your stance is that Google is applying self designed malware to its own services to violate its own policies to harvest data that could bring intense legal, financial and reputational harm to it as an org it was ever discovered?
You think a nearly trillion dollar public company has an internal division that writes malware against flaws in its own software in order to harvest data from its own apps. It does this to gain just a bit more data about people it already has a lot of data on, because why not purposely leave active zero days in your own software, right?
That is wildly conspiratorial thinking, and honestly plain FUD. It undermines serious, actual privacy issues the company has when you make up wild cabals that are running double secret malware attacks against themselves inside Google.
You think a nearly trillion dollar public company has an internal division that writes malware against flaws in its own software in order to harvest data from its own apps. It does this to gain just a bit more data about people it already has a lot of data on, because why not purposely leave active zero days in your own software, right?
You think you are being the smart one here?
No, that’s not what I said. Also cypherpunks and other hobbyists are not that much smarter than corporations and nation-states, to be the only ones to think about plausible deniability.
For example, the whole Windows sources have been given officially for various 3-letter agencies of various countries (Russia included) to study, and of course there are vulnerabilities with the size of such codebase. MS might not have left obvious backdoors and informed FSB of them, but it has given interested parties the ability to find those themselves, which is only a question of work, or maybe make tampered versions of DLLs and what not easier.
Also they are legally obligated to silently comply with a lot of things.
That is wildly conspiratorial thinking, and honestly plain FUD.
WhatsApp and Facebook (before it bought WhatsApp) have both done this, Telegram has done this, MS has done this, even Apple has done this.
when you make up wild cabals that are running double secret malware attacks against themselves inside Google.
You made that up, not me. Should have tried to read what you are being told first.
Signal doesn’t harvest, use, sell meta data, Google may do that.
E2E encryption doesn’t protect from that.
Signal is orders of magnitude more trustworthy than Google in that regard.
There’s also Session, a fork of Signal which claims that their decentralised protocol makes it impossible/very difficult for them to harvest metadata, even if they wanted to.Tho I personally can’t vouch for how accurate their claims are.
Agreed. That still doesnt mean google is not doing E2EE for its RCS service.
Im not arguing Google is trustworthy or better than Signal. I’m arguing that E2EE has a specific meaning that most people in this thread do not appear to understand.
Sure!
I was merely trying to raise awareness for the need to bring privacy protection to a level beyond E2EE, although E2EE is a very important and useful step.
End to end could still - especially with a company like Google - include data collection on the device. They could even “end to end” encrypt sending it to Google in the side channel. If you want to be generous, they would perform the aggregation in-device and don’t track the content verbatim, but the point stands: e2e is no guarantee of privacy. You have to also trust that the app itself isn’t recording metrics, and I absolutely do not trust Google to not do this.
They make so of their big money from profiling and ads. No way they’re not going to collect analytics. Heck, if you use the stock keyboard, that’s collecting analytics about the texts you’re typing into Signal, much less Google’s RCS.
Note that it doesn’t mean metadata is encrypted. They may not know what you sent, but they may very well know you message your mum twice a day and who your close friends are that you message often, that kinda stuff. There’s a good bit you can do with metadata about messages combined with the data they gather through other services.
Yup, they can read anything you can, and send whatever part they want through Google Play services. I don’t trust them, so I don’t use Messenger or Play services on my GrapheneOS device.
You have the key, not the provider. They are explicit about this in the implementation.
They can only read the messages before encryption if they are backdooring all android phones in an act of global sabotage. Pretty high consequences for soke low stakes data.
I’m pretty sure the key is stored on the device, which is backed up to Google. I cannot say for sure if they do or don’t backup your keyring, but I feel better not using it.
End to end is end to end. Its either “the devices sign the messages with keys that never leave the the device so no 3rd party can ever compromise them” or it’s not.
Signal is a more trustworthy org, but google isn’t going to fuck around with this service to make money. They make their money off you by keeping you in the google ecosystem and data harvesting elsewhere.
Your honor, I would like to submit Exhibit A, Google Chrome “Enhanced Privacy”.
https://www.eff.org/deeplinks/2023/09/how-turn-googles-privacy-sandbox-ad-tracking-and-why-you-should
Google will absolutely fuck with anything that makes them money.
This. Distrust in corporations is healthy regardless of what they claim.
Thats a different tech. End to end is cut and dry how it works. If you do anything to data mine it, it’s not end to end anymore.
Only the users involved in end to end can access the data in that chat. Everyone else sees encrypted data, i.e noise. If there are any backdoors or any methods to pull data out, you can’t bill it as end to end.
You are suggesting that “end-to-end” is some kind of legally codified phrase. It just isn’t. If Google were to steal data from a system claiming to be end-to-end encrypted, no one would be surprised.
I think your point is: if that were the case, the messages wouldn’t have been end-to-end encrypted, by definition. Which is fine. I’m saying we shouldn’t trust a giant corporation making money off of selling personal data that it actually is end-to-end encrypted.
By the same token, don’t trust Microsoft when they say Windows is secure.
Its a specific, technical phrase that means one thing only, and yes, googles RCS meets that standard:
https://support.google.com/messages/answer/10262381?hl=en
They have more technical information here if you want to deep dive about the literal implementation.
You shouldn’t trust any corporation, but needless FUD detracts from their actual issues.
You are missing my point.
I don’t deny the definition of E2EE. What I question is whether or not RCS does in fact meet the standard.
You provided a link from Google itself as verification. That is… not useful.
Has there been an independent audit on RCS? Why or why not?
Not that I can find. Can you post Signals most recent independent audit?
Many of these orgs don’t post public audits like this. Its not common, even for the open source players like Signal.
What we do have is a megacorp stating its technical implementation extremely explicitly for a well defined security protocol, for a service meant to directly compete with iMessage. If they are violating that, it opens them up to huge legal liability and reputational harm. Neither of these is worth data mining this specific service.
I’m not suggesting that Signal is any better. I’m supporting absolute distrust until such information is available.
Here’s all their independent audits:
https://community.signalusers.org/t/overview-of-third-party-security-audits/13243
Thank you. I had trouble running down a list.
I do consider Signal to be a more trustworthy org than Google clearly, but find this quibbling about them “maybe putting a super secret backdoor in the e2ee they use to compete with iMessage” to be pretty clear FUD.
Even if we assume they don’t have a backdoor (which is probably accurate), they can still exfiltrate any data they want through Google Play services after it’s decrypted.
They’re an ad company, so they have a vested interest in doing that. So I don’t trust them. If they make it FOSS and not rely on Google Play services, I might trust them, but I’d probably use a fork instead.
They can just claim archived or deleted messages don’t qualify for end to end encryption in their privacy policy or something equally vague. If they invent their own program they can invent the loophole on how the data is processed
Or the content is encrypted, but the metadata isn’t, so they can market to you based on who you talk to and what they buy, etc.
This part is likely, but not what we are talking about. Who you know and how you interact with them is separate from the fact that the content of the messages is not decryptable by anyone but the participants, by design. There is no “quasi” end to end. Its an either/or situation.
It doesn’t matter if the content is encrypted in transit if Google can access the content in the app after decryption. That doesn’t violate E2EE, and they could easily exfiltrate the data though Google Play Services, which is a hard requirement.
I don’t trust them until the app is FOSS, doesn’t rely on Google Play Services, and is independently verified to not send data or metadata to their servers. Until then, I won’t use it.
Provided they have an open API and don’t ban alternative clients, one can make something kinda similar to TOR in this system, taking from the service provider the identities and channels between them.
Meaning messages routed through a few hops over different users.
Sadly for all these services to have open APIs, there needs to be force applied. And you can’t force someone far stronger than you and with the state on their side.
The messages are signed by cryptographic keys on the users phones that never leave the device. They are not decryptable in any way by google or anyone else. Thats the very nature of E2EE.
They cant fuck with it, at all, by design. That’s the whole point. Even if they created “archived” messages to datamine, all they would have is the noise.
Exactly. We know corporations regularly use marketing and doublespeak to avoid the fact that they operate for their interests and their interests alone. Again, the interests of corporations are not altruistic, regardless of the imahe they may want to support.
Why should we trust them to “innovate” without independent audit?
End to end doesn’t say anything about where keys are stored, it can be end to end encrypted and someone else have access to the keys.
It could be end to end encrypted and safe on the network, but if Google is in charge of the device, what’s to say they’re not reading the message after it’s unencrypted? To be fair this would compromise signal or any other app on Android as well
That’s a different threat model that verges on “most astonishing corporate espinoage in human history and greatest threat to corporate personhood” possible for Google. It would require thousands if not tens of thousands of Google employees coordinating in utter secrecy to commit an unheard of crime that would be punishable by death in many circumstances.
If they have backdoored all android phones and are actively exploting them in nefarious ways not explained in their various TOS, then they are exposing themselves to ungodly amounts of legal and regulatory risks.
I expect no board of directors wants a trillion dollars of company worth to evaporate overnight, and would likely not be okay backdooring literally billions of phones from just a fiduciary standpoint.
This is usually used for things like the Moon Landing, where so many folks worked for NASA to make it entirely impossible that the landing was faked.
But it doesn’t really apply here. We know for example that NSA backdoors exist in Windows. Were those a concerted effort by MS employees? Does everyone working on the project have access to every part of the code?
It just isn’t how development works at this scale.
Ok but no one is arguing Windows is encrypted. Google is specifically stating, in a way that could get them sued for shitloads of money, that their messaging protocol is E2EE. They have explicitly described how it is E2EE. Google can be a bad company while still doing this thing within the bounds we all understand. For example, just because the chat can’t be backdoored doesn’t mean the device can’t be.
Telegram has its supposedly E2EE protocol which isn’t used by most of Telegram users, but also there have been a few questionable traits found in it.
Google is trusted a bit more than Pavel Durov, but it can well do a similar thing.
And yes, Android is a much larger heap of hay where they can hide a needle.
I think it’s also confirmed by radio transmissions from the Moon received in real time right then by USSR and other countries.
How do spyware services used by nation-state customers, like Pegasus, work?
They use backdoors in commonly used platforms on an industrial scale.
Maybe some of them are vulnerabilities due to honest mistakes, the problem is - the majority of vulnerabilities due to honest mistakes also carry denial of service risks in widespread usage. Which means they get found quickly enough.
So your stance is that Google is applying self designed malware to its own services to violate its own policies to harvest data that could bring intense legal, financial and reputational harm to it as an org it was ever discovered?
Seems far fetched.
Legal and financial - doubt it. Reputational - counter-propaganda is a thing.
I think your worldview lags behind our current reality. I mean, even in 30-years old reality it would seem a bit naive.
Also you’ve ignored me mentioning things like Pegasus, from our current, not hypothetical, reality.
So yes.
You think a nearly trillion dollar public company has an internal division that writes malware against flaws in its own software in order to harvest data from its own apps. It does this to gain just a bit more data about people it already has a lot of data on, because why not purposely leave active zero days in your own software, right?
That is wildly conspiratorial thinking, and honestly plain FUD. It undermines serious, actual privacy issues the company has when you make up wild cabals that are running double secret malware attacks against themselves inside Google.
You think you are being the smart one here?
No, that’s not what I said. Also cypherpunks and other hobbyists are not that much smarter than corporations and nation-states, to be the only ones to think about plausible deniability.
For example, the whole Windows sources have been given officially for various 3-letter agencies of various countries (Russia included) to study, and of course there are vulnerabilities with the size of such codebase. MS might not have left obvious backdoors and informed FSB of them, but it has given interested parties the ability to find those themselves, which is only a question of work, or maybe make tampered versions of DLLs and what not easier.
Also they are legally obligated to silently comply with a lot of things.
WhatsApp and Facebook (before it bought WhatsApp) have both done this, Telegram has done this, MS has done this, even Apple has done this.
You made that up, not me. Should have tried to read what you are being told first.
Signal doesn’t harvest, use, sell meta data, Google may do that.
E2E encryption doesn’t protect from that.
Signal is orders of magnitude more trustworthy than Google in that regard.
There’s also Session, a fork of Signal which claims that their decentralised protocol makes it impossible/very difficult for them to harvest metadata, even if they wanted to.Tho I personally can’t vouch for how accurate their claims are.
Agreed. That still doesnt mean google is not doing E2EE for its RCS service.
Im not arguing Google is trustworthy or better than Signal. I’m arguing that E2EE has a specific meaning that most people in this thread do not appear to understand.
Sure!
I was merely trying to raise awareness for the need to bring privacy protection to a level beyond E2EE, although E2EE is a very important and useful step.
End to end could still - especially with a company like Google - include data collection on the device. They could even “end to end” encrypt sending it to Google in the side channel. If you want to be generous, they would perform the aggregation in-device and don’t track the content verbatim, but the point stands: e2e is no guarantee of privacy. You have to also trust that the app itself isn’t recording metrics, and I absolutely do not trust Google to not do this.
They make so of their big money from profiling and ads. No way they’re not going to collect analytics. Heck, if you use the stock keyboard, that’s collecting analytics about the texts you’re typing into Signal, much less Google’s RCS.
end to end is meaningless when the app scans your content and does whatever with it
For example, WhatsApp and their almost-mandatory “backup” feature.
Note that it doesn’t mean metadata is encrypted. They may not know what you sent, but they may very well know you message your mum twice a day and who your close friends are that you message often, that kinda stuff. There’s a good bit you can do with metadata about messages combined with the data they gather through other services.
End to end matters, who has the key; you or the provider. And Google could still read your messages before they are encrypted.
Yup, they can read anything you can, and send whatever part they want through Google Play services. I don’t trust them, so I don’t use Messenger or Play services on my GrapheneOS device.
You have the key, not the provider. They are explicit about this in the implementation.
They can only read the messages before encryption if they are backdooring all android phones in an act of global sabotage. Pretty high consequences for soke low stakes data.
I’m pretty sure the key is stored on the device, which is backed up to Google. I cannot say for sure if they do or don’t backup your keyring, but I feel better not using it.
I mean, Google does, with Play Services.
You may be right for that particular instance, but I’d still argue caution is safer.
Of course our app is end-to-end encrypted! The ends being your device and our server, that is.
It’s end to end to end encrypted!
Unless you’re Zoom and just blatantly lie lol
https://arstechnica.com/tech-policy/2021/08/zoom-to-pay-85m-for-lying-about-encryption-and-sending-data-to-facebook-and-google/