Thank you
> The American Civil Liberties Union announced Tuesday that Open Whisper Systems (OWS), the company behind popular encrypted messaging app Signal, was subpoenaed earlier this year by a federal grand jury in the Eastern District of Virginia to hand over a slew of information—"subscriber name, addresses, telephone numbers, email addresses, method of payment"—on two of its users.
> ... “The only information responsive to the subpoena held by OWS is the time of account creation and the date of the last connection to Signal servers,” Kaufman continued, also pointing out that the company did in fact hand over this data.
https://arstechnica.com/tech-policy/2016/10/fbi-demands-sign...
Telegram, for all intents and purposes, is about as secure as using Facebook. The best you can do with Telegram is hope they don't sell out or get compromised at some point in the future, because all your private communications are stored on their servers forever. Telegram does have "secret chats", which from what I can gather, don't even work for group chats, only one-to-one messages.
My general advice is to treat Telegram like a new Facebook if you have to use it, assume everything may by read by everyone, don't treat it like it's private and secure.
For "text messaging" friends and family use Signal. Everything is end-to-end encrypted by default, so you know nobody is collecting your data.
What we do know is that programs like Telegram have to store data about users on their servers, by design. A big difference between the two projects is that Signal is carefully designed to minimize the amount of data the service needs to operate; it's why identifiers are phone numbers --- so it can piggyback on your already-existing contact lists, which are kept on your phone.
By contrast, other services store, in effect, a durable list of every person you communicate with, usually indexed in a plaintext database.
As of mid-2016, and trusted as much as you feel like trusting something attested in a court of law, Signal stores: a bool (is this phone number a user) and two ints (epoch of signup, epoch of last transmission).
https://www.aclu.org/open-whisper-systems-subpoena-documents
See the full explanation at https://signal.org/blog/private-contact-discovery/ (starts part way down, with "trust but verify"). Or check the client source code yourself!
Telegram: I dunno, they.re closed source, don't encrypt by default, and have shady ownership. I don't trust them at all, personally.
> do we really trust signal? cause i see zero reason to.
Here's a reason: I use it every day and I'm not dead yet.
[1] https://twitter.com/Snowden/status/1347217810368442368?s=20
And it is convenient, because you can just switch your smartphone and still access all your chathistory, without having to manually backup/restore.
But Telegram in general does not have a business model yet, so just assume, that one day, they want(or have) to cash out.
Signal on the other hand is a non-profit foundation and pretty open on what they are doing. That creates trust for me.
Telegram‘s „secret chats“ and signal chats are end-to-end encrypted. The servers still may store metadata, and there is no way to tell if they do than either joining them or let a trusted third party verify that.
To check if e2e encrypted message content cannot be encrypted via backdoors on their servers, you need to ensure they use proven encryption schemes and the client encryption does correspond to those algorithms.
These are three very different companies with very different security processes and trust profiles.
In the case of Signal: if you trust that the source code they distribute is the same as the app available in the Play Store, then it's pretty easy to verify that the messaging data is end-to-end encrypted in a way that prevents Signal from having much metadata that they even could store. With "sealed sender", they don't even know who's talking to whom: https://signal.org/blog/sealed-sender/
There's the possibility that Signal could ship a different app in the Play store, but that would require active malice to do in a way that would not be trivial to discover[0], and at some point you do have to trust someone. It's not impossible, but it's hard to imagine a world in which Signal is compromised but other links in the chain aren't, because quite frankly, there are far more easily corruptible or hackable links in the hardware/software stack that you use, so Signal would make a pretty inefficient target for someone who wants monetizeable data.
[0] ie, an accidental divergence between the two would be more conspicuous
1. the protocol between client and server is setup in such a way, even if Signal wanted to store interesting information, they could not access anything interesting even if they wanted to (for example, messages), thus they don't store anything since it's useless
2. the app implements the protocol faithfully and this has been checked by people perusing the source code
3. the binary downloaded from the app/play store phone is compiled from the sources listed on github
My concern is not about my data being stored on their servers. My concern is about having having marketing data being sold to third parties in order to target advertising at me, just as when you leave "third party cookies" active on your browser. That is creepy and invasive. Would Zuckerburg ever do such thing?
Just try the following trick:
- in a private Browser window open web.telegram.org
- enter your phone number, receive the code
- turn on flight mode on your phone
- now enter the identification code in your browser
- access your whole chat history while your smartphone can definitely not act as a source
Even WhatsApp is better in this regard.
Article in German: https://www.heise.de/hintergrund/Telegram-Chat-der-sichere-D...
As far as promising not to store your metadata, or promising not to deliberately give the gif service information about your account because they hate you, or promising not to store your contacts when you search for other friends with Signal, then yeah you have to just take their word for it. Though, they may over time look for ways to put some of those guarantees on the client side as well with some clever engineering, so you could prove it.
Which does not prove that if the app sends your phone number or location (for example), they don't save it in database.
Indeed, interesting question
CLI prototype. Can be generalized into a nice phone app.
https://github.com/adsharma/zre_raft https://twitter.com/arundsharma/status/1348718596415918080
After that, what data do you care about? Neither Signal or Telegram is intended to provide complete anonymity. That is a much harder problem. For Mozilla that would involve Tor. I don't think that Mozilla really has "servers" in the sense you mean.
What about Mozilla? What could they store?
(1) All Signal messaging is E2EE; (2) they don't store messages on their servers; (3) the client code is open source, and it seems like a good portion of the server code is open source.
Where I think Signal could go further on being the most secure, useful, and privacy-conscious messaging app/company in the world:
1. Open source ALL of the server code. They have something called Signal-Server (https://github.com/signalapp/Signal-Server) on their Github, but it's unclear if this is the server they use, or simply a server one could theoretically use to run a private Signal server.
2. Open source all server-side services/infrastructure code that doesn't compromise security in some way.
3. Better features. Signal is currently the most secure and privacy-conscious of the messaging apps, but solidly the worst overall user experience. It's not that it's bad, it's just that the other apps are much better. People like gifs and giphy and emojis and a fast-feeling interface. This is important, because it's hard to be a privacy-conscious individual when all your friends want to text on other apps. At least in my social circle, Signal is still the thing that people jump over to when they want be extra super sure they're not leaving a paper trail, but not the default messaging app they use.
4. Introduce a user-supported business model. This probably makes a lot of people uneasy, and while I appreciate the current grant and donation-based business model (the Wikipedia model), that model comes at great cost of efficiency. By operating effectively as a non-profit, you are inherently in a less competitive position relative to your competitors (the best product and engineering people are more likely to go competitors who can pay more), and you're persistently in fund-raising mode (again, see: Wikipedia). There are lots of ways to skin this cat, maybe the easiest is to ask power users to pay like $5/mo. Or just give people the option to pay with absolutely zero obligation. Some non-zero cohort would inevitably take them up on this.
Most of these suggestions, of course, especially 1-3, are very very hard and come at an enormous cost. Building in public as an open source business seems to massively slows things down and introducing a huge amount of community management overhead. That said I'm sure there are ways to manage/mitigate those costs.