Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Signal says it won’t compromise on encryption (theverge.com)
422 points by TobTobXX on Oct 27, 2022 | hide | past | favorite | 316 comments


Requiring handing over encryption keys as a requirement to do business there sounds like a good way to sanction yourself from the modern world.


As much as I agree with you according to my own principles, I would not underestimate the sacrifice-anything-for-profits side of the cost-benefit analysis that many businesses will perform.


Signal is a nonprofit though. They shouldn't be under pressure to create business.


Non profit but still against federation or anyone running an alternative client. Why?


The good answer: They can't verify the integrity of alternative clients and that they don't leak info

The other answer: They've got somewhat of a "we know best" vibe going for them which also comes in play when you see their response to feature requests - e.g. for usernames instead of phone numbers or for "edit message" functionality like Telegram has.


I don't know what response to username feature requests you're talking about, but AFAIK they've been saying for a long time already that they're working on that ­— but it's a significant rework of their architecture, so it understandably is taking a long time, especially to also do so in a privacy preserving way.


>The good answer: They can't verify the integrity of alternative clients and that they don't leak info

The good answer isn't even a good argument.

The client is open source. People have forked it and used the Signal network. Signal asked them to stop and they did, but there is nothing stopping people from ignoring Signal's request in the future.

This has nothing to do with federation. Signal could federate their network and still request everybody use the official client.


What exactly is the problem anyone would be trying to solve by adding third party clients?


Your point in no way explains why the argument from Signal is bad, and arguably completely misses the point.


My point is client integrity and federation of servers are not related.


It's a non profit so it can receive donations, but the developer is a LLC that's run for profit. It's a similar story in almost all software companies that market themselves as non-profit foundations (Mozilla too btw)

https://en.m.wikipedia.org/wiki/Signal_Foundation#Signal_Mes...


The effect is the same as it is for "regular" non-profits: there are no shareholders (other than the non-profit), and so no incentives to maximise profits.

(Of course employees and board members can still receive handsome compensation, but the same holds true for regular non-profits.)


There are some things non-profits are not allowed to do. But, owning a for-profit isn't one of them, and the for-profit is allowed to do those things. So hence this is a common strategy.

Example: Suppose I bulk buy T-shirts printed with my cool logo for $15 each and I sell them to consumers for $50 each. That's a for-profit activity, if Walmart was allowed to have a "non-profit" arm which did this I'm sure they would, the tax saving would be considerable.


That's decidedly not what "nonprofit" means. A nonprofit can absolutely fundraise like that, and pay the employees above market salaries, or they can use the funds for other things, they just can't return them to shareholders (of which there aren't any), or the board.


That's described here: https://signal.org/blog/the-ecosystem-is-moving/

(And also extensively discussed elsewhere on HN already, if you want to dive into it some more.)


Because federation is seriously difficult — look at all the effort the Matrix team has put in, and it’s still not quite 100%.

Plus yes Signal has a bit of an NIH complex.


Matrix federation is hard because they want to support large IRV style rooms with consistent history and strong controls to prevent room takeovers.

Signal is more focused on 1:1 or small group chats. Federation there can be much simpler, more like email/xmpp than IRC.

It does still add friction and slows down youe development as upgrading is difficult.


Signal also has a grand total of 40 employees. Keeping the application running on the various platforms takes a decent chunk of their time. Difficult development to add something like federation cannot be expected to happen quickly.


You know perfectly why, as your comment history shows.


Many countries tried this, even the ones we consider the "modern world" (like france, more than once). Some even prohibited export of encryption, so they could more easily decrypt foreign traffic (usa). Some protocols were even designed with "alternative" encryption, intentionally weakened (GSM).


I remember an IETF meeting where this was discussed and turned down. This was 25+ years ago, and it seems to pop up again and again.

https://www.ietf.org/rfc/bcp/bcp200.html


Especially when companies are looking to move their manufacturing from China...

India just shot itself in the foot


China has been regulating encryption and requiring escrow on businesses and breaking encryption for years


I mean, this is how the USA works. It is widely speculated that the USG will retaliate against Apple for shipping true end-to-end encryption software that does not enable surveillance unless they (Apple) do some sort of clientside scanning. This seems to indicate that Apple's 1A rights to publish software are being infringed.

Presently all large/webscale messengers in the USA (iMessage, WhatsApp, etc) all have backdoors in the e2ee that allow the FBI et al to access message content at will (often without a warrant or probable cause).


unfortunately, there are things they could do without "turning over encryption keys".

Scanning for keywords, doing content id on images, all kinds of other stuff that would not "turn over encryption keys".

I suspect other services do these things for advertising keywords, and to prevent objectionable images from being sent.


Isn't this what Bill Barr and the Trump administration wanted, too?

https://www.justice.gov/opa/pr/statement-attorney-general-wi...


Every time a new Gov gets in it feels like they roll this crap out.

Those in the top of the power tree really seem to dislike the idea that anyone not them can keep secrets from them.

Its exhausting as it never seems to be a well thought out, technically or philosophically sound argument.

Are we doomed to never have technically competent leadership?


> Are we doomed to never have technically competent leadership?

Arguably it has nothing to do with technology and everything to do with philosophy of government. It makes some sense that people in government optimize for government's needs and wouldn't it be great (from the government's perspective) if the government could have an eye in everything so it could try to be everywhere at once?

Of course that's antithetical to a free country, but from a pragmatic rather than principled view from the top it's not as apparent that the country _should_ be free, especially once the US changed from a federation (power largely in the hands of the states with the federal government mostly focused on legitimately cross-state issues) to a nation (power largely in the hands of the federal government)


> Those in the top of the power tree really seem to dislike the idea that anyone not them can keep secrets from them.

Just so we're on the same page, the "top" are the people we elect and the others are bureaucrats, technocrats, and a "community" of spies. Is that correct?

-

This pattern can be understood from different povs. A point of view rarely discussed is this: democracies and secret services are incompatible when currents of power are maintained by secrets.

Why can't we do something along the line of zero knowledge protocols in e.g. open Senate hearings?

Is it truly impossible to have both full accountability and effective intelligence services?


What is it about that guy, that makes people unable to stop talking about him?

Was it not enough for him to be the top story in the news every day for 4 years?


He was our president, every former president is newsworthy. Even when they just speak out. January 6th and the theft of top secret documents including nuclear secrets guarantees that he's not leaving the news cycle anytime soon. If he's not charged with a crime that is disqualifying for president, most expect him to run and deny any loss, ensuring our republic is overthrown by a banana republic as well. Many folks are keeping eyes on him.

There's really no other political figure with as much unrestricted aggression as he has. I certainly don't have his nerve, I have far too much respect for the founders and institutions of our government to so willfully rage against it. I guess our greatest leaders, Washington, Lincoln, Roosevelt and everything we've built till now humbles me.

But unlike Mr. Trump, I also was working as a paperboy at 12 years old, worked night shifts during college, and never got a break being unemployed all the way into my 40s now. There's a bigger difference between someone like me and Trump, and people of other racial and ethnic backgrounds. That's why I never saw eye to eye with him. But he is pretty fascinating.

The audacity and outright disregard for our institutions is absolutely astonishing. I do hope the media spotlight remains on him for the rest of his life, I don't want that guy flying under the radar. He's a reminder of what America's worst inclinations can produce. Never worked a job (other than President of the United States of America), was born at the finish line, and a soul full of man's worst inclinations. And believe me, I agree with at least 75%+ of "his" policies. I remain an independent. If I had to describe my leanings it would be a "pre-MAGA Republican, who supports unions". These parties have to work for my vote and cannot count on my vote.. In case, you think someone saying the things I have is somehow anti-Trump. If I am, it's only because it makes sense to be so.


The FBI desires that you are referencing through Trump have been policy goals since the 80s when source code had to be exported as pages on a book. So it rings false when you claim he's just another president. You do get into your real reasons at least, but the soapbox betrays the real nugget of the question. Why cant we have a policy discussion because someone must derail it with their absolute need to tell us how badly they disagree with him.


I said I agree with 75% of his policies or more. I'm not sure where you read that I "badly disagree with him". That's a disingenuous take on what I said. I disagree with his methods and disapprove of his lack of character and integrity. It's not worth the win.

That said, for an example of the 25%, while I agree with low taxes, I do not agree with tax cuts when already at historical lows as President Trump enacted. Deficit spending should not be supporting any tax cuts. To share just one example outside of that 75% of agreement I have with Trump policies. I would hardly characterize that disagreement as proof that I "badly disagree" with "his" policies.

Scarequotes around him because I'm not convinced many of his administration's actions were his ideas. I would characterize him as more of a puppet that dominated the room and was willing to disregard any form of humility or tradition, but did not dominate any intellectual matters such as policy decisions. Any positive change that occurred could only accurately be ascribed to his administration. His criminal actions have nullified, or should nullify all of his support. We're long past the stage of it qualifying as a cult.


China does it just fine


India wants to read everything? Why don't they just forbid the use of https. So that everyone can read everything on the internet


Kazakstan mandated government issue man-in-the-middle TLS certificates:

https://www.zdnet.com/article/kazakhstan-government-is-inter...

The EU is following this govennment friendly move:

https://www.bleepingcomputer.com/news/security/experts-urge-...

It would not be difficult for India as well. I see itlikely Modi and BJP will make this move as part of some anti-terrorist legislation.


Personally I find it unbelievable that major governments are not already in possession of the private key for at least one of the 150+ root certificates pre-installed on my device.


Having the private key for a root CA does not allow them to decrypt your traffic. They'd have to sign an impostor certificate for the hostnames to which you connect, and actively tamper with (MITM) your traffic using that new key/certificate. This would be trivially detectable (indeed that's what certificate transparency does).


This would be a lot of effort for very little reward.

The problem is that those keys can't be used passively. Just knowing these keys achieves nothing (lay people often assume you could snoop TLS, but, that's not how it works with a CA root even in archaic SSL versions) The only useful thing you can do with those keys is make certificates (the thing the CA gets to do legitimately) but presumably you'd make bogus ones.

But in most of the world's web browsers those certificates don't work unless they come with SCTs, receipts from two or more public certificate transparency logs promising they logged these certificates.

So now as well as obtaining private keys to a trusted root CA, you need to break at least two of the CT logs.

This deliberately and unavoidably creates a paper trail showing what happened. All three entities (the root CA and two logs) have their reputations destroyed and if they're for-profits presumably go bankrupt (or the business unit fails).

And what did you get for this? A forged certificate? Maybe a few dozen if you targeted carefully. Maybe you were able to pull this off for a whole week before alarm bells got too loud to ignore ?


All of this has happened already, see this: https://en.wikipedia.org/wiki/DigiNotar

"Cryptographer Bruce Schneier says the attack may have been "either the work of the NSA, or exploited by the NSA."[6] However, this has been disputed, with others saying the NSA had only detected a foreign intelligence service using the fake certificates.[7]"


Some govs officially operates their own root CA https://ccadb-public.secure.force.com/mozilla/IncludedCACert...


I'm trying to imagine a ban on https for better understanfing and I remember that Australia implemented anti-encryption laws some 3-4 years ago. Can someone comment on how Australian anti-encryption laws work in a case of https? Is it illegal to use encryption (like https to send data to server in a browser) without a backdoor now in Australia?


Australia law doesn't require you to do anything until explicitly asked by the intelligence agencies. The first stage is a gentle "Request for Technical Assistence" with no penalties for no saying no. But then they can then ask again and demand you provide assistance with jail time/fines for non-compliance. The orders also come secrecy notices so you can't inform anyone (except your lawyers) that you've received the notice. The request have to target specific users so they can't be "Collect messages on everyone with a Muslim name" or something.

The chilling effect of it is. What if they demand you give them information you do not have a way of accessing. (Eg Signal). How would you comply? Do you have to pre-empt whatever requests you MIGHT get and ensure you could back-door a user if it were required. The law also seems to imply that ASIO could demand a single employee at a company backdoors something and they wouldn't be able to tell their co-workers.


> The chilling effect of it is. What if they demand you give them information you do not have a way of accessing. (Eg Signal). How would you comply? Do you have to pre-empt whatever requests you MIGHT get and ensure you could back-door a user if it were required.

That's what Technical Capability Notices are for. You don't have to implement a backdoor until they force you to. They have worded the legislation to make it sound as though this cannot be used to implement "systemic weaknesses" but this is bullshit (their definition of a "systemic weakness" would be something like getting a backdoor into OpenSSL, while a backdoor in Facebook Messenger is not a "systemic weakness" because it only affects one application).


Of course the government should still use https, too important, even government officials' personal communication should be protected. All animals are equal, ...


I read the linked source. Could not find any reference to any Indian law or Govt. order. Can you please point me to any source with credible information on the supposed provisions which are causing Signal to "exit" India.


Its was a proposed law (to solicit feedback). Signal is not exiting but would exit if such law comes to India.

https://www.hindustantimes.com/india-news/govt-proposes-law-...

But Telecom Authority of India rules out any immediate intervention.

https://tech.hindustantimes.com/tech/news/trai-rules-out-reg...

I couldn't quote from the article here but looks like they are going to wait till clarity emerges from International jurisdiction.


Proposed law draft (I hate that news outlets do not link to the law they discuss) https://dot.gov.in/sites/default/files/Draft%20Indian%20Tele...


Thanks. This bill seems to be about telecom infrastructure. Does not talk about metadata, or encryption. As far as I can find does not seem to touch on messaging apps as well. Was anyone able to find anything related to which provision in the bill would impact Signal.


It's explicit on the interception of any kind of data.

It defines message as:

  “message” means any sign, signal, writing, image, sound, video, data stream or intelligence or information intended for telecommunication;
And then it says that said "messages" can be "intercepted or detained or disclosed", for a really wide range of reasons, apparently without the intervention of a judge.

  24.4 On the occurrence of any public emergency or in the interest of the public safety, the Central Government or a State Government or any officer specially authorized in this behalf by the Central or a State Government, may, if satisfied that it is necessary or expedient to do so, in the interest of the sovereignty, integrity or security of India, friendly relations with foreign states, public order, or preventing incitement to an offence, for reasons to be recorded in writing, by order:

  (a) direct that any message or class of messages, to or from any person or class of persons, or relating to any particular subject, brought for transmission by, or transmitted or received by any telecommunication services or telecommunication network, shall not be transmitted, or shall be intercepted or detained or disclosed to the officer mentioned in such order;


Thanks. To me it looks like client side encryption should be fine, or real P2P for that matter. Officer: give me the messages

Service: here they are

Officer: decrypt for me

Service: Sorry, don't have the keys.


> “telecommunication services" means service of any description (including [..] electronic mail, voice mail, voice, video and data communication services, [..], internet based communication services,

Which I think Signal would fall under at least one of those services.

you can control+f your way through to see everything related to that phrase.


Thanks


The power of TRAI (Telecom Regulatory Authority of India) is also proposed to be curtailed in the same bill [0].

[0] https://www.outlookindia.com/business/proposal-to-curtail-tr...


I am looking for that proposed law to understand the provisions. Can you please give me a link to that if it is available.



Thanks. Again this article does not link to the supposed draft bill, and what it actually says. They directly jump to saying government wants to "intercept". How does the Govt propose to intercept? What was actually proposed in the bill? They also say the Govt. is seeking public feedback on the draft. Ideally this news source should link to the draft in question so that people can read, and raise their objections.


Signal is too small a player and is therefore more likely to be bullied by governments. India is taking pot shots at it to see if it can get away with forcing them into intercepting communications. If they leave as a result, they'll simply shrug and move on.

Also, I'm convinced that if Signal were to become popular they'd probably sell it to some commercial provider, since the cost of maintaining a service used by hundreds of millions if not billions is extremely expensive.

WhatsApp is an entirely different matter. Hundreds of millions of Indians probably use it and if the company were to suspend service because the Indian government mandated a backdoor all hell would break loose!

For the same reason both Australia and the UK haven't as of yet forced WhatsApp to install a backdoor, even though legislation forcing them to do so is already in place there.


How do you know that there is not a back door in WhatsApp?


Presumption of innocence, or do you have seen any evidence to prove it exist, besides Durov's shit from time to time?


In the context of privacy, you can pretty much assume every black box is compromised. With Telegram this black box is the server (the client is open source); with WhatsApp, it's the client. I suppose there's threat models where WA still wins, but knowing it's owned by Meta, I have a hard time imagining what such a threat model would look like.


Is the Whatsapp client really a black box? APKs are fairly straightforward to decompile back to Smali or a reasonable approximation of Java, or people on rooted devices can hook it with Frida. Of course source code would be better, but it would be pretty brazen to stick a backdoor in an app store release. App versions for popular apps get archived by numerous third-party sites, so even a temporary backdoor in one specific version would be archived forever. That would be putting their reputation and billions of dollars on the line.

Non-E2E with black box server code like Telegram is far more concerning, in my opinion. With a system like that, it would be trivial to backdoor and leave behind no evidence after the fact.


It is very difficult to find a backdoor even in the open source code, in megabytes of closed source code it is nearly impossible.

> That would be putting their reputation

Does Facebook have any reputation left?


> Does Facebook have any reputation left?

Most people still think of WhatsApp and Facebook as separate. For a while, WhatsApp displayed a Facebook logo for a second whenever it started (and given how Android works, opening WhatsApp does not necessarily mean starting it, most of the time it just switches to an already started process), but even that does not happen anymore, since it was replaced by a "Meta" logo (and most people do not associate "Meta" with Facebook).


Which reputation? The reputation of a company that doesn't care about privacy at all and we t through dozens of privacy scandals?


Seems like someone doesn't like the truth


Telegram has more reputation that Meta.


bullied how? it's easier to get bullied when you have an office space, they harass your employees. it's easier to get bullied when you sell products/service, they stop your money flow. what/how do you envision them bullying Signal?


> ... what/how do you envision them bullying Signal?

I guess, by outlawing them. Think of the collateral damage. For ex, unavailability of Signal means MobileCoins in wallets of its Indian users become inaccessible (theoretically, since Signal is a non-custodial wallet) if they don't get it out in time before the ban, so there's something for Signal to think about.


They might not compromise on encryption, but they have no intentions of open sourcing their censorship module either: https://github.com/signalapp/Signal-Server/blob/90490c9c8485...


Can y'all chill with calling every instance of a message being deleted censorship? I've heard everything from minecraft chat profanity filters to copyright infringement detection called "censorship", meanwhile there are systems filtering and changing SMS messages based on political sentiment and people living in fear of being arrested, tortured and executed for saying bad things about the ruling class.

Call it what it is: Signal is keeping their spam filter private. And history has shown that this is usually the right move.


The difference between an 'abuse filter' and a censorship mechanism is semantics. By classifying political speech or image hashes you don't like as "spam", you've effectively implemented a political censorship mechanism and just gave it a different name. This is also why many privacy and security advocates are against on-device CSAM scanning. By classifying hashes of memes shared among political opposition parties as CSAM, you enable censorship, or even worse, targeted tracking of individuals over protected political speech. These systems need to be evaluated by what they can be made to do, not by what we're told they're for.

I have personally witnessed political messages being censored on Signal, server-side.

I recognize that keeping that private is what makes it effective. The argument being made isn't that the filter would be more effective if it were open source, the argument is that the filter is being abused to perform censorship, and that's the reason why it's not open source.

One of many downsides of trusting centralized platforms/services like Signal.


You seem to be under the impression that this filter is based on message content. It's not. All content is e2ee, and this filter is not downloaded to the client. The filter is to stop abusive network behavior, e.g., someone enumerating every phone number to get a list of all Signal users. If you'd like, I can link you to some of the research that has been done to determine some of what they're doing (or you can search google scholar).

edit to add: I guess I should clarify, the messages in Signal are properly e2ee with ephemeral keys, and there is no content hashing or the like to leak anything to the server. It's not public yet, but if you'd like, I can share some of my research into the structure of an actual Signal message, from the TCP layer down to the encrypted payload.


Evidence?


Be more specific. I already linked to the exact line number in Signal's code where they admit both that a server-side message filter exists and that the implementation is private, look at the top level comment.


There are legitimate reasons for not open sourcing it. It would become effortless to circumvent.


The argument isn't that it should be open sourced to improve spam detection or efficacy, the argument is that it's being misused as censorship mechanism without calling it that, and that's why it's not open source - it would become readily apparent that it's being used for political censorship if it was open source.

If it was just being used for spam filtering and not censorship, it would make much more sense to implement the blocking client-side (on the recipient's device) and provide a user-toggleable flag to disable that functionality. This would enable those who don't want spam and trust the filter to keep the exact same functionality, while allowing those who distrust it or suspect it of censorship to voluntarily opt-out for messages they receive. This would also offload processing costs from Signal's servers to user's devices, which would decrease the operational expenditure incurred by Signal.

One reason not to take advantage of that economic incentive is that you don't want your users to have the choice, and you'd not want them to have the choice if you're trying to censor them.


People act when on the precipice. Decentralized e2ee messengers and servers will thrive now in India: xmpp, matrix, session, tox, briar, Jami..


Or they’ll just install the .apk directly?

iPhone users are screwed until sideloading becomes a thing, but iPhones represent a tiny market share in India, AFAIK


Sideloading on iPhones is already possible, but requires a Mac/PC and is limited for people without paid developer account (max. 3 apps, must reinstall every 7 days, there are windows/mac apps which do it for you automatically). with paid developer account you can sideload unlimited apps and they last a year... If you're interested, check out AltStore, or Sideloadly...


Maybe not, imagine possession of such program is treated same way as a child pornography.


How popular would it be to treat people like pedos, just because of an app that they use?


No, what the government says is "use of X is sufficient evidence to support a warrant", then the fact the everyone does it is irrelevant - you only use it against enemies so that's the only time it comes up. Because this is likely government triggered invasion of privacy, it's easy to see how a government can then make it appear that a given [ab]use of the law be "legit".

In CA for example the standard highway speed limit is 65mph, but everyone drives well over 70 (this isn't a silly "everyone does it", it's a "if you're doing 65 you may well be rear ended"), this means that the CHP can basically choose whoever they want to pull over, despite everyone breaking the same laws. Essentially they've created a system where everyone is breaking a law so that they can then selectively charge.


This rhetoric is already applied against onion websites and similar decentralized networks.


Easy. Make everyone think that you have to hide something or create a fake stories that pedos use app X.

> if you have nothing to hide, why don’t you show it?

Is also “nice” rhetoric that laymen agree with.


I'd assume that India has its fair share of smart people who are able to spot fallacious rhetorics and want to see the bank accounts of officials in return.



Smart people don’t dictate things, majority does.


Who is "smarter", the people that manipulate both their philosophy and the majority's opinion to suit their goals, or the people who's smart philosophies are uncompromised but unimplemented?


You’ll have to consult the opinion of parent comment.

I read it as smart=intellectually smart and virtuous.


Probably not popular, but it would be really easy for the state to do it. All they need is their larger media outlets to make a few daytime TV story lines featuring these apps in their narratives, a couple big budget movies, some exaggerated "criminal activity" headlines in the major press. Next thing you know we get the next Satanic Scare.


The average person uses whatsapp, has no idea about security, has never heard of signal. The media will tell them signal is specifically setup up to prevent law enforcement. They will give an example of a drug dealer that gets caught and has the app on their phone. Even if the dealer didn't use signal for dealing the connection is presented.

The average person will connect the dots. Signal is an underground app for illegal stuff. I have nothing to hide so I don't need Signal to exist. People must use signal for the most illegal and disturbing thing I can think of. Signal users must be pedos. Signal must be banned.

This is how power uses media to control the masses. They don't need to make the claim they simply present facts in a narrative and allow the average person to connect the dots making them feel like they figured it all out themselves.


Proud of Signal’s stance on this!


Encryption is just a tip of the iceberg here.

There are several major problems with Signal:

- it is not that private after all since it requires a phone number. Yes, you can override this by using some virtual throwaway number if you are geeky enough but your account will be associated with this phone number anyways.

- as a consequence you _will_ receive spam from bots fanning out messages to phone numbers. You can’t restrict your social circle to allow only chosen people or let’s say people only from your contact list to message you.

- Signal protocol is probably great from the e2ee perspective but it is not federated and unlike XMPP you cannot spin up your own server and have full control over it.

- Since the end product is not a protocol or a framework or a platform it is a product that is run by other people who you can only trust albeit you can verify and audit source code by yourself(or by hiring someone to do it on your behalf). And I am sure you cannot run an end-to-end audit of the whole Signal platform to verify that what they actually run has been built from the source code you have audited.

- Since it is centralized, Signal is prone to censorship in those countries that decide to fight it. This leads to introducing workarounds like this https://signal.org/blog/run-a-proxy/ to help people circumvent limitations in affected regions and territories.


> ...your account will be associated with this phone number anyways.

Messages exchanged in Signal are repudiable. That said, in a recent blog post Signal indicated that arbitrary usernames is something they're working on.

> ...unlike XMPP you cannot spin up your own server and have full control over it.

Signal is a better alternative for the likes of PGP because it is centralized [0]. Spam notwithstanding, Matrix has made good of the federation model, however. So that's there too.

> And I am sure you cannot run an end-to-end audit of the whole Signal platform to verify that what they actually run...

Valid but shallow. Case in point: Public CAs, though held to accountability by browsers, are no where near as transparent as Signal is, and yet they form the backbone of almost all sensitive communication on the Interwebs.

> Signal is prone to censorship in those countries that decide to fight it.

This is the only major concern, but glad that Matrix exists, and so do other solutions. May be it isn't Signal's battle to fight. May be it is, and they'll figure something else out aside from building naive TLS proxies.

[0] https://signal.org/blog/the-ecosystem-is-moving/


> Signal indicated that arbitrary usernames is something they're working on.

no offense but they've been saying this for years, the feature may eventually come but I'm not holding my breath



i think SMS support was part of the reason they could not do it before, since usernames would break that feature. removing SMS makes this a non-issue. the phone number is now just an arbitrary ID string which can be replaced by any other


There's no reason they couldn't have done both.

Signal never used SMS as a transport. It only provided a UI to have unencrypted SMS messages alongside Signal chats. As far as Signal messenges themselves go, phone numbers always were an arbitrary ID string.

Having a username that isn't a phone number would have only more clearly segregated the SMS feature from Signal chat; and the desire for that segregation is one of the key excuses they came up with for dropping SMS support from the Signal app.


> Signal never used SMS as a transport.

Signal is a merge of the TextSecure and RedPhone applications; back when it was still called TextSecure, it did use SMS as a transport for encrypted messages.


OK, that's historically interesting, though it really doesn't change my point.


of course they could have, but it would have made for a more complex UI and add potential confusion to users. nothing that would be a serious problem that could not be overcome, but any of these things are tradeoffs. cost, complexity, etc.

SMS wasn't removed to make way for arbitrary user IDs. that would have been surprising even though i prefer IDs over SMS support. but if SMS was on the way out already it made sense to wait with implementing arbitrary user ID support until then. and now they have one less excuse not to implement it


In the most recent beta they have enabled the backend infrastructure for usernames


I hate having to come up with a username. A better (IMHO) solution would be GUIDs or similar, generated on-device.


Bitwarden will now generate usernames for you, like it can generate passwords.


steve_taylor is a perfectly nice username ;)


Also:

- There is no way to back up your message history (with photos, etc). This could be done using their (annoyingly pushed to users) "PIN", but isn't.

Few people realize that if your phone dies today, your history is GONE. From what I saw, once people do realize this, it's game over for Signal. WhatsApp is just easier, "everybody is there", and it does back up your history.

EDIT: Yes, it's on iOS. Yes, I realize this might not matter to you, but it matters to a lot of people. And if Signal tries to "bring privacy to the masses", this needs to be fixed. I've seen multiple people stop using Signal after losing all their data.


Signal has backup on Android, and fairly fast image management, but it's lacking on iOS. That points to a lack of developer resources.

What you call WhatsApp history is other people being able to read your messages.

WhatsApp is violently agressive in demanding you use your phone and give access to your contacts. You can't use it without feeling violated.


Signal is similarly coercive about contacts - only "Yes" and "Not now", and once they get them, there's no way to delete the data they've got.

Desktop Signal still keeps showing me other people who also have Signal, but whom I've never even tried to contact via Signal, only because years ago I've mistakenly let them see my contacts.


You can deny the Signal app the address book permission and it still works.

Signal contact discovery is also the most privacy preserving approach available anywhere, using only the partial hash of their phone number. You don't have to share your contacts -- but yes, your contacts may share their truncated hash list, and hence discover that your number is tied to a Signal identity, but it is on their phone that the number maps to a name, not in any network traffic or on a Signal server.

We're really not getting any better solutions until secure MPC becomes more practical.

https://www.howtogeek.com/708651/can-you-use-signal-without-...

https://signal.org/blog/private-contact-discovery/


> That points to a lack of developer resources.

No, that points to prioritization issues. Many other features were developed and added over the years, but backup was not. It's not an accident, it's a result of multiple decisions where backing up user data was not seen as important.


I see no reason to back-up my history ever. I don't use signal for anything that needs to be archived. Nor do any of my friends. One of the selling points of signal is the feature that it doesn't automatically save photos, this is a good feature. You manually save the photos you want to save.

And I thought people regularly backed up everything worth saving from phones to non phone archive anyway.


> I see no reason to back-up my history ever. I don't use signal for anything that needs to be archived.

Every single time this comes up, someone chimes in saying they don't need it.

That does not invalidate the users who do need it. If I can't keep my messages from device to device, I'm not going to use Signal.

With far more manual effort than should be required, I can move my messages from Android device to Android device, if the old device is still functional. That's still not fully sufficient; I need a reliable automatic (encrypted) backup solution.


Signal on Android has backups and migrations. Maybe the IOS signal doesn't?


iOS signal has no backups and migrations routinely fail. I've lost years of message history multiple times attempting iOS -> iOS migrations. It's a shame it's so botched and has made me stop recommending Signal to people entirely.


Some people want their complete messaging history, but encrypted so that only they can see it. And backed up so that... well... they don't lose it.


Not having the ability to backup and archive my messages means Signal controls my data, not me. It reminds me of my past fights with closed source, proprietary software and file formats.


I do off-device Signal backups all the time, and have successfully restored.


You have on Android, but not iOS.


The solutions for the most of the issues you are describing comes with great usability costs. It is already hard to make non-tech people to switch from WhatsApp.


How come, signal works exactly the same as whatsapp. I forced my friends and (older) relatives to change to signal and there has been no problems and everyone is happy.


Network effects and minor UX issues. “VERIFY PIN NOW” pop-up every week for example.


I love that feature


> How come, signal works exactly the same as whatsapp.

I think right there is a powerful impediment. It's not compelling to switch for most, especially when people they know are still on whatsapp.


> The solutions for the most of the issues you are describing comes with great usability costs.

What costs? Element (a popular Matrix client) has recently improved their onboarding greatly!

https://element.io/blog/all-aboard-better-ftue-for-less-wtf/

It really is just as easy to onboard to Element as Signal these days, the UX has come a long way. And you'll never have the move them again, because you can choose any client you like.


That's very good to hear. I use Element a lot, but I found it very difficult to get non-technical friends and family to use it. A better onboarding experience is a real step in the right direction.

Another constant stumbling block is this whole "verify session" business. No one understands what it means. I understand technically what it does but I can't explain why it is so important that it keeps popping up all the time. It creates a constant sense of "something isn't right here but I don't know what to do".

I hope this was fixed as well.


Yep, it's definitely been frustrating in the past. The number of iOS Element bugs was overwhelming at times too. It's a lot more stable now, but the bubble layout still isn't the default - I think that's what most people expect from a personal messenger. I'm looking forward to seeing what the Rust rewrite [1] brings for performance/stability.

FluffyChat also has quite nice UX and a bubble layout by default, but threads are still a while off [2]. On iOS it worked flawlessly through the iOS 16 betas while Element had some show stopping bugs, a couple of my friends moved over if they were on the beta.

I haven't had any friends ask me about the verify session buttons. I don't see any prompts on latest iOS Element but it's still too prominent on Element desktop for my liking.

SchildiChat [3] is my daily driver and feels more friendly than Element on desktop (unified DMs & group chats, no verify UX, chat bubbles), but it doesn't have any update mechanism built in, so I'm wary to recommend it to non-technical friends. It was also my goto recommendation on Android before the Element redesign.

I'm confident the ecosystem is moving in the right direction though, and so thankful for the amount of choice.

[1]: https://github.com/vector-im/element-x-ios [2]: https://gitlab.com/famedly/fluffychat/-/issues/881 [3]: https://schildi.chat/


> What costs? Element (a popular Matrix client) has recently improved their onboarding greatly!

Matrix is a great example where you need that one tech guy to create home server. Otherwise nobody of your friend group can use it as intended to.


It's easy to use the free server at matrix.org. For $5/month you can get a hosted server from EMS with optional bridges to Signal/Telegram/WhatsApp.

Or another paid provider: https://matrix.org/hosting/


Try to convince non-technial users to pay for something they already use for free… They don’t understand the benefits.


But why would you need to convince them? There's nothing wrong with using a free matrix.org account unless and until you do understand the benefits of using something else.

The good thing about having the federated protocol is that people can use different providers and change providers without destroying the network effects.

It's perfectly fine if most people end up using one of very few big providers as long as all providers support federation.

I think we have to make a distinction between personal communication and mass communication. The former is a question of privacy. The latter is a question of political freedoms.


>it is a product that is run by other people[...]Since it is centralized

so are federated instances in reality because the vast majority of people are going to be on some popular node by virtue of how communication networks organize.

And I'd rather trust Signal than having to trust some server in a guy's basement who I can only pray has a reasonable security setup, and is going to fold ten times more quickly to a court order than a foundation with a 50 million dollar check and nice lawyers.


I've never had a spam message on Signal in all the years I've been using it.


Neither did I until a few months ago, had like 1 spam per week for about 2 months. Then it stopped ¯\_(ツ)_/¯


>> Since it is centralized, Signal is prone to censorship in those countries that decide to fight it.

Yes, this. It might be "fighting for privacy" now, but they can change their mind at any point. With new management, eventual profit seeking, government coercion, etc, this could change with a simple app update. It is more likely to me that it remains private because it isn't particularly successful. I find a lot of people have it but rarely use it.

It also isn't clear to me what is in it for donors such as EM. It is much easier to trust a system with correctly designed incentive structures.


> requires a phone number ... as a consequence you _will_ receive spam from bots

Weird take. Can you imagine the number of bots if it didn't require a phone number?


Your "major problems" are fair, but nothing's perfect, so unless you have a better solution with the same design constraints (e.g. https://github.com/signalapp/Signal-Android/blob/main/CONTRI...), they are rather moot, because they're not actionable. That being said:

> - it is not that private after all since it requires a phone number. Yes, you can override this by using some virtual throwaway number if you are geeky enough but your account will be associated with this phone number anyways.

You still leak very little because of their sealed sender feature. If you allow arbitrary usernames, you need to bootstrap your trusted channel from no assumption instead of a weak one (if I receive a message from @barack_obama for the first time, I have no clue if it's actually someone I know, while a message from @+1235312 indicates that the sender at least had access to this number at some point to register). All solutions have tradeoffs.

> - as a consequence you _will_ receive spam from bots fanning out messages to phone numbers.

There's very little spam now on Signal (I think I've had 2 spam messages in about 4 years). At one point there was a bit, but server-side measures are now preventing that.

You can't really have no spam at all as long as you have a common id that's shared among your contacts, which is the case most of the time for usability, as any of these contacts can leak this id somehow.

> - Signal protocol is probably great from the e2ee perspective but it is not federated and unlike XMPP you cannot spin up your own server and have full control over it.

Signal's approach is that you shouldn't care about the server. Why do you?

> And I am sure you cannot run an end-to-end audit of the whole Signal platform to verify that what they actually run has been built from the source code you have audited.

You can do that on the client code, which is reproducible, and again mostly what you should care about.

> - Since it is centralized, Signal is prone to censorship in those countries that decide to fight it.

I'd say that their reliance on phone numbers is the main issue with censorship: if new users can't receive their registration text message, they cannot register.

AFAICT people in censored countries still rely on WhatsApp, Signal and similar messengers instead of more niche but more decentralized alternatives like Tox. Again, however bad their current approach might be, it's a moot point until there's something that works better in practice.


If you care about privacy, stop using your phone and any app. There, problem solved.


Is there anything more secure than signal that is widely used? Maybe something that doesn’t leak metadata or require a phone number?


Uhh, Jitsi? Jitsi Meet? Free and open source of course.

Jitsi has ZRTP encryption. Jitsi Meet uses WebRTC and its encryption isn't the same, but from the academic research papers I've read such as "Stegozoa: Enhancing WebRTC Covert Channels with Video Steganography for Internet Censorship Circumvention" from June 2022, it's convincing enough for me to have set up my own Jitsi Meet server on a Debian Linux Virtualbox machine.

From the paper:

"Given that the peer-to-peer connections carrying the video streams are encrypted end-to-end, not even a state-level adversary with unrestricted access to the network infrastructure will be able to observe the raw video content of the WebRTC streams. "


> widely used


There's Briar, its peer to peer and pretty great.

XMPP/Jabber with OMEMO encryption, it runs on federating servers.

Session is a fork of signal that doesn't require phone numbers, there's some cryptocurrency something or other in there I don't quite get, but I don't believe you need it to send messages.

There's Tox, another p2p sort of thing.

Then there's threema, wire, and a bunch of others im not all that familiar with.


> Session is a fork of signal that doesn't require phone numbers, there's some cryptocurrency something or other in there I don't quite get, but I don't believe you need it to send messages.

This is not a fork of Signal. It was originally designed to use Signal protocol under the hood for encryption and key management but does not anymore[1]. They appear to be going a different direction now with the service using cryptocurrency nodes to route the service[2]

[1] https://getsession.org/blog/session-protocol-technical-infor...

[2] https://oxen.io/session-lokinet


There are a few secure alternatives but as stated in the article, their goal is privacy as well as security. For instance they don’t even know who’s in which groups or that a group exists and they keep as little metadata as possible. The remaining metadata that is required for the service to work, they encrypt and don’t even have the keys.


Threema, but it's not free.


What metadata does signal leak? They have sealed sender since 2018 which means the most interesting metadata isn’t shared. Just don’t share contacts and register with a disposable number.




Tox doesn't leak anything, but it's not widely used.


Widely used? No. I've found Session to be a decent alternative however it's still early development which entails some scuff and details as to how they intend for it to be financially supported long term aren't clear.

https://getsession.org/

TLDR on Session is that it's a fork of Signal (effectively same front end, key scheme, encryption scheme, etc) with a modified transport/delivery and notification system and without the phone-number-as-an-identifier caveat that signal has.

Note: Sorry for the wall of text below.

As for what that modified transport layer is, it's routing all the messaging and data hosting over Oxen (https://oxen.io/) which is a cryptocurrency that serves as a decentralised short term / small size addressable data store and an onion router for those messages/data. As much as cryptocurrency=bad in a lot of cases, here it kinda makes sense as it's just an automated digital marketplace for data hosting and bandwidth with tooling wrapped around it to support privacy and anonymity preserving tools without relying on some hopefully benevolent dictator to run it.

As for who's backing it, same group that develops the Oxen, an Australian non-profit focused on privacy tech and bearing the same name (Oxen Privacy Tech Foundation). While Oxen is pay for use (messaging and all that has an on chain cost), it looks like the foundation is covering the costs of running Session for the foreseeable future. Given the nature of the project, it should eventually be possible for users to pay their own infra costs however that doesn't seem to be implemented yet.

It's pretty easy to use.

1. Install via F-droid or download from the web.

2. Basic cryptocurrency wallet style setup where your account is based on a randomly generated "recovery seed" phrase (string of words with equal bits of randomness as the private key which can be used to rebuild the private key on a new device).

3. Then you can share your "Session ID" which is basically just your public key or you can pay for a custom username which is addressed to your public key (you can set names for contacts after adding them so the username is mostly for ease of discoverability).

4. After that it's basically just Signal but where you can make and throw away accounts at the drop of a hat.

My main complaints are

1. that it's a bit slow on delivery

2. The onion routing half of decentralised storage + routing is still being implemented for Session as the project is very much WIP at this stage.

----

My takeaway is that provided it can stick around, Session has potential to shore up where Signal falls short. Give it a year or two in the oven and I might recommend it as a daily driver for messaging.

Likewise for the OPTF and their goals in general. It looks like once Session is "fully implemented" they are looking at trying to expand the approach to a discord/slack/matrix competitor as well which could be interesting. As far as I can tell they are just a bunch of privacy nerds with a little bit of a cryptocurrency lean to them but they are doing good work.


matrix


I thought Matrix homeservers had access to way more metadata than Signal. Is it better than what I thought?


What does it matter to additionally leak relationships between nessages, such as deletes or replies?

Most important is who when with who which both leak to the server operators


Not at all like Signal. They only have your account creation time and your last connection time

https://signal.org/blog/looking-back-as-the-world-moves-forw...

(edit: answering to your initial message, that said that Signal operators also have access to the same metadata, but I guess my comment still answers your sentence "Most important is who when with who which both leak to the server operators" - Signal operators have showed they can't provide this information)


If you don't store that data or delete it, you can't provide it.

But Signal could store that data because they have access to that dats


> If you don't store that data or delete it, you can't provide it.

Can I do this easily and still have my homeserver work correctly with Matrix?

Anyway, the law would probably require any operator to keep this data for a while since they have access to it. And I can't rely on my contact's homeservers to delete this data even if they can (technically and legally).


Signal certeinly can store it, because they have access to it.

They have access to that data alone through the fact that they are in control of their servers and, thus, can see who sends and receives messages.


But they can't. That's the thing.

https://signal.org/blog/sealed-sender/


They can.

Even on a theoretical level their sealed sender technique doesn't work: https://www.ndss-symposium.org/ndss-paper/improving-signals-...

Now, include lower level technical Details such as the IP layer.

1. Imagine you connect to a server to send a message. Now, you send a message to someone else. The server can't see who you are, right? Because the letter misses your name. Imagine someone else sends a message to you.

2. Imagine you to connect to the same server to receive messages that the server stored for delivery with your name on it. The server gives you the messages.

Do you notice something?


Okay, maybe Signal has access to this information with sufficient investigation then.


No investigation necessary.

The information can be obtained in an automated way.


you can easily run your own.

though you will need lotsa RAM for federation if you use synapse (mine keeps bogging down now after a year of mild use (could also be a bug idk, it allocates all the RAM and then nothing goes), now im waiting for the new server software dendrite to be able to migrate from synapse... allthough i wouldnt lose much if i just nuked the synapse instance)


> you can easily run your own.

Yes, and I have in the past. But that does not solve the issue completely. If the police asks me some information about one of my contacts, I'll have some metadata stored in my homeserver, and I'll need to hand it over. And vice versa.

It's better that the server does not have access to this information at all in the first place.



I have been wondering how censorship resistant matrix is. As far as I know, groups are not hosted on any particular server, it’s some kind of shared consensus.

I suppose the moderators would still be at risk of hosting platforms and ISPs dropping their homeserver.


Matrix is great, but it leaks a ton of data.


To whom? AFAIK if you self-host, no data is leaked.


If you and everyone you talk to are hosted on servers you trust, down to the physical provider, yes.

EDIT: if the person you're talking to is hosted on Google, Google has all the metadata. Even if one person is hosted on Google and is part of a room you're in, Google has all the metadata in that room.

In 2022 e2ee is the standard for messages but not metadata, and unfortunately Matrix doesn't tackle it, so no, you can't expect 100% privacy.


Signal is hosted on Amazon


But the backend of Signal doesn't store who talks to who, so Amazon can't have that information. It also doesn't store what groups exist and who is part of it.


Nobody knows that except amazon and signal.

Would be interesting to know if Amazon could reconstruct that information without Signals knowledge


So does Signal: who when with whom

Matrix does leak more meta data though, e.g. message relationships for instance, but what does that matter?


That's false - Signal has had sealed sender on by default since 2018. (the setting you can turn on is just to see an icon in the ui).



>“A big part of our model is telling people not to take our word for it.”

That's rich, coming from a company that wasn't (and probably still isn't) running the code they made public.

>So if I want to fork Signal and make my own, I can just take the code and do it today?

>People do it. There are many of those. We don’t endorse them because we can’t guarantee or validate them — we don’t have the time or the resources for that. But yes, there are many out there.

Sure, but you can't use them with the same server cluster everyone else in the world is using, making it about as useful as a chocolate teapot.


> Signal makes its code open-source.

Is anyone aware of whether they are still obscuring spam related code?

For the downvoters:

- https://www.reddit.com/r/privacy/comments/qlw1ag/signal_is_a...

- https://signal.org/blog/keeping-spam-off-signal/


Okay, but so what? Why is this a problem? You can run a functional signal server without their spam filtering code.


It's a matter of trust. If I can't trust that it's truly open source, then it erodes some of the confidence I have in them. I haven't personally tried running a functional Signal server without the spam filtering code, so I don't know if it truly works without it.


I suppose people should decide for themselves if they take the word of a centralized service. Convenience is a factor after all.

For those that have small circles of friends they wish to chat with and minimize the number of ISP's their traffic traverses, I would suggest tinkering around with uMurmur [1] a lightweight version of Murmur. There are pre-built packages in several operating systems package managers. The configuration is dirt simple [2] and the daemon is very light weight, designed to run on home routers. Use Certbot to generate LE certs or just use self-signed. One TCP and one UDP port must be opened to the daemon or forwarded if on a different device with the default port being 64738. One can set a server-wide password to keep strangers off of it, or set passwords per-channel. It took me about 2 minutes to install and configure uMurmur on Alpine Linux and most of that time was deciding how I wanted to nest channels.

The mobile client is Mumla. Just put in the IP or hostname of the uMurmur instance. uMurmur is not E2EE but if it is running on your own router or VM and you are talking with your friends that you know and trust then maybe that is less of an issue given the server is on your hardware.

The voice quality is incredible. It uses the OPUS codec and does not require much bandwidth. The client also supports text chat, but that can be disabled in the server if one so wishes.

[1] - https://github.com/umurmur/umurmur

[2] - https://github.com/umurmur/umurmur/wiki/Configuration


Why doesn't govt improve its own machinery instead of trying to spy on its citizens? Do they think this way they'll improve the society? Like China?


The first function of a state is to ensure the continued existence of the state. Which is threatened a lot more by civil uprisings as compared to an external actor, especially in a time when international relations are relatively stable (current security events being the formerly-normal that is today's extreme). Another "storm on the Capitol" is so much more damaging to the US as it exists today as compared to a border skirmish with the Chinese over Taiwan.


The state wouldn't fear civil uprising if it made its citizens' lives half decent


Then it would take away from other powerful factions, which will then ... overthrow the state. The idea of an unified, altruistic populus which will put the average wellbeing over their own is utopian - in both senses of the word.

Reading suggestion: The Dictator's Handbook: Why Bad Behavior is Almost Always Good Politics, by Bruce Bueno de Mesquita and Alastair Smith


all about power


Tell me which country doesn't spy on its citizens.


I mean this is not a surprising announcement given that encryption is their only selling point over competitors.


Couldn’t one just do PGP over WhatsApp, over Telegram or whatever? Instead of sending a plaintext message you’d send the PGP message but using existing infrastructure. Then there’d only be metadata to harvest but content should definitely be secure.


[OTR] would probably be a goto for encrypting instant messages. But both that & PGP require a 3rd-party client (pidgin/adium/etc...), if user wants to retain at least some usability.

And nowadays, every popular IM network wants 3rd-party clients off their net at all costs. Signal is on the better side of the spectrum here. While they explicitly say they don't support non-official clients connecting to the official network, this isn't rigorously enforced. WA, on the other hand, would permaban your whole account if you even so much as try to use 3rd-party client.

[] https://en.wikipedia.org/wiki/Off-the-Record_Messaging


Android's Messages app does this. RCS isn't end-to-end encrypted, but Messages adds the Signal protocol on top.

https://www.gstatic.com/messages/papers/messages_e2ee.pdf


Big govt vs big tech, a battle that is being played out all over the world.

Any country that value sovereignty and sees itself as an independent actor rather than a vassal state, must take the position India is taking, or else be susceptible to foreign owned apps influencing its citizenry. The only alternative would be to insist Signal hire its security agents into product / moderator roles, so that oversight can be conducted in more implicit ways.

Not great for the users, but the world where globalised technology makes everyone friends is over


Two mathematicians could literraly communicate encrypted on a piece of paper, and there would be no way to stop them other than scaring them put of doing it with threats of violence, jail or similar.

There is an aspect ro this we have to acknowledge: we live in a world where everybody who knows how can create encrypted communications that are impossible or at least very costly to break. You cannot stop them from doing it.

If we ban that kind of communications, the only persons making use of it will be people who really have something to hide. Criminals, drug cartels, financial fraudsters, terrorists and the likes. Which also means they are the only ones who got to get safe communication channels.

Banning a safe and private way to communicate for billions of people because of some criminals that you cannot deal with is very bad deal to strike. The way I see it criminals most of the time have to interface with the real, physical world in some places and that's were you would catch them if that was your goal: money transactions, buying selling things, shipping, production, etc. Hell if you check any police report ever this is where they catch the people.

That makes you wonder if banning encrypted communications for a whole population is not rather a thing you do for politicial (power) considerations.


I agree with you when it comes to regular crime, and I'm not finding it difficult to defend my right to privately communicate with other individuals and small groups. We have always been able to do that, one way or another.

Publishing is a more difficult matter though. Wouldn't you say that governments should be able to demand transparency when someone distributes information to entire populations?

If someone was calling on millions of followers to kill members of some minority, would you still take the position that the distribution channel must remain off limits to governments trying to enforce encitement laws?


I also agree that publishing is a different thing, it is very much the difference between private and public communication.

Where a private group becomws of the size that posting there constitutes an act of publication is a matter of discussion, but I'd argue below a certain size any group chat can still be seen as private communications.

This means Signal groups with their lower size are in less jeopardy than a one-to-many telegram group.


>If we ban that kind of communications, the only persons making use of it will be people who really have something to hide. Criminals, drug cartels, financial fraudsters, terrorists and the likes. Which also means they are the only ones who got to get safe communication channels.

They wouldn't be safe, though. Merely using encrypted communication would be enough to warrant attention. Encryption only provides plausible deniability if everyone uses it for mundane stuff.


> They wouldn't be safe, though

Provided they are using the same channels as the rest of us. Which they will not do, unless they are stupid.


> be susceptible to foreign owned apps influencing its citizenry

Are they banning all of the internet, books, movies and everything else too? Your position is that a state needs to control what the citizens can consume otherwise it's a vassal state?

My country lived under dictatorship with hard censorship before, and reading your comment that was the vibes I got.


nothing is getting banned, governments are asking for access, which Signal is signalling it will refuse and withdraw service.

India is a democracy, like the US, which makes similar requests for access for its domestic tech / media firms, as well demand the same for the few foreign owned ones which have had success in the country.


> Big govt vs big tech

Signal has 40 employees.


It does give the impression that the parent has no clue what their talking about.


Signal does not have the keys either, so how could they accept moderators? Moderators of what, encrypted data streams?


I mean ok, but Signal is not big tech and provides orders of magnitude more positive value to society than say Facebook.


> Any country that value sovereignty and sees itself as an independent actor rather than a vassal state, must take the position India is taking, or else be susceptible to foreign owned apps influencing its citizenry. The only alternative would be to insist Signal hire its security agents into product / moderator roles, so that oversight can be conducted in more implicit ways.

Horrifying take: The implications from such a stance means that the State has ultimate power over the Individual, and that when in conflict, the State's desires for control & order override the civil liberties that the Individual should have, including the right to privacy.

When it comes to privacy, it should be difficult for the State to peek into the private lives of the Individual without explicit consent from said Individual. If not, the private rights of the Individual are as ephemeral as the State's rulings on said rights (e.g. LGBTQ+ rights, abortions, rights to firearms, civil asset forfeiture, eminent domain, government bailouts/loans, etc).


would the US accept an Indian owned app, which was not subject to US legal jurisdiction?.

The issue at stake is one of national security - we have seen that this trumps individual rights whenever the two are in conflict. Look at the creation of NSA post 9/11 - the people broadly accepted this in order to avoid another terrorist attack - and have continued to accept it even as the natsec apparatus continues to expand


> Look at the creation of NSA post 9/11

https://en.wikipedia.org/wiki/NSA was created during World War II, renamed to its current name in 01952, and played a key role in weakening DES in the 01970s.


Except Government of India seems to want more (unchecked) foreign influence.

>> Lok Sabha passes Bill to exempt political parties from scrutiny on foreign funds, without debate

https://www.thehindu.com/news/national/lok-sabha-passes-bill...


... so you're saying any country that is independent must violate basic human rights to privacy ?


no such thing as 'human rights to privacy' I am afraid. Rights needs to inviolable, otherwise they are temporary privileges which can be withdrawn at any time. The countries which talk most about human rights have a verifiable track record of transgressing them when convenient


How can they exit India if they never entered it?

As long as they don't have a business entity there they can still keep running it, and it would be upto India to create a national firewall if they so wish to, otherwise their citizens would be able to still access Signal without any issues.


I like that they are structured as a non-profit, makes me able to trust them a bit easier.


> If India passes a law or deems Signal to not be in compliance with whatever encryption regulation, will you walk?

Makes me think of my encounters with US encryption export regulations. I think we're all lucky those aren't enforced.


Reminder that Apple did this domestically for the FBI/iMessage.

They intentionally maintain a backdoor in the end to end crypto of iMessage:

https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...


This is incorrect.

iMessage is encrypted, the encryption is end-to-end, there are no backdoors.

The unencrypted backup is a clear and annoying hole in the apple privacy story, but the non-e2ee iCloud backup does not include any messages (SMS, MMS, or iMessage), Contacts, Calendars, Notes, iCloud Photos, or health data.

A local backup is local, and is protected by filesystem encryption on your local storage.

So no, iMessage has no backdoors. Basic research would have showed that this claim was bogus.


iCloud Backup is effectively unencrypted and on by default, and sends the end device keys to Apple. Apple knows this, and went about encrypting iCloud Backups (like Google does on Android) but then stopped to preserve this vulnerability to avoid antagonizing the FBI, leaving Apple in possession of cleartext endpoint keys of all of their customers. Approximately nobody uses local iOS backups because iCloud Backup is enabled by default (even if you don't want to use iCloud and only log in with your Apple ID to install apps in the App Store).

If the middle relay service has the keys, it's not end to end encrypted.

The non-e2ee iCloud Backup does include your photos, includes your messages (if you have Messages in iCloud disabled) or includes your message sync keys (which are equivalent to your messages) if you have Messages in iCloud enabled. Your photos, contacts, calendars, notes, etc are never end to end encrypted and don't even pretend to be.

Even if you disable iCloud Backup and do only local backups: everyone you iMessage with will have iCloud Backup still enabled, sending either the sync keys (Messages in iCloud on) or the messages themselves (Messages in iCloud off) to Apple in plaintext in the backup that is done automatically each night.


> iMessage is encrypted, the encryption is end-to-end, there are no backdoors.

How would/could we know? If the NSA doesn't have a backdoor (whether that's in the non-public iMessage code, the hardware RNG on iPhones, or somewhere else), what are they even doing all day?


We know authoritatively. Buy a new iPhone and create a new Apple ID. Install an app (which automatically logs you in to iCloud and automatically enables iCloud backup).

Wait 24 hours plugged in to AC power.

Remove the SIM card and throw the iPhone into the sea. Forget the password for the Apple ID.

Buy a new iPhone, and get Apple to reset the Apple ID password using PII/SMS/whatever. Log in and restore your iCloud backup.

Your messages are still right there, and you have provided no key material. Apple has them in plaintext or plaintext-equivalent.


Yes, this is the back door, if you want to call it that. iCloud backups are not encrypted and contain key material.


If my iPhone got run over by a truck, and I log in my apple ID to another iPhone, will I get my messages back?


Nope, unless you’re using iCloud backup, or your messages can be synced from other devices in the key ring.


No this is the whole point: if you are using iCloud backups, your messages are not backed up.

The _only_ online way to maintain your iMessages is (checks settings) "Messages in the Cloud", which is e2e encrypted - if you lose all your devices there is no recovery path except for the "iCloud Key Vault", but I honestly don't know if that can get you to a point where old messages can be recovered.


https://support.apple.com/en-us/HT202303

Please read this article. The Messages in iCloud sync feature is e2e encrypted - but the sync key itself for the e2e is included in the (non-e2e) iCloud Backup which is done every night.

From Apple themselves:

> For Messages in iCloud, if you have iCloud Backup turned on, your backup includes a copy of the key protecting your messages. This ensures you can recover your messages if you lose access to your Keychain and your trusted devices. When you turn off iCloud Backup, a new key is generated on your device to protect future messages and isn't stored by Apple.

This is actually worse. It means that Apple has the cross-device Messages in iCloud sync key in effectively unencrypted form from the backup the night before - which means that they can decrypt the iMessages as they sync between your devices in realtime. This enables realtime iMessage surveillance because of this backdoor in the end to end crypto (the effectively unencrypted iCloud Backup, which includes the e2e keys for the Messages in iCloud feature).

The technical term for this is "key escrow". When you escrow the key to the middle transit service that is supposedly end to end, you backdoor the end to end encryption.


Via that same logic Telegram has ironclad E2E encryption ‘because secret chats’.

99% of people have iCloud backups turned on.


Yes. You can also do this even if you lose your Apple ID password.


> iMessage is encrypted, the encryption is end-to-end, there are no backdoors.

In Germany, we have a saying: "Wer glaubt wird selig." which could be translated as something like: "Blessed are they who believe."


"Meredith Whittaker .. in 2018, she was an AI researcher at Google".

That is factually incorrect.


In reality, none of the popular apps that advertise themselves as a forefront of Privacy are safe.

Decentralized protocols are the only viable alternatives to 'normie' chat apps like Telegram & Signal.

Speek! is a new app that looks promising: https://speek.network/


How is Signal tied to the region? What does it mean to “exit” India, exactly?


It means they'll get banned from the app stores and have their server IPs blackholed.


This is one of the pro/cons of trusting a messaging service from one of the big internet players. Sure Google/Facebook are profit-seeking enterprises, and won't put everything on the line for a privacy promise.

However neither of those companies is terribly easy to just blackhole. You can blackhole some IPs, but you'll also be causing your own business and government endless technical headaches since those IPs also serve local ads, shared folders and documents, government outreach channels, cloud services, etc.


App store, dns, routable ip-address.


"Says"


For now…


Here's one major problem with Signal - you cannot delete contacts.

Following scenario:

1) X communicates with Y using Signal trying to hide from Iranian police

2) Y is getting arrested and who ever is found to have his phone number is getting in to trouble as well

3) X deletes Y from its contacts

4) Y stays in X's contacts on Signal no matter what

now what should X do? delete Signal? theoretically the police could reinstall it and see who you had in your contacts.

There have been several issues opened for this problem on GitHub for years. They all get closed by their bot after couple of weeks.

I have several ghost numbers and even ghost user names on my Signal clients. Super annoying and cluttering my list of contacts. For me Signal is just one option to avoid WhatsApp. But boy do I prefer Telegram ...


Usernames are under development right now so this will address this issue one way or another.


it's either one pile of poo or another it seems... hosting your own matrix homeserver is probably not a thing you would like to do?


I have nothing against Signal, Moxie, etc — but it attracts high-value targets. As such, Signal is an extremely high-value target.

>> Signal knows nothing about who you are.

This is based on trust, not systematic proofs, Signal knows this, yet never tells its users. For example, Signal uses Intel’s Software Guard Extensions (SGX) - which is know to have multiple attacks, any of which Signal might be forced to run using national security letter. They have also had multiple data breaches, which I am guessing average user is not aware of. Moxie is still on the board, and he personally refused multiple times to remove the requirement for a phone number. Lastly, Signals three board members have all been the leader of Signal in last year and I have been unable to find their most recent 990 non-profit disclosure.


> any of which Signal might be forced to run using national security letter.

NSLs can demand information but cannot compel action. The govt cannot use an NSL to force you into military service, for example, or force you to hack someone else’s computer (which is essentially what you are suggesting).


Regardless if they were “compeled” - nation security letter related activities have included actions.

For example, this included a room and split:

- https://wikipedia.org/wiki/33_Thomas_Street

My understanding is that vendors that execute national security letters offer system integration technology to be locally installed to ingest the related data.

Signal has the data, it just requires them collecting it — hence my statement that Signal’s security is based on trust, not physics, math, etc.


Those actions were not the result of a national security letter. NSLs can only require the production of existing records. They cannot require the release of the content of communications nor the collection of records which don't already exist. They must also specify the time period the records request covers which can only include up to the date the NSL was received.


As I said, they’re related. That is All Writs Act is a United States federal statute, codified at 28 U.S.C. § 1651, which authorizes the United States federal courts to issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law. A writ is a form of written command in the name of a court or other legal authority to act, or abstain from acting, in some way.

Also my understanding that the National Security Letters vendors can simply reissue them automatically to maintain persistent data feeds. Disclosing actions taken via All Writs Act to enable national security letters would likely be a disclosure of them similar to warrant canary.


NSLs don't involve data feeds. The All Writs Act involves a judicial review and issuing of a judicial subpoena. A judicial subpoena may well involve ongoing data collection. NSLs are administrative subpoenas which aren't reviewed by the judiciary before being issued and limited in scope because of that. Actions taken "via All Writs Act" are not to enable NSLs.


This is correct but the All Writs Act can compel third parties to help in a criminal investigation, like a phone company being asked to help wiretap a phone. Signal by design at the most would have only contacts that you shared and IP address/access log. Apple was able to argue that they shouldn’t be forced to push an update to unlock the phone of that terrorist in California so there are limits to it but they are not entirely clear.


Idea that Apple would not add backdoors privately regardless of what actions they take publicly is a stretch give Apple has obviously been more than happy to tailor their systems to China’s demands. Core difference between China and US is that US is fine on government spying on it as long as it’s done in secrecy.


> Signal is rapid on its way to a billion users

The last number I could find is 40 million in 2021


You’re correct; removed that from comment, not sure where I got that from, but was unable to find anything quickly that would be anywhere near billion.


Signal does marketing by omission: "Intel’s SGX protects against introspection", "we depend on donations from our community", etc.


Would it be possible for you to expand on Signal not being dependent on donations?


"we depend on donations from our community" + a $105 million 0% interest loan (until 2068)¹.

¹ https://en.wikipedia.org/wiki/Brian_Acton#Signal


Are you suggesting Signal should not repay their debts or the best time to do that is not as soon as possible? If not, regardless of if there’s no interest, that loan should be repaid as soon as possible in my opinion. In my opinion, 990s are public, that loan was publicly acknowledged, and see no issue with a non-profit raising donations or repaying their debts.

All that said, if it was up to me, Signal would have their 990s on their site, annual publicly shared audits, etc. - which to my knowledge they do not.


> Are you suggesting Signal should not repay their debts or the best time to do that is not as soon as possible?

Cutting through the tug-at-the-heartstrings sales pitch: Signal does not currently "depend on donations", nor will they for at least 45 more years.

It is reasonable to expect the entire truth when asked for money, instead of getting Wikipedia'd.


They also received a very large amount of funding from some successful tech founders, Brian Acton and Pavel Durov


My understanding was that the funds from Brian Acton were a loan to be repaid, though might be wrong. I was unable to quickly locate any information relate to Pavel Durov having any financial relationship with Signal.


That's not how NSLs work.


Whenever India and its authoritarian stances are mentioned, a number of folks (I presume Indians, both on HN and elsewhere) come out of the woodwork to sing praises of "national security" while saying nothing about how such power can be abused.

It is truly sad to see that an entire populace can't see the perils of a government with broad-reaching powers, when government institutions jailing the opposition, censoring the press, and supressing minorities is rather commonplace there[1][2].

[1] https://en.wikipedia.org/wiki/The_Emergency_(India)

[2] https://en.wikipedia.org/wiki/Violence_against_Muslims_in_In...


Well that's what we call power of propaganda, I say that as an Indian(still living in India).

I honestly couldn't do anything even after writing letters, talking to friends who can talk to people who make shitty decisions, just cause there is no larger sentiment against these decisions, things just go by.

And now I have to say I am not sure when will the general public understand this issue, or will they ever. Are we just too lazy to do anything about it, or too divided to put up a fight...

Truly sad to see. Taking away rights to privacy is not the way of fighting the wrongs and problematic elements, and considering how corrupt our system can be I have no hope of it not being abused wildly. :(


Btw if anyone here is from Australia, you guys had a similar bill a few years ago correct? what happened after that? I am curious any hope of overturning this stuff from the sheer corporate backlash?

Or, maybe I am too optimistic about people who sold their souls for money, making a stand.


Yeah, the Assistance and Access Act (2018) passed and is still the law of the land. It gave law enforcement, intelligence and the government the ability to compel Australian providers to backdoor their services. Can't tell you how often it's been invoked since it also allows for gag orders.

Probably only 1 out of 10 Aussies would have even heard of it. The mainstream media here are about half a dozen ideologically aligned corporations who are not the type to ask hard questions, and the average Australian is focused only on their wealth, their family or their recreation.


I guess now we share that plight.


Australian here. What Assistance and Access bill did is different, and clever in some ways. TL;DR: it doesn't attack encryption directly - it doesn't give the government power to direct anyone to hand over keys, for instance. In fact the bill specifically prohibits the government in asking anyone to introduce a "systemic weakness". A systemic weakness is something that would allow the government to spy on everyone - which is what India seemingly wants to do.

But as I said it's clever. It can specifically prohibit introducing new "systemic weaknesses" because they already have one that's more than good enough for their purposes. That would be silent security updates. They have given themselves the power to compel tech company (Apply/Google/Microsoft) to install a silent security update on a specific device. The "silent security update" would of course be a bug (spy intercept) of some sort. It doesn't bypass encryption because it doesn't need to - a human can only consume unencrypted data, so that's what the spy bug intercepts.

They've ensured that it will never be systemic to their own satisfaction by putting several hurdles in place, like independent judicial review of the bug and which devices will be targeted. The fundamental principle is the only acceptable reason for targeting someone is criminal activity. If those hurdles are respected (and it seems likely they would mostly be followed) it means the Chinese like surveillance society India seems to be trying to create would be very difficult in Australia, even with this law. Which I guess would make it a reasonable compromise between government privacy invasion and law and order concerns.

The flaw is it's impossible to know if they are being respected. All companies and people forced to inject these spy ware updates are automatically subject to a gag order. All that review I mentioned happens in secret, and they have specifically exempted themselves from publishing any meaningful information on who, what, why and how devices are targeted.

To finish the picture - if the Australian government was concerned about criminal behaviour happening over Signal, it's highly unlikely they would be approaching Signal as India has apparently done. (I can't for the life of me think of a reason why Signal would give a shit about what the India government thinks or wants. Ditto the Australian government.) Instead they would direct Google to inject keyboard and screen monitors into Android. Google makes a lot of money in Australia, so it's likely they would comply. Like I said - it's clever.

But not impossibly so. It only works if they can target a particular device. For a commercial products this is invariably easy - Apple, Google, Microsoft all want you to sign in with an identity so they can milk some profit out of it either by charging you or at least displaying advertising. But open source projects, like Fedora or Debian, go out of their way to not identity the users, and worse Debian creates audit trails like reproducible builds. So their users are largely immune to Australia's Assistance and Access Bill (2019). But they aren't immune to India's rubber hose approach.


Please don't post insinuations about astroturfing, shilling, bots, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data.

https://news.ycombinator.com/newsguidelines.html


It's propaganda. It's the same reason why a large portion of Indians (even with full internet access...) are fervently pro-Putin in the context of the Ukraine-Russia war.


Saying that people who disagree with you do so because of propaganda isn't a helpful way of looking at the world.

EDIT: Just saw the company I am keeping. This is not a pro-war comment.


It really feels like a pro-war comment. You seem to be missing a lot of context, at best.


We're going a bit off topic here but I'll expand a bit.

If we both (presumably) start from the premise of "this invasion is a bad thing", then following this up with "anyone who supports it is just a moron duped my propaganda" and/or "the Russian administration are simply crazy" isn't helpful. This did not happen in isolation, it happened against a backdrop of decades of western failures in diplomacy & deterrence.

That's worth examining if we wish to prevent future wars.


I said neither of those things


Apologies for strawmanning you


I'm not an Indian, but I can understand why a country that was oppressed by the Great Britain for such a long time, and endured such atrocities, can opt to support the party that stands on the opposite side of Britain.

That alone could explain why not only India, but big parts of the worlds, especially the ones that suffered from the British and American imperialism, would rather see an arch-enemy of their usurper win against oppressors puppet state.


India was the chief non-aligned country of the Cold War, and for decades has warmer relations with the Russian Federation (and it's predecessor the USSR) than the west.

https://en.wikipedia.org/wiki/India%E2%80%93Russia_relations


I understand this. I'm simply asserting that if one believes in this political drivel, ignoring the onslaught happening in Ukraine, then you've not progressed as a human much beyond the middle ages.


Unfortunately, it's how humans work. As a species, we did not progress beyond the Middle Ages. Our tech did, but we didn't. Still the same tribalism as before.

When we plunder the other village, take their crop and their wives, it's to bring justice to the world. When they come to us, take our crop and our wives, they are evil aggressors.

Nothing has changed, and probably will not change anytime soon. Thousands of years of evolution are still stronger than whatever thin layer of civilization we try to put on top.


90% world the world (i.e. nearly everybody outside the west) has a similar position on the Russia-Ukraine issue as India though. Is 90% of the world propagandized? Or can they have a legit different perspective on things?


This is inaccurate at best. Most Asian countries have shown support for Ukraine. Most African countries as well, with a few notable exceptions.

There's no "different perspective" when it comes to slaughtering innocent civilians and other war crimes.


The UN vote on not recognising the annexation was a crushing defeat for Moscow, they only got votes from Syria, NK, Nicaragua and Belarus.

https://news.un.org/en/story/2022/10/1129492


Countries can have more complex positions than the simple "Ukraine good, destroy Russia" (or its inverse, "Russia good, invade Ukraine") dichotomy that many in the west go for. Many countries don't approve of Ukraine's invasion, but also don't like NATO very much and show some understanding for Russia's position.

Most countries wish to be neutral on this matter, continuing to trade with both Russia and the west, i.e. similar to India's position. Look at how many countries chose to sanction Russia. Of those who voted against Russia, most who chose to sanction are western countries. Very few Asian and African countries imposed sanctions. 87% of countries chose not to impose sanctions.

Re Africa, see Why African Countries Had Different Views on the UNGA Ukraine Resolution, and Why This Matters — Center for Strategic and International Studies: https://www.csis.org/analysis/why-african-countries-had-diff...


> WhatsApp does not protect metadata the way that Signal does. Signal knows nothing about who you are.

This isn't strictly true though, no? Signal knows your mobile phone number.

I appreciate that this is for facilitating usability, but its still a piece of metadata that can specifically be used to identify you - and signal knows it. You cant even use signal without verifying a mobile phone number AFAIK.

They really shouldnt be making false statements like this. It undermines their (otherwise admirable) position.


I think this means, assuming it’s not a burner SIM bought with cash, that someone could “only” identify you were a signal user. They wouldn’t be able to see your social graph , who you contacted and when, from where, what was said, what groups you were in.

I also thought I’d read on HN recently that they were looking at making this optional in the future but don’t quote me on that !


Tech companies will make all sort of noise when India, China, Turkey ask to have access and control over the data of their own people. However when NSA comes around with their secret laws and courts, they all bend the knee ...


That's because the NSA doesn't make big public statements. They just install a little black box in your servers and you don't say anything.


That's because when the NSA comes knocking to your door you are also prevented from disclaiming the fact that the NSA came to visit. Unless you like the hospitality of prison, that is.



> Some things are a bit Boolean, as Whittaker states above, but some things are simply binary.

Eh, what's the difference?

It's better to read the original article: https://www.theverge.com/23409716/signal-encryption-messagin...



Is it me or India is becoming the next China?


People really have warped perceptions of different Asian nations. As someone that has been a resident of most of them they are mostly all like China... which shouldn't be surprising there is a lot of shared cultural history in the region and by and large they are more collectivist societies.

Some are obviously worse (looking at you Myanmar) but most are various shades of authoritarian (Singapore, Thailand, Vietnam, etc) and most have heavy censorship (yeah the same culprits again).

This shouldn't surprise anyone and yet it always seems to.


This talk by professor Jeffrey Sachs at the Athens Democracy Forum couldn't be more timely. https://youtu.be/Ec2E4k1K52E

Different countries have long, deep-rooted political cultures that go back centuries. "Democracy vs authoritarianism" is the wrong lens: different systems are complex and cannot be ranked on an easy scale like we want them to.


It's not the wrong lens, it's a lens that authoritarian governments and their supporters (https://en.wikipedia.org/wiki/Jeffrey_Sachs#China) don't want people to use.


Interestingly, those push this sentiment the strongest, are also often those who deny others of having any sort of legitimate perspective, using methods that boil down to character assasination rather than argumentative refutation. It seems that such people want to monopolize on what constitutes the truth. That seems a bit... authoritarian to me?


I'm not calling to put him in the reeducation camp or a prison. Reputation is a mechanism of self-regulation in free societies, so it's the opposite of authoritarian.

There is nothing to refute, he's just being manipulative. Politics is "culture" + all "culture" is good and should be accepted without questioning => authoritarian governments are just as good as democratic ones and should not be judged. That's plain nonsense.


> Reputation is a mechanism of self-regulation in free societies, so it's the opposite of authoritarian.

This "reputation" you referred to is nothing more than showing that there are people who have yelled "look, he sides with the enemy!" Nothing in that article you linked is a substantial refutation of his points. Everything is based on attacking his character merely for associating with countries that certain people don't like.

You may recall that "guilt by association" is certainly not a feature that free societies ought to have.

> Politics is "culture" + all "culture" is good and should be accepted without questioning

That is not the point at all. The point is that political systems cannot be easily changed because culture is enduring, meaning we shouldn't be lured by the fantasy that we can overthrow some dictators and voila we have democracy and everyone lives happily ever after. This should be super-obvious if you look at attempts to drop western-style democracy in the middle east: places are now worse off than before.

Another point is that political systems are complex, partly due to tradition and culture, and they cannot be force-fitted to be ranked on an easy, feel-good "authoritarian vs democracy" scale. This means it's a better idea to deal with those systems by talking and engaging with them, and understanding their nuances and merits, rather than taking a forceful "only my perspective is right and everyone else is wrong" approach.

This message is far more nuanced and constructive than you give him credit for. Your approach, "destroy all whom we deem authoritarian; don't listen to them in any way", has empirically resulted in countless amount of suffering in the past few decades, and has created a large amount of resentment in the global south.

It also doesn't help that many who insist on the "authoritarian vs democracy" frame are also those who align with, or at least fail to distance from, US regime change foreign policy, which is known to overthrow democracies and install dictators as long as those dictators favor US interests, while at the same time spinning such efforts as "spreading democracy and fighting for freedom". At their best, such figures often hyper-focus on "democracy", but turn a blind eye towards countries' inability to develop and inability to solve poverty due to how weak they've become after democracy was installed. All this too has generated a large amount of resentment in the global south.


And I will still likely never use it unless it brings back SMS support in the app.


fucking lying is what they do! how come, if they do not share data with third parties, that a "breakin" at TWILIO would have any consequence on signal?

the app store claims data is not shared with 3rd parties, yet they use twilio to "verify phonenumbers"

how about verifing them yourself? to expensive...? so twilio does it for free for you? what the fishy fuck is going on here?

sure they do mention verifying the numbers with a third party in their terms & privacy policy, but the rest of their communication/advertising just very misleading and fishy, claiming they dont keep any data on you.

i never liked or trusted signal, especially not with marlon"no federation for u"spike and not with auntie "i used to work for the FTC" either.



and? is it a free service? .. no it's not.

what is your point?


Dear messaging services (Signal, Whatsapp, etc.)

You don't get to say "hey our messaging service is more secure than x, y or z"

That kind of statement can only be stated by external, independent researchers, not the marketing department.

This thread alone has already pointed to a number safety/security issues that are neither obscure, unreasonable nor able to be addressed by the user.

Additionally I wouldn't even describe Signal as the safest option available in comparison to the many E2EE widely available chat programs.


A noob question, how does signal know/prevent use of its app for illegal/criminal activities?

Larger question here would be - Do governments and security agencies need to keep a tab on social media to check for illegal activities


Illegal under which laws. Qatari law, which outlaws homosexuality? Russian law, which prohibits referring to the "invasion" of Ukraine? Or do you assume everyone around the world is subject to US law, so it's fine for all our messages to be intercepted and checked for compliance with US law?


> A noob question, how does signal know/prevent use of its app for illegal/criminal activities?

If you allow for privacy you allow for illegal usage. There is no way around it.

> Larger question here would be - Do governments and security agencies need to keep a tab on social media to check for illegal activities

They can do it old fashioned way. Also every compromise on security can be used by bad actors as well, and in some cases government is that bad actor, even in 1st world countries.


The same way a spoken language is a communication tool and doesn't restrict what is discussed.


How do roads prevent speeding, drug trafficing, human trafficing, fraud, etc?


What about phone companies ?


This is answered in the article.


I do want to hear more about how signal and other companies are working to prevent their apps being used by bad actors, terrorists.


I do want to hear more about how toll road management companies are working to prevent their roads being used by bad actors, terrorists.

Food manufacturers should also ensure that they are not contributing to feeding bad actors...

People like you being so easily manipulated by governments are extremely dangerous to democracy.


hey, hey, hey Dont give these companies any ideas, chase bank already closes your account if you say something they dont like.


Next we would like to hear more about what owners of open spaces are doing allowing bad humans to use them to communicate sounding words over air.


The greatest mystery to me is how anyone can think you can somehow stop 'bad actors' from communicating over the Internet other than banning it as a whole.


Exactly! "Oh no now I will have to use a illegal app to commit my illegal activities".

Banning encryption doesn't stop bad actors from communicating over encryption, it just adds to the number of illegal things they did.


How do car manufacturers prevent their cars from being used by terrorists?


Meredith Whittaker was always an activist, and her being in charge of a big product doesn't change that, it just increases the risk. I don't believe for one second that if push comes to shove and she thinks it's an important cause she won't wield her power to cut through the supposed security of Signal and expose "undesirables" to the light of day and leak all their private info.

I would not trust this product anymore, especially if you are not a hard leftist like her.


Signal is already compromised. I don't understand why people still keep fooling themselves it's private and secure.

It requires a mobile number, and thus your identity is known and your device is uniquely identifiable anyway, and it's also developed in the US where three-letter agencies have infinite reach and control.


Yes, your signal identity is tied to a phone number. But attackers can't tell who you are talking to, or what groups you are in. So why does it matter?


eh. good riddance. unless they move away from mobile number requirement, they are as bad as telegram or whatsapp as the local police has been known to phish for phone numbers and physcailly harrass them and what not. https://thenextweb.com/news/kashmirs-police-want-people-to-r... https://kashmirobserver.net/2021/12/23/police-warns-social-m...

https://timesofindia.indiatimes.com/city/agra/in-up-district...

https://theintercept.com/2020/12/06/kashmir-social-media-pol... this is for tweeting but twitter is a public platform, same as your "mobile number" because twitter forces 2fa that makes you put your mobile for "security" reasons.

what i am trying to say is, this is not about police needing 100% access to find pedophiles. This is about crushing dissent, about the same big brother type are what people in india are seeing unless they join the ruling party. This is out in the open now and has for some time.

This is the way of the government to force big players to comply.

lets say tomorrow signal and whatsapp BOTH remove mobile number requirement. Now, unless the ISP are made to identify their specific traffic and block them on demand, ALL the users would be independent from local indian administrative action


Neither Signal, WhatsApp, or Telegram are particularly secure. On Signal for example, your private key is protected only by a verification text message and short PIN. That's very low entropy compared to a full ED25519 keypair. It's better than SMS but don't be fooled into thinking you have Snowden-level opsec because you use one of these apps.

Use PGP and email.


> your private key is protected only by a verification text message and short PIN. That's very low entropy compared to a full ED25519 keypair.

Can you explain what you mean by this? In practice, I can see the argument that the only verified identity (phone number) for most users is the one protected by a flimsy SMS verification message, but I don't believe that that implies your private key is terribly vulnerable to e.g. SIM swap attacks. Or perhaps I'm misunderstanding you.

Put another way: if you rigidly adhere to out-of-band-verified pubkeys for contacts, you should be fairly safe. True that Signal's UI makes this hard to do, but that's a different conversation than "your keys are only protected by SMS verification".


> True that Signal's UI makes this hard to do

Do you think so? The app provides a "Safety Number" and QR code with built-in scanner to verify that yup, your device and their device have nobody in the middle. It has a visible reminder that you checked this person's actual identity (if you did) and a message appears if the keys change. If you are "rigidly adhering" to protocol your response should be to arrange an in-person meet-up to reconfirm, not "New phone? Cool".


I agree that verification is easy. I think they have softened the warnings and behavior surrounding key changes to accommodate their users that have virtually no understanding of crypto (i.e. the majority of people). It's an understandable position for them to take, but yes, in my opinion the alarm bells when a key unexpectedly changes are more oriented towards casual users than a (to paraphrase c7DJTLrn) Snowden-level user. For the latter, I expect behavior more similar to an OpenSSH key mismatch (which is quite a bit more strongly worded than Signal's).


This is completely wrong? Do you really think the first text message they send is some crypto negotiation?

PGP and email is extremely error prone and has zero forward secrecy, and web of trust is completely broken as a practical matter.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: