What the hell are you talking about? I'm not libertarian or ranting, I'm saying this is a complex issue and the "LETS REGULATE IT" feels premature given how complex and subtle the issue is, and how hard laws are to change once they're on the books.
Methinks you're projecting more than a little bit. Also your comment is some of the most toxic I've seen on HN in a little while, and I'm not exactly a ray of sunshine on here...
Definitely getting a flag from me, and I think that might be a first.
> I'm saying this is a complex issue and the "LETS REGULATE IT" feels premature given how complex and subtle the issue is, and how hard laws are to change once they're on the books.
There is a difference between "This specific regulation fails to take X into account" or even "This is a complex issue which requires carefully crafted regulation" and your stance which seems to be that regulation is impossible or undesirable due to the complexity of the issue or because it might be annoying to change later. That very much suggests your problem is with regulation itself.
Just out of curiosity, if you think regulating this is a bad idea, what do you propose to fix the problem? Appealing to the good will of companies who would gladly exploit us to make money?
I suppose it was somewhere between when you said that government shouldn't regulate or try to fix the problem until they have a better understanding of it ("this is a complex issue and the "LETS REGULATE IT" feels premature given how complex and subtle the issue is...We need to understand the full scope of the problem, rather than try to fix what isn't fully grasped") and that you don't think the government is capable of doing exactly that. ("I have very little faith in the US government to accurately grasp the subtleties and nuances of the advertising and privacy problems")
I acknowledge that you may have just worded things poorly, or intended your words to mean something else which is why I prefaced by question with "If".
If you actually do think the US government should regulate this, how should that be done given your concerns of their inability to understand the issue and taking into account the fact that any mistakes they make will take effort to correct once it's been put into law?
Are you open to the possibility that you treated me overly harshly, and that I've never actually taken the position you seem to badly want me to have taken? Or is that beyond the pale, and this is exclusively my fault?
To try and contribute to this conversation a small amount, it makes the most sense to me to establish a Professional Engineer like system for the software industry. I don't know all of the details about how these systems work, but a friend of mine is a structural engineer and as he's described it to me it creates a personal ethical obligation that currently feels entirely absent from the software industry.
This ties the regulation to the industry, lets the experts decide what is and isn't okay, while also involving personal ethics and keeping individuals accountable for their decisions.
Keep getting called by a spam bot? A licensed engineer can be tied to that system, and can be punished.
A self driving car had a bug that couldn't tell the difference between a plastic bag and a child, forcing the vehicle into a concrete barrier? Someone stamped that code, and is now accountable.
Is a company using your tracking data to raise your medical bills based on your online food order history? That company's engineers are liable, you can sue them personally.
Again, not sure of the details about how this works in the construction industry, but when you put professionals at personal risk, they tend to care more about the outcomes, and it doesn't end up forcing the US government (specifically legislators) to understand all the details.
I'm fully open to the possibility that I've misinterpreted some of what you've said.
I wouldn't mind seeing that kind of system applied to some software companies. I'm not sure that fully solves the problem, but I really do like the idea of having someone willing to be held accountable for problems.
I think one issue with this would be that companies who keep their activity opaque enough are highly unlikely to be caught by users. A major data leak or a hacked system might be detectable, but a backroom deal to share data with a medical or insurance company would likely go undetected without a whistleblower to expose what was happening. If a bridge collapses, or a fire starts, or a floor sinks, it's a little easier to see when building/fire codes weren't followed.
I think a system like this would be best used when some rules are already in place for what a company can and cannot do. Then a company's certified security/privacy person would be responsible for making sure everything was done in full compliance with whatever guidelines were established. This would probably be needed anyway if you tried to sue the company's engineers later. If nothing they do with your data is actually illegal those lawsuits won't get very far.
I do agree that compliance checking and having that level of accountability could help developers rein in marketing teams, greedy shareholders, and stupid managers who put pressure on them to add tracking and anti-consumer code to their products.
I do worry about what it would mean for small startups, volunteer/personal projects, and single developers. It might be a more secure world if anyone who wants to write an app can't just slap a boilerplate notice that they aren't responsible for anything if you choose to use it, but my guess is that there might also be a lot fewer apps.
That's what regulation does, it restricts the players in a field to the ones who have the resources to follow those regulations. It is definitely a limiting factor on an industry, but I thought we were sort of assuming that the rampant "anyone can play" nature of software was the source of many of the industry's problems right now.
My point is that if someone has to do it, I would A) want the industry to police itself (let me finish) and B) would want individuals to be unable to hide behind corporations, to the extent that is reasonably possible. These together could, I believe, make more of a dent in the problems in the software industry today than any one specific topic of legislation (e.g. privacy laws or digital advertising laws) could.
Imagine if you could lose your software license for writing adware! Would it stop everyone/everything? No, definitely not. Would it give your average corporate software engineer a leg to stand on when they try to say no to their bosses when the business wants to start selling customer data unethically? Hell yes!
"I could lose my license if I add that feature." <- huge impact.
I mean, you saw "regulations are written in blood because people died and resulted in them getting passed" and thought a proper response was "let's not have any regulations at all because they're hard to get rid of and slow business down" so you're at least taking the libertarian position, regardless of whether you identify as a libertarian.
And no, nobody else has ever interpreted "regulations are written in blood" in that fashion, nor is that a valid interpretation of that saying. That is in fact the opposite of what the saying means.
There are wrong answers to questions, I understand that it may be uncomfortable to be told that but it doesn't change the fact that nobody else ever interpreted that saying that way, nor does it really make any sense to interpret that way.
Your jumping into semantic arguments about well I said it therefore it's a viewpoint is especially unproductive to the overall discussion. Like sure, but (a) that's not relevant, and (b) not all viewpoints are good, it's not a valid viewpoint just because it's a viewpoint. It's also where you start seriously leaning off into sealioning/tone argument instead of, you know, discussing why regulations are written in blood.
Hope this helps, I'm trying to keep this as even-keeled and matter-of-fact as possible - which certainly poses a challenge once you've poisoned the well by throwing around accusations of toxicity. That's a burden on me, everything I argue now has to be sugarcoated lest it further reinforce you as being the victim, which is of course the whole reason you dove into tone arguments.
That is part of the problem with "no offending anybody" type rules - it becomes very easy to cry "toxic" and play the victim and that lowers the quality of discourse, because nobody's opinion can ever be wrong lest they get offended.