I'm fully open to the possibility that I've misinterpreted some of what you've said.
I wouldn't mind seeing that kind of system applied to some software companies. I'm not sure that fully solves the problem, but I really do like the idea of having someone willing to be held accountable for problems.
I think one issue with this would be that companies who keep their activity opaque enough are highly unlikely to be caught by users. A major data leak or a hacked system might be detectable, but a backroom deal to share data with a medical or insurance company would likely go undetected without a whistleblower to expose what was happening. If a bridge collapses, or a fire starts, or a floor sinks, it's a little easier to see when building/fire codes weren't followed.
I think a system like this would be best used when some rules are already in place for what a company can and cannot do. Then a company's certified security/privacy person would be responsible for making sure everything was done in full compliance with whatever guidelines were established. This would probably be needed anyway if you tried to sue the company's engineers later. If nothing they do with your data is actually illegal those lawsuits won't get very far.
I do agree that compliance checking and having that level of accountability could help developers rein in marketing teams, greedy shareholders, and stupid managers who put pressure on them to add tracking and anti-consumer code to their products.
I do worry about what it would mean for small startups, volunteer/personal projects, and single developers. It might be a more secure world if anyone who wants to write an app can't just slap a boilerplate notice that they aren't responsible for anything if you choose to use it, but my guess is that there might also be a lot fewer apps.
That's what regulation does, it restricts the players in a field to the ones who have the resources to follow those regulations. It is definitely a limiting factor on an industry, but I thought we were sort of assuming that the rampant "anyone can play" nature of software was the source of many of the industry's problems right now.
My point is that if someone has to do it, I would A) want the industry to police itself (let me finish) and B) would want individuals to be unable to hide behind corporations, to the extent that is reasonably possible. These together could, I believe, make more of a dent in the problems in the software industry today than any one specific topic of legislation (e.g. privacy laws or digital advertising laws) could.
Imagine if you could lose your software license for writing adware! Would it stop everyone/everything? No, definitely not. Would it give your average corporate software engineer a leg to stand on when they try to say no to their bosses when the business wants to start selling customer data unethically? Hell yes!
"I could lose my license if I add that feature." <- huge impact.
I wouldn't mind seeing that kind of system applied to some software companies. I'm not sure that fully solves the problem, but I really do like the idea of having someone willing to be held accountable for problems.
I think one issue with this would be that companies who keep their activity opaque enough are highly unlikely to be caught by users. A major data leak or a hacked system might be detectable, but a backroom deal to share data with a medical or insurance company would likely go undetected without a whistleblower to expose what was happening. If a bridge collapses, or a fire starts, or a floor sinks, it's a little easier to see when building/fire codes weren't followed.
I think a system like this would be best used when some rules are already in place for what a company can and cannot do. Then a company's certified security/privacy person would be responsible for making sure everything was done in full compliance with whatever guidelines were established. This would probably be needed anyway if you tried to sue the company's engineers later. If nothing they do with your data is actually illegal those lawsuits won't get very far.
I do agree that compliance checking and having that level of accountability could help developers rein in marketing teams, greedy shareholders, and stupid managers who put pressure on them to add tracking and anti-consumer code to their products.
I do worry about what it would mean for small startups, volunteer/personal projects, and single developers. It might be a more secure world if anyone who wants to write an app can't just slap a boilerplate notice that they aren't responsible for anything if you choose to use it, but my guess is that there might also be a lot fewer apps.