Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My first question is how much of a burden is it really? Are we hearing the squeaky wheels, or is it actually pretty bad?

My second question is how much does it help. It's fine to say that it codifies practices that companies mostly do anyways (and if so, how bad can it be), but it was also a response to some troubling behavior in the market. How many problems does it prevent for the burden it exacts?



> My first question is how much of a burden is it really? Are we hearing the squeaky wheels, or is it actually pretty bad?

It's pretty bad. It was bad enough that we had to hire multiple full time people on our side just to deal with the interactions, people with engineering backgrounds who basically just did paperwork, who could have been doing much more useful things given their knowledge and experience.

> My second question is how much does it help. It's fine to say that it codifies practices that companies mostly do anyways (and if so, how bad can it be), but it was also a response to some troubling behavior in the market. How many problems does it prevent for the burden it exacts?

It's important to remember that there are two aspects to SOX: Operational and financial. I don't have a lot of experience with the financial side, other than to say they have just as much overhead, but perhaps it prevented a lot of things.

But from the operational side, it made us do things in bad ways so that we could show the auditors, and also slowed us down. For example, production access to financial data must be limited so that it can't be modified in production after the transaction but before it gets to the financial systems. Sounds like a good idea, but then when you have an outage, you have to scramble to find multiple people to unlock the access keys and watch over your shoulder while you make fixes on production systems.

Or instead you rearchitect your entire system so that only a few machines are actually handling financial transactions and keeping the rest out of scope.

Either way, it's a huge burden.

Another great example is password rotation. The law demands you have a password rotation policy. It doesn't say what that policy should be. Most auditors have settled on 90 days. Most researchers have shown that forced password rotation is bad. Without SOX, I would just follow the recommendation of the people who actually used science to figure out that password managers are better than password rotation. But with SOX, I either just follow the auditor's redone checklist, or spend a whole bunch of time convincing them that my policy is better than rotation. Either way, a bunch of overhead either for me or for all my coworkers.


> Sounds like a good idea, but then when you have an outage, you have to scramble to find multiple people to unlock the access keys and watch over your shoulder while you make fixes on production systems.

While I can understand that seems like a huge pain for a legitimately acting company, this is exactly the type of thing I would want to see a law like this enact. Sure, it slows down fixes when there are problems, but it sounds like it might have helped with past stuff like the Crazy Eddie fraud[1]. There's likely a spectrum between what happened there and what a respectable company like yours does, and requiring few keys to change this data and oversight while doing so likely helps quite a bit with operational incompetence as well.

> Another great example is password rotation. The law demands you have a password rotation policy. It doesn't say what that policy should be. Most auditors have settled on 90 days.

I've experienced this with PCI Compliance. It's annoying, but you have to realize that there are a lot of people and companies out there that have no idea about proper security, or continuously deprioritize it in favor of some other thing they need to get done. It's never a big deal until it is, then it's a huge deal. Making it mandatory, even if it overshoots a bit and is more cumbersome that it needs to be is beneficial overall because there are a lot of people like I described.

And that's sort of how I see these laws overall. For the people that are already mostly compliant it's burdensome, but if those people and companies are actually 10-20% of the market and the other 80-90% aren't really following best practices and are ripe for problems, whether they be security, management, operational or criminal, then I don't really care if it's somewhat burdensome for those 10-20%. If the companies that are using best practices already are 80-90% or more of the market, then sure, the law might be too burdensome for the benefit it infers (but it might not, depending on how bad the problems it prevents are).

And that's really the crux of the issue. How burdensome the law is to responsible companies is irrelevant without the context of how often it is useful to force the hands of irresponsible companies. How big is the group your company, as a responsible actor, represents? That's the missing information here.

1: https://news.ycombinator.com/item?id=13844316


That's a fair point. I've only ever worked for responsible companies.

If the checklists are actual implementations at the other companies, then yes, maybe there is some value there. But then the law could still be improved to allow a little more leeway for companies that are responsible. I don't know how that would work though.


> But then the law could still be improved to allow a little more leeway for companies that are responsible.

Bad companies already try their darndest to present as responsible, and if they never succeeded there wouldn't have been call for these regs in the first place. I don't think making a determination as to who "is responsible" and loosening requirements makes any sense - Enron and WorldCom would probably have been "responsible".

Better would be to try and align the regulations so that they have minimal friction while people are acting responsibly. Maybe you don't need a key and active oversight to make a change - maybe the change can be logged non-destructively and audit can happen after the fact.


>Another great example is password rotation. The law demands you have a password rotation policy. It doesn't say what that policy should be. Most auditors have settled on 90 days.

This is false. SOX does not require that you have a password rotation policy. It requires that you have a password control policy, which may or may not include password rotation.

Likewise, I've worked at a company whose idea of SOX compliance was mandating a VP to sign off on every code commit, and I've also worked at a company whose idea of SOX compliance was mandating that someone sight off on every code commit. The former had a 90-day password rotation policy, the latter didn't have a rotation policy at all.


In my previous job I was involved with IT of large corp and here is explanation I got:

Yes it is false but SOX does call for strict internal controls on financial information. Which means that password rotation is necessary so that people who are not anymore allowed to access information does not have access to these information any more. Thus password rotation is required.

I and my IT friends could be 100% wrong here but this is how these SOX vague requirements will be translated to IT admins.


I'm a former Big Four auditor, on the financial side, but I worked pretty closely with our tech folks and I'm now in the tech industry. SOx really does have very strict internal control requirements on financial data and how and where it can pass between systems and people, whether technological systems or physical ones. I've worked with clients who used strict password rotation to fit the law, as well as with clients who didn't do this. As long as the policies are clearly spelled out, do not allow those who are no longer supposed to have access to have access, and are consistent across the organization, it's all good. For example, one company I worked with changed passwords to one system only when someone rolled off the team working on it. That happened infrequently, but more than once a year. The policy was clear, written, and consistent, so it fit the internal control criteria.

Really, we'd need to find what are called 'material weaknesses' or 'significant deficiencies' to make us really stop and consider writing up a finding that would be published. 'Material weaknesses' are considered worse, and would likely lead to the possibility of a material misstatement in the financial statements of the firm. Deficiencies are a step below that. If the firm corrects them, we're okay for the most part, unless the weakness was terrible.


> strict internal controls on financial information. Which means that password rotation is necessary so that people who are not anymore allowed to access information does not have access to these information any more

How does rotating a password prevent access? That's access control, not authentication.

Unless it's a shared password, which is a bigger problem.


privileged accounts may become disabled by system policy if the password is not updated after >180 days of expiry


It is pretty bad. The desire to avoid having to deal with SOX compliance has pushed a lot of companies to sell themselves privately rather than IPO. A lot of the restrictions that are imposed make debugging and fixing operational problems a lot harder. Many of the policies that are imposed are actively harmful.

And, sadly, SOX compliance is easily bypassed by bad actors. I'm not convinced that Enron would have been stopped by the regulation. And even if it would have been, after several rounds of regulatory capture like the above, the regulation will be nothing more than another marketing channel for auditing companies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: