…but who is really accountable?
This June 14th CSO Online article says boards are ready to fire their security leaders over bad security reporting. A strong indicator that board members have moved on from well meaning soundbites to career focused concern. Great, but, landing real strategic and operational responsibility is far from a trivial task.
Taking that question of reporting: Effective reporting relies on risk specialists, business contacts, control operators, data feeds from far beyond the local security team (including many third parties), as well as recipients making time to read and think about them. So if you haven’t pinned down the full spectrum of appropriate accountability for security, there doesn’t seem to be much scope for blaming problems on someone else later…no matter how valid the budget, time, or political challenges faced. That suggests your CISO should be looking (harder than they are already?) for a new job.
The rest of this post looks at one key cause and potentially simple ways to combat this kind of blamestorming.
The Cybersecurity Risk Owner-Go-Round
Lack of formal, transparent, and realistic risk ownership is a key root cause for creation and release of insecure technology solutions and inadequate overall security – a vehemently held belief of mine. So, a few weeks ago, I launched this poll to see where peers think the cybersecurity buck stops in their organisations:
- If a project has unresolved security vulnerabilities, and needs to go live, who is asked to sign that risk off?
- If a team is working fast to deliver a digital solution, and argues they have to use production data to test it, who is asked to formally approve that?
- Who do you go to for final sign off, if the business wants to sign a vendor contract before security due diligence is complete?
- When there are insecure websites/user access processes/data transfers, and the business argues there is no time or budget to fix them, who has the final say on whether or not that’s acceptable?
It’s a small poll, just 53 respondents, but in my follower world there are a concentration of people who live the impact of this every day. Even allowing for poll size, likely bias, and lack of respondent information, the results were striking:
Security is Everyone’s Responsibility… a piece of self-serving vapid lip service paid to the real underlying problem
A risk owner is defined in the ISO31000 Risk management standard as:
to manage a particular risk and is accountable for doing so
Dejan Kosutic (leading expert on ISO information security management talking about his book “Risk Owners Vs Asset Owners”) expands that further:
When choosing risk owners, you should aim for someone who is closely related to processes and operations where the risks have been identified – it must be someone who will feel the “pain” if the risks materialize – that is, someone who is very much interested in preventing such risks from happening. However, this person must be also positioned highly enough so that his or her voice would be heard among the decision makers, because without obtaining the resources this task would be impossible. So, it seems to me that mid-level managers are often the best candidates for risk owners.
Who is the most senior person who will answer to the most senior internal or external body if the assets at risk come to predicted harm…or…who will be first in line to ‘spend more time with their family’ when it hits the fan.
Why does this matter? If the majority view of the RACI is wrong (who is Responsible, Accountable, Consulted, Informed), we will continually fail to prevent incident related harm to everyone’s economic and physical welfare. Quoting a past article of mine ‘There Is No Such Thing As Cybersecurity Risk‘ a rock solid risk RACI is a vital foundation to rationally tackle competing business motivations.
Operational business bod
“If we don’t get this solution in place, we are at risk of not meeting annual performance objectives”
“Every second we delay getting this to market risks loss of planned competitive advantage”
Project Management bod
“The late delivery of this IT project is risking an overrun in the allocated budget and disappearance of my hoped for bonus”
“Squeezing time and resource to fully pin down requirements, understand system interactions and robustly test means we could have significant functional issues with the implemented solution”
“Not allowing time to assess security of planned changes, pentest web elements and fix problems found, risks serious vulnerabilities ending up in the live service which could lead to financial fraud, data disclosure or data theft”
What really matters? The bottom line. Why is security so often deprioritized in this context? Because all of the other impacts are easier to understand and feel more immediate.
Perhaps consider that accountability perspective in the context of a couple of incidents and newsworthy issues with broader compliance:
- Explaining Volkswagen’s Emissions Scandal – New York Times, Updated 1st June 2016
- Talk Talk breach could be good for firms in the long run – V3, 1st December 2015
- Why the “biggest government hack ever” got past the US government – Arstechnica, June 19th 2015
and there’s no shortage of commentary on board level security responsibility:
- Cybersecurity is the responsibility of the board – Information Security Buzz, 1st May 2016
- Cyber Security Executives Need To Up Their Game: Here’s Why – Forbes, June 14th 2016
- 82% of Boards Are Concerned About Cybersecurity – ISACA, February 29th 2016
- When Business Culture Eats Cybersecurity For Breakfast – Infospectives, August 2015
A more granular view of this challenge (and a potential step in the right direction):
- Clarity about the many elements of cybersecurity related risk ownership
- Group risk ownership
- Risk ownership delegation and
- Risk ownership persistence
The need to scrupulously document risk and any related risk acceptance, hopefully goes without saying.
Cybersecurity risk is influenced by a range of responsible and accountable people. Most of them reside outside the security function. Financial risk owners, regulatory risk owners, policy owners, business process owners, asset owners (applications, kit & data), control owners, control operators, IT users. Defining each contribution is vital. Usually through scenarios exploring common usage and incident response.
The one time accountability is often defined with utter clarity is after things go wrong (see multiple recent breaches for details). That is closing the door after the profit and reputation horse has bolted.
The CISO’s team is often responsible for operating some key security controls, but otherwise, in risk terms, security is arguably the least operationally important stakeholder. The role of your security function, at base, is to arm the organisation with sufficient information to make good risk decisions to remain compliant and grow. That needs to be drawn out and properly articulated. What they can’t do is unilaterally define ‘good enough’ requirements, then plan, implement, and run all of the necessary tools and processes to keep the business safe…unless you hand them a significant portion of all board member’s budgets and grant them right of veto for any board decision.
That isn’t going to happen and it shouldn’t happen. So you need to enable the right accountable people to negotiate to do the right thing in the context of their budget, remit, organisational objectives and operational responsibilities…but only if they remain accountable for those decisions.
2. Group risk ownership
If your ‘board’ owns your cybersecurity risks it is still not going drive changes required. Ownership by a committee, a committee that often has rapidly changing membership and terrifyingly broad responsibilities, leads to NO-ONE taking personal accountability to sponsor cybersecurity changes.
For perspective on effectiveness of management by committee, consider this published in 2012 by the CIA. Number 3 in a list of ‘Timeless Tips for Simple Sabotage’ from an agency field manual.
Organizations and Conferences: When possible, refer all matters to committees, for “further study and consideration.” Attempt to make the committees as large and bureaucratic as possible. Hold conferences when there is more critical work to be done
In order to build familiarity with specialist risks a disproportionate amount of time needs to be spent to get up to speed. Someone at the most senior level needs to be accountable for making that time and giving others involved in decision making the information to do their job. Thereafter, when awareness has been appropriately raised, the familiar bullet point summaries can take back over.
3. Risk ownership delegation
If you fail to limit the extent of permitted delegation, ownership will just devolve back down to the historical operational level, a level unable to resist calls to JFDI to meet more familiar objectives.
If we don’t put in place some kind of persistent ownership of significant risks that businesses accept, there’s no reason on this earth for a board member to fly counter to others and change the security status quo. This happened with in the world of finance with Sarbanes Oxley legislation after Enron collapsed under the weight of financial reporting irregularities. The CFO’s head is now squarely on the block, even if they jump ship before things hit the fan.
The essence of Section 302 of the Sarbanes-Oxley Act states that the CEO and CFO are directly reponsible for the accuracy, documentation and submission of all financial reports as well as the internal control structure to the SEC
For cybersecurity, even when decisions are made that can potentially end lives or destabilise critical infrastructure, responsibility for signing off poor or insecure design often stops with a pink slip (or giant golden parachute). This doesn’t prevent formal accountability being handed over to a successor, but that handover, if it includes a legal liability, would be far more robust. Just like the initial risk acceptance decision – if risk owners know that kind of scrutiny will come in future.
While the tone of the above could be interpreted as combative, it’s not intended to demonise the C-Suite. It’s intended to address the poor-relation status of security in business decision making. Below are proposals I would like to shape going forward and potentially see enshrined in law. First to address risks to life and critical infrastructure and then to more generally clarify the responsibility all businesses have to protect their customers, employees and investors:
- Formal definition and assignment of security accountability and responsibility and a living document with CEO sign off recording that cybersecurity RACI.
- A requirement to attach accountability and responsibility to individual roles not boards, forums, committees or other groups.
- A requirement to limit the allowed extent of ownership delegation, and document the precise nature of delegated accountability and responsibility and
- A proposal to attach accountability for a defined subset of accepted risks to the individual accepting those risks, beyond their time in the risk owner role and/or organisation.
Cumulatively just codifying things organisations should be doing, records they should be keeping, and means to enable improvement requiring no significant investment … depending on budget needed to raise board level security awareness when it explicitly becomes a key part of personal contracts, and those accountable start to scrutinise risk acceptance decisions made by their predecessors.
Picture Credit: Copyright: <a href=’http://www.123rf.com/profile_iqoncept’>iqoncept / 123RF Stock Photo</a>