Boardrooms face seemingly unending governance, disclosure, regulatory and legal challenges related to digital systems risk (see, for example, the latest SEC requirements for board cybersecurity).
This is exacerbated by the rapid adoption of AI; a digital technology that society is just beginning to grapple with and understand. AI is a powerful capability being added to the digital tool arsenal which businesses must employ to compete and win. These tools have evolved rapidly from segmented IT functions into the central nervous systems controlling the most vital assets and systems in all sectors of the economy, both private and public. Highly sophisticated AI tools clearly magnify cyber-risk. In addition, they also introduce new, much more complicated risks which are perhaps more consequential than cyber-risk.
Among the many examples of governance issues around AI are the introduction of biases, unintentional violation of laws and regulations, data exfiltration and erroneous decision making. The growing complexity and ever-changing persistent nature of AI and cyber-risk is daunting, seemingly overwhelming, and hard to understand. Boards are on the defense dealing with digital systems oversight.
In addition, the rapid rise and technical complexity of risks associated with digital tools is widening governance gaps between the Board and risk managers. Digital risk transcends typical business risk. Defensive measures employed by risk experts such as compliance, risk assessments, and enhanced disclosures, etc. are all vitally important, but alone do not constitute acceptable governance. Unfortunately, they are often viewed as “check- the-box” solutions to complex problems and communicated in the boardroom using technical language which lacks the business context boards need and should demand. Despite this deficiency however, board members often derive false comfort accepting these measures as meeting their governance obligation. Instead, boards need to develop a contextual understanding of digital risk. This requires understanding the systems being governed and establishing systems based digital risk frameworks, policies, and procedures to govern them. Accomplishing this requires organizational, educational, and cultural changes to your enterprise.
This post provides context on Organizational, Educational and Cultural aspects of governance in this new operating environment.
Organizational: Reorganize your Enterprise Risk and Digital Systems Management and Governance Structure. Key considerations/recommendations:
- Stand up an enterprise risk management (ERM) and digital risk organization to fit the size of your enterprise. One size does not fit all. Smaller companies may engage CISOs-as-a-Service while large organizations may employ Chief Risk Officers (CRO), Chief Information Officers (CIO), Chief Information Security Officers (CISOs), and Business Information Security Officer (BISO’s), etc.
- Given the magnitude and growing complexity of digital systems risk, consider establishing a “Chief Systems Officer” (CSO), or equivalent position, with responsibility and authority over all digitalsystems. The complexity of digital tools requires careful delegation of responsibilities, authorities, and access controls. The CSO must have:
- Clear authority over IT, OT, legal, internal audit, compliance, finance, HR, etc. to the extent these functions impact enterprise-wide use of digital systems.
- Independent reporting channel to executive leadership.
- Role as peer to C-Suite executives.
- Establish an internal Digital Risk Committee (DRC) led by the CSO to include leaders of all functionalareas of the enterprise. This committee will be tasked with managing digital risk and makingrecommendations to the board of directors.
- Establish a Chartered Risk Committee of the board with a mandate to oversee digital risk. Add digital systems expertise to the board. This committee would interact with the CSO and DRC on a periodicand “as needed” basis. Be mindful that a separate committee does not relieve the responsibility of the full board for risk oversight.
- Establish systems based ERM and digital risk frameworks based upon DRC recommendations. See AppendixA. These frameworks will evolve as digital systems evolve and as the education process within the enterprise matures.
Educational: Learn to Contextualize Digital Risk as a Systemic Risk. Key considerations/recommendations:
- Digital risk is a form of systemic risk, which can only be dealt with through a contextual understanding of the underlying system and sub-systems. Without this, the application of risk protection and mitigation methods lack context and can be both wasteful and suboptimal. All private and public enterprises can and should be defined within a systems context i.e., “Enterprise-as-a-System”. The EAS is a regularly interacting and interdependent group of elements and subsystems which comprise the operation of the enterprise. EAS elements include assets, processes and the people who interact with one anotherboth internally and externally. Some elements are more valuable than others. See Appendix B.
- Develop governance over the EAS through a four-phase process:
- Phase 1: Task the CSO and the DRC to produce a high-level business process map of the EAS for the board which identifies, and describes system elements, their importance and how they interact with one other. Describe the digital treat landscape of the EAS. This should bepresented using plain English, not technical jargon. Use outside advisors as necessary.
- Phase 2: Conduct a more detailed Business Process Analysis-summarized for the board-detailed for the CSO team. This analysis breaks down the larger elements identified in Phase 1 into an array of smaller elements, thereby fostering a better understanding of the overall process defining the EAS. This leads to a better contextual understanding of the relative importance of your assets and enables better digital risk mitigation investment decisions.
- Phase 3: With the benefit of context established in Phase 1 and 2, conduct a Control/Framework analysis identifying, assessing, and determining the efficacy of digital risk mitigation tools and control activities. Redesign the EAS to reduce the threat landscape and improve control efficiency. Add or reduce the use of digital risk mitigation tools to produce optimal results. Develop a risk appetite defining the risks the enterprise is prepared to accept in pursuit of value.
- Phase 4: The Board and CCO team now have a more complete picture of the digital risk posed to the EAS using language and terms understood by all. It should be reevaluated periodically and episodically when changes are introduced such as new digital systems, changes to the business, M&A events, etc.
Cultural Aspects: Stress the Importance of Shared Responsibility for Managing Digital Risk. Key considerations/recommendations:
- People are the most important component of the EAS. Organizational and educational steps outlined above will signal the importance of digital risk to the entire enterprise. Elevate the mitigation and control of digital risk from an IT function to a responsibility shared by all constituents.
- Develop an enterprise-wide training program with frequent short periodic training episodes which do not overburden employees.
- Communicate within your enterprise emerging threats to digital systems and actual incidents experienced by the enterprise.
- Market within your enterprise the importance of controlling digital risk and reward good behavior.
Concluding Considerations
Effective digital risk governance requires Boards to demand organizational changes necessary to manage and control complex digital systems, educational changes to develop a common contextual “system”understanding amongst the board and risk experts, and cultural changes to imprint upon the organization theimportance of a shared responsibility for managing digital risk. The alternative is to remain reactive with unknown consequences.
There are no “check-the-box” solutions for digital-risk governance. Without contextual understanding of the Enterprise-as-a System, digital-risk governance and mitigation is akin to throwing darts blindfolded.
Resource/Appendix A: Governance Frameworks
Enterprise Risk Management (ERM)
Risk governance begins with an overarching Enterprise Risk Management (ERM) framework designed to identify and evaluate enterprise threats and opportunities and manage these risks according to the organization’s risk tolerance. One size does not fit all. A good starting point is principles published by the Committee of Sponsoring Organizations of the Treadway Commission (COSO) whose five components ofrisk management include:
- Governance and culture
- Strategy and objective-setting
- Performance
- Review and revision
- Information, communication, and reporting
Cybersecurity Governance Framework
A subset of ERM is the familiar cybersecurity governance framework. Once again, one size does not fit all. However, a good starting point is the NIST framework which includes the following elements:
- Identify: Develop an organizational understanding to manage cybersecurity risks tosystems, assets, data and capabilities.
- Protect: Develop and implement the appropriate safeguards to ensure delivery of services.
- Detect: Develop and implement the appropriate activities to identify the occurrence of a cybersecurity event.
- Respond: Develop and implement the appropriate activities to take action regarding adetected cybersecurity event.
- Recover: Develop and implement the appropriate activities to maintain plans for resilienceand to restore any capabilities or services that were impaired due to a cybersecurity event.
Detailed policies and procedures underlie each element. The NIST cybersecurity framework is currently under review to add a GOVERNANCE element.
Resource/Appendix B: AI Governance
As society grapples with AI, we can expect rapidly changing regulations and governmental requirements. To keep pace with these changes, enterprises must take immediate action to establish AI frameworks, policies, and procedures to control its use. Monitor and anticipate changes as AI evolves. Human control of these policies and procedures is key to optimizing the benefits of AI’s use while minimizing its risks. Include the following categories in your AI Framework:
- Governance & Ethics: Require human oversight and usage authorization on a “need to use” basisonly. Incorporate the organization’s values and ethics into designing and implementing of AI tools. Require external AI tools to meet these requirements.
- Privacy: Incorporate data privacy rules and regulations into the design and implementation of AI tools.
- Bias Control: Strive to ensure that data leveraged for AI tools and the tools themselves, bothinternal and external, are as unbiased as possible.
- Consistent Output: Train AI tools to produce consistent results.
- Explainable: Require users of AI tools to understand the input and explain their output.
- Accountable: Clearly define ownership and access to each AI tool, its inputs, and outputs. Develop an inventory of approved AI tools and capture, maintain, and retain an audit trail of key inputs &updates.
- Secure: Protect the enterprise and AI tools from cyber and physical attacks.
- Regulatory Compliance: Factor applicable regulations into designing and implementing AItools. Routinely monitor and review regulatory changes.
- Education: Develop an enterprise-wide education program to instill awareness, benefits and risks associated with AI. Openly reward positive good behavior and create a culture of sharedresponsibility throughout the enterprise.
For more on AI governance see:
About the Author
Rod Hackman
Rod Hackman is certified by the National Association of Corporate Directors (NACD) in Cyber-Risk Oversight and certified as a Boardroom Certified Qualified Technical Expert ("QTE"). Mr. Hackman has extensive experience heading the cybersecurity oversight function for an NYSE company. His career has been dedicated to capital formation, M&A, corporate development, and the creation of shareholder value as an entrepreneur and advisor. He is a former member of several public and private Boards of Directors and has served as lead director and as the head or member of all chartered committees. As a former Naval nuclear engineer, Mr. Hackman understands the importance of protecting and building resilience into complex digital business ecosystems.
Subscribe to OODA Daily Pulse
The OODA Daily Pulse Report provides a detailed summary of the top cybersecurity, technology, and global risk stories of the day.