CISOs can only know the performance and maturity of their security program by actively measuring it themselves; after all, to measure is to know. With proactive measurement, CISOs will confirm how well their security program performs, better understand its preparedness against relevant threats, and highlight gaps that require improvement.
However, CISOs aren’t typically measuring their security program proactively or methodically to understand their current security program. Rather, they rely on ad hoc inputs such as IT audits, pentest results, one-time security assessments, risk register analysis, and a general understanding of their program.
Why CISOs need better-measured insight
CISOs are under increased pressure to not only secure the organization but to do it in a demonstrated manner. This increased pressure comes from end customers demanding secure products and services, as well as the business, the board, and the regulators.
Among new regulations with higher security demands and an increased burden to demonstrate security are the following:
The U.S. SEC’s revised cybersecurity incident disclosure provision, which requires that material cybersecurity incidents or breaches be disclosed to the SEC within four business days after an organization determines the incident to be material
The European Union’s Digital Operational Resilience Act (DORA), which requires strengthening the security of financial entities in the EU
The EU’s NIS2 directive, which introduces stricter cybersecurity requirements for organizations that provide essential services to the EU’s energy, transport, banking, healthcare, and digital infrastructure sectors
This increased pressure means that CISOs need to be more self-aware of their current security state and that an assessment of the current security state must be based on measurements, metrics, and facts.
Defining and describing the security program
Before a security program can be measured, it must be defined and described. Most security programs are based on a standard framework such as the U.S. Department of Commerce’s NIST Cybersecurity Framework (CSF), or ISO27001, and customized to align with the business, the technical and IT landscape, and organizational specifics related to the organization.
IDC, 2024
Maturity levels should be applied to the individual security program processes. This typically consists of defining four or five levels. It’s important to clearly define and describe each level of maturity to ensure clear and consistent meaning of the various maturity levels across the security program.
There’s no one-size-fits-all security program. It must adapt to the organization and its size, potential losses, acceptable risk levels, regulatory requirements, nature of the business, the specific IT landscape, level of in-house development versus off-the-shelf solutions, and more.
Defining and describing the security risk appetite
A security program does not have to achieve perfect security. In fact, achieving perfect security is impossible. Rather, a security program should aim to achieve sufficient security and reduce risk to acceptable levels to achieve the organization’s overall business goals.
To do so, the organization and the security program should define its security risk appetite. This typically occurs within a risk management program where the risk appetite for other disciplines is also included. If this is not the case, it can occur within the security program.
A risk appetite statement provides a high-level description of acceptable risk. It should be worded in such a way that technical and non-technical colleagues can understand it. Some organizations have an ambitious risk appetite (they are very risk averse), whereas other organizations have a more tolerant risk appetite.
The challenging part is that the risk appetite should be understood and translated to the security program, the technical threats and risks, the security processes, and in terms of the maturity levels.
Methods to measure performance and understand the current security state
The goal is to be able to assign a maturity level to each security program process. This must be done as accurately as possible using quantitative and qualitative measurements. This can be achieved using the methods described below.
Answering standardized questions with weighted scoring: Standardized questions can cover all the required topics in a security process (i.e., people, processes, and technology). The more answers responded to positively, the higher the score, contributing to a higher overall maturity rating.
Clearly defined metrics with target thresholds: Using performance metrics allows for an unbiased reading of the performance from a security program process. Metrics should be based on the key objectives of a security process. Ideally, these performance metrics already exist for security processes. However, if they do not exist, they can be introduced.
Interviewing security program contributors: Interviews bring an improved insight into the quality aspects of a security program process. Security process owners can inform the interviewer (typically a security and risk professional) of the big picture within a security process, explain the challenges faced (which can be translated to gaps), and the ongoing activities to improve the process. Interviews provide context that standardized questions and performance metrics may not identify.
Interview approach:During the interview, walk through the security process end-to-end with the process owner.Discuss the standardized questions and performance metrics to get more context and understanding of the answers given.
Ask the owner what their gaps are and how improvements can be made.
Use of existing information from other security-related activities: This can include activities such as penetration tests, IT audits from internal and external auditors, ad hoc security assessments, and responsible disclosures. The security organization is probably performing many activities already that can be used as data points for measuring security performance. It is wise to reuse as many activities and results as possible and to quantify the results where possible.
The results from operational security dashboards within the security program: Typically, security programs have dashboards with various operational metrics. The collection of security dashboards can be used as overall input to measure the performance of the security program. Further, the lack of operational dashboards will also be an indicator.
Comparisons with peer organizations: Active comparisons with peers are typically not made and are certainly not based on data and performance measurements. However, the security community is small enough that security professionals are often aware of practices used by peers. Therefore, this knowledge can be used to identify security processes lagging behind peers.
Developing a security road map
Once a current state has been established using maturity levels of the security processes, you can develop a security road map.
The “starting” point for the road map should be well-understood (given the understanding of the current state). The road map should indicate how teams can build from the starting point and reach their target maturity (this does not necessarily mean achieving a state of perfection).
The standardized questions and the performance metrics that are failing should be addressed within the road map.
Priority should be given to road map activities that have the biggest impact on overall risk reduction for the organization as quantified by standardized questions and performance metrics. Typically, this equates to security processes with the lowest maturity and the biggest gaps. However, some security processes will have a bigger overall impact on risk reduction than others (for example, improving vulnerability management may have more impact than improving security policies and governance).
The central planning and road map team as well as the individual security process owners should be involved in developing a road map. The central planning team will have good knowledge of the bigger picture, and the individual security process owners will have the best understanding of their process.
The road map should be developed in cooperation with key stakeholders and delivery teams inside and outside of the security department. Many of the activities of the road map will rely heavily on those stakeholders and delivery teams. The CIO and IT department, for instance, will have a major part to play in road map activities.
Learn more about IDC’s research for technology leaders.
International Data Corporation (IDC) is the premier global provider of market intelligence, advisory services, and events for the technology markets. IDC is a wholly owned subsidiary of International Data Group (IDG Inc.), the world’s leading tech media, data, and marketing services company. Recently voted Analyst Firm of the Year for the third consecutive time, IDC’s Technology Leader Solutions provide you with expert guidance backed by our industry-leading research and advisory services, robust leadership and development programs, and best-in-class benchmarking and sourcing intelligence data from the industry’s most experienced advisors. Contact us today to learn more.
Nick Kirtley is an adjunct research advisor with IDC’s IT Executive Programs (IEP). He is a senior security professional with broad experience from many consultancy engagements and internally within various security (CISO) departments. Nick has expert knowledge in topics such as security program management (and measuring security maturity), cloud security, DevOps security, vulnerability management, and threat modeling.