The Metric Library: Practical Security Metrics for Real Dashboards¶
"Build my cyber security dashboard, and put metrics on the board" - A statement I have heard all too often, and when you go into the design phase to choose the metrics, there are blank stares all around the room. We want to measure something, but we don't know what.
One common response is, Just use your framework! I once faced that exact scenario... I have been in a situation a few years ago where my manager asked me to develop a number of cyber security metrics based off the ISO 27001 framework. Easy right? Turns out, not really. Maybe I'm a bit more critical of ISO27001 these days, as it is way too vague and broad to be used as a metrics library. Still, something had to be done. I helped the team to develop a number of metrics, went through a couple of audits, got some feedback, and eventually figured out that metrics are more than meets the eye.
What makes a good metric?¶
You may have heard the term KPI (key performance indicator), or KRI (key risk indicator). Having a number on a screen with some graphs doesn't mean anything if the person looking at it cannot determine what they're supposed to do with that data.
A metric is typically a score that indicates a performance, like the % of servers patched in the last 30 days
. What I like about this metric is that it is actionable - the technical team can see the list of servers, and they can initiate a patching process, and by tomorrow, that score will improve.
Other metrics are a bit more vague, like # of users that clicked on the phishing campaign link
. What I don't like about the metric is that it is not clear if we are measuring the effectiveness of our training campaign being consumed by the audience, or if it is indicating a potential risk because my users are just clicking things they should not do.
The same with # of security incidents raised in the last 30 days
. It can measure both adoption of the process (higher the number is better), or, it can indicate a risk that more issues are being uncovered (higher is bad).
Sometimes you do need this indicator to demonstrate a specific risk or control that you are implementing. It does however come back to the story. Why are you measuring it?
Tell the story¶
Every metric needs to tell a story. The team responsible for patching vulnerabilities (as my favourite example) have a few stories they need to tell, like
- What do I need to patch today? Give me the list of work items I need to focus on today.
- How well are we performing? Are we patching quick enough within the acceptable risk tolerances of the company?
- Where is the biggest risk? Which systems should be patched first?
I have seen teams simply measure the number of CVEs and plot that over time. I'll agree, seeing the number of open issues over time has a certain level of value to it, however if you suddenly have an increase in your environment's size, and add a bunch more systems, say through an acquisition, that number will suddenly jump significantly.
While it is technically true that the number of vulnerabilities have increased in your environment, it distorts the story you are trying to tell to your team and your management.
So now what?¶
I started collecting a number of metrics that I believe are useful within an organisation, and these metrics can serve as a starting point as you embark on your continuous assurance journey. You can head on over to https://www.metricslibrary.net/ to take a look at the current library. The goal is provide you with some inspiration, to pick a metric that you may find useful in your own journey.
You can contribute¶
This library is just the start. Being hosted in Github, you can submit a pull request to have your own metric added. Each metric is a simple yaml
file, so it's easy to maintain, and easy to track.
Not sure if it should be a metric? Let's have a discussion about the proposed metric idea, and work towards it.
Conclusion¶
There are other catalogs out there. The The Continuous Audit Metrics Catalog is also good, and I would encourage you to check it out. What I didn't like about it, is the lack of updates. This needs to be a living, breathing document, updated frequently, with code examples to show you how to build this metric. (Yes, the github repo does show you how to extract the data, and run the metric - but that's another story).
Do check out the The Metrics Manifesto: Confronting Security with Data book on Amazon if you are interested in learning more about cyber metrics.