4 CRR 4.0 Self Assessment-NIST CSF v1.1 Crosswalk-April 2020

Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

Survey

SANS 2023 SOC Survey


Written by Chris Crowley, Barbara Filkins, and John Pescatore
June 2023

©2023 SANS™ Institute


Executive Summary
Welcome to the 2023 SANS Institute SOC Survey. In this, our seventh annual survey, we
added many questions but didn’t really take any away. Our new areas of focus include
operational threat hunting, threat intelligence, data ingestion into the SIEM, and SOAR, as
well as more detailed questions relevant to staff hiring and retention. Thank you again to
the respondents who generously spent their time, a mean of 59 minutes (Q5, n = 641) based
on the Qualtrics reported duration, to answer our barrage of questions.

The lead author (Crowley) has heard many times that just taking the survey is good for SOC
staff and managers, because it is challenging and thought-provoking. If you’re reading this
and it provides value, please be sure to take the survey in 2024! We’re already planning
enhancements and updates. We’re also hoping to hear from you what you’d like to read
about in the future. If you have analysis that you’d like to perform, the deidentified raw
data set and a Jupyter notebook (Python) is available for download and analysis at
https://soc-survey.com. Among this year’s top findings:

• More than 75% of respondents detected incidents before external notification, 9% via
proactive threat hunting (Q3.31, n = 327).

• 84% of SOCs collect and expose metrics, including these top three:

- Quantity of incidents

- Time from detect to eradicate

- Ratio of incidents from known/unknown vulnerabilities

• Continual tuning of SOAR by skilled analysts is needed to obtain value—SOAR as a


work style, not throwing a switch.

• SOAR work style increases effectiveness more than it reduces staffing needs.

• Monthly review of SIEM data ingestion proves valuable to vet sources.

• SOC funding follows a traditional IT model: SOC budget requests go up, allocations
come back down.

• SOC outsourcing tends to be pen-testing (and variants) and forensics, whereas


in-house tends to be security system architecture, engineering, planning, and
administration.

Figure 1 on the next page provides a snapshot of the demographics for the respondents to
the 2023 survey.

SANS 2023 SOC Survey 2


Top 4 Industries Represented Organizational Size
Small
Cybersecurity (Up to 1,000)

Small/Medium
(1,001–5,000)
Technology
Medium
(5,001–15,000)
Banking and
finance Medium/Large
(15,001–50,000)
Government Large
(More than 50,000)
Each gear represents 10 respondents.
Each building represents 25 respondents.

Operations and Headquarters Top 4 Roles Represented

Ops: 190 SOC analyst


HQ: 33 Ops: 200
HQ: 48
Ops: 265
HQ: 113 Security administrator/
Security analyst

SOC manager
Ops: 149 or director
Ops: 432
HQ: 338 HQ: 28

Ops: 123 Security manager


Ops: 147
Ops: 107 HQ: 25 or director
HQ: 13
HQ: 43
Each person represents 10 respondents.

Figure 1. SOC Survey Respondent


Demographics
What We Learned in 2023
We included several questions this year that weren’t present in previous surveys.

We asked about visibility into data and ingestion choices made for data into SIEM. Everything (Q3.5,
171/600, 28.5%) and some selectiveness based on risk (Q3.5, 169/600, 28.2%) were top explanations,
with a monthly review (Q3.7, 105/239, 43.9%) being the most common frequency for those who said
they reviewed ingestion (Q3.6, 256/597, 42.9%) on a periodic basis.

Aligned to VERIS structure of detection sources, we asked respondents to identify the ranking of
incident discovery. A little over two-thirds of respondents (Q3.31, 246/327, 68.3%) indicated that
monitoring/alerting was most frequently responsible for detection. See Figure 2 on the next page.

SANS 2023 SOC Survey 3


Rank the following techniques for incident discovery from most to least frequent.

1 2 3 4 5

100%
85.3%
80%
68.3%
60%

39.4% 41.9%
40% 32.2%
26.9% 25.0% 28.1%
21.1%
20% 16.1% 16.9% 13.6%
9.2% 7.5%
4.2% 2.2% 2.2% 1.4% 1.9% 5.0% 2.2% 0.8% 0.6% 1.4% 2.8%
0%
Monitoring/alerting Hunting User reported Third party/external notification Other
(e.g., law enforcement)

Figure 2. Ranked Incident


We asked about SOAR use, because it is now a technology fixture in
Discovery (Q3.31, n = 327)
SOCs. (Next year, we’ll introduce a similar line of questioning for use of
AI/ML.) It seems that people are taking SOAR as an ongoing change and
adjustment project (SOAR as a work style), and they most commonly
(Q3.33, 16/46, 34.8%) allow the users/analysts of the SOAR to tune as
they go. Important to note, the people who got to answer the question
on how they tune
How do you approach your SOAR update and tuning needs? Select the best option.
the SOAR were only
those respondents We set up the SOAR initially and have little 17.4%
change since initial deployment.
who indicated (Q3.32, We infrequently change the SOAR workflows. 15.2%
48/398, 12.1%) that We frequently change the SOAR workflows, the 34.8%
analysts/SOAR users implement changes.
SOAR was their We frequently change the SOAR workflows, we have dedicated staff to 26.1%
implement changes who aren’t typically using the SOAR as an analyst.
primary method for
Unknown 4.3%
event data correlation
Other 2.2%
and analysis. See
0% 10% 20% 30%
Figure 3.
Figure 3. SOAR Updating and Tuning
We included a number of questions on what SOC managers focus on
(Q3.33, n = 46)
when hiring; look at the SOC staff section for more details.

Key Findings
“Do more with less” is a hallmark clarion call, trite and honest. There
are only limited resources in the organization, and SOC managers
who can show connections from increased investment in the SOC to
improvements in business-relevant metrics are in the best position to
benefit from that increased spending on cybersecurity. Nonetheless, in
the past 10 years, cybersecurity budgets have increased substantially.

SANS 2023 SOC Survey 4


There is still no universal equation for budgeting for an
How is funding allocated in your organization? Select the best option.
adequate SOC. Deriving estimates from overall IT spend
2.0%  anagement and SOC leads/managers
M
is a common practice in trade literature but carry the work together closely to decide how to
allocate funds for cybersecurity.
caveat that it’s no way to set a budget.1 Figure 4 shows
 anagement takes recommendations
M
how respondents allocate budgets. 13.0%
from SOC leads/managers, but
ultimately decides how to allocate
24.3% funds, sometimes against SOC
The most common (Q3.69, 126/300, 42%) answer was management’s recommendations.
that SOC management prepares budget input, and 18.7%
 anagement takes recommendations
M
from SOC leads/managers, but
then higher-level decision making allocates funding. frequently goes against SOC
management’s recommendations.
That seems rosy compared to those (39/300, 13%) who  anagement pays little heed to the
M
42.0%
recommendations of SOC managers
stated that budget decision makers pay little attention and allocates the cybersecurity budget
as they see fit.
to the SOC management’s recommendations.
Other
The budget shows minimal correlation to organization
Figure 4. How Funding Is Allocated
sector and size. We’ll explore this more in a later section. Most people want to see the (Q3.69, n = 300)
money, so see Figure 5 for the responses of what reported annual budgets are. Important
to note, the most common answer was: Unknown! (Q3.68, 68/307, 22.1%).

Metrics are regularly used in SOCs—only a small What is your estimated annual budget for new hardware, software
portion (Q3.47, 39/349, 11.2%) said “No” they don’t licensing and support, human capital, and any additional costs?

provide metrics. Of those 11%, we wonder how this


Unknown 22.1%
is possible, and come to the depressing conclusion
Less than $100,000 USD 9.8%
that the audience who would be receiving the metrics
$100,001–250,000 USD 11.1%
likely doesn’t care. Interesting to note, when industry
250,001–$500,000 USD 13.4%
vertical is cross referenced, government (Q2.2, n =
69/641) is the top industry that said “No” metrics (Q2.2 $500,001–$750,000 USD 11.1%

X Q3.47, n = 9/41). $750,001–$1,000,000 USD 8.5%

$1 million–$2 million USD 8.8%


The respondents who do use SOC metrics were
generally satisfied with their effectiveness. Only 23% $2 million–$4 million USD 6.2%

(Q3.48, 28/124, 22.6%) expressed being “not satisfied” $4 million–$8 million USD 3.9%

with current metrics. A later section will discuss $8 million–$16 million USD 2.6%
metrics in more detail. $16 million–$48 million USD 1.3%

To understand how SOCs can use metrics to move to Greater than $48M USD 1.3%
better performance, we asked if they have a method 0% 5% 10% 15% 20% 25%

for calculating the value the SOC provides. This is Figure 5. Estimated Annual Budget
(Q3.68, n = 307)
a tricky calculation, because it expresses the value of something not occurring. It’s no
surprise that people are trying, but there isn’t clarity or consistency in doing this, partially
because it isn’t easy. As a result, more than half (Q3.55, 184/327, 56.3%) of the responses
indicated people aren’t trying to calculate this.

1
https://www.gartner.com/newsroom/id/3539117

SANS 2023 SOC Survey 5


For those (Q3.55, 83/327, 25.4%) who said they are,
For the estimated incident with SOC vs. incident without SOC,
we asked what the result was. Most (Q3.56, 64/76 & select the best match for estimated relative handling
55/77) of the responses indicate there was a 50% or and loss costs on a per-incident basis.

less reduction in handling and incident impact cost. Handling Cost Incident Cost

Reducing time to detect/resolve/restore directly is the N/A or unknown reduction 5.3%


of handling cost 6.5%
second-most-common metric in use by SOCs, and that
35.5%
correlates directly to reducing the overall cost of an 10% reduction of handling cost
35.1%
incident and demonstrating SOC value and business- 43.4%
50% reduction of handling cost
relevant progress. See Figure 6. 29.9%

10.5%
To facilitate the estimate of reduction we’re asking 90% reduction of handling cost
18.2%
about in Q3.56, there’s typically a value assigned to Multi-fold (2x or more) 2.6%
reduction of handling cost 7.8%
assets: when those assets are challenged by an actual
Actually, handling cost is higher 2.6%
attacker, the SOC gets to claim a reduction due to its with the SOC than without it 2.6%
preparation (reduced handling costs) and ability to 0% 10% 20% 30% 40% 50%
intervene (reduced incident costs) to minimize damage. Figure 6. Estimated Handling and Incident
So, what’s the basis of the value claimed? We asked respondents if they Cost Reduction (Q3.56, n = 76,77)

have a cost per record. Most (Q3.53, 160/323, 49.5%) said no, but there’s a Have you calculated a “cost per record”
high (Q3.53, 63/323, 19.5%) percentage who don’t know if they have a cost per from an actual incident?
record or not. See Figure 7. We’ll delve deeper into these record types and
costs later in this paper.

It takes qualified people to run a SOC. This has been a consistently reported 19.5%
aspect for the past six years of the survey. Again this year, we asked many 31.0% Yes

survey questions related to staff and appropriate qualifications. But the most No
common question encountered in the authors’ experience related to SOC
Unknown
staff is “How many are required?” This is typically in the form of something 49.5%
like, “If [company] in [industry] has [number of employees], then how many
people are needed to staff the SOC?” The cynical author has started simply
answering, “around 25,” because in the survey data, the most common (Q3.58,
Figure 7. Cost per Record Based on
83/335, 24.8%) SOC size is between 11 and 25 staff. See Figure 8.
Incident Data (Q3.53, n = 323)

What is the total internal staffing level (i.e., all related positions) for your SOC, expressed in terms
of full-time equivalents (FTEs)? What is the number of FTEs specifically assigned to the management of your SOC systems,
not just to analysis of the data from your SOC systems?

Total Specific to SOC Systems Management


36.7%
35%

30%

25% 24.8%
21.8%
20%

15% 14.0%
12.5%
11.3%
10% 9.3%
6.6% 5.7%
5.1% 4.8% 5.1%
5% 3.6%
2.5%
0.9% 1.2%
0%
<1 (part-time) 1 2–10 11–25 26–100 101–1000 >1000 Unknown

Figure 8. Total Internal Staff Levels (Q3.58, n = 335)

SANS 2023 SOC Survey 6


If we only look at large
Team Size vs. Organization Size
(50,000 FTE or greater)
50,001–100,000 More than 100,000
organizations, the most
5.0
5.0
commonly reported size is
4.0
26–100 SOC staff (Q3.58 v 4.0
3.0 3.0
Org Size, n = 8/26) followed 3.0

closely by 11–15 (Q3.58 v Org 2.0 2.0 2.0


2.0
Size, n = 7/26) SOC staff. See 1.0
1.0 1.0 1.0 1.0 1.0

Figure 9. 0.0 0.0 0.0 0.0


0.0
<1 (part-time) 1 2–10 11–25 26–100 101–1000 >1000 Unknown

Figure 9. Number of SOC Staff by Organization Size

Key Challenges What is the greatest challenge (barrier) with regard to full utilization of your
SOC capabilities by the entire organization? Select the best option.
Now that we’ve identified some key elements
Lack of context related to
within the results, let’s address the key what we are seeing
16.0%

challenge identified by respondents in 2023. Lack of skilled staff 14.1%


“Lack of context related to what we are seeing” Lack of enterprise-wide visibility 13.7%
was the most popular (Q3.79, 50/313, 16%) Lack of automation and orchestration 12.8%
response. See Figure 10. It’s worth noting that
High staffing requirements 11.2%
“Lack of context” was almost at the bottom of
Lack of management support 8.3%
the list in last year’s survey. (See Figure 4 in the
Lack of processes or playbooks 7.3%
2022 Survey if you want to compare.2)
Silo mentality between 5.4%
security, IR, and operations
There are other challenges at roughly the same
Too many tools that are not integrated 4.8%
numbers. Lack of automation and orchestration
Too many alerts that we can’t look into 2.9%
(Q3.79, 40/313), lack of enterprise-wide visibility (lack of correlation between alerts)
Other 2.6%
(Q3.79, 43/313), and lack of skilled staff (Q3.79,
44/313) are also top challenges. Regulatory or legal requirements 1.0%
0% 5% 10% 15% 20%
Skilled security staff is needed to solve that
Figure 10. Key Challenges (Q3.79, n = 313)
“lack of context” problem, as well
as to make the “SOAR workstyle”
effective, but experienced security
analysts are still hard to find.
Lack of management support has
decreased as an obstacle, indicating
funding is available to make
progress on increasing context/
visibility and automation.

Key challenges are illustrated in a


word cloud in Figure 11.

Figure 11. Key Challenges Word Cloud (Q3.80, n = 138)

2
www.sans.org/white-papers/sans-2022-soc-survey

SANS 2023 SOC Survey 7


Expanded Content
Things get curiouser and curiouser as we delve deeper. The full set of
responses and an accompanying Jupyter notebook in Python to assist with
performing analysis is available from
Capabilities
https://soc-survey.com if you choose to do
some of your own analysis. Outsourced Both In-house

Alerting (triage and escalation) 93 127 274


Design/Development/ Compliance support 100 78 312
Implementation
Data protection 85 78 326
“Do you actually run a (cyber)security
Digital forensics 134 111 242
operations center (SOC)?” is a reasonable
Incident response 95 127 271
question to start from as we look into
Pen-testing 206 109 166
design, development, and implementation
Purple-teaming
topics. Or put another way, “How does 151 100 207

Red-teaming
the survey define a SOC?” The way this 194 85 189

survey characterizes a SOC is broad. It is Remediation 84 96 298

a cybersecurity operations center (SOC) Security administration 86 59 339

if an ongoing mission of an operational Security architecture and engineering


(of systems in your environment) 86 72 327
team to centralize cybersecurity activity is Security monitoring and detection 98 131 259
authorized and funded. Security road map and planning 63 90 328

A more detailed approach to assessing SOC architecture and engineering


96 100 292
(specific to the systems running your SOC)
“SOC or NOT?” is reviewing capabilities the SOC maturity self-assessment 121 102 260
team and outsourced partners perform. Security tool configuration,
integration, and deployment 77 115 291
This reveals a continuum of basic SOC to
Threat hunting 110 117 256
learning SOC with almost all capabilities.
Threat research 109 125 241
As has been consistent across years in the
Vulnerability assessments
SOC Survey, the respondents largely agree 84 120 284

Other
that the capabilities we inquire about are 67 47 54
0 100 200 300 400 500
done. Figure 12 shows a list of capabilities
Figure 12. Capabilities Performed
sorted on if they’re done, regardless
Sorted by Total (Q3.13, n = 545)
of whether they’re done internally,
outsourced, or both.

SANS 2023 SOC Survey 8


The survey doesn’t explore why people outsource,
Outsource Rank vs. Activity Rank
although we’ll speculate on this in later sections on
Total Outsource Total Rank
staffing and funding. But what we do know is that
20
SOCs outsource consistently. Taking the same data Other
20
from Figure 12, we see that the more commonly 19
Security administration
outsourced activities are variations on forensics, 10

threat intel, and penetration testing, which are less 18


Security road map and planning
14
commonly done activities for SOCs (see Figure 13),
Security architecture and engineering 17
as the lower ranks of activity performed are at the (of systems in your environment) 9

highest ranks of outsourcing. Data protection


16
4
Activities that are more likely to be outsourced are 15
Compliance support
also slightly less likely to be done overall. This could 3

be due to lack of budget, meaning those specialized 14


Remediation
16
items are simply not done at all. Or it could reflect
Security tool configuration, 13
the sentiment that forensics and pen testing are integration, and deployment 11

not a requirement and, hence, not done. SOC architecture and engineering 12
(specific to the systems running your SOC) 5
The authors see the categories more likely to be
11
Vulnerability assessments
outsourced—forensics, pen-testing, and threat 6

intelligence—as specializations that require Alerting (triage and escalation)


10
1
substantial training and experience but aren’t used
9
consistently within most SOCs. Of course, looking Incident response
2
at alerts, triaging them, and performing handling 8
SOC maturity self-assessment
12
requires knowledge, skills, and abilities (KSAs).
7
However, these categories are broader in scope of Threat hunting
13
KSAs, so it is more difficult to identify an outsource 6
Security monitoring and detection
partner where a transactional basis assures a 7

value proposition over leveraging more flexible 5


Threat research
17
internal staff.
4
Digital forensics
8
Hunting and threat intel are important capabilities
3
of the SOC, in the opinion of the authors. The SANS Purple-teaming
19
Institute has analyst papers on these areas, but we 2
Red-teaming
included similar questions here to see what the 18

SOC respondents had to say about their operational 1


Pen-testing
15
performance of these activities. 0 5 10 15 20

Figure 13. Capabilities Performed,


Ranked and Sorted by Outsource Rank
(Q3.13, n = 545)

SANS 2023 SOC Survey 9


First, we consider threat hunting the investigation of available data, presuming that other
alerting-based mechanisms have failed. We do not consider looking in logs for something
specifically known to be malicious (such as a known malicious domain name) as hunting.
Others might include a malicious domain search within the grouping of threat hunting.3
Although valuable, we define this as historical/retroactive analysis. Threat hunting entails
a distinctive attribute that we don’t have a specific value to match. We’re looking for
outliers and new indicators by identifying objectionable behavior or patterns. These refined
semantic distinctions ultimately have limited value because we need to perform both of
these approaches to identify problems. The distinction is drawn to implore SOCs to do the
more difficult form of hunting (looking for the unknown) in addition to historical analysis.

Of course, we wondered how our respondents’ SOCs


Is your threat hunting primarily historical,
performed threat hunting, so we asked them. Figure 14 analyst-driven, or technology-driven?
depicts partial automation as the most common answer Manual Partly Automated Fully Automated
across all of the categories we inquired about: historical,
Historical (We search 35.4%
analyst-driven, and technology-driven hunting. Manual isn’t for new indicators in 44.2%
historical data only.)
very far behind in all categories. 10.0%
Analyst-driven (The hunting 39.5%
Where responses don’t make sense in the survey, we analyst formulates hypothesis
and determines most effective
41.3%
think it is appropriate to address them with some approach to find it in the data.) 9.8%
potential explanation. In this case, the 48 responses (Q3.15, Technology (Technologies 31.2%
specifically intended for
48/491) indicating that analyst-driven threat hunting is hunting are deployed, no
40.5%
analyst interaction required.) 15.7%
fully automated is baffling to the authors. Bots; human
0% 10% 20% 30% 40% 50%
respondents without a clue, comprehension, care, or
Figure 14. Threat Hunting
consideration; considering “fully automated” as a query or script created by a person as Categories (Q3.15, n = 491)
analyst-driven threat hunting; and a linguistic misunderstanding of the question options
are the hypotheses we’ve articulated but not assessed for this oddity. See Figure 14.

Threat intelligence is the study of adversaries with the


How would you characterize your threat intelligence activities
intention of optimizing the use of scarce resources. We use in your SOC? Select all that apply.
it to improve our defensive posture, identification capability,
In-house Outsourced Both
and post-detection response readiness and capability.
41.1%
There’s an abundance of threat intel data out there. So, Feed consumption 23.9%
26.2%
we asked how people use it in an actionable way in a SOC.
42.7%
We separated feed consumption, production of threat
Production 23.3%
intelligence, and attribution threat intel categories for the 22.3%
question (Q3.18, n = 309). Figure 15 shows that internal only 36.6%
(in house: 127, 132, 113) activities outnumber either purely Attribution 25.6%
23.9%
outsourced or mixed internal/outsourced when taken
0% 10% 20% 30% 40% 50%
distinctly. But, adding outsourced and both results in a higher
Figure 15. Threat Intel Activities
aggregate number in all categories (out+both: 155, 141, 153). So, we assess that more people (Q3.18, n = 309)
outsource threat intel entirely or partially (out+both) than do it entirely on their own
(in). From the same numbers, it’s clear that more people do it internally in some respect
(in+both: 208, 201, 187) than do it externally in some respect (out+both). See Figure 15.

3
www.sans.org/security-resources/glossary-of-terms

SANS 2023 SOC Survey 10


SOC Architecture
We consider something a SOC based on mission and capabilities, not architecture.
But the architecture of SOCs is still worth exploration. In our survey questions, we
consider the physical locations, staffing arrangements, and what is protected to be
part of the architecture.

Centralized, all in one physical location SOCs might still allow work from home,
for example. But the physical work location where the staff “sit” is still one
geographic region. The other aspect of this centralized and distributed notion is
where the data used by analysts to view alerts resides. The
centralization of all data into a SIEM from cloud resources Select the option that best reflects the size and structure
of your SOC environment.
doesn’t always make sense from a value proposition.
So, where the people are and where the data is are not Single, central SOC 48.7%
necessarily the same. Related, some jurisdictions and Multiple, hierarchical SOCs 19.9%
industry verticals prefer (or are legally obligated) to Multiple, unorganized SOCs 13.6%
keep data within the country or within organizationally
Multiple, standalone/siloed SOCs 8.3%
owned systems. This makes architecting the SOC systems Multiple, parallel, redundant 5.9%
SOCs looking at same data
complicated. What’s more, SOC staff may have strong
Other 3.6%
opinions on working together as a team—meaning being
0% 10% 20% 30% 40% 50%
together in one place. If scarcely available staff insist on a
Figure 16. Structure of SOC
specific arrangement, it is likely to manifest in the SOC architecture. (Q3.8, n = 557)
In Figure 16 we see the continuation (in this
How is your SOC infrastructure (i.e., your SOC architecture) deployed
survey’s history) of the single, central SOC today, and how might it change over the next 12 months?
dominating (Q3.8, 271/557, 48.7%) the responses. Select the best choice for each. If you select the same answer
for Present and Future, SANS will assume no change.
In Figure 17 we see the same signaling we’ve
Current Next 12 Months
seen for the past three years. “Cloud-based
19.8%
services” is projected to be the architecture Cloud-based SOC services
24.7%
next year. (Q3.9, 130/527). But, based on the 9.2%
Partial SOCs in regional locations
5.9%
percentage of “current” in 2021 (12.9%), 2022
(15.2%), and now in 2023 (19.8%), we’re seeing 11.0%
Full SOCs distributed regionally
12.9%
only a modest change represented in the
19.0%
Centralized and distributed regionally
responses in the survey. We don’t track individual 18.8%

responses from year to year so we don’t know if Centralized into a single SOC
32.1%
32.6%
people are saying they will change but not doing
7.0%
it, or if the respondent composition year after Informal SOC, no defined architecture
3.2%
year has the same forward-looking thought but 2.0%
Other
1.9%
doesn’t achieve the change.
0% 10% 20% 30%

Figure 17. SOC Infrastructure


(Q3.9, n = 546, 527)

SANS 2023 SOC Survey 11


Another architectural attribute is whether the SOC
Does your SOC operate 24/7?
operates 24 hours a day, every day of the year.
Overwhelmingly, the answer is yes, with only 18% Yes, in-house only 31.3%

(Q3.23, 80/434, 18.4%) indicating they do not run 24 Yes, outsourced only 24.2%
hours a day, as shown in Figure 18. The architectural Yes, mixed internal/outsourced 24.9%
decision of running non-stop drives quite a bit of
No 18.4%
outsourcing, with 49% of the overall answers (Q3.23,
Unknown 1.2%
213/434, 49.0%) and 61% of the yes answers (Q3.23,
0% 5% 10% 15% 20% 25% 30%
213/349, 61%) indicating outsourcing was used in
Figure 18. SOC 24/7 Operations
whole or in part to accomplish non-stop operations. (Q3.23, n = 434)
We’ll describe composition of staff and staff roles in a moment. First,
Do you allow SOC staff analysts
keeping with the architectural focus, we consider remote work for SOC to work remotely?
staff as an architectural attribute. When we dive into the staff section, we
will describe what factors enable individual employees to work remotely.
7.6%
Almost three-quarters of respondents (73%) say staff are allowed to work
remotely (Q3.24, 318/435, 73.1%). See Figure 19.
19.3% Yes
Necessarily, some of the respondents who said they work in a centralized
No
SOC also responded that the SOC allows remote work. So, we delved into this
set. Of the respondents (Q3.8, n = 271) who say they have a single central SOC, 73.1%
Unknown
58% (Q3.24, 157/271, 57.9%) indicate that remote work is allowed. See Figure 20.

The structure (Q3.8) of the SOC doesn’t appear to have a substantial


influence on whether SOC staff can work from home. See Figure 20, which
depicts SOC structure and whether staff are allowed to work from home. Figure 19. SOC Staff Remote Work
(Q3.24, n = 435)

SOC Staff
Do you allow SOC staff analysts to work remotely?
How many staff are there currently in the SOC, and is that the
Yes No Unknown
right number? This is an important question with multiple
attributes to explore. 157
Single, central SOC 33
Each SOC could probably do more or operate at higher quality 3
with more qualified people. The SOC is a space where adding 81
people risks detracting from performance despite added Multiple, hierarchical SOCs 21
6
expense. Most SOCs struggle to effectively incorporate new or
54
junior staff. They can only tolerate onboarding a small volume
Multiple, unorganized SOCs 17
of staff who need substantial on-the-job training to develop 14
the required knowledge, skills, and abilities. Why? It is the 25
Multiple, standalone/
opinion of the authors that: siloed SOCs
14
7
1. SOCs aren’t designed, built, or operated to address the
17
human capital cycles that actually occur; and Multiple, parallel, redundant
11
SOCs looking at same data
1
2. SOCs are chronically understaffed to the degree
8
that tasking those busy people to help address the Other 0
shortcoming is essentially loading on one more new 2
skillset (training others) for the SOC staff to try to master. 0 50 100 150

Figure 20. Remote Work by SOC Environment Structure


(Q3.8 vs Q3.24, n = 435)

SANS 2023 SOC Survey 12


In the key findings section, we
To your best estimate, how many of the following positions do you have on staff?
described the typical size of a
<1 1 2–10 11–25 26–100 101–1000 >1000 Unknown
SOC. Let’s look in more detail
9.8%
at the job roles of those staff. 11.9%
36.5%
We asked about specific roles, Junior analysts/interns
12.5%
8.9%
such as monitoring analysts, 2.7%
2.7%
systems administrators, and 3.6%
incident handlers. Then, we 5.3%
General purpose 12.2%
asked how many of these analysts (variable 41.2%
responsibility, do
14.8%
were on staff. Figure 21 doesn’t 7.7%
monitoring, IR, forensics, 1.8%
threat intel, etc...) 2.1%
account for the overall size of
3.6%
the team, just their reported 11.9%
10.4%
numbers for each role. Some 31.8%
Dedicated monitoring 15.1%
of these values are surprising— analysts 8.3%
3.0%
seeing someone report there 0.9%
4.7%
are more than 1,000 threat
13.4%
intel analysts, for example. 13.4%
27.6%
Dedicated incident 13.6%
responders 5.6%
5.6%
1.5%
5.3%
17.5%
11.3%
31.8%
Dedicated threat 6.8%
intelligence analysts 8.3%
4.7%
0.9%
4.7%
5.6%
10.4%
IT support staff 24.9%
(sysadmins, network
16.6%
14.5%
techs, database admins) 7.7%
2.7%
5.0%
0% 5% 10% 15% 20% 25% 30% 35% 40% 45%

Figure 21. Staff Role Count (Q3.59, n = 337)

SANS 2023 SOC Survey 13


Of course, SOCs want to understand
Select the best description of how you calculate per-analyst workload.
what the right number of analysts is.
What appears to be happening is that
 e base it on the ticketing data when
W
people are calculating workload based on 7.7% an analyst starts and closes a ticket

actual time worked on tickets. Responses 28.6%


 e use SIEM data to calculate how
W
19.6% many alerts are present and indicate
indicate that existing workload calculation how much time an analyst has to work
each ticket.
is used for staff count justification (see
 ur service level agreements dictate
O
Figure 22). More maturity in SOAR and how quickly we must review content,
and we allocate that much time per
overcoming the cited obstacles in visibility 44.1% analyst per shift to make a decision.
and context are needed to see reduction Other
in average analyst time/incident and
reduction of headcount need. Figure 22. Per-Analyst Workload (Q3.52, n = 311)
We hear a lot about staff hiring issues
and staff turnover. So, we asked questions What is the average employment duration for an employee
around what the average duration of in your SOC environment (how quickly does staff turn over)?
employment is. As shown in Figure 23, 37.6%

three years or fewer is most common. 35%


29.9%
Further, fewer than or equal to five years 30%

(Q3.60, 227/311, 73.0%) is the reported 25%


average tenure for about three-quarters 20%
of respondents. This is in line with overall 14.8%
15%
IT turnover and should be expected.4
10% 9.3%
5.5%
5% 2.9%
0%
1 year or less 1–3 years 3–5 years 5–10 years 10+ years Unknown

Figure 23. Average Employment Duration (Q3.60, n = 311)

4
www.inc.com/business-insider/tech-companies-employee-turnover-average-tenure-silicon-valley.html

SANS 2023 SOC Survey 14


We delved further into hiring
What is the one most important technical skill deficit when you hire staff for technical roles?
this year than in previous
surveys. We asked what hiring Information Systems and Network Security 10.7%

managers are looking for. We Threat Analysis 7.5%

asked about this in several Data Analysis 6.5%


attributes including technical Intelligence Analysis 6.2%
skills (see Figure 24) and non- Data Security 5.2%
technical skills (see Figure 25 on
Modeling and Simulation 5.2%
the next page). The top answers
Technology Fluency 4.5%
were “Information Systems and
Digital Forensics 4.2%
Network Security” and “Risk
Information Technology Assessment 4.2%
Management,” respectively.
Mathematical Reasoning 4.2%
These responses can be
Incident Management 3.9%
utilized by hiring managers and
prospective candidates alike. Computer Languages 3.2%

Network Management 3.2%

Software Testing and Evaluation 3.2%

Infrastructure Design 2.9%

System Administration 2.6%

Database Administration 2.3%

Operating Systems 2.3%

Enterprise Architecture 1.9%

Identity Management 1.9%

Operations Support 1.9%

Requirements Analysis 1.9%

Other 1.9%

Encryption 1.3%

Systems Integration 1.3%

Vulnerabilities Assessment 1.3%

Collection Operations 1.0%

Systems Testing and Evaluation 1.0%

Target Development 1.0%

Software Development 0.6%

Physical Device Security 0.3%

Telecommunications 0.3%
0% 2% 4% 6% 8% 10% 12%

Figure 24. Technical Skill Focus for Hiring


(Q3.64, n = 308)

SANS 2023 SOC Survey 15


The takeaway is that SOC
What is the single most important non-technical skill deficit you are trying to address
managers are looking for when you hire staff for organizational roles?
“T-shaped” analysts with
Risk Management 13.1%
deep skills in one or more
Business Acumen 12.1%
technical areas augmented
with broad communications, Knowledge Management 8.8%

risk management, and business Project Management 7.5%

knowledge. Because such Process Control 6.9%

analysts are in high demand, Business Continuity 6.5%


job satisfaction is key to reduce Contracting and Procurement 5.9%
turnover. Data Management 5.9%

When the workload is too Organizational Awareness 5.9%


high (and probably also when Asset and Inventory Management 5.6%
it is too low) analysts decide Education and Training Delivery 5.6%
they don’t want to be at the Education and Training
Curriculum Development
4.6%
organization. There are plenty of
Strategic Relationship Management 3.3%
reasons a person leaves a job.
Other 3.3%
So, we asked the question about
Law, Policy, and Ethics 2.6%
how to retain staff. This might
Data Privacy 2.0%
reduce the hiring to only those
Supply Chain Management 0.7%
new staff positions you’re able
0% 2% 4% 6% 8% 10% 12%
to secure. The most commonly
Figure 25. Non-Technical Skill Focus
cited retention method? “Career for Hiring (Q3.62, n = 306)
progression” (Q3.66, 93/314, 29.6%) topped the list. It seems
people employed within the SOC are seeing this as a journey
and are looking for personal development and meaning.
Meaningful work (Q3.66, 64/314, 20.4%)
What is the most effective method you have found to retain SOC employees?
was the second most cited response,
29.6%
barely eclipsing money. Money (Q3.66, 30%

63/314, 20.1%) works, too. See Figure 26. 25%


20.1% 20.4%
After hiring and retaining staff, there’s 20% 18.5%

a lot to explore about how they work. 15%

A current hot topic is the concept of 9.9%


10%
working from home. It has long been 5%
1.6%
these authors’ opinion that working from
0%
Money Training Career Shifting roles and Meaningful Other
home is viable for all SOC roles given progression responsibilities work
regularly
appropriate data protections.
Figure 26. Employee Retention (Q3.66, n = 314)

SANS 2023 SOC Survey 16


To illustrate the idea of remote work, monitoring a DOD Classified network could be
done using remote workers, if the transport networks for the protection were themselves
protected at the appropriate level. Some people balk at this, but would simultaneously
accept that there are networks that provide video conferencing, with people remotely
discussing the content stored on the computers under monitoring by the SOC. Of course,
the cost of securing the necessary transport and storage network of a SOC like this
becomes onerous at some point. So, organizations decide on boundaries where SOC staff
must be within a physical location that can be secured at the appropriate protection level.
Hence, most SOCs monitoring classified networks require on-premises staff.

Most readers of this document don’t need to meet the strict physical, technology, and
administrative requirements of a USDOD Secret (or higher) network. So, they’re left to
decide if the data can go to a SOC analyst’s home computer or not—usually without any
rigorous standard of quantitative assessment. We
suggest using the aforementioned “value of a record What factors are considered in determining whether a SOC
staff analyst can work remotely? Select all that apply.
calculation” as a start to this effort, but regrettably
can’t provide a simple equation to do this risk vs. Role 43.3%
expense calculation. Platforms securely support 43.3%
remote workforce
See Figure 27 describing the factors involved in Skill set 40.5%

allowing a SOC analyst to work remotely. Tied for first Individually negotiated 32.2%
(Q3.25, 125/289, 43.3%) were the role of the SOC staff, Seniority 31.5%
and if secure access to data is feasible. We presume 27.3%
Work ethics
that some SOCs deal with data that is considered “on
Other 10.7%
premises only,” and analysts supporting those SOC- 0% 10% 20% 30% 40% 50%
monitored systems aren’t allowed to work remotely. Figure 27. Remote Work Factors
There might also be a rationale for citing that the data doesn’t need to be on premises (Q3.25, n = 289)
per se, but remote access technology is not adequate for the security sensitivity. Most
SOCs would err on the side of caution within these parameters.

Caution for appropriate work–life balance and avoiding expectations of constant


availability is the counterargument to working from home. A certain way to drive
employees to a breaking point is to enable work from home, then foster an environment
that drives expectations that the person is always available.

Making sure the work–life balance is appropriate for the long term and adding tooling
to relieve analysts of needless and frustrating tedium are likely to give them a sense of
career progression, wherever they work.

SANS 2023 SOC Survey 17


Technology Technology GPA Ranking

tech-satisfy-host-exdr 2.88

Our technology question section tech-satisfy-net-email 2.86


tech-satisfy-log-endpoint-os 2.85
is very long and optional. Those
tech-satisfy-net-vpn 2.83
who made it this far in the survey
tech-satisfy-net-firewall-nextgen 2.83
(roughly half) opted to skip this
tech-satisfy-host-mps 2.81
part, with roughly 50% saying yes
tech-satisfy-analysis-siem 2.79
(Q3.37, 188/380, 49.5%). tech-satisfy-log-mgmt 2.74

Those who persisted were tech-satisfy-host-vulnremed 2.74

subject to an extensive set of tech-satisfy-log-dns 2.72


tech-satisfy-net-nidps 2.71
questions on how much they like
tech-satisfy-host-ransom 2.70
technology and how completely
tech-satisfy-net-webproxy 2.67
the technology is deployed.
tech-satisfy-net-firewall-egress 2.67
We first show the technology on tech-satisfy-host-conmon 2.67

a GPA-ranked basis. Figure 28 has tech-satisfy-net-ddos 2.66

the GPA-ranked tech based on tech-satisfy-net-firewall-ingress 2.64


tech-satisfy-log-endpoint-app 2.62
our respondents. Top of the list
tech-satisfy-host-behavanalysis 2.62
is host-based EXDR (Q3.39, GPA:
tech-satisfy-net-dns 2.61
2.89). Bottom is a tie between
tech-satisfy-net-firewall-webapp 2.61
network-based packet analysis tech-satisfy-net-sslintercept 2.55
and artificial intelligence/ tech-satisfy-analysis-siem-custom 2.53
machine learning (Q3.39, GPA: tech-satisfy-analysis-threatintel-platform 2.52
2.18). Note that no GPA was above tech-satisfy-host-euba 2.51
a C … unless we grade on a curve! tech-satisfy-analysis-threatintel-precursor 2.47
tech-satisfy-net-trafficmon 2.46
Next, we cross walk the
tech-satisfy-net-assetinventory 2.45
phases of deployment, and
tech-satisfy-net-segment 2.44
respondents’ satisfaction with tech-satisfy-analysis-threathunt 2.43
them. In the past two surveys, tech-satisfy-analysis-threatintel-opensource 2.42
we’ve presented a similar tech-satisfy-net-nac 2.40
picture, and the correlation tech-satisfy-net-inlinemalware 2.37

seems to hold that technology tech-satisfy-analysis-soar 2.37


tech-satisfy-analysis-asm 2.37
that reaches production has
tech-satisfy-host-dlp 2.37
higher satisfaction. Although
tech-satisfy-analysis-ediscovery 2.36
we do not claim this as a causal
tech-satisfy-other 2.35
relationship, it speaks to the
tech-satisfy-net-netflow 2.34
reality that the tech in use has tech-satisfy-analysis-risk 2.32
a higher score when it is fully tech-satisfy-host-appwhitelist 2.27
deployed. See Figure 29 on the tech-satisfy-net-deception 2.25
next page. tech-satisfy-net-fullpcap 2.25
tech-satisfy-analysis-frequency-network 2.20
tech-satisfy-analysis-aiml 2.17
tech-satisfy-net-packetanalysis 2.15
2.00 2.20 2.40 2.60 2.80 3.00

Figure 28. GPA Ranked Technology (Q3.39, n = 194)

SANS 2023 SOC Survey 18


Technology A B C D F GPA Total
Host: Vulnerability remediation 38 61 41 11 6 2.7 157
Net: Network intrusion detection system (IDS)/intrusion
40 54 42 15 4 2.7 155
prevention system (IPS)
Net: Next-generation firewall (NGF) 50 54 34 9 7 2.9 154
Host: Malware protection system (MPS) 44 56 40 10 5 2.8 155
Net: VPN (access protection and control) 53 51 34 14 5 2.8 157
Net: Email security (SWG and SEG) 54 49 36 10 6 2.9 155
Analysis: SIEM (security information and event manager) 47 49 41 15 3 2.8 155
Host: Endpoint or extended detection and response (EDR/XDR) 57 50 31 16 4 2.9 158
Log: Endpoint OS monitoring and logging 45 62 37 12 2 2.9 158
Log: Log management 43 58 36 13 6 2.8 156
Net: Ingress filtering 36 54 34 11 11 2.6 146
Host: Ransomware prevention 45 47 47 14 5 2.7 158
Net: Network segmentation 39 44 41 20 16 2.4 160
Net: Web application firewall (WAF) 38 57 34 16 11 2.6 156
Net: Web proxy 40 54 33 19 5 2.7 151
Net: DNS security/DNS firewall 34 57 39 12 9 2.6 151
Net: Network traffic monitoring 33 45 38 23 9 2.5 148
Log: DNS log monitoring 48 50 36 14 8 2.7 156
Net: DoS and DDoS protection 45 51 32 21 7 2.7 156
Analysis: Customized or tailored SIEM use-case monitoring 39 40 46 16 10 2.5 151
Log: Endpoint application log monitoring 44 51 33 19 11 2.6 158
Host: Continuous monitoring and assessment 41 52 33 18 6 2.7 150
Net: Asset discovery and inventory 35 43 46 13 14 2.5 151
Net: Egress filtering 35 59 37 13 6 2.7 150
Net: SSL/TLS traffic inspection 37 52 36 13 14 2.6 152
Analysis: Threat intelligence (open source, vendor provided) 29 43 46 19 10 2.4 147
Net: NetFlow analysis 29 43 38 22 14 2.3 146
Host: User behavior and entity monitoring 34 52 43 9 16 2.5 154
Net: Network Access Control (NAC) 31 41 50 13 14 2.4 149
Analysis: Attack surface management 27 48 42 16 14 2.4 147
Host: Behavioral analysis and detection 44 47 40 10 14 2.6 155
Host: Data loss prevention 27 45 53 11 17 2.4 153
Analysis: Threat hunting 32 48 41 16 14 2.5 151
Analysis: Threat intelligence platform (TIP) 34 47 38 17 10 2.5 146
Analysis: E-discovery (support legal requests for specific
32 41 39 20 15 2.4 147
information collection)
Net: Malware detonation device (inline malware destruction) 32 45 40 11 20 2.4 148
Analysis: External threat intelligence (for online precursors) 29 47 39 23 6 2.5 144
Host: Application whitelisting 23 49 50 14 18 2.3 154
Net: Full packet capture 34 39 36 21 22 2.3 152
Net: Packet analysis (other than full PCAP) 23 47 36 16 25 2.2 147
Analysis: digital asset risk analysis and assessment 26 44 40 18 16 2.3 144
Analysis: SOAR (Security Orchestration, Automation, Response) 34 35 47 13 17 2.4 146
Analysis: AI or machine learning 25 38 43 23 19 2.2 148
Analysis: Frequency analysis for network connections 28 37 38 21 21 2.2 145
Net: Deception technologies such as honey potting 25 44 45 13 21 2.3 148
Other (Please specify) 15 13 14 5 10 2.3 57

Figure 29. GPA-Ranked Technology (Q3.39, n = 194)

SANS 2023 SOC Survey 19


Related, staff is required
What are the three most important technologies/tools for your new hires to be familiar with?
to use the technology
Host: Endpoint or extended detection
27.1%
because, at least for now, and response (EDR/XDR)
Analysis: SIEM (security information & event manager) 26.5%
computers don’t install
Host: Vulnerability remediation 12.7%
or operate themselves Net: Network traffic monitoring 12.7%
or analyze the data they Host: Behavioral analysis and detection 11.8%
contain without human Host: Ransomware prevention 10.5%
Analysis: Threat hunting 10.5%
oversight. The top two
Log: Endpoint OS monitoring and logging 9.8%
technologies desired by Net: Packet analysis (other than full PCAP) 9.5%
hiring managers (by a Host: Malware protection system (MPS) 9.2%
substantial margin) were Log: Log management 9.2%
Net: Full packet capture 9.2%
SIEM Analysis (Q3.65,
Net: Web application firewall (WAF) 8.5%
81/306, 26.5%) and EDR/ Net: Network intrusion detection system
(IDS)/intrusion prevention system (IPS)
8.2%
XDR (Q3.65, 83/306, 27.1%)
Host: Continuous monitoring and assessment 7.8%
products. Host: Data loss prevention 7.5%

See Figure 30 for the big Analysis: Risk analysis and assessment 7.5%
Net: VPN (access protection and control) 7.2%
jump (more than double Analysis: Customized or tailored
SIEM use-case monitoring
7.2%
the next lower value) and
Analysis: SOAR (Security Orchestration,
Automation, Response)
7.2%
the ranking of the rest of
Net: Asset discovery and inventory 6.5%
the items.
Net: Next-generation firewall (NGF) 6.5%
In the long-form qualitative Net: DoS and DDoS protection 5.9%
Analysis: AI or machine learning 5.9%
responses, SOC managers’
Host: User behavior and entity monitoring 5.6%
most common need Net: DNS security/DNS firewall 5.6%
was analysts with broad Net: Email security (SWG and SEG) 4.2%
technical knowledge vs. Log: Endpoint application log monitoring 3.6%

individual product or Net: Web proxy 3.3%


Analysis: External threat intelligence
3.3%
technology experience. (for online precursors)
Net: Egress filtering 2.9%
The general feeling that an
Net: Deception technologies such as honey potting 2.9%
analyst who understood Analysis: Threat intelligence (open
source, vendor provided)
2.9%
both how business process
Log: DNS log monitoring 2.6%
flows worked and how Net: Network segmentation 2.6%
threats were likely to attack Analysis: Frequency analysis for
network connections
2.6%
them could quickly learn Analysis: Threat intelligence platform (TIP) 2.6%
how to use and extend SIEM Host: Application whitelisting 2.3%
Net: Network Access Control (NAC) 2.0%
and EDR/XDR management
Net: NetFlow analysis 1.6%
consoles and tools. Net: SSL/TLS traffic inspection 1.6%
Net: Ingress filtering 1.6%
Net: Malware detonation device
(inline malware destruction)
1.0%
Other 0.7%
Analysis: E-discovery (support legal requests
for specific information collection)
0.0%
0% 5% 10% 15% 20% 25%

Figure 30. Technology/Tools for New Hires (Q3.65, n = 306)

SANS 2023 SOC Survey 20


Does your SOC provide metrics that can be used in your reports and dashboards to
Evaluation gauge the ongoing status of and effectiveness of your SOC’s capabilities?

Yes, regularly to board of directors, and


As mentioned in the key findings, only a organization executives within and outside
36.7%
of cyber security management hierarchy
small portion (Q3.47, 39/349, 11.2%) say
Yes, regularly to organization 31.5%
“No” they don’t provide metrics. Figure executives within and outside of cyber
security management hierarchy
31 shows this, plus that reporting SOC- Yes, but not outside of cyber 16.0%
security management
related metrics regularly to board of No 11.2%
directors and organization executives,
Unknown 4.6%
both within and outside of cybersecurity 0% 10% 20% 30% 40%
management hierarchy, is common, but
Figure 31. Metrics Reported Audience (Q3.47, n = 346)
not done in the majority (Q3.47, 128/349,
36.7%) of cases. For outsourced functions (or capabilities), what KPIs (key performance indicators)
and/or metrics do you request or receive from your MSSP for tracking performance?
We asked what metrics are in use, and Indicate whether these metrics are used to enforce service level agreements (SLAs)
Figure 32 shows the answers sorted and whether your SOC consistently meets the service level represented by that metric.
Indicate N/A those that are not used.
on the value of “Used” for outsourced
Used Enforced Consistently All Three
capabilities.
24.0%
In some cases, the idea of these metrics Number of incidents handled 16.2%
18.9%
being “enforced” seems untenable, but 20.9%
14.9%
people answered that way, nonetheless. Thoroughness of eradication (no recurrence 21.3%
of original or similar compromise) 24.0%
For example, enforcing a metric of 12.5%
18.2%
“monetary cost per incident” would mean Time from detection to containment to eradication 18.2%
23.0%
that incident handling is terminated once 17.2%
14.5%
a certain amount of resource is expended. Incident occurrence due to known 15.9%
vs. unknown vulnerability 18.6%
Perhaps this is what people are reporting 19.3%
11.1%
they do. The authors sincerely hope this is 20.9%
Number of incidents closed in one shift 22.0%
not the case. 13.5%
11.8%
Avoidability of incident (could the incident have been 22.0%
avoided with common security practices in place?) 18.2%
14.2%
16.9%
Threat actor attribution (using threat intelligence) 14.5%
19.9%
16.2%
12.2%
Time to discover all impacted assets and users 20.9%
18.2%
18.9%
10.5%
Losses accrued vs. losses prevented 16.9%
22.3%
11.8%
11.8%
Downtime for workers or duration of 24.0%
business outage per incident 16.9%
11.5%
11.1%
Monetary cost per incident 17.2%
17.6%
13.2%
Thoroughness and accuracy of enterprise 15.2%
sweeping (check all information systems 16.9%
16.9%
for indicators of compromise) 19.3%
2.7%
Other 7.8%
11.8%
4.4%
0% 5% 10% 15% 20% 25%

Figure 32. MSSP Metrics/KPIs/SLAs Used, Enforced, and Consistently Met (Q3.50, n = 256)

SANS 2023 SOC Survey 21


Another aspect of metrics we asked about was cost per record,
What is your estimated cost-per-record?
discussed briefly in key findings. Details are provided here for Select Unknown if you don’t know or don’t have a value.
comparison to your in-house calculations (see Figure 33). Internal user account Customer account information

Based on popularity, the top values for each type are: Credit card Other
13.6%
• Internal user account $1–$5 (Q3.54, 24/103) Unknown 14.6%
16.5%
14.6%
• Customer account information: $1–$5 (Q3.54, 23/103) 14.6%
<$1 7.8%
• Credit card: $5–$10 (Q3.54, 22/103) 7.8%
4.9%
23.3%
The definition of “cost per record” varies and is hard to estimate—a $1–$5 22.3%
19.4%
high percentage of respondents indicated they were not using cost per 10.7%
20.4%
record. Large incidents can be the most damaging, but actually show 18.4%
$5–$10 21.4%
the lowest cost per record. Conversely, ransomware attacks can disrupt 7.8%
an entire business by encrypting one key file with a small number of 5.8%
$10–$25 11.7%
8.7%
records, if any. 4.9%
6.8%
The SOC metric of time to detect/response/restore represents the $25+ 8.7%
9.7%
only part of cost/record that the SOC actually owns. Having accurate 5.8%
0% 5% 10% 15% 20% 25%
estimation of that metric enables the SOC to support business needs
Figure 33. Cost per Record
for a cost/record estimate. (Q3.54, n = 103)

Budget and Funding What is your estimated annual budget for new hardware,
software licensing and support, human capital, and any additional costs?
How much does all this cost? Figure 34 shows
the responses at varying budget sizes. That Unknown 22.1%

the most popular answer is “Unknown” by a Less than $100,000 USD 9.8%

wide margin (Q3.68, 68/307, 22.1%) indicates $100,001–250,000 USD 11.1%


that most SOC staff don’t have accounting 250,001–$500,000 USD 13.4%
duties for what this all costs. It’s as though $500,001–$750,000 USD 11.1%
the topic of the SOC’s expenses isn’t shared or $750,001–$1,000,000 USD 8.5%
they don’t ask about it.
$1 million–$2 million USD 8.8%
Accountability, reasonable expense, frugality, $2 million–$4 million USD 6.2%
and alignment come from an understanding $4 million–$8 million USD 3.9%
of the real cost of resources for the SOC. The
$8 million–$16 million USD 2.6%
SOC offers potential loss prevention. SOC staff
$16 million–$48 million USD 1.3%
should understand this cost of doing business
Greater than $48M USD 1.3%
comes at the expense of other potential
0% 5% 10% 15% 20% 25%
business expenditures. It is necessary, and
Figure 34. Estimated Annual Budget
should be based in financially sound practices. Understanding this financial reality (Q3.68, n = 307)
is an important visibility into the business context for the SOC. If you flip back up to
Figure 10, you’ll see that the top response (Q3.79, 50/313, 16.0%) is “Lack of context
related to what we are seeing.” There’s a disconnect here between the business
owners, the SOC cost and expenses, and the information systems used by the
business. There’s no simple solution to this; it requires diligence and ongoing effort
to gain context for awareness and understanding.

SANS 2023 SOC Survey 22


Returning to the notion of
Which of the following metrics do you use to justify funding?
metrics, we asked what metrics Select all that apply.
were used to justify budget for
Number of incidents handled 46.5%
the SOC. Figure 35 identifies
Time from detection to containment to eradication 37.9%
the responses. The number
Incident occurrence due to known
of incidents handled is the vs. unknown vulnerability
33.0%
Avoidability of incident (could the incident have been
most common response (Q3.75, avoided with common security practices in place?)
29.4%
Thoroughness of eradication (no recurrence
131/282). Hopefully, this doesn’t of original or similar compromise)
28.0%

reinforce bad behavior to show Time to discover all impacted assets and users 23.0%
management that the SOC is Number of incidents closed in one shift 22.7%
needed by running up the count Downtime for workers or duration of
20.6%
business outage per incident
of incidents. The more likely
Losses accrued vs. losses prevented 19.1%
use of this metric is showing
Threat actor attribution (using threat intelligence) 17.7%
how many issues arise that
Monetary cost per incident 17.0%
need appropriate detection and
Thoroughness and accuracy of enterprise sweeping (check
all information systems for indicators of compromise)
12.8%
response, and possibly using
Other 3.9%
this for a “cost per record” type
0% 10% 20% 30% 40% 50%
justification.
Figure 35. Metrics Used to Justify
The next two most highly cited metrics (time to detect/eradicate and percentage of Funding (Q3.75, n = 282)
incidents exploiting unknown vulnerabilities) are much more value for both corporate
management and SOC operations. The board is not interested in how many raindrops are
hitting the roof; they want to know if we are getting better at finding the leaks and fixing
them before the business damage occurs.

Summary
We asked a lot of questions, but
we also wanted to know what
respondents would ask other
SOCs. Here at the closing, the
authors have selected their favorite
question: “How have you managed
to be effective despite heavy staff
and resource constraints?”

It’s worthwhile to note that there


were many questions around
people, AI, and risk. How about a
word cloud to summarize it? It’s in
Figure 36. Figure 36. Word Cloud of Questions
for Other SOCs (Q3.81, n = 101)

SANS 2023 SOC Survey 23


This year’s SOC Survey covered many points. Threat hunting and threat intel are important
parts of the processes of the SOC. The most popular response on the question of the key
challenge to SOCs is that there is a lack of context of the systems that are being protected.
Hiring and retaining staff continues to be a challenge. Most SOCs are using metrics, and
most are reporting to entities outside the SOC itself.

In 2023 there will be several additional discussions of the survey and the data. It also
should be noted that a deidentified data set and Jupyter notebook is provided by the lead
author (Crowley) for follow-up analysis. This is intended to help readers and respondents
answer their own questions. If you have specific questions that you would like answered,
the authors are interested in understanding how to improve the report for the future, and
what additional information would be valuable to the community.

Sponsors

SANS would like to thank this survey’s sponsors:

SANS 2023 SOC Survey 24

You might also like