0% found this document useful (0 votes)
13 views17 pages

SANS

Uploaded by

ghamdiaaw
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
13 views17 pages

SANS

Uploaded by

ghamdiaaw
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 17

Survey

SANS 2024 SOC Survey:


Facing Top Challenges
in Security Operations
Written by Christopher Crowley
May 2024

©2024 SANS™ Institute


Executive Summary
As we’ve seen in the past, security operations centers (SOCs) are a core component of an
organization’s cybersecurity practice. We’re exploring what a SOC is, and hope that you use this
survey to recalibrate your near term and longer-term plans. In the author’s experience many
organizations are currently looking for a basis to compare the SOC’s performance with other SOCs.
This includes capabilities, budget, staffing, and challenges. All of these are covered in this report.
In addition to the details covered here, there are a multitude of additional items we simply don’t
have space to address. To help you help yourself, the de-identified responses and a Jupyter
notebook are available for you to do some additional analysis at: https://soc-survey.com.

Top 4 Industries Represented Organizational Size


Small
Technology (Up to 1,000)

Small/Medium
Government (1,001–5,000)

Medium
(5,001–15,000)
Banking and
finance Medium/Large
(15,001–50,000)

Cybersecurity Large
(More than 50,000)
Each gear represents 10 respondents.
Each building represents 10 respondents.

Operations and Headquarters Top 4 Roles Represented


Ops: 148 Security administrator/
HQ: 34 Ops: 124
security analyst
HQ: 9
Ops: 157
HQ: 30
SOC Analyst

Ops: 85 Security manager or


Ops: 351
HQ: 3 director
HQ: 301

Ops: 77
Ops: 117
Ops: 62 HQ: 4 SOC manager
HQ: 20
HQ: 2 or director

Each person represents 10 respondents.


Participants’ Locations

Figure 1. Survey Demographics


Figure 1 is dense with information. Some of the items expressed in it are that the top sectors
represented by respondents were Technology, Government, Banking and finance, Cybersecurity,
and Education. The respondents were a mix of technical and managerial: the top responses
were: Security administrator/Security analyst, SOC Analyst, Security manager or director, and
SOC manager or director. 334 out of 403 respondents were headquartered in North America, 301
of those were based in the United States of America. But there were responses from companies
headquartered around the globe, including: Europe, Latin or South America, Asia, Middle East,
Australia/New Zealand, and Africa.

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 2


What’s your budget? “Unknown” is by far the most common response, answered by 151
people. This seems odd. It is the author’s opinion that it is a result of a fundamental
misalignment between the SOC staff/management and the organizational budget process.
The author’s interpretation of this response and others is that the SOC is misaligned with
the organization it is intended to protect.

What is your estimated annual budget for new hardware, software licensing and support, human capital, and any additional costs?

40%
38.2%

30%

20%

10.4%
10%
7.8% 7.8% 7.3%
6.8% 6.1%
5.6%
3.3% 3.0%
1.8% 1.8%
0%
Unknown Less than $100,001– $250,001– $500,001– $750,001– $1 million–$2 $2 million– $4 million– $8 million– $16 million– Greater than
$100,000 $250,000 $500,000 $750,000 $1,000,000 million $4 million $8 million $16 million $48 million $48 million

Figure 2. SOC Budget


You’re reading this report to understand how it is going in other people’s SOCs. To
provide a consistent basis of comparison, we frequently use metrics. The survey asked if
metrics are reported. 260 of 384 responses to Q3.77 said they provide metrics to senior
management to justify resources for the SOC, representing 67% of the responses.

This is a relatively small increase


What is the total internal staffing level (i.e., all related positions) for your SOC, expressed
from 2023 where 66% said they did in terms of full-time equivalents (FTEs)? What is the number of FTEs specifically assigned
the same. Both these last two years, to the management of your SOC systems, not just to analysis of the data from your SOC
systems? Note: Include both employees and in-house, dedicated 1099 contractors who
however, are a fairly substantial
function as employees in your SOC. If responsibilities are shared across a team, estimate
drop from 2022 where 74% the equivalent FTE amount of time spent among the team.
reported using metrics to senior <1 (part-time) 1 2–10 11–25 26–100 101–1,000 >1,000 Unknown
management for justifying SOC 40%
35.4% 36.7%
resources. Prior to 2022, we asked
the question as an open-ended 30%
response so the percentages aren’t
available.
20%
16.4%
What might cause such a change? 14.9%
13.6%
12.8%
11.0% 10.5%
We can only speculate—maybe a 10% 9.7%
6.9% 6.9%
more mature approach to metrics. 3.6%
5.6%
2.1% 3.1%
1.8%
Regardless, we’ll dig into specifics
0%
of metrics in a later section. Total Specific to SOC Systems Management

Figure 3. SOC Staffing

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 3


“How many people work in your SOC?” The
What is the greatest challenge (barrier) with regard to full utilization of your
figure for question Q3.61 shows that most SOC capabilities by the entire organization? Select the best option.
respondents report 2–10 people. In a later Lack of automation and orchestration 18.3%
section, we’ll dissect this into industry,
High staffing requirements 14.4%
organization size, and outsourcing. This has
Lack of skilled staff 14.2%
been the most common answer since the
Lack of enterprise-wide visibility 12.9%
inception of the SOC survey in 2017. So, it’s no
Silo mentality between
8.5%
surprise that it is the same this year. security, IR, and operations
Lack of management support 7.0%
“What’s your biggest barrier in the SOC
Too many tools that are not integrated 5.9%
currently?” Lack of automation and
Lack of context related to
4.6%
orchestration is the single highest answer what we are seeing
Too many alerts that we can’t look into
4.6%
with 71 responses out of 388. But combining (lack of correlation between alerts)
Other 4.1%
the next two answers which are directly
related—“high staffing requirements” and Lack of processes or playbooks 3.6%

“lack of skilled staff” (56+55=111) we see Regulatory or legal requirements 1.8%


0% 5% 10% 15% 20%
that staffing represents the greatest barrier.
The third issue commonly cited is a lack of Figure 4. SOC Automation

enterprise-wide visibility, with 50 responses.

“How does the SOC know there’s a problem?”


What triggers a response from your SOC team? Select all that apply.
EDR/XDR is the highest reported initial
Alerts from our endpoint security 334
trigger for incident response by the SOC team
in question Q3.32. The SIEM, user reports, Automated SIEM alerts 291

other anomalous activity, and third-party User-reported issues


266
(direct to SOC or via help desk)
intelligence represent the items that received
Anomalous activity 263
over 200 responses out of 394 respondents
Alerts from our IDS/IPS and
to this “select all that apply” question. We’ll firewalls on the network
257

update the answer options in 2024, because Searching our SIEM 248
“anomalous activity” doesn’t get to the heart Alerts and reports from our
third-party intelligence providers
216
of the question we asked, “How does the SOC
0 50 100 150 200 250 300 350
know there’s a problem?”
Figure 5. Response Triggers

Commentary on Trend Analysis of Responses


An important note about identified change and consistency trends in the next
sections is that we can’t guarantee the same population responds year over
year. We don’t vet the identity of the respondent. Nonetheless, polling provides
a reasonable insight into the state of things in the world. One way to measure
quality of the respondents is how long people spent answering this extremely
long survey! The mean time was 52 minutes and the median time was 33
minutes with a Qualtrics projected time to complete of 36 minutes. In fact,
people frequently tell the author of the survey that answering the questions
has substantial value as a thought exercise!

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 4


Highlights of Changes and Trends
in SOC Survey Responses
Section Summary: Changes: Cloud-based is new top structure; everything goes in SIEM is
more common; single, central SOC is more common; vendor-tool based threat hunting is
more common; fewer are planning on deploying AI/ML; people express lower grade for
AI/ML than last year; TLS inspection is decreasing; employee duration of employment is
increasing; career progression is more important for retention.

We hope you’ve been reading the SOC Survey since it was first created in 2017. Since you
might not remember the charts from last year,
What is the primary approach you use to decide what data
let’s look at a few things that changed from to ingest into your SOC?
previous years.
Everything goes in SIEM/syslog 152

Cloud-Based Architecture Some selection criteria


applied (risk-based)
110

The first one we’ll explore is a big one. That Only high-priority systems
(selection based on system type)
40
“cloud based” now exceed “single central” SOC Highly selective use case/
detection engineering
38
as the most common architecture. The trend
Unknown/Unsure 30
of moving to the cloud has been observed
Data adjustment applied prior to
in IT for years and is now embedded in SOC ingesting data to apply selection criteria
25

architecture. Other 8

0 25 50 75 100 125 150


SIEM Everything
Figure 6. SOC SIEM
We asked how people deal with the massive volume of data, and they seem to be
exerting less effort filtering things and instead are dumping everything into the SIEM.
This may seem counter-intuitive, but it may be more economical than exerting lots of
engineering effort to figure out what is actually needed before collecting it. See Figure
6 for the answer of the question, “What is the primary approach you use to decide what
data to ingest into your SOC?”

This represents an increase from 2023 when the percentage was 29% of
600 answers, this year it’s 38% of 403 answers for the same question.
We didn’t ask the question prior to 2023.
What is the primary approach you use to decide what data
to ingest into your SOC?
Single Central Architecture
Greater Ratio Single, central SOC 242

There are a few ways to build out your SOC. Multiple, hierarchical SOCs 59

Having a single, centralized SOC is the most Multiple, unorganized SOCs 25


common way to do it, as shown in Figure 7
Multiple, standalone/siloed SOCs 24
for 242 out of 403 or 60% of respondents. An
Multiple, parallel, redundant
increase from 2023 at 49% and 2022 when 53% SOCs looking at same data
30

answered single, central SOC. Other 23


0 50 100 150 200 250

Figure 7. SOC Architecture

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 5


Vendor Threat hunting automation on rise
Threat hunting has a primary objective of looking for compromise which wasn’t detected
by our alerting systems. One important but simple approach to this is applying newly
discovered indicators to historical data repositories.

We asked if threat hunting activities were


Are your threat hunting activities automated?
automated, and 179 out of 388 responses
Partially automated using mainly
46.1%
indicated they are at least partially automated vendor-provided tools.

using vendor provided tools, as visualized by Manual 37.3%

Figure 8. Partially automated using


18.3%
mainly home-grown tools.
Last year, only 38% of 457 responses indicated Fully automated using mainly
5.7%
vendor-provided tools.
the same “partially automated with vendor
Fully automated using mainly
home-grown tools.
2.6%
tools” response compared to this year’s 46%.
0% 10% 20% 30% 40% 50%
It’s the author’s opinion that retroactive
Figure 8. Hunting Automation
analysis using updated IOCs is just the bare minimum hunting and real hunting entails
thoughtful seeking of the previously undiscovered. Our advice, keep automating the
retroactive analysis, and strive to do sophisticated hunting.

AI/ML Tech and Satisfaction


Last year we quickly added AI/ML to our technology satisfaction list, and it was unsurprisingly
at the very bottom. We’ll show you the overall grade-based comparison again this year in a
later section. But, let’s look at some of the technology changes from 2023 to 2024.

From 2023 to 2024, the percentages of full or partial production or in the midst of
implementing didn’t change much. But look at the drop in
Analysis: AI or machine learning
planned implementations from 2023 when 21% said it was
2024 2023
planned to 2024 when 11% said it is planned. From this picture 40%
it looks like the people who were going to do it have already 36.4%
35%
done it, and the rest have decided to pass.

The other thought to explore is if people are having buyers’ 30%


28.0% 27.3%
remorse. We provide a GPA based grading each year. In 2023 24.2%
25%
22.7%
“Analysis: AI or machine learning” got a GPA of 2.17, beating only
20.5%
20% 19.3%
network packet analysis which scored the lowest GPA of 2.15.
How did it do in 2024? It came in 2nd to last again, but with a
15%
lower GPA of 1.99. 10.6%
10% 8.7%
We considered that the drop was due to respondents being
harder graders in 2024. But there was a higher high than last 5%
2.3%
year: EXDR kept the top spot. It got a 2.88 in 2023, and a 3.13
0%
in 2024. So, our interpretation is Cybersecurity staff are more Production Production Implementing Purchased, Planned
(all systems) (partial systems) not implemented
unhappy with AI/ML in 2024 than in 2023.
Figure 9. AI Implementation
But, the new lowest was a new addition to our list of technology. Are you ready for the
new lowest? Making its debut at the bottom of the list is “Analysis: AI or machine learning-
Generative (GPT)” at a GPA of 1.80. Let’s look at it again in 2025. In 2025 we plan to have a
more detailed list of AI/ML technology options because the products are proliferating.

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 6


TLS Intercept
TLS intercept address blindness, or lack of visibility into data. With applaudable
privacy advances come reduced enterprise visibility into network traffic. One
approach to this is providing transport layer security (TLS) intercept technology to
peer into encrypted communication. This is becoming harder to do, and the 2024
responses indicated a slight decrease from 2023.

In 2024 34% indicated “We’re not using any TLS interception to see inside HTTPS or
other encrypted communications” whereas in 2023 only 25% indicated the same.
In 2023 38% indicated “We have TLS intercept implemented, some categories of
websites are excluded from intercept due to company policy and/or user privacy
considerations.” In 2024 that percentage dropped to 34%.

SOCs are losing visibility into the traffic leaving the network, which likely means
more reliance on the endpoint protection tools.

Average Tenure Increasing What is the average employment duration for an employee
in your SOC environment (how quickly does staff turnover)?
Staffing is always a concern for the SOC. It
takes skilled analysts to perform well under 40%

high pressure for a long time. So, retention is


a perennial challenge. The survey asks how
35%
long the average tenure is, and slightly longer 1–3 years
tenures of three to five years are just barely
3–5 years
eclipsing one to three-year tenures, but this is 30%

a positive trend for long term career-oriented


staff and organizations looking to minimize
25%
the cost and uncertainty of constantly hiring 2022 2023 2024

and retraining. See Figure 10 depicting this Figure 10. Employment Duration
inflection. We’ll keep an eye on it for 2025.

Retention What is the most effective method you have found to retain employees?
What has been compelling people to stay? The 35%
survey asks how to retain employees. We don’t
cover macro-economic conditions, but those 30%
could also play a factor. See Figure 11 to see that Money

meaningful work took the top spot this year, but 25% Career progression
the reported differences have reduced.
Meaningful work
20%

15%
2022 2023 2024

Figure 11. Retention

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 7


More of the Same Within your organization, is use of the internal SOC viewed as mandatory
or is it acceptable for members of your organization to acquire services
from external parties/providers?
Section Summary: Same old story: internal SOC is
60%
mandatory; NOC and SOC are not integrated but
coordinate. 50%  es, use of the
Y
internal SOC is
40% mandatory.
We’ve explored some of the changes observed
for the past couple of years. What seems to be 30%  o, we may acquire
N
services from an
consistent? external provider.
20%
 o, we have no
N
Internal SOC Mandatory 10% internal SOC.

For one thing, most of the time use of the SOC is 0%


2022 2023 2024
not an option, and this is consistent with all the Figure 12. SOC Mandatory
years we’ve run the survey.
What resources does your organization utilize to collect
Q3.2 asked if internal SOC use was mandatory. malware samples and/or perform malware analysis?
Figure 12 indicates that the spread has changed
 ur SOC and IT/NOC teams
O
slightly but the ratios are about the same with no 30% have very little direct
communication.
major movement.
25%  ur SOC and IT/NOC teams
O
The NOC and SOC have about the same work together only when
there are emergencies.
relationship year over year, as shown in Figure 13. 20%
 ur IT/NOC team is
O
an integral part of our
Next we deep dive into some other questions 15% detection and response,
although our SOC and IT/
in the survey, leaving behind the year over year NOC activities are not
technically integrated.
comparisons. 10%
 ur IT/NOC team and SOC
O
team are kept well-informed
5% through integrative
dashboards with shared
information, APIs, and
0% workflow, where needed.
2022 2023 2024

Figure 13. NOC/SOC Relation

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 8


SOC Technology Rate your satisfaction with the technologies you are using.
Host: Endpoint or extended detection
and response (EDR/XDR) 3.13
Net: VPN (access protection and control) 2.93
Section Summary: 47 listed technologies graded; EXDR
Net: Email security (SWG and SEG) 2.90
is top GPA technology still; AI/ML is lowest. Analysis: SIEM (security information
and event manager) 2.90
Since 2020, we’ve taken a GPA approach to the Net: Next-generation firewall (NGF) 2.87
depiction of technology satisfaction. Surprise, this year Host: Malware protection system (MPS) 2.83
Analysis: Customized or tailored
one technology received an A! Barely in the A range SIEM use-case monitoring 2.81
Net: Network intrusion detection system
of 3.1, EXDR tops the list. Figure 14 has the full list. It’s (IDS)/intrusion prevention system (IPS) 2.80
Host: Continuous monitoring
2.78
worth noting that AI/ML occupy the bottom two spots and assessment
Net: DoS and DDoS protection 2.78
in satisfaction. We added AI/ML Generative Transformer
Log: Endpoint OS monitoring and logging 2.75
this year, since ChatGPT has captured the public
Host: Ransomware prevention 2.73
imagination since Generative Pretrained Transformer Net: DNS security/DNS firewall 2.70
(GPT) version 3 started spouting useful stuff. SOC staff Net: Network traffic monitoring 2.69
don’t seem impressed yet. Log: Log management 2.68
Host: Vulnerability remediation 2.67
Figure 15 on the next page ranks the technology list by
Net: Egress filtering 2.66
deployment phase and shows the corresponding GPA
Net: Ingress filtering 2.66
of that technology.
Host: Behavioral analysis and detection 2.66
Another interesting way to look at this is to rank Log: Endpoint application log monitoring 2.62

technology based on the top of each category. Let’s Host: User behavior and entity monitoring 2.60

take each in turn. Net: Web application firewall (WAF) 2.59


Log: DNS log monitoring 2.56
Production (all systems) has a top product of “Net:
Net: Network segmentation 2.56
Email security (SWG and SEG)” with 111 out of 161 Analysis: Attack surface management 2.55
overall responses. This mature technology is easy to Net: Network Access Control (NAC) 2.53
accomplish full coverage and is so commonplace and Net: Asset discovery and inventory 2.49

necessary that email would be unusable without it. Plus, Net: Network Detection and Response (NDR) 2.48
Analysis: Threat intelligence (open
it would likely be criminally negligent to run an email source, vendor provided) 2.47
Net: SSL/TLS traffic inspection 2.45
server with no filtering in place. Or maybe criminally
Analysis: External threat intelligence
2.44
profitable, but offering bulletproof hosting and no-trace (for online precursors)
Net: Web proxy 2.42
mail servers is the other side of the cyber industry.
Host: Data loss prevention 2.41
Production (partial systems) top technology is Analysis: Analysis: Threat hunting 2.41

Threat hunting with 61 responses. This is aligned with Host: Application whitelisting 2.39
Analysis: digital asset risk
the aforementioned increase in threat hunting being analysis and assessment 2.35
Analysis: E-discovery (support legal
2.33
driven by third party provided hunting tools. This is requests for specific information collection)
Analysis: Threat intelligence platform (TIP) 2.32
easy to deploy into production, but a challenge to
Analysis: SOAR (Security Orchestration,
Automation, Response) 2.23
accomplish full coverage because of visibility issues.
Net: Packet analysis (other than full PCAP) 2.21
These issues may stem from inadequate authorization
Net: NetFlow analysis 2.17
or mandate. But it may also simply be a challenge Analysis: Frequency analysis
2.15
for network connections
to provide effective hunting across all systems. It’s Net: Full packet capture 2.13
trivial to say, go look for a hash on a computer. Doing Net: Malware detonation device
(inline malware destruction) 2.11
Net: Deception technologies
so across tens of thousands of globally deployed such as honey potting 2.10

systems on commodity internet with varying bandwidth Analysis: AI or machine learning 1.99
Analysis: AI or machine
1.79
becomes a substantial challenge. learning- Generative (GPT)
0 0.5 1 1.5 2 2.5 3

Figure 14. Grade Point Average


SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 9
Rate your satisfaction with the technologies you are using.
Production (all systems) Production (partial systems) Implementing Purchased, not implemented Planned GPA
Net: Email security (SWG and SEG) 111 25 11 3 2 2.90
Host: Endpoint or extended detection
and response (EDR/XDR) 103 39 10 3 2 3.13
Net: VPN (access protection and control) 102 39 9 1 1 2.93
Host: Malware protection system (MPS) 90 37 16 1 3 2.83
Net: Next-generation firewall (NGF) 89 39 12 2 3 2.87
Analysis: SIEM (security information
and event manager) 88 43 10 3 5 2.90
Host: Vulnerability remediation 87 45 12 1 4 2.67
Net: Network intrusion detection system
(IDS)/intrusion prevention system (IPS) 87 34 14 2 8 2.80
Net: DNS security/DNS firewall 87 29 20 3 1 2.70
Host: Ransomware prevention 86 45 11 4 5 2.73
Host: Behavioral analysis and detection 82 46 14 0 5 2.66
Net: Network segmentation 81 49 18 1 4 2.56
Log: Endpoint OS monitoring and logging 80 51 12 1 7 2.75
Net: Network traffic monitoring 80 50 11 3 4 2.69
Net: Ingress filtering 79 35 9 2 9 2.66
Log: Log management 78 56 13 3 3 2.68
Host: Continuous monitoring
and assessment 77 46 13 3 6 2.78
Net: DoS and DDoS protection 77 46 16 4 2 2.78
Log: DNS log monitoring 77 45 20 3 6 2.56
Net: Asset discovery and inventory 75 49 17 2 5 2.49
Net: Web proxy 71 34 21 2 13 2.42
Log: Endpoint application log monitoring 69 50 17 5 7 2.62
Net: Web application firewall (WAF) 68 44 26 1 6 2.59
Net: Egress filtering 67 42 20 2 12 2.66
Host: User behavior and entity monitoring 64 48 25 3 7 2.60
Net: Network Access Control (NAC) 64 40 23 3 6 2.53
Analysis: Customized or tailored
SIEM use-case monitoring 61 47 22 2 6 2.81
Analysis: Attack surface management 61 40 28 4 6 2.55
Net: Network Detection and Response (NDR) 61 36 26 2 9 2.48
Net: SSL/TLS traffic inspection 59 49 22 1 7 2.45
Analysis: Threat intelligence (open
source, vendor provided) 54 54 25 2 9 2.47
Host: Data loss prevention 53 50 37 3 3 2.41
Host: Application whitelisting 53 47 32 2 10 2.39
Analysis: External threat intelligence
(for online precursors) 50 53 27 3 7 2.44
Net: NetFlow analysis 50 45 26 3 14 2.17
Analysis: SOAR (Security Orchestration,
Automation, Response) 50 38 33 6 11 2.23
Analysis: E-discovery (support legal
requests for specific information collection) 48 48 27 3 8 2.33
Analysis: Threat intelligence platform (TIP) 45 41 35 6 9 2.32
Analysis: Threat hunting 43 61 33 1 8 2.41
Net: Malware detonation device 42 43 28 3 16 2.11
(inline malware destruction)
Analysis: digital asset risk
analysis and assessment 42 42 34 2 10 2.35
Net: Full packet capture 39 43 36 2 14 2.13
Net: Packet analysis (other than full PCAP) 38 54 29 3 16 2.21
Net: Deception technologies
such as honey potting 38 29 43 2 20 2.10
Analysis: Frequency analysis
for network connections 36 45 33 2 11 2.15
Analysis: AI or machine learning 30 37 48 3 14 1.99
Analysis: AI or machine
learning- Generative (GPT) 26 33 51 2 17 1.79

Figure 15. Technology Satisfaction Report Card

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 10


Implementing is topped by generative AI, “Analysis: AI or machine learning-
Generative (GPT) with 51 responses. Funding is a challenge for SOCs, and
the GPT products have rained out of the sky recently to try to optimize
efforts within most businesses. It’s the author’s
Indicate your level of satisfaction with your incident
opinion that GPT can be a phenomenal enabler for response capabilities. Select all that apply.
better communication and analyst understanding of
Very Satisfied Satisfied Not Satisfied
information, but it is not yet a replacement for analysts.
86
Network forensic analysis 180
Purchased not implemented sees a tie for the top of 94
the list, “Analysis: Threat intelligence platform (TIP)” and 163
Endpoint detection 176
and response (EDR) 41
“Analysis: SOAR (Security Orchestration, Automation,
73
Response)” with six responses. It is probably due to Adversary interruption 185
83
shifting priorities. It’s an oversimplification, but when 83
Adversary containment 195
a product is purchased but the implementation gets 64
sidelined, there are usually two major parties to blame: 82
Command center 183
us and them. We, the SOC, are to blame because we 72
93
frequently underestimate the time to deploy items, and Host-based forensic analysis 168
90
often don’t have a clear comprehension of how the
76
technology will fit into our tech stack. Or the SOC finds it Public relations coordination 139
72
isn’t as easy to accomplish the original intention. 54
Reverse engineering 103
of malware 137
With respect to “Them,” the organization is to blame
98
because there is often last-minute budgeting without Network detection and 183
response (NDR) 74
allocation of resources from other teams, usually IT.
92
Customer interaction 149
Finally, Planned has “Net: Deception technologies (call center) 62
such as honey potting” with 20 responses. Deception 98
Playbook-based 158
response actions 97
has been slowly increasing in its deployment and
89
satisfaction according to the SOC survey. But it hasn’t Workflow-based remediation 177
83
reached the production deployment levels of other 76
Threat campaign tracking 139
technology. 113
75
Constituent communications 182
Incident Response Satisfaction 67
68
Threat attribution 150
103
Section Summary: Most satisfied with endpoint-based
60
incident response capability; visibility and asset Adversary deception 106
133
correlation continues to be a challenge. 47
Hardware reverse engineering 80
We asked about satisfaction with incident response 109
96
capability. Figure 16 is sorted by the sum of very Utilization of threat 167
intelligence 95
satisfied and satisfied. Endpoint and network detection 10
and response are well regarded. Whereas deception and Other 14
9
reverse engineering receive low rankings. 0 25 50 75 100 125 150 175 200

Figure 16. IR Satisfaction

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 11


Visibility
Visibility into systems is important. Correlation Select the option that most accurately represents your method
to what the systems are doing and who is of correlating assets to responsible system owner or user
for servers and user endpoints in your environment.
using them is important to contextualize
systems. We asked, “Select the option that Both User Endpoints Servers

most accurately represents your method of Through full integration between our 39
physical badging system, authentication 25
correlating assets to responsible system owner system, and our SIEM/workflow tool 17

or user for servers and user endpoints in Fully automated through our user
56
authentication system (such as Active
52
your environment.” Figure 17 shows that the Directory, IPAM), which is fully integrated
53
into our SIEM/monitoring workflow tool
most common method is mostly automated
Mostly automated, but must fall 78
augmented by manual efforts. A surprising back to manual log inspection 68
and correlation sometimes 66
number have a manual effort each time.
Manual effort each time (manually looking 64
A surprising number have integration with up IP addresses, comparing against 39
directories, privileged user access logs, etc.) 49
physical badging systems and into the SIEM!
0 10 20 30 40 50 60 70 80

Figure 17. Correlation


SOC Capabilities and
Outsourcing
Section Summary: Capabilities are consistent
What activities are part of your SOC operations?
across almost all respondents; frequently What activities have you outsourced, either totally or in
outsourced items are pen-testing, forensics, part, to outside services through a managed security service
provider (MSSP) or as a result of hosting in the cloud?
threat-intel, and alert triage.
350 360 370 380 390 400 Total
About two-thirds of the way into the report, Alerting (triage and escalation) 401
Security monitoring and detection 399
we define what we consider a SOC! We’ve
Incident response 398
reused the capabilities list for years since Security administration 396

there’s a strong consensus on what people Security architecture and engineering


396
(of systems in your environment)
do in the SOC. Slightly more than 400 people SOC maturity self-assessment 395
Vulnerability assessments 395
answered the question as to it being done
Pen-testing 391
In-house, outsourced, or both. The highest Remediation 391
SOC architecture and engineering
total answer for an activity was 401 (alerting) (specific to the systems running your SOC)
391

and the lowest was 378 total (purple teaming). Digital forensics 390
Security road map and planning 390
Basically, everyone answering performs all the Threat research 390
capabilities in some way. For the lowest count Threat hunting 389
Compliance support 388
capability, only 25 of the people, or about Security tool configuration, integration,
388
and deployment
6%, don’t perform it. To illustrate this, look at
Data protection 385
Figure 18. Red-teaming 380
Purple-teaming 378

Figure 18. SOC Capabilities

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 12


Let’s see alternate visualizations to depict the
What activities are part of your SOC operations?
internal/outsourced variation within these What activities have you outsourced, either totally or in
responses. First, we’ll focus on what’s primarily part, to outside services through a managed security service
provider (MSSP) or as a result of hosting in the cloud?
done internally. If we sum “Inhouse” and “Both,”
Inhouse
we see that security administration, security 150 200 250 300 350 400 + Both
Security administration 378
planning, and architecture are at the top.
Security road map and planning 374
Security architecture and engineering
Flipping the combination, look in Figure 20 at (of systems in your environment)
372

the sum of purely outsourced and done both Remediation 368


Security tool configuration, integration,
in and out. Here we see our typical outsourcing and deployment
362

Incident response 361


items of pen-test, forensics, threat intel,
Security monitoring and detection 361
and initial alert triage are most commonly Data protection 360

outsourced. Vulnerability assessments 357


Compliance support 349
For measuring maturity of those capabilities, Alerting (triage and escalation) 347
SOC architecture and engineering
Figure 21 shows that NIST-CSF and MITRE (specific to the systems running your SOC)
339

SOC maturity self-assessment 338


ATT&CK are about equal in the capabilities
Threat hunting 334
assessment. Threat research 315
Digital forensics 291
Purple-teaming 245
Red-teaming 227
Pen-testing 211

Figure 19. SOC Internal and Internal/Outsourced

What model(s) are you using to determine what What activities are part of your SOC operations?
capabilities your SOC needs? What activities have you outsourced, either totally or in
Select all that apply. part, to outside services through a managed security service
provider (MSSP) or as a result of hosting in the cloud?
300 273
Outsourced
50 100 150 200 250 300 + Both
250 Pen-testing 276
274 Red-teaming 237
200 98 Purple-teaming 214
Digital forensics 199
150 Threat research 184
Alerting (triage and escalation) 181
100 27 Incident response 171
48
Security monitoring and detection 170
50 SOC maturity self-assessment 161
Threat hunting 155
0 Vulnerability assessments 150
NIST-CSF MITRE SOC-CMM SOC-Class Other
ATT&CK SOC architecture and engineering
129
(specific to the systems running your SOC)
Figure 21. Capability Basis Security tool configuration, integration,
127
and deployment
Remediation 116
Compliance support 110
Data protection 94
Security architecture and engineering
78
(of systems in your environment)
Security road map and planning 68
Security administration 62

Figure 20. Outsourced + Both

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 13


Architectures
Section Summary: Most SOCs run 24x7, and about half are follow-the-sun; most allow
work-from-home; 68% have some OT component to monitor, with about equal portions
monitoring IT/OT separately as converged.

Most SOCs are 24x7. Only 20% of 402 answered “No” to Q3.24, “Does your SOC operate
24/7?” Of the 314 operating 24x7, 36% are in-house only, 16% are outsourced only, and
26% are mixed internal and outsourced. Of these 314, 49% indicated there’s a “follow
the sun” model in place.

Other interesting facts that affect architecture:

• 76% of 403 responses to Q3.26 indicated SOC staff can work remotely.

• Regarding the IT/OT split, 68% of 397 acknowledged there was some OT
component. 10% of these said separate monitoring systems were used but
the same staff was used. 29% said separately, and 30% said together with IT
resources. This is from Q3.30 with 397 people answering the question.

SOC Staff
Section Summary: Staff with analytical skills on EDR and vulnerability remediation are
in demand; workload calculation per analyst is typically based on historical ticketing or
SIEM data.

We mentioned earlier that the most popular SOC staff size is Table 1. Top SOC Skills
a consistent 2-10. So, let’s dig in to some other details on staff.
Analysis: SIEM (security information and event manager) 138
The overall top three most important technologies for new hires Host: Endpoint or extended detection and response 98
to be familiar with are SIEM for analysis, host based EXDR, and Host: Vulnerability remediation 73
Vulnerability remediation. See Table 1.

Most SOCs are trying to figure out what the right workload is per analyst. So, we asked
the hard question, “how you calculate per-analyst workload.” Figure 22 shows that
most people use the ticket data for start and stop
time on a ticket. While this can have some error if Select the best description of how you calculate per-analyst workload.

ticket opening and closure isn’t done consistently We base it on the ticketing data when
an analyst starts and closes a ticket.
40.1%
between analysts, it’s a good approximation of level We use SIEM data to calculate how many
alerts are present and indicate how much 29.8%
of effort. time an analyst has to work each ticket.
Our service level agreements dictate
Presumably there’s some further calculations to how quickly we must review content,
and we allocate that much time per
14.6%
gauge busy time, optimize for expensive work, and analyst per shift to make a decision.
looks for per-analyst discrepancies to address Other 15.4%
skills, knowledge, and training deficiencies as well 0% 10% 20% 30% 40%
as varying performance levels. Or, probably not to Figure 22. SOC Workload
all of that. “Other” answers aren’t worth a full word cloud. There are primarily “we don’t
do this” type answers, “outsourced it’s the MSP’s problem” type answers, and a variety
of SIEM and other variations or nuanced tuning on the offered answers.

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 14


Threat Intelligence
Section Summary: Threat intel is used for incident response and hunting; typically done
based on intuition.

Threat intelligence is supposed to be used to gain tactical and strategic advantage over
the threats to our environment. In Q3.21 we asked a “check all that apply” type question
on how threat intelligence is being used, and the top response with 194 affirmations was
for “Incident Response” follow closely by “Threat Hunting” with 191 responses out of 276
respondents to this question.

We also wondered about the analysis work in threat intelligence, since there are no
clear parameters of accuracy and the data pieces can fit together in multiple seemingly
meaningful ways, like a mosaic. The top “used frequently” method for CTI analysis was
“Intuitive or experienced-based judgement” with 152 responses out of 263 answers. Threat
modeling was the top “Used Occasionally” method with 123 of 163 answers.

For outsourced functions (or capabilities), what key performance indicators


Metrics and/or metrics do you request or receive from your MSSP for tracking performance?
Select all that apply.
Section Summary: Metrics
Used Enforced Consistently
summary: For outsource functions,
Thoroughness of eradication (no recurrence
86 81 65
metrics are commonly used; the of original or similar compromise)

most common is “number of Number of incidents handled 143 78 94

incidents handled.” Time from detection to


containment to eradication 119 74 85
A SOC uses metrics to assess Time to discover all impacted
assets and users 100 82 65
performance. As we saw there
Incident occurrence due to known
are several activities which are vs. unknown vulnerability 97 64 63

outsourced, so we asked about Thoroughness and accuracy of enterprise


sweeping (check all information systems 82 67 55
metrics for outsourcing in Q3.52. for indicators of compromise)

Time based metrics are great, when Avoidability of incident (could the
incident have been avoided with 87 59 54
paired with quality metrics. We common security practices in place?)
Threat actor attribution (using
have the list ranked based on total threat intelligence) 85 60 53
in Figure 23.
Number of incidents closed in one shift 84 54 55

Downtime for workers or duration


of business outage per incident 83 57 43

Losses accrued vs. losses prevented 65 67 47

Monetary cost per incident 72 50 50

0 50 100 150 200 250 300

Figure 23. KPIs Used, Enforced, and Consistently Met

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 15


Instead of judging the external
If you provide metrics to your constituents (customers, internal resources protected
service provider, SOCs also assess by SOC), indicate whether these metrics are used to enforce SLAs and whether
themselves and their performance your SOC consistently meets the service level represented by that metric.
for their constituents. It has been Used Enforced Consistently All Three
mentioned in the past that the
Number of incidents handled 103 33 36 83
“number of incidents” seems
Time from detection to
a reasonable enough metric, containment to eradication 84 50 34 63

but establishing a service level Thoroughness of eradication (no recurrence


of original or similar compromise) 66 50 32 51
agreement and meeting it on Time to discover all impacted
assets and users 76 32 29 56
the number of incidents seems
Incident occurrence due to known
improbable. The items in Figure 24 vs. unknown vulnerability 66 39 36 44

are depicted sorted by the total of Avoidability of incident (could the


incident have been avoided with 74 31 33 48
used, enforced, consistently met, common security practices in place?)

and all three. Thoroughness and accuracy of enterprise


sweeping (check all information systems 67 28 44 45
for indicators of compromise)

Conclusion
Threat actor attribution (using
threat intelligence) 73 35 33 40

Downtime for workers or duration


of business outage per incident 70 37 30 44
Cloud-based is new top structure.
Number of incidents closed in one shift
Everything goes in SIEM is more 58 37 31 51

common than it has been in the Losses accrued vs. losses prevented 60 38 25 39

past.
Monetary cost per incident 58 34 28 38
Changes from past years: a single, 0 50 100 150 200 250
central SOC is more common; Figure 24. Metrics to Constituents
vendor-tool based threat hunting is more common; fewer SOCs report planning to deploy
AI/ML; people express lower grade for AI/ML than last year; TLS inspection is decreasing;
employee duration of employment is increasing; and career progression is more
important for retention.

Similar to past years, the internal SOC is mandatory to use and the NOC/SOC are not
integrated but coordinate.

Budget of SOC isn’t known to most respondents to the survey. Metrics are provided by 67%
of respondents, and the most common metric is number of incidents handled.

Capabilities of the SOC are very consistent across almost all respondents. Frequently
outsourced items are pen-testing, forensics, threat-intel, and alert triage.

The most commonly reported SOC size is 2–10 people. The highest cited barrier is lack
of automation. EDR/XDR is the most common initial indication of a problem. Most SOCs
are 24x7, about half are follow-the-sun and most allow work-from-home. 68% have some
OT component to monitor, with about equal portions monitoring IT/OT separately as
converged. Threat intel is used for incident response and hunting which is commonly
based on intuition.

47 listed technologies were graded and EXDR is top GPA technology still, and AI/ML is
lowest. Most satisfied with endpoint-based incident response capability but visibility and
asset correlation continue to be a challenge.

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 16


Sponsors

SANS would like to thank this survey’s sponsors:

SANS 2024 SOC Survey: Facing Top Challenges in Security Operations 17

You might also like