Taking a holistic and systematic approach to cybersecurity
Against a constantly shifting cyber threat landscape, many organisations are tackling the…
Against a constantly shifting cyber threat landscape, many organisations are tackling the challenge of security analytics with niche tools, out-of-the-box analytics, incomplete or subpar data, and ad hoc processes. But, says Stuart Bradley, the result is going to disappoint.
A January 2017 Ponemon Institute report based on a survey of 621 IT and IT security practitioners found some disheartening realities about how organisations are positioned to defend against cyberattacks. Some key findings of the report – When Seconds Count: How Security Analytics Improves Cybersecurity Defences – were:
- There is high reliance on internally developed tools. Half of the organisations surveyed use in-house tools with a data lake, even though such tools lack the adaptability and scalability to handle evolving cyber threats.
- Out-of-the-box” solutions are anything but. Fifty-six percent of respondents said their packaged solutions required extensive configuration and adaptation before they could be used.
- Data is a fundamental challenge. Half of respondents said they struggled with too much data, and 45 percent said they had issues with data quality or getting all the data they needed.
The inherent challenges of security analytics
Security analytics is a complex proposition. Take a look at the top objectives named by respondents in the Ponemon survey:
- Detect security events in progress.
- Conduct forensics on past security events.
- Provide advance warning about potential internal and external threats and attackers.
- Prioritize alerts, security threats and vulnerabilities.
These objectives are all very different – and they each bring different data into play. That speaks to the breadth and depth of analytics sophistication needed for an organisation to develop all the right capabilities. Success requires a confluence of different analytics disciplines but also a carefully plotted road map to mature security analytics competency. With such a road map, organisations can make the most of limited security personnel.
The Ponemon research tells us this has not been happening. For the most part, organisations have not done well aligning their security solutions to objectives – or at least have had trouble succeeding at it. For example, detection capabilities don’t measure up to expectations for all four of the most important security tasks respondents cited – data exfiltration, adversary reconnaissance, adversary lateral movement and insider threats. Even where current security solutions are showing the most promise – such as for detecting account compromise, malware and privilege escalation – they are only doing the job for 40 to 50 percent of respondents.
These results raise the issue that current security analytics approaches have not been effective, and the market needs to rethink traditional approaches to the data and analytics.
With more organisations adopting security analytics as part of their cybersecurity strategies, Ponemon Institute conducted survey research to determine the experience and impact to security postures. The report concludes the findings from the research, with citations on deployment, use and effectiveness of these solutions.
Ponemon Institute Report
A lifecycle approach to security analytics
The Ponemon report is a call to action for the marketplace to take a more proactive and holistic approach to data and analytics to get the results organisations are expecting. Addressing the challenges calls for a lifecycle approach – one that doesn’t just focus on data and algorithms. What’s needed is a consistent, governed process for deploying analytics. And the analytics must be consumable across a broad range of resources.
The security analytics life cycle can be visualised as two closed loops around discovery and deployment, with the following stages.
Ideation.
At this stage, stakeholders generate ideas, assess the challenges, determine what questions security analytics should answer and what data may hold the answers. Ideas may come from amassing, visualising and mining the data. They may also come from external notifications about emerging threats.
Data preparation.
Organisations often want to jump to the analytics and shortcut the basics of getting the data right. If they don’t address the data upfront, they’ll suffer for it later.
The prevailing concept of a data lake is insufficient. It’s not enough to store and analyse massive amounts of security data after the fact. That’s a huge technical challenge, requiring massive processing times and delaying results. There needs to be a process to enrich the data for analysis before it’s stored. This approach provides more context and identifies more subtle behaviours.
At this stage, security analysts work with other stakeholders to source the data, establish a time frame, provide data quality and integration, and plan how to serve the data to answer critical questions.
Exploration and visualisation.
Data utilities enable users to explore and visualise security data, begin to understand connections among influential variables, and make determinations from that data. This process can be done in a very user-friendly way in an intuitive visualisation that doesn’t require the skills of a data scientist. At this point, users generate data-driven hypotheses that are passed to the next stage.
Threat modelling.
Users now actually get their hands on the data, start to mine and define the data, and create the analytic models that will help solve the challenges identified in the ideation phase.
Simulation and validation.
Once a series of models has been created, the models are simulated across past data and the outputs tested against past known outcomes. This process pits champion and challenger models against each other to validate them and determine which models will perform best in the production environment.
Model deployment.
The best-performing models are deployed. These models must be flexible enough to be deployed in multiple different run-time environments – batch and real time – and to adapt to reflect ongoing flux in the security environment.
Monitor and evaluate.
Once deployed, models must be monitored and evaluated to track their ongoing efficacy. Ideally, this cyclical feedback process works to broaden and deepen the data sources, feed new insights back into the ideation phase, and continuously improve your security analytics.
Keep evolving.
Cyber threats are always adapting, and so are the data and analytics to detect and deflect them. Tomorrow’s channels and attacks will be different from today’s. Data sources evolve and multiply. Analytical methods improve. A successful security analytics program has to be in motion, comfortable with change, and able to cope with the surprises and challenges that will inevitably arise.
It’s a difficult challenge. But building sophisticated analytics ultimately pays off in improving organisations’ abilities to detect, investigate and respond to security events in a reliable, repeatable way.
Stuart Bradley is Vice President of Cybersecurity Solutions at SAS Institute.