Capture DB - they all use NoSQL db and hence solve the ad hoc query and 'go back in time' problem with current best of breed SIEM and DLP solutions that rely on real time analysis of incoming logs (and don't store them). This means deeper and quicker iterative threat analysis and assessment that resolves the provenance and impact of a risk and threat elevated by incoming logs
Anomaly detection - using a baseline and anomalies to surface and rank incoming logs and associated threat/risk - these tools are better able to 'separate the chaff from the wheat' and avoid alarm fatigue and false positives plaguing current log aggregate type of security solutions. Further these security analytics 'learn' in the background and with much more agility than current solutions which must have an explicit 'learning mode' for an extensive period of time as part of set up.
'Fuzzy Logic' rules - morphing the term to describe how these solutions are much more agile and relative in interpreting risk and threats than current generation correlation rules with rely on very discrete criteria to treat incoming logs priority. Very important as malware and cyber criminals are equally agile at morphing there attack vectors.
Shop floor to top floor - the UI and dashboards tend to move the querying and decision making and resulting assessments up to the executive suite (C level) as opposed to backrooms SIRT, InfoSec tool. Goes to response time and TRA.
Kill Chain - these solutions build a non linear attack 'genealogy' showing direct chain of custody of events leading to a data breach AND related events, users, end points involved passively or as middle men over time. This not only gives the provenance of breach but points to future weak spots in your surface area to proactively in advance of future attacks.
Room for Improvement:
Like any new product the traditional enterprise readiness criteria around scaling, support, robustness, integration and deployment need to be proven out over their maturity curve. That being said their architecture provides confident remedies for scaling and robustness. Further as a 'pro to the con' these tools 'play nice in the security sandbox' in that they have public apis that easily integrate into existing security suites to add value to existing log aggregation solutions in place in an enterprise with significantly reduced set up cycles to their predecessors.
Use of Solution:
Assessed/Used the following next gen security analytics tools. There may be more competitors in space but these are the ones I am most familiar with and endorse:
- Interset (formerly Filetrek)
This is a compare and contrast relative to best of breed DLP/SIEM solutions in Garner MQ and widely deployed
Interset - further to above key differentiator of this product is focus on insider threat - by tracking file activity and correlating against user end points and risky activities (read file exfiltrations) the resulting dashboards present an organizational risk profile with actionable events prioritized by risk = probability X impact. If one supports the notion that layered security needs to focus on inside out risk instead of trying to securing the perimeter - a very compelling tool for where to focus your infosec/forensic brain power.
Cybereason - similar in mindset to above (inside out risk) this application focuses on Malops - ie the notion that malware has and will continue to penetrate the perimeter - but will exhibit tell tale patterns of behaviour trying to exfiltrate files (in a manner similar to an insider) - this tool excels at identifying potential attacks in a manner easily understandable at an executive level and again maximizing efficiency of your deep security talent.
SQRRL - similar in intent to Cybereason. Major differentiator is tight AD coupling and labeling functions that can decisively evaluate impact and importance of data under attack and provenance of attack (what users are involved, what machines are infected)
As a final thought - my recommendation would 'either or' selection - they all support the notion of a security ecosystem where every tool gets better with more data. So using these tools in a sort of proactive round robin log assessment and pushing logs to each other would provide the best all round solution.