Log Management Forum

Content Specialist
IT Central Station
Sep 13 2018
One of the most popular comparisons on IT Central Station is IBM QRadar vs Splunk. People like you are trying to decide which one is best for their company. Can you help them out? Which of these two solutions would you recommend for Log Management? Why? Thanks for helping your peers make the best decision! --Rhea
Loren BuhleIt depends on the intended purpose of the tool and the type of people implementing it. Q-Radar tends to focus its out-of-the-box reports on compliance reporting, as well as tracking behavior-based tracking that is arduous for the DIY script writer. Having used both, they are both great platforms that take quite a bit of training to fully understand and wring the most value. Once you are at a steady state of log analysis, Q-Radar tends to be more useful on exploring "what we don't know" while Splunk tends to focus on confirming what I suspected, but didn't have the evidence. If you love scripting and going after known deviations, there are alot of Splunk consultants and expertise for hire. This makes Splunk slightly better for small organizations. If known deviations are "table stakes" and you focus is on exploring risks currently unknown to you...then Q-Radar is the better option, in my opinion. Q-Radar's learning curve used to be slightly steeper than Splunk...but I've heard there is more automation and better training on the Q-Radar in the past few years.
Darius RadfordAs all consultants say...it depends. The elements I would factor in are: 1) How they are staffed? 2) What groups outside of security will use this tool? 3) Is this for SIEM or log management? 4) Size of environment For" how are they staffed" question I think if you have developers and scripting expertise in house then this makes for a strong case for Splunk. If not then Q-Radar may be a better fit. The next question..."what groups outside the security group with use this tool?". Splunk does a lot of items that are really nice to haves, but don't necessarily fall into the security space. So if folks outside of security team will use the tool and subsequently help fund the endeavor this makes a strong case for Splunk. If this is a pure play security need, then out of the box, I feel this is a strong case for Q-Radar Is this for SIEM or log management? By default Splunk is not a SIEM, once you buy the SIEM/Security license then it becomes a SIEM. That being said, it does log management and analytics very well. Out of the box Q-Radar is a very effective SIEM with tons of pre-set rules. So obviously if this is a pure play log management move, then Splunk becomes a strong choice here. Size of environment. Because the Splunk licensing model is based on the number of events being produced in your environment, then this is a factor that must be considered. Q-Radar on the other hand is one of most straight-forward SIEM installations, and shortest time to value out there. As such, they have often been associated with small to mid sized organizations. There are other factors out there to consider...this is in no means an all encompassing list, however, I feel if you ask yourself these questions, at a minimum , then your answers becomes a lot clearer.
Eduardo PerezI had been looking at the Security Analytics Platforms from the top right quadrant in the Gartner and Forrester reports and found that [architectural] use-case really matters. For my business I was looking to build a shared environment that would service multiple customers so multi-tenancy, data security, roles based access controls and self-service-ability were key requirements. For the purposes of providing the SOC a single-pane of glass I needed a single configurable dash, and in a single-tenant environment both (Splunk and QRadar) could do it but in a multi-tenanted scenario on one could do it, at least without having to adding unnecessary systems. Also, I didn't want to spend too much time on integration, setup and configuration so having [SIEM] use cases and compliance reporting available out-of-the-box and integration with common devices and OS had to be part of the base offering allowing the team to install and start using it immediately. For me QRadar ticked all the boxes. Additionally the vast range of free apps which include user behavioral analytics are available which let you leverage its analytics engine. That said, Splunk is an effective analytics platform that has use cases outside of SecOps. You will need to have the depth of certified knowledge, expertise and deep pockets to make effective use.
Content Specialist
IT Central Station
Aug 16 2018
We know it's important to conduct a trial and/or proof of concept as part of the buying process.  Do you have any advice for our community about the best way to conduct a trial or PoC? How do you conduct a trial effectively?  Are there any mistakes to avoid?
Carl PhillipsAt the risk of sounding flippant,  I personally believe that the best way to trial log management tools is best encapsulated in these three words, "clarity, clarity, clarity".  Clarity as in having an agreed upon internal understanding of exactly what problem(s) you and your team are trying to solve.  This is critically important because a fair amount of consumers inadvertently conflate Log Management with the components of a SIEM, i.e., Security Event Management, Security Information Management, etc. If there is currently no one on staff that possesses the background and experience to develop and internally socialize a list of criteria for the trial / PoC, I would definitely leverage my interactions with the Vendors to develop my list of criteria.  If budget and internal strategies permits, I would recommend utilizing a 3rd party resource with demonstrated real-world experience and knowledge in this space to assist with providing the thought leadership and direction for the effort. Some suggestions for decision / evaluation points that you many want to consider as part of your trial / PoC. 1.  Scope  2.  A defined set of Use Cases 3.  Metrics      a.  response time       b.  retention requirements      c.  other 4.  Log Sources       a.  COTS - (diversity of log sources, Windows, Unix, Linux, Mainframe, Routers, Firewalls, etc)      b.  In-house developed       5.  Problem Set(s)      a.  Regulatory Compliance      b.  Security Monitoring / Analysis / Response      c.  Security Audit Remediation      6.   Key indicators for success / product selection Some items to think about: 1.  If you perform a POC / Trial with vendor equipment, do you have a policy / plan that prevents the unintentional retention of internal log data on vendor's hardware, i.e., disks, etc?  ( you absolutely       want to ensure that critical / sensitive log data does not go out the door with the vendor! ) 
Kent Gladstone-USAMark is correct but there are things to look for. Do you have a set of requirements? Not all log managers collect the information, not all log managers are easy to navigate, and not all log managers provide the reports your are looking for. Check to see how much data it collects so you can plan storage. Does the log manager compress the data or does it dependent on a third party tool? Do you know what you are collecting, and why? Are the logs used for security, sox audits or something else? My advise, before testing, is to gather and review you're requirements and test against that. There are lots of free trials. In fact if there isn't one on the web, contact the vendor and they'll give you something to try out for 30 days.
UmbertoAlloniHello, for my experience a good Log management POC task must include: - POC Business Requirements and Drivers of customer - POC Scope agreement - POC Success Criteria agreement o Use Cases to proof for primary interest of the customer - POC Collecting environment acquisition and definition o Customer IT Environment o Technology in POC requirements for installation and configuration - POC Archiving environment acquisition and definition o Customer IT Environment o Technology in POC requirements for installation and configuration - POC Tasks plan also: - Make an evaluation score matrix for primary Log Management capability o Log Collection capabilities – Using an Agent-based approach or Agentless approach, out-of-the-box log collection support for 3 rd party commercial IT products o Parsing & Normalization capabilities – Collected logs will be parsed and normalized to o CIA guarantee capabilities - Information security's primary focus is the balanced protection of the confidentiality, integrity and availability of data (also known as the CIA triad) while maintaining a focus on efficient policy implementation, all without hampering organization productivity - Avoid to proof capability or feature out of success criteria (wasted time)
Content Specialist
IT Central Station
May 31 2018
One of the most popular comparisons on our site is Compare ELK Logstash vs Graylog.  One user says about ELK Logstash, "ELK documentation is very good, so never needed to contact technical support." Another user says about Graylog, "UDP is a fast and lightweight protocol, perfect for sending large volumes of logs with minimal overhead." In your opinion, which is better and why? Thanks! --Rhea
Martin LabelleThe question has two part. You need to choose the back end to aggregate the log / information you want to centralize to allow advance query. On our side we decide to go with ElasticSearch has a backend and leverage the kibana for advance query to our users. Also on our project, we did many integration in ElasticSerrch like application logging. The client side / log shipping mechanism, you have many way to do it. Gralog / Syslog forwarder have minimal overhead to forward event / log. ELK support Graylog and many other method. We decide to leverage the beat project (filebeat) to forward all file log to ELK. As conclusion, both product are very powerful and the real value is to have a central point with all relevant information to take the right decision.
Sr. Director of Community
IT Central Station
Apr 26 2018
Let the community know what you think. Share your opinions now!
Marc BoitelReal Time remediation Ease of customization (collectors/connectors) Integration with Identity management stacks (for enriched information) Scalability (possible split between collection, correlation, remediation, reporting, ..) No hardware constraints PCI, SOX, ISO,.... reporting
MarkBrownLog compression and metadata storage capability Ease of implementation/integration Relational or Full Text English Query Support, Efficient Query Response Compatibility with existing security vendors/products Responsiveness of Tech Support and Integration Support Services Support for breadth of security vendors and speed of new security product log integration ID Management, Ticketing, and Geolocation Visualization Support
RanjanSandeep1. Automatic Remediation 2. Co-relation Engines 3. Real Time Threat Visibilities 4. Pre-Built Dashboards
User at a tech services company with 10,001+ employees
From a few reviews I saw that Elastic Stack, which is an open source stack solution is gaining popularity.  Splunk has been in the market for quite some time but is commercial product.  Is it possible to replace Splunk with Elastic Stack?  If so, what are all the benefits we may be losing in this decision?  Does Elastic Stack also have a retention policy?  Is Kibana a form of equivalent to what Splunk provides?  Is it advisable to set Elastic Stack for an enterprise application?  What may be the challenges if we want to setup Elastic Stack for application which runs on two nodes and with a load balancer?

Sign Up with Email