Let the community know what you think. Share your opinions now!
From a pure cyber security and technical point of view the most important aspects are: (1) The detection rate and (2) The width of coverage (how much attack surface is protected).
For the first one, it is unfortunately very difficult to assess the detection rate of a solution unless you are an expert with a large dataset of threats (known and unknown) at your disposal to benchmark the solution against. In any case, you should make sure the solution is capable of detecting unknown and novel threats - this is, the solution must go beyond heuristics and possess a profound understanding of cyber threats.
Second, the width of coverage means that the solution covers a large number of threat verticals but more importantly is deployed at anywhere where a threat may appear. In several cases, customers do not cover all the areas of their network.
- Capabilities, if we don't understand what these are it is unlikely we will have a success story.
- The expertise to operate
- Product documentation
- Training provided by a supplier
- Best practices
- Successful use case scenario (ideally from the same industry),
- Pricing (matters for local gov), etc.
Education, documentation, use cases and best practices.
Documentation. Algorithmic transparency. Ability to get someone smart on the phone FAST at the vendor, without going through gatekeepers. Confidence levels (statistical validity).
There are many cybersecurity tools available, but some aren't doing the job that they should be doing.
What are some of the threats that may be associated with using 'fake' cybersecurity tools?
What can people do to ensure that they're using a tool that actually does what it says it does?