What is our primary use case?
It is used primarily to help put a layer of security around some of our legacy applications that were built quite some time ago. It's also used to provide better quality assessments on the vulnerabilities of some of these applications, compared to some of the other tools that we've been using.
We're using the SaaS platform.
How has it helped my organization?
The solution’s OSS feature, through which we can look at third-party open-source software libraries, give us better visibility into such libraries compared to any other tool on the market, because this is the only tool that I'm aware of that offers that capability. It's not affecting our software development a whole lot because we're not holding developers accountable to that level of metrics, but it's valuable insight to have.
In a way, Assess helps developers incorporate security elements while they are writing code. Not while they're actually writing it, but certainly while they're fixing it, because it provides really impactful feedback on how to go back and fix that code, and the best practices on how to fix it.
It also saves time and money by helping us fix software bugs earlier in the software development life cycle. The enterprise that I'm with has not, historically, prioritized any kind of security remediation at all. It considers all of it to be in a context they call "technical debt." This solution allows the organization to prioritize how to best use the labor hours allocated for technical debt. The savings are an intuitive inference to make in this case. I'm personally seeing that it's easier to get things remediated, versus where they weren't being remediated at all because the quality of the results from those other tools was just terrible. Now that I'm seeing that action being taken on them, it's very rewarding. I can nearly guarantee that we've saved time and money. I just don't know exactly how much.
What is most valuable?
The most valuable feature is the IAST part. Institutionally, we're not quite at the point of using Contrast for the Protect functionality because we have other tools that overlap with the web application firewall component of it. But for the Assess component, there's a direct correlation to other tools that we've used and the failures of those tools. Contrast, in terms of providing that vulnerability assessment, it provides an immediate benefit there.
The effectiveness of the solution’s automation via its instrumentation methodology is a solid eight out of 10.
The accuracy of the solution in identifying vulnerabilities is better than any other product we've used, far and away. In our internal comparisons among different tools, Contrast consistently finds more impactful vulnerabilities, and also identifies vulnerabilities that are nearly guaranteed to be there, meaning that the chance of false positives is very low. The number of false positives from this product is much lower compared to competing tools that we use right now: WebInspect and AppScan. It reduces the number of false positives we encounter by more than 50 percent.
What needs improvement?
The effectiveness of the solution’s automation via its instrumentation methodology is good, although it still has a lot of room for growth. The documentation, for example, is not quite up to snuff. There are still a lot of plugins and integrations that are coming out from Contrast to help it along the way. It's really geared more for smaller companies, whereas I'm contracting for a very large organization. Any application's ability to be turnkey is probably the one thing that will set it apart, and Contrast isn't quite to the point where it's turnkey.
Also, Contrast's ability to support upgrades on the actual agents that get deployed is limited. Our environment is pretty much entirely Java. There are no updates associated with that. You have to actually download a new version of the .jar file and push that out to the servers where your app is hosted. That can be quite cumbersome from a change-management perspective.
For how long have I used the solution?
I've been using Contrast Security Assess since October of last year, making it about nine months.
What do I think about the stability of the solution?
Overall, the stability is quite good.
We've had a couple of support-related problems. Contrast is funny because there are many aspects of it that they don't support. For instance, we have ColdFusion applications and, on paper, Contrast did not support ColdFusion. However, it will still work with ColdFusion, kind of. But it has caused some problems as it comes to isolating troubleshooting issues that occur. It's left us in a position where we have to make generalized assumptions about what can and can't be supported. So, out-of-the-box, we've made the decision not to try to support ColdFusion because of the issues that that can pose for us.
What do I think about the scalability of the solution?
The scalability ties back to something I said before about change management. So far, we haven't seen anything that would prevent us from scaling upwards significantly. However, it requires the organization to have a pretty robust way of handling the changes for Contrast: for instance, the updates of the application itself. Because those updates aren't bundled into Contrast, it behooves the organization that's deploying Contrast to ensure it has a very robust change-management strategy to work with the product.
Out of our perimeter applications, we've got about 20 apps onboarded. Those applications that it has been deployed to are key applications, including key revenue-driving applications, but it's still being used only in a minority of our applications at the moment. Our adoption rate is around 10 percent. We have plans to increase usage of Contrast Security. We have hundreds of applications. Out of our customer-focused applications that are on the perimeter — we have over 200 of them — Contrast is deployed to about 20 of them.
We have about 130 users registered to use the product. The majority, about 80 percent, are developers, while about 10 percent are security personnel, and 10 percent are managers. We have a dedicated staff for maintaining the solution. That's the staff that I'm part of right now.
How are customer service and technical support?
Their level of support and troubleshooting for the product is limited because of how they handle troubleshooting. It's done through a log file that's very cumbersome to work with.
Their technical support staff is very responsive. Personally, I've put in about 60 support tickets with Contrast. Some of the support tickets have ended up being actual changes to the product itself. Overall, I'm pretty pleased with that. But they're definitely still growing. They're a small company that is on the verge of growing into a very big company. I can tell from the quality of support I'm getting that they're struggling to keep up with that demand.
Which solution did I use previously and why did I switch?
We use WebInspect and AppScan. We're evaluating the possibility of switching from them to Contrast, but right now Contrast is still in trial. We're not quite at that point in making a decision to drop one of those other tools yet.
How was the initial setup?
The initial setup is straightforward. The version we're using is built for Java, and the setup procedure involves you associating the Contrast .jar file with the JVM arguments of the app server itself. The instructions on that are relatively clear and they've broken those instructions out per container platform that the JVM can run in. It's as clear as it can be for that product.
We're still deploying. We have many apps and there's an onboarding process associated with it. But on a per-app basis, it can take us less than an hour. For a larger app, in a clustered environment, it might take closer to a week.
Because we have a very large organization, we have a different team per application. We have an onboarding process where we work with an application team to onboard the Contrast product into their workflow, and then follow up with them to ensure that they're using it correctly. It's a multi-stage approach on a per-app basis.
What about the implementation team?
We've mostly done it ourselves, although we have Contrast Security Professional Services on staff to assist with harder problems, and to follow up directly with our development teams. We've been happy with Professional Services.
What was our ROI?
We have seen ROI, but I can't get into specific numbers because those are sensitive to the organization. But some of these applications are key revenue drivers. Contrast's ability to help secure them, even if it is just those applications, gives us a little confidence that they are being looked at in terms of security. That is always going to be a significant return on investment, compared to the other tools that, frankly, weren't driving the progress necessary to secure those applications.
What's my experience with pricing, setup cost, and licensing?
If you know your needs upfront, and if you're more concerned about vulnerabilities and you already have a web application firewall that you're happy with, then focus on the Assess component of it, because the Assess component has a very straightforward licensing strategy.
If you need the web application firewall and you have a highly clustered environment, then you will be paying that license cost per server. Unfortunately, that does not scale as well for us. It helps to understand what your use case is upfront and apply that with Contrast, knowing whether or not you need it per application or per server.
Which other solutions did I evaluate?
We have not evaluated other IAST platforms.
What other advice do I have?
Make sure that you have a very good change-management strategy in place ahead of time.
Also, it's not enough to have the solution itself. It still requires proactive management on behalf of your developers to make sure they understand what the product is offering and that they are using the product in a way that will benefit them.