Contrast Security Assess Review

Because they're not waiting on security to complete scans for them, Dev teams are not seeing delays in deployment


What is our primary use case?

We've been using Contrast Security Assess for our applications that are under more of an Agile development methodology, those that need to deliver on faster timelines.

The solution itself is inherently a cloud-based solution. The TeamServer aspect, the consolidated portal, is hosted by the vendor and we have the actual Assess agent deployed in our own application environments on-prem.

How has it helped my organization?

We've historically run dynamic and static scans for all of our applications, but for these teams that need to deploy on a much faster basis, we prefer using Contrast because there are no point-in-time scans required. There isn't a lot of triage required when it comes to reviewing the results. Everything is instant and requires little bottleneck from the security-team side, and the developers can continue on with their development and testing without us.

We have a very large backlog at the moment for DAST scan requests, from our application teams. That backlog has grown so much that some of the teams have missed their initial deployment timelines because they're waiting on us to become available to run dynamic scans. Now, with teams that have Contrast, they're not seeing any delays in their deployment process because they're not waiting on us to complete the scans on their behalf. The vulnerabilities are being automatically identified using the tool.

What is most valuable?

The most valuable feature is the continuous monitoring aspect: the fact that we don't have to wait for scans to complete for the tool to identify vulnerabilities. They're automatically identified through developers' business-as-usual processes.

The automation of the actual vulnerability identification is great. I would give it a very high rating, given that it requires little of the security team or developers to understand and start reviewing the results that are identified.

The false positive rate is another good feature. It has a very low false positive rate. That means my team, the security team, has to spend less time looking at results and findings, compared to historical, static and dynamic scans where the false positive rate is much higher. From a percentage perspective, somewhere around 90 percent of the time we used to spend has been given back to our team, because the false positive rate with Contrast is less than 5 percent.

In terms of the accuracy of vulnerability identification, so far we've had tens of thousands of issues identified in applications that have historically been scanned by dynamic and static scanning. So far, the large majority of those findings have been true positive. I may have seen just a handful, five or 10, false positives so far, in the scope of tens of thousands. That's a very low rate.

We also use the solution's OSS feature through which we can look at third-party open source software libraries. It is a great tool. We've never had a solution for software composition analysis. It has affected our software development greatly. Since we've never really had a solution for doing software composition, nor have we required fixes for vulnerable third-party libraries, this has changed the way that developers are looking at usage of third-party libraries, upfront. It's changing our model of development and our culture of development to ensure that there is more thought being put into the usage of third-party libraries.

The solution is definitely helping developers incorporate security elements while they are writing code. Since we're able to install Assess in Development and QA and all the pre-production environments, developers can start making use of the tool as soon as they have a deployed version of their products. As they code new features and test those out in their development environment, Contrast is already going to be automatically identifying things at that point. We are identifying issues much earlier in the software development life cycle, which makes it much less costly for developers to fix those findings.

We're saving time and money by fixing software bugs earlier in the software development life cycle. We're saving time on the developers' side, as well as on the security auditors' side.

What needs improvement?

Regarding the solution's OSS feature, the one drawback that we do have is that it does not have client-side support. We'll be missing identification of libraries like jQuery or JavaScript, and such, that are client-side.

The same thing is true on the custom code side: the client-side technology support. Although client-side technologies are inherently less risky than server-side technologies, which is where Contrast focuses testing, it would definitely help for this tool to identify both the server-side and client-side findings in libraries, as well as custom code. This would help us move away from using multiple tools. For example, if we have Contrast for our server-side testing, we still need to use some sort of static scanning sensor for the client-side. In a perfect world, it would just be Contrast Assess doing both of those.

For how long have I used the solution?

I have been using Contrast Security Assess for five months.

What do I think about the stability of the solution?

So far, the stability has been good. We've only had two applications where performance was affected by the agent. For the hundreds of other agents we've deployed thus far, there's been no impact.

What do I think about the scalability of the solution?

Scalability ties back to automation. It's very tough to scale this from an automated perspective, so we've just been doing manual installs from the beginning. If there were an easier way, a way to automate the deployment of the solution, that would be one of our hopes for the product roadmap.

How are customer service and technical support?

On a scale from one to five, Contrast technical support is about a four. I haven't had too many support issues just yet, but in each one that I have had, they have been very quick to respond; within hours as opposed to days. I haven't rated it a five just because I haven't had enough support requests to see if they are any different than other software vendors out there.

Which solution did I use previously and why did I switch?

We did not use something else specifically for interactive app testing or software composition. We've only had tools for static and dynamic testing.

Our decision to go with Contrast dates back to the whole issue of our application teams that need faster results and fewer bottlenecks. We use Fortify for static and dynamic scanning, and that creates a lot of time delays, either waiting for a scan or waiting for review of the scan results to be completed. Whereas with Contrast, there are no delays. The teams that are more Agile and deploying much more often require that feature.

How was the initial setup?

The setup of the solution is different for each application. That's the one thing that has been a challenge for us. The deployment itself is simple, but it's tough to automate because each application is different, so each installation process for Contrast is different. But manually installing the tool or deploying it is very simple.

The setup of the Contrast Assess agent is quite simple. Not much time is needed upfront to get this working and, thereafter, ongoing maintenance is very trivial for Assess.

We're still deploying. We have thousands of applications and thousands of teams around the world that we're deploying to. But if we're talking about just one application, at most it would take one to two hours.

The implementation strategy is that we are deploying it firm-wide within our organization to at least make use of the software composition analysis, because that is a part of the agent that is a free feature. At that point, once we have the agent deployed, that's when we would start working with application teams to give them an understanding of the findings that are being identified, just for software composition analysis. In the meanwhile, the interactive application security testing feature of the same agent is working in the background. So as teams are seeing custom code vulnerabilities being identified as well, we're working with those teams to apply licenses as needed. 

From the deployment perspective, we're focusing holistically on deploying the agent for software composition, and then thereafter, making more risk-based decisions on which teams or applications would use a license for interactive testing.

The adoption rate will be 100 percent because we're deploying all of these agents to all of our application servers out there. For now, we're at about 30 percent. We have a little over 100 users, currently. They range from application security testers and managers, like myself, to product managers who are worried about the business-side of getting the application deployed. And then there are the development teams and build-engineers who comprise those teams. Each application team maintains its own instance.

What about the implementation team?

We're working with Contrast. They've provided a very helpful technical solution architect who has been helping with the deployment.

What was our ROI?

From a security team perspective, we're able to free up a lot more time. We spend less time reviewing results and can spend our time elsewhere. Developers have the same thing. They can spend more of their time working on actionable results rather than looking at false positives and waiting for the security team to complete testing on their behalf.

What's my experience with pricing, setup cost, and licensing?

The good news is that the agent itself comes in two different forms: the unlicensed form and the licensed form. 

Unlicensed gives use of that software composition analysis for free. Thereafter, if you apply a license to that same agent, that's when the instrumentation takes hold. So one of my suggestions is to do what we're doing: Deploy the agent to as many applications as possible, with just the SCA feature turned on with no license applied, and then you can be more choosy and pick which teams will get the license applied. Thankfully, it's always going to be working. You just won't be able to see the IAST results without applying that license.

There are no fees apart from the licensing fee. Some teams might run into issues where they need to spend more money on their servers and increase memory to support the Contrast Assess agent running while the application is running, but that is a small amount.

Which other solutions did I evaluate?

We did not evaluate other options. We met with Contrast and they were the leader in the space for instrumentation, so we went forward with them.

What other advice do I have?

Make sure you understand your environment before deploying. Try to get an idea of what technologies are in use by applications so you can group them and group the deployment and the implementation. That way you can focus on automating .NET deployments, for example, first, and then move on to Java, etc.

The biggest lesson I have learned from using this solution is that there is a tool out there that is really changing the way that we are running security testing. In the security realm we're used to the static and dynamic testing approaches. Contrast Assess, as well as some other tools out there, has this new feature of interactive application security testing that really is the future for developer-driven security, rather than injecting security auditors as bottlenecks into the software development life cycle.

I would rate Contrast Security at eight out of 10, and that is because of that the lack of client-side support and the troubles in automating the deployment holistically across an organization.

**Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
More Contrast Security Assess reviews from users
...who compared it with Veracode
Add a Comment
Guest