Contrast Security Assess Review

Continuously looks at application traffic, adding to the coverage of our manual pen testing


What is our primary use case?

We use the solution for application vulnerability scanning and pen-testing. We have a workflow where we use a Contrast agent and deploy it to apps from our development team. Contrast continuously monitors the apps.

When any development team comes to us and asks, "Hey, can you take care of the Assess, run a pen test and do vulnerability scanning for our application?" We have a workflow and deploy a Contrast agent to their app. Because Contrast continuously monitors the app, when we have notifications from Contrast and they go to the developers who are responsible for fixing that piece of the code. As soon as they see a notification, and especially when it's a higher, critical one, they go back into Contrast, look at how to fix it, and make changes to their code. It's quite easy to then go back to Contrast and say, "Hey, just consider this as fixed and if you see it come back again, report it to us." Since Contrast continuously looks at the app, if the finding doesn't come back in the next two days, then we say, "Yeah, that's fixed." It's been working out well in our model so far.

We have pre-production environments where dedicated developers look at it. We also have some of these solutions in production, so that way we can switch back.

It's hosted in their cloud and we just use it to aggregate all of our vulnerabilities there.

How has it helped my organization?

If an app team is going to deploy new features to prod, they put in a ticket saying, "We are including these features in our 2.0 release." The ticket comes to our team. We deploy Contrast Security and then we do a bunch of manual pen tests. During the time that we're doing manual pen tests, Contrast will have a bunch of additional findings because Contrast is sensor-based. It's an agent-based solution which continuously looks at traffic coming in and going out of the application. When my team does manual penetration tests, Contrast looks through those flows and that makes our coverage better. It goes hand-in-hand with our pen test team. When the manual pen-test team tests the application, Contrast is looking at that traffic. Another application, like a Qualys, doesn't go hand-in-hand with a manual pen test team. Contrast really helps us because it's more like another resource looking at traffic, and at logs. It's like a watchman looking at traffic going in and going out. I literally consider it as another resource looking at traffic, day in and day out.

Contrast has also reduced the number of false positives we have to deal with, by something like 10 to 20 percent over the 18-plus months that we've had it.

The solution is accurate 90 percent of the time. Most of the time, when Contrast has identified top vulnerabilities in the OWASP Top 10, our manual pen-test team has gone in and said, "Yes, for sure." There were times when, because of resourcing issues, we did not have people pen-testing and they would just say, "Okay, we'll see what Contrast says." And sure enough, Contrast would come back with 10 to 20 critical vulnerabilities. Then we would backtrack and have manual pen do some pen tests. They would come back and say, "Yes, it has literally identified most of them;" things like a SQL Injection, which is in the OWASP Top 10. So we've seen that happen in the past, and that's why I feel the accuracy of Contrast is pretty good.

The advantage of using Contrast is that it is continuous.

I've seen some of the development teams completely take up Contrast themselves and work in Contrast. For example, a developer will be notified of an issue and will fix the code. He will then go back to Contrast and mark it as remediated. Then, he will keep watching the portal. He will be notified if the same vulnerability is found. We have seen teams that completely like the information that Contrast provides and they work independently with Contrast, instead of having a security team guiding them and holding their hands. There are times when we do hold hands for some of the teams, but it really depends on the software developers' maturity and secure coding practices.

In addition, it definitely helps save us time and money by being able to fix software bugs earlier in the software development lifecycle. It really depends on where you put Contrast. If you put Contrast in your Dev environment, sure enough, as soon as the developer deploys his code and QA is testing it in that environment, it will immediately flag and say, for instance, "You're not using TLS 1.2." The developer will go back and make those changes. It really depends on what model you have and where you want to use Contrast to your advantage. A lot of teams put it in the development environment or a preparation environment and get to fixing vulnerabilities before something is released.

I've also seen the other side of the fence where people have deployed it in production. The vulnerabilities keep coming. Newer hacks develop over time. When teams put it in prod and an exploit happens, they can use Contrast Protect and block it on the other side. You can use it as you need to use it.

The time it saves us is on the order of one US-based FTE, a security person at an average pay level. At a bare minimum, Contrast helps us like that resource. It's like having a CISSP guy, in the US, on our payroll. That's how we quantify it in our team and how we did so in our project proposal.

What is most valuable?

Contrast has a feature called Protect. When a real exploit comes through, we can look at it and say, "Hey, yeah, this is a Cross-Site Scripting or SQL Injection," and then we can block it.

Another especially valuable feature is the stack trace. I've been in the application security space for about 15-plus years now. I saw it when it was a baby or when people thought of it as the "icing on the cake." That was especially true when they had money. Then they would say, "Yeah, we can now look at security." Now, security is a part of the SDLC. So when Contrast identifies a vulnerability, it provides very important information, like stack trace and variables.

It also has another feature called IAST, interactive application security testing. When I started out I was actually an embed developer, and now I'm managing an OWASP team. I've seen both ends of the spectrum and I feel that the information for every vulnerability that Contrast provides is really cool and amazing, enabling us to go and fix the vulnerabilities.

It also has features so you can tweak a policy. You can make a rule saying, "Hey, if this vulnerability comes back, it is not an issue." Or you can go and change some code in a module and tell Contrast, "This is per-design." Contrast will cleverly identify and recognize that it was marked as per-design. It will not come back and say that's a vulnerability.

We use the Contrast OSS feature that allows us to look at third-party, open-source software libraries, because it has a cool interface where you can look at all the different libraries. It has some really cool additional features where it gives us how many instances in which something has been used. For example, of the total, say, 500 calls, has the OSS been used that many times? It tells us it has been used 10 times out of 20 workloads, for example. Then we know for sure that OSS is being used. There are tools that would tell you something is being used, but sometimes developers can include libraries that are never used. Contrast goes one step further and tells you how many times something has been used. 

I can't quantify the effect of the OSS feature on our software development, but it gives us a grading from A to F. In this evolving security world, customers come back to us and say, "Hey, do you guys have a pen test report? We can go back to Contrast and pull all this stuff and provide it to customers.

What needs improvement?

Contrast Security Assess covers a wide range of applications like .NET Framework, Java, PSP, Node.js, etc. But there are some like Ubuntu and the .NET Core which are not covered. They have it in their roadmap to have these agents. If they have that, we will have complete coverage. 

Let's say you have .NET Core in an Ubuntu setup. You probably don't have an agent that you could install, at all. If Contrast gets those built up, and provides wide coverage, that will make it a masterpiece. So they should explore more of technologies that they don't support. It should also include some of the newer ones and future technologies. For example, Google is coming up with its own OS. If they can support agent-based or sensor-based technology there, that would really help a lot.

For how long have I used the solution?

I have been using Contrast Security Assess for a year and a half.

What do I think about the stability of the solution?

You can't quantify anything about the stability. It's more an autopilot, like agents. It's more like a process monitor that keeps looking at traffic. It's quite similar to that. Once you put it on there, it just hangs in there until the infrastructure team decides to move the old apps from PCF to another environment. Once it has been deployed it's done. It's all auto-maintained.

What do I think about the scalability of the solution?

It depends on how many apps a company or organization has. But whatever the different apps are that you have, you can scale it to those apps. It has wide coverage. Once you install it in an app server, if the app is very convoluted, it has too many workflows, that is no problem. Contrast is per app. It's not like when you install source-code tools, where they charge by lines of code, per KLOC. Here, it's per app. You can pick 50 apps or 100 apps and then scale it. If the app is complex, that's still no problem, because it's all per app.

We have continuously increased our license count with Contrast, because of the ease of deployment and the ease of remediating vulnerabilities. We had a fixed set for one year. When we updated about six months ago, we did purchase extra licenses and we intend to ramp up and keep going. It will be based on the business cases and the business apps that come out of our organization.

Once we get a license for an app, folks who are project managers and scrum masters, who also have access to Contrast, get emails directly. They know they can put defects right from Contrast into JIRA. We also have other different tools that we use for integration like ThreatFix, and risk and compliance and governance tools. We take the results and upload them to those tools for the audit team to look at.

How are customer service and technical support?

They have a cool, amazing support team that really helps us. I've seen a bunch of other vendors where you put in tickets and they get back to you after a few days. But Contrast responds really fast. From the word "go," Contrast support has been really awesome.

That's their standard support. They don't have premium support. I've worked with different vendors, doing evaluations, and Contrast is top-of-the line there.

Which solution did I use previously and why did I switch?

Before Contrast we were using regular manual pen-testing tools like Burp and other common tools. We switched to Contrast because the way it scans is different. Back in those days, security would do a pen test on Friday or Saturday — over the weekend when the traffic is less. We used to set aside time. Contrast doesn't work that way. It's continuous scanning. We install an agent and it continuously does it. Continuous is way better than having a separate time where you say, "We're going to scan at this time." The Dev-SecOps model is continuous and Contrast fits well there. That's why we made the switch.

Contrast is above par with respect to the different applications that I've used in the past, like Veracode. I saw false positives and false negatives with all those tools. But Contrast is better than all the other tools that I've used.

How was the initial setup?

The initial setup was straightforward. At the time, I was doing a proof of concept of Contrast Security to see how it works. It was fairly simple. Our company has a bunch of apps in various environments. Initially, we wanted to make sure that it works for .NET, Java, and PCF before we procured it. It was easy.

Our implementation strategy was coverage for a complete .NET application and then coverage for a complete Java application, in and out, where you find all the vulnerabilities and you have all the different remediation steps. Then we set up meetings with the app teams to go over some of it and explain things. And then, we had a bunch of apps in PCF. These were the three that we wanted: .NET, Java, and PCF. They are our bread and butter. We did all three in 45 days.

From our side, it was just me and another infrastructure guy involved.

What about the implementation team?

We only worked with Contrast. There were times when Contrast worked with Pivotal, internally, for PCF. But they pulled it off because they have a fairly good agreement with Pivotal and its support team. Initially, we had a few issues with deploying a Contrast tile to Pivotal. But Contrast worked things out with Pivotal and got all of it up for us. It was easy for us to just deploy the tile and bind the application. Once the application is bound, it's all about the vulnerabilities and remediation.

What was our ROI?

We expect to see ROI with the architecture team, the infrastructure team, and with the development teams, especially when it comes to how early in our development cycle the vulnerabilities are found and remediated. That plays a big part because the more time it takes to find a software vulnerability, obviously, the more your cost to market will be substantially higher.

What's my experience with pricing, setup cost, and licensing?

I like the per-application licensing model, but there are reasons why some solutions want to do per KLOC. For us, especially because it's per app, it's really easy. We just license the app and we look at different vulnerabilities on that app and we remediate within the app. It's simpler.

If you have to go to somebody, like a Dev manager and ask him, "Hey, how many thousands of lines of code does your application have?" he will be taken aback. He'll probably say, "I don't know." It's difficult to cost-segregate and price things in that kind of model. But if, like with Contrast, they say, "Hey, your entire application — however big it is, we don't care. We're just going to use one license," that is simpler. This type of license model works better for us.

Which other solutions did I evaluate?

Before choosing Contrast Assess, we looked at Veracode and Checkmarx. 

Contrast does things continuously so it's more of an IAST. Checkmarx didn't. Using it, you would have to upload a .war file and then it would do analysis. You would then go back to the portal and see the vulnerabilities there. 

It was the same with Veracode. When you take a SAST piece or a DAST piece, you have to have some specific timing in some workflows and then you upload all of the stuff to their portal and wait for results. The results would only come after three days or after five days, depending on how long it takes to scan that specific workflow. 

The way the scanning is done is fundamentally different in Contrast compared to how the solutions do it. You just install Contrast on the app server and voilà. Within five minutes you might see some vulnerabilities when you use that application workflow.

What other advice do I have?

If you are thinking about Contrast, you should evaluate it for your specific needs. Companies are different. The way they work is different. I know a bunch of companies that still have the Waterfall model. So evaluate and see how it fits in your mode. It's very easy to go and buy a tool, but if it does not fit very well in your processes and in your software development lifecycle, it will be wasted money. My strongest advice is: See how well it fits in your model and in your environment. For example, are developers using more of pre-production? Are they using a Dev sandbox? How is QA working and where do they work? It should work in your process and it should work in your business model.

"Change" is the lesson I have taken away by using Contrast. The security world evolves and hackers get smarter, more sophisticated, and more technology-driven. Back in the day when security was very new, people would say a four-letter or six-letter password was more than enough. But now, there is distributed computing, where they can have a bunch of computers trying to compute permutations and combinations of your passwords. As things change, Contrast has adapted well to all the changes. Even five years ago, people would sit in a war room and deploy on weekends. Now, with the DevOps and Dev-SecOps models, Contrast is set up well for all the changes. And Contrast is pretty good in providing solutions.

Contrast is not like other, traditional tools where, as you write the code they immediately tell you there is a security issue. But when you have the plugin and something is deployed and somebody is using the application, that's when it's going to tell you there's an issue. I don't think it has an on-desktop tool where, when the developer writes this code, it's going to tell him about an issue at that time, like a Veracode Greenlight. It is more of an IAST.

We don't have specific people for maintenance. We have more of a Dev-SecOps model. Our AppSec team has four people, so we distribute the tasks and share it with the developers. We set up a team's integration with them, or a notification with them. That way, as soon as Contrast finds something, they get notified. We try to integrate teams and integrate notifications. Our concern is more about when a vulnerability is found and how long it takes for the developer to fix it. We have worked all that out with Power BI so it actually shows us, when a vulnerability is found, how long it takes to remediate it. It's more like autopilot. It's not like a maintenance type of thing.

I would rate Contrast at nine out of 10. I would never give anything a 10, but Contrast is right up there.

**Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
More Contrast Security Assess reviews from users
...who compared it with Veracode
Add a Comment
Guest