Contrast Security Assess Review

Our dev team can see vulnerabilities and start mitigating them before the test-app team inquires about remediation


How has it helped my organization?

The daily reporting of vulnerabilities is very helpful for our development team. They can log in to the Contrast tool and see the vulnerabilities and start working to mitigate them before my test-app team reaches out to them inquiring about when certain vulnerabilities are going to be remediated. A case in point was last week, when I followed up with one of my developers. I said, "We need to mitigate this set of vulnerabilities," and he said, "Well, I've already started mitigating them. You should see the JIRA ticket out pretty soon." It's that type of response that we really like with Contrast. It allows us to move faster than if we were just using a SAST tool.

Before Contrast, everything was done manually. The developers were doing their own code reviews as best they could. When I came in and I started having the application security meetings, I found that most of the developers were very adept at building code for functionality, and testing functionality based on their unit tests. We had something like 25 or 30 developers in my class, and only one person was familiar with application security. That should tell you how far behind we were. So we had a heavy educational push, bringing in Contrast personnel for onsite application security training and to learn how to integrate Contrast into our SDLC. They showed them what the vulnerabilities are and how to mitigate them. The change from three years ago to now is one of the benefits.

Once we got it implemented — deployed the agents onto the application servers and got those vulnerabilities to populate into our team server — it was coming up with a Visio diagram of our processes for Agile development and our process for Waterfall development and it really turned around how our company is is a able to identify and mitigate and roll out fixes for our security vulnerabilities.

It also helps developers incorporate security elements while they're writing code. Our development team has it on their local box, through the IDE, and as they are building the functionality they're running the scans at that time. They correct some of the vulnerabilities right there before passing it along on the SDLC. Sometimes they will miss things and we'll catch them in our QA environment. It has positively affected our software development because, before that, everything was manual. When we brought in Contrast, it exposed how many vulnerabilities, criticals and highs, had been missed. The difference between doing purely manual reviews and doing a review with instrumentation was very stark.

It's hard to quantify how much time and money it has saved us by fixing software bugs earlier in the software development lifecycle. There's time, cost, and public image. In terms of the costs saved, we had something like 2,000 vulnerabilities — some critical and some high — and I don't even know how to put a price on that. Sometimes a vulnerability can end up costing 100 times what it would cost to fix in a development environment. So you can start to calculate what that cost would be, per vulnerability. And then we're looking at the time to detect, mitigate, validate, and then roll out to production. And correcting these vulnerabilities before they get into our production network is crucial to our image. If we were still doing manual reviews, we probably would not know of the critical and high vulnerabilities that we've found using Contrast. It would just be a matter of time before some hacker exploited those vulnerabilities for PHI data.

Another great benefit that Contrast has allowed us to enjoy is that there was no push-back from our development teams. Normally, in an organization, when you bring up security, developers gripe and moan because they look at security as a hindrance. But they were very receptive, very eager, and asked a lot of questions. We had two or three sessions with Contrast and, even today, developers are highly engaged with using the tool. They have implemented it into their development lifecycle process, both for our Agile teams and our Waterfall teams. It's been a huge turnaround here at FEPOC.

Management loves it. Having Contrast expose so many vulnerabilities that are in the applications means there's this heavy pressure now for 2020 to mitigate the vulnerabilities But it's a funny thing. Normally this task would be very cumbersome and problematic because of the number of vulnerabilities, but everyone loves the tool and the Contrast personnel are very helpful and very responsive. I'm enjoying it and I think our development and our test-app teams are as well. We have a very high adoption rate in our company.

What is most valuable?

What I find most valuable is the fact that we can install the agents onto the web server and then it does the automatic scanning. Every day when I come in, I log into Contrast and I can see the agent reports, real-time, on the vulnerabilities. I can see my list of security vulnerabilities that are immediately reported on a daily basis.

We are using the OSS feature and looking at those libraries. It's very thorough in terms of what we're looking for at the application level. What we want to know is what libraries we're using, which applications use those libraries, which ones are out of date, and which ones are recommended to be used. Contrast gives us all of that information. It affects our development of new code. When the development teams are developing new code, they know which applications not to use.

What needs improvement?

There is room for improvement in the reporting. We're looking for a dashboard. One of the things that I have to do right now is export to Excel spreadsheets to get the management-level view that I need to present to the leadership team. We can do a report on applications within the tool, but as far as management goes, they want to see a high-level view of all the applications. They want to see the total number of applications, the total number of criticals, the total number of highs, and then they want to break each of those down and be able to manipulate that data in a dashboard. That's something that's missing.

Because we have it on-premise, anytime we have to have maintenance, we have to schedule it and tell our developers to get out of it; that we're going to be doing an upgrade. That's our main gripe with our on-premise version. We don't get the updates as soon as a cloud version, so just two weeks ago we started talking about going to the cloud. We had our meeting about what we need to do and we have a followup meeting this Thursday to continue that.

For how long have I used the solution?

We've been using Contrast for three years.

What do I think about the stability of the solution?

We've had no problem with the stability. It's been able to handle all our users, and more are being added.

What do I think about the scalability of the solution?

We have 128 users between development, test, our Jenkins automation team, and management.

As soon as other business units found out about Contrast, I was getting requests to onboard their applications and their users. I just explained to our leadership that at some point we need to look at buying more on-premise licenses or going to the cloud, because it's scaling up pretty quickly. So far, I don't see any limitations. It's scaling up perfectly for our needs.

It is being heavily used. Once the management at the VP level found out how many vulnerabilities were in the applications, all types of bells and whistles and alarms went off. From there it just started rolling downhill. It was: Let's come up with a plan of getting these remediated. It's being widely used and, because it's coming from the VP down to the directors, the managers, and the staff, it's ramping up based on interest with other teams. That's one of the main reasons why I'm pushing for us to go to the cloud.

How are customer service and technical support?

It's really uncanny how excellent the Contrast support staff has been. I work very closely with the staff. As a matter of fact, I tell people that we don't consider Contrast a vendor. They're more like a part of our team. They're really integrated with our development and our growth.

When we put in tickets for tech support, we get a response immediately. I have really had no gripe or problem or issue with the tech support team.

Which solution did I use previously and why did I switch?

When I came aboard we had SonarQube. Our teams weren't using it religiously. They would only spot check. There was really no one pushing to use it. Only a few developers knew how to use it. It was one of those things they bought and that sat on the shelf unless someone pulled it off the shelf to use for their code base. There was no management push to scan code for X number or types of vulnerabilities before putting that code into production.

We did an analysis of what FEPOC needs right now. We looked at several tools and we settled on Contrast Assess because 

  1. it was scalable or for our needs 
  2. it was an easy set up 
  3. there wasn't a high bar or learning curve. 

The major reason was that we didn't really have a lot of time to spend on the learning curve. The Contrast tool and the Contrast team were there in guiding us every step of the way.

We still use SonarQube in our Jenkins pipeline. But the developers are no longer using it. Now they're using Contrast, 100 percent.

How was the initial setup?

It was roughly straightforward. We got the Contrast team on the line, our solution architect, me, and our AppSec team. We also had our system engineer and our development manager, so we had about six personnel from our side at the PoC, and we had two people from Contrast. We did a conference call and I don't remember any hiccups or problems. It went smoothly. It was painless.

The main part of the installation was making sure that our configuration, our hardware and our software stack, were complementary with Contrast.

I believe we did it all in one session, in one day, and got everything set up and working. We got everything restarted and installed to start populating the vulnerabilities into the team server. 

We thought it would take a month. We allotted a month of time to get it stood up, tested, and to do a shake out. We definitely beat that timeline. Contrast had given us a list of tasks to do before we got everybody on the phone to make sure our hardware was good, our configurations were set; that we had the proper people, proper password, proper access. They gave us all of that information beforehand. Once we confirmed that we were ready, we scheduled the call and we were able to knock everything out.

The way we rolled out Contrast to our organization was that we first started with a project that we call e-services. It has about 50 applications or sub-applications or APIs. Once we got that stood up, we went to FEP Direct which has about 70 applications.

In terms of the effectiveness of the solution’s automation via its instrumentation methodology, there are two aspects to that. On the Contrast side, we do understand that you have to instrument the application to get those vulnerabilities to report to the team server. That was fine. The lift involved on our side of the house was education. Many people who had histories with a static tool assumed that Contrast is similar to a static tool. So we had to educate our development teams, our tech teams, and our management that Contrast is not like the static tools. That's one of the reasons we have it do a Visio diagram: to show where in the process Contrast fits and how it works. Now that the teams are aware of how it works, they enjoy playing around with it. Whereas, their comparison with a static tool led to questions like, "Well, how are we going to get code coverage?"

We decided, we don't really need code coverage right now because Contrast includes route coverage. When we want to look at how much of our application is covered through a test, we're looking at the route coverage. This seems to be working great with the developers but even more so for our test teams in the QA environment. It lets them know when they run their regression tests how many routes those tests have missed, and then they go back and modify their regression tests so that those routes are included.

What was our ROI?

It took us about eight months to get it really set up and operational, in terms of starting to see ROI. After that eight-month period, it gave us a good view of the vulnerabilities that we had. 

We onboarded one suite of applications as our pilot in the first year and then right after that we onboarded another suite of applications. So the increase in the vulnerabilities wasn't per-suite. We just looked at them in bulk. We didn't say, "Alright, e-services is reporting this many more per month versus FTP Direct which is reporting this many per month." The metrics are still something that we'll have to delve into. For now, our push has been getting rid of our criticals and highs.

What's my experience with pricing, setup cost, and licensing?

The pricing was a point of contention even within our organization. There are some folks who felt we could get a cheaper tool, but there's a tradeoff there. We could have gotten a cheaper SAST tool, but what we would have saved in money we would have spent in learning-curve time. We didn't want to have a learning curve. We wanted something that we could set up and run now, so we felt the cost was justified by our requirements.

Regarding the OSS feature, when we got Contrast it came with the free version of the OSS, but after Contrast found out how popular their OSS was they started packaging it separately where new customers will have to pay for it. If we want to expand on Contrast's OSS offering, I think we'll have to pay for that, but I'm not sure. Right now, the OSS offering we have works for what we need it to do.

Which other solutions did I evaluate?

Before a tool is brought into the FEPOC, there are a bunch of criteria that it has to meet. It went through our review board. 

We looked at SonarQube, Checkmarx, and a few others. Most of them were static tools. Contrast was the only IAST tool. I don't recall if any DAST tools were on the list, but we wanted something that did not have a steep learning curve. We wanted something that was easy to set up and get running fairly quickly without causing an impact to our release cycles. 

We really weren't keen on code freezes and having to scan the code. And then, while we would be parsing through the reports, the developers would still be building code and by the time we would get the reports to them, it might be outdated. We really weren't looking forward to that type of methodology. Contrast gave us an ultimate methodology that worked with some of our goals and objectives.

What other advice do I have?

My advice is: Don't think about it. Do it. The benefits that you'll get from implementing it are enormous, especially for your development teams. They'll be able to look at these vulnerabilities and start remediating them in their environments before even passing them along during the SDLC.

In terms of the accuracy of Contrast in identifying vulnerabilities, my assessment is that so far we've had no false positives. However, if you speak to our developers, they will say it does have false positives, but that's not true. Let me give an example. We have several high vulnerabilities that Contrast found. It will say: "Application disable secure flag on cookies." In the lower environments, our Dev and QA environments, we want it that way. We want those cookies to be shared within the development, but we want that flag set on in the higher environment. So developers will say, "Well, that's a false positive." My argument to them is that it's not a false positive. We do want to make sure that those cookies are protected in the higher environment, even if the development team is okay with leaving it open in the lower environments. We need to know that that flag gets set in the higher environment. Therefore, as far as AppSec is concerned, this is not a false positive. We can mark it as not an issue. There's that type of "tennis" between AppSec and the development teams.

We are using the OSS feature and looking at those libraries. Our leadership team is planning around the vulnerable libraries that we have on our code base, but rather than fixing these issues individually, application-by-application, they're going to take an enterprise look at it. There's an initiative looking at what Contrast provides versus Black Duck and WhiteHat. They're doing that assessment at an enterprise level. What we're looking for is a tool that alerts us, before we pull it down and bring it in-house, about the status of that library. Contrast is further right than what we're looking for. The leadership wants something farther left to look at these libraries even before the developer downloads it. Right now, Contrast doesn't give us that because it's further right than what the leadership is looking for.

We are really building around Contrast. When we bring in new tools, one of the questions we ask is how does it integrate with Contrast. We're looking for tools that complement our use of Contrast. When a new tool is coming in, it goes to our board and they look at how well it integrates with Contrast. We're looking for things that complement our growth with Contrast rather than anything that replaces it.

I would rate it at about nine out of 10. If they do come out with a management-level dashboard, that that would be the icing on the cake.

Which deployment model are you using for this solution?

On-premises
**Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
More Contrast Security Assess reviews from users
...who compared it with Veracode
Add a Comment
Guest