What is our primary use case?
The product scans runtime and that is our main use case. We have deployed it for one application in our testing environment, and for the other one on in our Dev environment. Whatever routes are exercised with those environments are being scanned by Contrast.
How has it helped my organization?
It has helped us to improve the overall security posture of the company. We are able to address the findings before they have been reported by a third-party. It helps to identify things before someone else reports them or they have been widely exposed. It definitely improves the security posture of our applications, as a whole. It also improves our own security processes within the company, the way we catch the findings and resolve them. It has also helped us to gain our customers' trust.
Contrast helps save time and money by fixing software bugs earlier in the software development life cycle. We have installed the app in our Dev environment, so it's way before anything goes into production. It helps us shift left in our SDLC and it definitely helps us fix findings before the code is pushed to production.
What is most valuable?
The tool has good, strong findings. We have other static analysis tools, but Contrast has found high-priority issues which other tools have not found. The capability of the tool to scan and throw errors that other tools don't catch is important.
No other tool does the runtime scanning like Contrast does. Other static analysis tools do static scanning, but Contrast is runtime analysis, when the routes are exercised. That's when the scan happens. This is a tool that has a very unique capability compared to other tools. That's what I like most about Contrast, that it's runtime.
There is also a feature in the tool where you can actually specify that this or that is not a problem and mark it as false positive, and it doesn't show up again on your dashboard. It's pretty easy. You can filter out your false positives and be good to go. We have seen a reduction in the number of false positives because, once you mark something as a false positive, that particular one doesn't show up.
What needs improvement?
I would like to see them come up with more scanning rules. I don't know how it was done within the tool, but there is always room for improvement.
We recently had a call with the vendor. We were talking about a finding where it combined all of the instances of the finding into one. Whenever a new instance shows up that finding is being reported again. We want it to work so that once we mark it as "not a problem" the new one will be reported as a new finding, rather than an old finding popping up as a new instance.
For how long have I used the solution?
I have been using Contrast Security Assess for about eight or nine months. I joined my current company last September and I've been using it since then. In our company we have applications to work on, as subject matter experts for security. I have onboarded my applications into Contrast. After onboarding, I scan and tune the scan, and then list the non-true positives and false positives. I work with governing team to fix the issues.
What do I think about the stability of the solution?
It's been stable. It hasn't gone down from the time we installed it on our cloud. The scans are running every day. We have very great support from the Contrast team so they would be able to help us if we were stuck anywhere.
What do I think about the scalability of the solution?
It's easily scalable. We are planning to spread it to other teams and we are planning on one more application from within our team. It's just a matter of installing it on the proper cloud and it's good to go. It's easy to configure and you just have to decide which environment you want it on and make a few configuration changes.
In our company, it's mainly security who maintains and uses the tool. We haven't onboarded any of the developers or security champions within the company because we just started with it and we want to get to know the tool entirely. Then we can pass it on to other people in the company. For now, we, as the security team, are using it. Our team has 10 to 11 people. There are a few people from the DevOps team who have access to it to do the configuration stuff, and that team is another four or five people.
How are customer service and technical support?
Contrast's tech support is very helpful. They answer our questions and address our concerns. It's been easy and smooth with them.
Which solution did I use previously and why did I switch?
We did not have a previous solution. Contrast is a one-of-a-kind tool. It does runtime scanning so this is the only runtime scanning tool we have had.
Before me, one of my teammates was working on a different application and he was the first person to use Contrast. Then we bought three licenses. There is one more person who used it before me, for a different application. We have had good findings there as well. I have put to use the second license and we have one more license to use. We have identified an application to onboard, and we have also spread the word to different teams within the company and they're working closely with the Contrast team to use it in a different way. We are using the cloud version and they're still deciding on how to use it. We are just starting with Contrast but use of it is expanding within our company.
By "application" I mean monolithic, big applications. We currently have two such applications in Contrast and we will be working on the third one. We are looking to do more.
How was the initial setup?
The setup wasn't complex. It was pretty simple. We worked with an internal team that deals with the firewalls, because that's how it has to be configured. Because it was new to us, it took time for us to understand. But otherwise, it was smooth and we were able to configure it pretty quickly. Everything together took under three months. It might have taken less time but it was during the December/January time frame so we weren't available and people from other teams weren't available.
We have an internal process where we connect with other stakeholders to come up with a plan. We worked with a different team to be able to configure it and to be able to run a scan. We also worked closely with them for key rotation and other maintenance stuff connected to the tool. We have a lot of processes internally on how to manage the tool and how to maintain the tool and to make sure it's running scans continuously and that the key rotation is done. We have our own internal processes and our own strategy to maintain it and manage the program.
There is also regular maintenance from Contrast, making sure that it doesn't go down.
What was our ROI?
We have definitely seen ROI. We have been able to onboard our applications and scan them. The scan is happening continuously, every day, and it does report new findings. We have been able to triage them and fix them, address the defects of the software, even before they were posted to Prod. This will help reduce our attack surface and make our products more secure.
What's my experience with pricing, setup cost, and licensing?
You only get one license for an application. Ours are very big, monolithic applications with millions of lines of code. We were able to apply one license to one monolithic application, which is great. We are happy with the licensing. Pricing-wise, they are industry-standard, which is fine.
Which other solutions did I evaluate?
There were other companies that the people involved in evaluations were looking at, but I was not involved in that process.
What other advice do I have?
It depends on the company, but if you want to manage and maintain and onboard, I would recommend having Contrast as part of your toolkit. It is definitely helpful. My advice would be to install it on the environment in which there are more routes exercised, whether it is the testing environment or Dev, to get most out of the tool.
In terms of configuration, we have Contrast on one of the applications in our testing environment and we have the other in the Dev environment. To decide on that took us some time because we didn't have access to all the environments of a single application.
Findings-wise, Contrast is pretty good. It's up to the app engineer to identify whether a finding is due to the functionality of the application or it really is a finding.
Contrast does report some false positives, but there are some useful findings as well from the tool. It cannot give you only true positives, so it's up to humans to make out which ones are true which ones are false. Applications do behave in different ways, and the tool might not understand that. But there are definitely a few findings which have been helpful. It's a good tool. Every other tool also has false positives and it's better than some other tools.
We are not actively using the solution's OSS feature, through which you can look at third-party open source software libraries, because we have other tools internally for third-party library scanning.
It's been a good journey so far.