What is our primary use case?
Up to this point, as an information security company, we had very limited visibility over the testing of the code. We have 25 Scrum teams working but we were only included in very specific projects where information security feedback was required and mandatory to be there. With the use of Contrast, including the evaluation we did, and the applications we have included in the system, we now have clear visibility of the code.
How has it helped my organization?
In our most critical applications, we have a deep dive in the code evaluation, which was something we usually did with periodic vulnerability assessments, code reviews, etc. Now, we have real time access to it. It's something that has greatly enhanced our code's quality. We have actually embedded a KPI in regards to the improvement of our code shell. For example, Contrast provides a baseline where libraries and the usability of the code are evaluated, and they produce a score. We always aim to improve that score. On a quarterly basis, we have added this to our KPIs.
We have a site that serves many different products. We have a sportsbook and casino, where a lot of casinos are using the provider's code. Our false positives are mainly due to points missing since we have not integrated the application on the provider's side. Therefore, a request that is not checked on our side is checked on their side, leading to gaps of knowledge which causes the false positive.
In regards to the applications that have been onboarded fully, we have had very effective results. Everything that it has identified has given us value, either in fixing it or knowing what's there and avoiding doing it again on other parts of our code. It's been very effective and straightforward.
What is most valuable?
The real-time evaluation and library vulnerability checks are the most valuable features, because we have a code that has been inherited from the past and are trying to optimize it, improve it, and remove what's not needed. In this aspect, we have had many unused libraries. That's one of the key things that we are striving to carve out at this point.
An additional feature that we appreciate is the report associated with PCI. We are Merchant Level 1 due to the number of our transactions, so we use it for test application compliance. We also use the OWASP Top 10 type of reports since it is used by our regulators in some of the markets that we operate in, such as, Portugal and Germany.
The effectiveness of the solution’s automation via its instrumentation methodology is very effective and was a very easy integration. It does not get affected by how many reviews we perform in the way that we have designed the release methodologies. So, it has clear visibility over every release that we do, because it is the production code which is being evaluated.
The solution has absolutely helped developers incorporate security elements while they are writing code. The great part about the fixes is they provide a lot of sensory tapes and stuff like what you should avoid to do in order to avoid future occurrences around your code. Even though the initial assessment is being done by a senior, more experienced engineers in our organization, we provide the fixes to more junior staff so they have a visceral marker for what they shouldn't do in the future, so they are receiving a good education from the tool as well.
What needs improvement?
During the period that we have been using it, we haven't identified any major issues. Personalization of the board and how to make it appealing to an organization is something that could be done on their end. The reports could be adaptable to the customer's preferences, but this isn't a big issue, as it's something that the customer can do as he builds his experience with the tool.
On the initial approaches during the PoC and the preparation of the solution, it would be more efficient if we were presented with a wider variety of scenarios aimed towards our main concern, which is system availability. However, once we fine tuned those by their scenarios that they provided later on in our discussion, we fixed it and went ahead.
For how long have I used the solution?
We evaluated the product twice: once in PoC and once in a 30-day trial. Then, we proceeded with using it in production, where it's been for four months. Our initial approach was almost nine months ago. So, we had a fair bit of experience with them.
What do I think about the stability of the solution?
The application is very stable because it is on-premise. So, we have had no issues with it. The stability of the solution is at a level where we just have the health check run on it and nothing more is needed. We don't have issues with capacity. We do not have issues with very high level of requests nor delays. It is very smooth at this point. We fine tuned it during the testing week. After that, nothing changed. It handles the traffic in a very easy way. We just configure it through the Contrast tool, if needed, which is very straightforward.
The maintenance is very simple. We have had two patches applied. Therefore, I have only needed to involve our systems team two times during these four months for one hour of time. The health check of the system has been added to our monitoring team's task, therefore there is no overhead for us.
What do I think about the scalability of the solution?
At this point, we have provided access to 20 people in the Contrast platform. However, it is being used by more people than that because once a vulnerability is identified and marked as something that we should fix, then it's handled by a person who may not have access to Contrast and is only presented with a specific vulnerability in order to fix it. Top management receives the reports that we give them as well as the KPI's. So, it's used across the organization. It's not really limited to just the teams who have actual access to it.
At this point, we see great value for the applications that we have it on. We want to spread it across lower criticality applications. This is something that's a positive thing, because if we want to have it on a larger scale, we'll just add another web node and filter different apps on it. It's a very scalable and easy to manage. We are more than sure that it will cover the needs that we'll have in the future as well. We have weekly releases with no issues so far.
How are customer service and technical support?
Every time that we approach them with a request, we have had an immediate response, including the solution, with the exact point in the documentation. Therefore, they have been very helpful.
It was a very smooth completion of the paperwork with the sales team. That's a positive as well because we are always scared by the contract, but they monitor it on a very efficient level.
I really want to highlight how enthusiastic everyone is in Contrast, from day one of the evaluation up until the release. If we think that we should change something and improve upon it, then they have been open to listening and helping. That is something that greatly suits our mentality as an organization.
Which solution did I use previously and why did I switch?
Prior to to this, we did not have such a solution and relied on other controls.
Our initial thought was that we needed a SAST tool. So, we proceeded with approaching some vendors. What sparked the interest for Contrast is its real-time evaluation of requests from our users and identification of real-time vulnerabilities.
We have now established specific web nodes serving those requests. We get all the feedback from there along with all the vulnerabilities identified. Then, we have a clear dashboard managed by our information security team, which is the first step of evaluation. After that, we proceed with adding those pieces of the vulnerabilities to our software development life cycle.
Prior to using Contrast, we didn't have any visibility. There were no false positives; we had just the emptiness where even false positives would be a good thing. Then, within the first week of having the tool, 80 or 90 vulnerabilities had been identified, which gave us lots to do with minor false positives.
How was the initial setup?
The setup is very straightforward. Something that has worked greatly in their favor: The documentation, although extensive, was not very time consuming for us to prepare. We have a great team and had a very easy integration. The only problems that we stumbled onto was when we didn't know which solution would work better for our production. Once we found that out, everything went very smoothly and the operation was a success.
The final deployment: Once the solution was complete, it took us about less than a day. However, in order to decide which solution we would go with, we had a discussion that lasted two or three working days but was split up over a week or so to have the feedback from all the teams. The deployment was very fast. It took one day tops.
What about the implementation team?
Their support was one of the best I have seen. They were always very responsive, which is something that we appreciate. When you assign a person and time to work the project, you want it to be as effective as can be and not have to wait for responses from the provider.
Their sales team gave us feedback from the solution architects. They wanted to be involved in order to help us with some specific issues that we were dealing with since we were using two different technologies. We wanted some clarifications there, but this was not customer support. Instead, it was more at a solution level.
The integration was very simple of the solution’s automation via its instrumentation methodology. We had excellent help from the solution architects from the Security Assess team. We had the opportunity to engage many teams within our organization: our enterprise architects, DevOps team, systems team, and information security team members. Therefore, we had a clear picture of how we should implement it, not only systems-wise, but also in organization-wide effect. At this point, we have embedded it in our software development life cycle (SDLC), and we feel that it brings value on a day-to-day basis.
We prepared a solution with the solution architect that we agreed upon. We had a clear picture of what we wanted to do. Once we put the pieces together, the deployment was super easy. We have a dedicated web node for that. So, it only runs that. We have clear applications installed on that node setup, so it's very straightforward and easy to set up. That's one of the key strengths of Contrast: It is a very easy setup once you decide what you want to do.
On our end, we had the one person from the systems team, the enterprise architect who consulted in regards to which applications we should include, myself from information security, and DevOps, who was there just to provide the information in regards to the technologies we use on the CI/CD front. However, the actual involvement with the project to the implementation was the systems team along with me.
From their end, they had their solution architect and sales acted as a project manager, who helped tremendously in their time limits of responses. There was just two people.
What was our ROI?
The solution has helped save us time and money by fixing software bugs earlier in the SDLC. The code shells and quality improve through missed links and libraries as well as units of extensive code where it's not needed. From many aspects, it has a good return of investment because we have to maintain less code use, a smaller number of libraries and stuff like that, which greatly increases the cost of our software development.
What it saves is that when a developer writes something, he can feel free to post it for review, then release it. We are sure that if something comes up, then it will be raised by the automated tool and we will be ready to assess and resolve it. We are saving time on extensive code reviews that were happening in the past.
What's my experience with pricing, setup cost, and licensing?
For what it offers, it's a very reasonable cost. The way that it is priced is extremely straightforward. It works on the number of applications that you use, and you license a server. It is something that is extremely fair, because it doesn't take into consideration the number of requests, etc. It is only priced based on the number of applications. It suits our model as well, because we have huge traffic. Our number of onboarded applications is not that large, so the pricing works great for us.
There is a very small fee for the additional web node we have in place; it's a nonexistent cost. If you decide to apply it on existing web nodes, that is eliminated as well. It's just something that suits our solution.
Which other solutions did I evaluate?
We had an extensive list that we examined. We dove into some portable solutions. We did have some excellent competitors because they gave us a clear indication of what we wanted to do. We examined SonarQube and Veracode, who presented us with a great product, but was not a great fit for us at the time. These solutions gave us the idea of going with something much larger and more broad than just a tool to produce findings. So, many competitors were examined, and we just selected the one who mostly fit our way of doing things.
The main thing to note is the key differentiation between Contrast and everything else we evaluated is the production value range since we had the chance to examine actual requests to our site using our code. Contrast eliminated the competition with their ability to add the live aspects of a request taken. That was something we weren't able to find in other solutions.
Some of the other competitive solutions were more expensive.
What other advice do I have?
I would recommend trying and buying it. This solution is something that everyone should try in order to enhance their security. It's a very easy, fast way to improve your code security and health.
We do not use the solution’s OSS feature (through which you can look at third-party open-source software libraries) yet. We have not discussed that with our solutions architect, but it's something that we may use in the future when we have more applications onboard. At this point, we have a very specific path in order to raise the volume of those critical apps, then we will proceed to more features.
During the renewal, or maybe even earlier than that, we will go with more apps, not just three.
One of the key takeaways is that in order to have a secure application, you cannot rely on just the pentest, vulnerability assessments, and the periodicity of the reviews. You need the real-time feedback on that, and Contrast Assess offers that.
We were amazed to see how much easier it is to be PCI-compliant once you have the correct solution applied to it. We were humbled to see that we have vulnerabilities which were so easy to fix, but we wouldn't have noticed them if we didn't have this tool in place.
It is a great product. I would rate it a nine out of 10.
Which version of this solution are you currently using?