What is our primary use case?
The primary use case for FOSSA is that when we build mobile applications — we have a few dozen mobile applications that we build — the build script calls FOSSA automatically and determines through the FOSSA scan what licenses are being used in the dependencies, and it determines if they comply with our policies. If they don't, it notifies that there's a problem. If they do, it generates information that we turn into a report that we then use to disclose what licenses we're using in our product.
There's a secondary use case, which isn't about mobile apps but about looking at code in general and running it through FOSSA to see what it can find, but that's a very rare occurrence. Most of the time we're just automatically scanning mobile apps
How has it helped my organization?
FOSSA is at the heart of the license compliance part of our open-source management program.
We have an obligation to comply with open-source licenses on our products. Anyone in the business of distributing products has a certain obligation, and our job is to meet that obligation as best we can without spending too much time or money to do so. We're trying to efficiently ensure that we're doing the right thing, and FOSSA enables us to do that.
And since we've been able to integrate FOSSA into our build, it's not something that we have to run as a separate step after the fact — and even allow people to circumvent if they're really busy and they're rushed and they want to jump past and not do it. They can't. They can't avoid it. It's part of our build process. As long as people are building software using our build pipeline, which they have to, we automatically scan for compliance, and that gives us a tremendous amount of confidence that if there's a problem we'll catch it.
There are two risks when it comes to compliance. There's the risk that you do the wrong thing, but the bigger risk is that you didn't know that you're doing the wrong thing. If you don't know that there's a problem, then you think there isn't a problem until you find out. FOSSA makes sure that we're looking at everything. We also have to do the right thing, but that's on us, and FOSSA makes it easy to do that too. The fact that we've integrated it into our process so it automatically runs means that we're confident that we're not missing the compliance step.
The solution is comprehensive. It does what we need.
It allows us to deploy software at scale. We have a CI/CD pipeline and build many mobile apps, many times a day. That means dozens of mobile apps, and we deploy them frequently. We deploy more mobile apps than most companies in the world do. We're one of the larger providers of mobile apps in terms of the number of apps and the number of apps that we deliver them to. Ours is a fairly high-scale operation, and FOSSA is part of the build of all of it.
In addition, it has significantly decreased the time our staff spends on troubleshooting. I can't approximate by how much, because prior to FOSSA, we didn't really have an effective troubleshooting process. It was hard to manage because there was a lack of process. We did it when we felt we had to do it. FOSSA gave us a regular process, and what that did is allow us to predict how much time things take.
What is most valuable?
I view FOSSA as a singular tool, not really one that has components, so it's hard for me to say that there's a valuable feature. FOSSA, to me, is something that scans, and it determines what you have and if there's a problem. But if I were to call it a feature, the most valuable would be the deep dependency scanning.
If I were to use another tool that scans source code but not during build-time, one that just scans the source code as a static thing, then it would tell me what it thinks goes into my mobile app. But when I use FOSSA, and it scans during the build, it tells me what actually goes into my mobile app. I believe it's a more accurate way to determine what's in my code, because it not only tells me what my code says should go, but it tells me what actually goes during the build. When the build pulls artifacts from an artifact store, FOSSA detects that happening.
I found FOSSA's out-of-the-box policy engine to be accurate and that it was tuned appropriately to the settings that we were looking for. The policy engine is pretty straightforward. When it comes to licenses, there are always very strange cases that no policy engine can predict, so those happen once in a while. But for the most part, in terms of an automated policy engine, I find it to be very straightforward to make small modifications to, but it's very rare that we have to make modifications to it. It's easy to use. It's a four-category system that handles most cases pretty well.
The way we've set it up, it's compatible with our build and packaging system, and that's all we need it to be. So it's perfectly compatible.
The solution provides us with contextualized, actionable data, but I wouldn't call it "intelligence." It gives us a signal which says, "We have found a violation." As an analogy, data tells you what the temperature is outside. Intelligence tells you that the temperature is so cold that you have to wear a sweater, and wisdom is to know that you can't wear a sweater over a coat. Those are different. So FOSSA gives us data. It tells us there's a problem. It doesn't tell us much about how to fix the problem. It tells us where the problem is, but it doesn't give us intelligence. It gives us data. It provides us the signal of the problem and the component that is causing the problem, and allows us to inspect that component to see if it really is a problem.
What needs improvement?
Security scanning is an area for improvement. At this point, our experience is that we're only scanning for license information in components, and we're not scanning for security vulnerability information. We don't have access to that data. We use other tools for that. It would be an improvement for us to use one tool instead of two, so that we just have to go through one process instead of two.
Another area for improvement is because our list of projects is large. We have over 700 projects that we're scanning through FOSSA right now on my dashboard, and it's just very hard to arrange 700 things. So any way that it would allow me to categorize products better, so that I could say, "Automatically categorize all the Android projects, all the iOS projects, and which are the ones that are for the US or Taiwan, and which are related to this customer or that customer." Better ways to categorize the projects would be very helpful.
Another thing that would be very helpful is during issue triage. I would like to see a better way to get access to the component without leaving the app. When the app tells me, "Your mobile product uses this component and this component has a problem." In that case, I want to look at that component to see, first of all, am I really using it? And second of all, is the problem real — I have to do some triage. It would be better if the tool made that triage step easier in the tool, without me having to go outside the tool and search around: Am I really using that component? Does the component really have the problem FOSSA thinks it has?
One other thing that would help is a report of all the components that I'm using. For example, suppose I have two apps and each app uses a hundred components. These two apps are very similar. Are the hundred components they're using the same or different? I would like to be able to run a report that takes a list of apps and tells me, here's all the components of one, here's all the components of the other, and whether they are the same version. If I have two versions of the same app, how different are the components? To be able to do that kind of holistic analysis of the components would be helpful, but not just in terms of a scan. It has scanned it, so it has all this data. I want to be able to now put that data on a chart. Not all 700 of my apps, but three of them; I want to take three apps and compare their bills of materials. Are they using the same STKs and are they the same versions of the STKs? That could be very helpful.
For how long have I used the solution?
I've been using FOSSA for about two years.
What do I think about the stability of the solution?
We have a problem no more than about three or four times a year. Relatively speaking, it's very stable with very few incidents. We do have a few. Every so often there's a problem and we have to reboot or do something else, but it's rare.
What do I think about the scalability of the solution?
We've been able to scale the solution to meet our needs.
How are customer service and technical support?
Their technical support got better. Initially, it seemed that our demands exceeded their capacity. When we started, they were a smaller company, and we just had more demands. They eventually staffed up and provided a better route for receiving requests and tracking the time to resolve those requests. So initially it was a little slow. We understood that, and then it got better.
Which solution did I use previously and why did I switch?
We used Black Duck Protex. We switched to FOSSA because we found that FOSSA addressed our use cases better than Black Duck was able to. When we got Black Duck, it was before we were doing mobile apps, and the nature of scan was different. We weren't scanning during build, we were scanning large repositories. As our business shifted to a mobile business, we found that we really needed to scan during build. We weren't able to do that with Black Duck and we were able to do that with FOSSA. Black Duck provides the capability, but we weren't able to get it to work the way we needed it. FOSSA looked like it was going to be better for us — and FOSSA actually costs us more money.
How was the initial setup?
The initial setup was relatively straightforward. They gave us the hardware specification and we were able to set up the server. They were able to set up pretty quickly. The initial setup was pretty good. I've seen much harder setups, relatively speaking. It wasn't plug-and-play, because it was an internal install, but it was actually quite good.
From the time that we wanted it up and running to the time that it was up and running was probably a week or two, but that was largely because of delays on our end. Reality is complicated. Their process was pretty straightforward; it took two days. But our process took about two weeks.
They helped us with an implementation strategy. Our experience with their staff during deployment was favorable and positive. They were very good. They were motivated to help and they did a great job. We didn't have any incidents.
What was our ROI?
We have seen ROI because, at the end of the day, we're getting the value we needed. With Black Duck, we weren't really getting the value. We weren't able to get what we needed out of that tool.
What's my experience with pricing, setup cost, and licensing?
I don't love the license model where FOSSA charges per engineer, given that we don't really have engineers who use FOSSA. The method that we pay by, the metering that they use, is based on a mode that doesn't make sense to us. We pay for some number of engineers. Well, only one engineer uses it, and we're not paying for one engineer, because that's not fair; we use it for dozens of projects. So the method by which it's determined what to pay doesn't make sense.
The amount that we pay, we pay. We're okay with it. We've negotiated a price that works for both parties, so that's not an issue.
Which other solutions did I evaluate?
We had a good relationship, and I was very familiar with Alameda. We looked at WhiteSource, and we looked at FossID and at two other small ones that never really made it into the market. They were startups that were trying to get in.
There were two reasons that we chose FOSSA over these other vendors. One was because it met our use cases. And the second was because I believed in the direction of the company. I was impressed with Kevin, and I believed that this was a company that would either meet our needs today or would be meeting our needs as our needs changed. FOSSA, as a company, was going in the direction that we, as a company, were going. Whereas, some of the other vendors that compete were going in a different direction and were solving problems that I was having less of and were ignoring problems I was having more of. I believed in their long-term prospects less, and believed in FOSSA's long-term viability more.
What other advice do I have?
Focus on those applications that pose licensing risks. I don't believe that one needs to use FOSSA to scan everything. You need to use FOSSA to scan products that you distribute to third-parties.
The biggest lesson I have learned using the solution is that command-line integration is the most important part of a scan tool.
We don't really have "users" using it. There's really only one person who does most of the work around FOSSA and everything is automated. Very rarely does somebody go into the tool and do anything, but we have many apps that are scanned with FOSSA. The one person who goes into it is one of the mobile build engineers and he is responsible for the mobile build process, which includes FOSSA scanning.
We do not use FOSSA's security or vulnerability management features yet. We want to. We know that that was recently released, but we haven't enabled them on our internal build yet.
There's opportunity for the solution to grow. There are things that it needs to do that it doesn't yet do. There are things it does that I'm satisfied with. I can't give it a 10 because there's opportunity to grow, so this is really less a measure of the product and more a measure of my expectations for the product. I'll give it a favorable eight, because it does what I need it to do, but I need it to do more as well.
Which deployment model are you using for this solution?