Reporting Tools Forum

Rhea Rapps
Content Specialist
IT Central Station
Sep 21 2018
We all know that it's important to conduct a trial and/or proof of concept as part of the buying process.  Do you have any advice for the community about the best way to conduct a trial or PoC? How would you conduct a trial effectively?  What mistakes should be avoided?
Alberto GuisandeHi Rhea, This is my personal opinion and shouldn't be taken as a best practice manual, but regarding PoCs, the better way I found so far: - Having very clear what you want to "test" in the tool. Lets say that your "actual problem" with data is volume, please, don't test only volume handling (you risk not to assess other functionalities that may become the "new problem". in the future). - Ask the vendor/partner to help you with the PoC. This is the better way to avoid trouble solving specifics on the tool, other than testing it, and can be very frustrating, biasing your judgement on the tool. - Use a specific business case/business question to tackle the PoC and involve the business user in the PoC (at the end, he is the SME) - Do not hold ANY questions to the vendor, he is there to sell, but once the buying decision is made, he should be there to provide you support (and enablement, and training, and etc.. Hope this helps you. Best, _AG_
Bob GillHi Rhea, This is a great question. Just like Alberto, this is my opinion, so please take it with a grain of salt. Here are some tips I would suggest for any group looking at purchasing reporting tools. * Define your current business challenge - What kind of decisions is this reporting supposed to support? ...number of users involved? ...how are they organized into teams? ...what are the steps in the business process? * Decide on some metrics in key categories - If you decide on some metrics before looking at all the alternatives, it can help make comparison conversations more clear. Here are some examples A) Ease of Use - {Training required, Usable from Excel, Usable from Web, Usable from Mobile, ...} B) Aesthetics - {Visuals, Responsiveness, Look and Feel} C) Functionality - {Ease of development, Business rules, change management, Output} D) Support - {Community Support, Vendor Support, Partner Support, Inhouse Support} E) Effectiveness - {Solves the business problem, Flexibility for change, Transparency, Accountability} * Ask for a proof of concept - Vendors are happy to build a proof of concept with your own data. They know showing you your own data will prove (or reject) the idea of the software helping the organization. After the vendor builds a proof of concept, ask multiple users from the organization to play with it. ...as for mistakes to be avoided, here are some common pitfalls I've seen folks encounter. * My buddy uses is, so it must work - Please don't assume that something that worked for someone else works for your organization. Test all the key functionality in the PoC or at least talk through it with your vendor/partner. * We need something now, so let's just do this quickly - By skipping through the selection process, there is so much risk added. If you don't have the time to invest in the selection process, just wait. * We want everything in the PoC - PoCs are not meant to be full solutions with everything. Ask for the most important features to be developed as part of the PoC. Even if it does have everything, you'll want to participate in the development process to ensure you know how to change things without having to go back to the vendor.
JhornberAll good guidance so far! A few things I'd add: 1) Before you even start a POC, put together a core set of requirements, and ask vendors to complete an RFI (Request for Information) - basically a check list of capabilities as they pertain to your use cases. This will help you identify early show-stoppers and rule out some vendors from the beginning should they not be able to provide some of the core functionality you're looking for. For example, when we last evaluated tools, automated report generation and distribution was a requirement, that several vendors could not meet. Likewise, more specifically, we needed a tool that could display Image thumbnails within a table. Again, many vendors were unable to do this and we were thus able to cross them off our list without out any more time evaluating the solution. 2) Echoing what Bob said, don't rely only on the vendor to prove something can be done by going away and doing it for you. Be sure you fully understand what it takes to do it yourself. i.e. Is it more or less out-of-the-box functionality, or does it require a lot custom coding, extension building, etc? A lot of solutions can accomplish nearly anything with enough time and technical expertise, but you probably don't want to rely on that for your primary use cases. Look for something that meets most of your core requirements while still enabling rapid dev and deployment. 3) As much as possible, make sure you're testing it under real-world conditions to properly gauge performance. I agree that data volume shouldn't be the only thing you assess, but do use one of your largest data sets rather than a sample and make sure the POC is taking place in an environment that has relative parity with your production environment. A lot of products demo well on a sample data set and vendor's architecture, but fall short once fully deployed. Performance is so crucial to adoption. You can build all the insightful, cool data tools in the world, but if they're slow, nobody's going to use them.
Ariel Lindenfeld
Sr. Director of Community
IT Central Station

Sign Up with Email