Bringing objectivity to a major upgrade at Maritz
Jon Hulbert was working as System Delivery Manager at Maritz Research, a global research agency. He was in the market for a particular data collection solution that would satisfy a cross-departmental group of around 30 execs and operational managers.
For Jon, the cornerstone of success in choosing software is to have a good, systematic process that can be seen by all stakeholders to be fair. “Bringing in a lot of different stakeholders certainly involves careful managing,” he states, “and it’s possible for the scope of the project to start to creep right there. This can be managed by drawing up very clear agendas and having quite firm chairing of the meetings.”
Jon chose to work with meaning to identify aspects of the project that were difficult to handle internally. He comments: “It lends objectivity to the whole project to have an unbiased outside person supporting the process. The long term success of the project depends on the overall acceptability of the decision you reach.”
Having an outsider involved also helped to drive the project forward. “When projects like these arise they tend to be workload that is in addition to your day job. Once you have engaged someone from outside you are able to follow the process through, and reach a positive decision with all expediency.” For Jon, the key to receiving an informative software demo is “a realistic scenario to work to as a case study. It is important to set some hurdles for the vendors that should be from real situations and real requirements. It is important to set realistic scenarios for them to demonstrate so that they are not just running through their prepared demonstration, as that gives them scope to smooth over all sorts of things.”
In Maritz’ case, this strategy proved to be highly revealing. A process which all the suppliers said they could do was examined carefully in the demos. In one demo, it turned out to be a very complicated process, and in another, the solution proposed was incomplete and unworkable.
The software Maritz was viewing was far too complex to download as a demo disk, so the evaluation had to be done with the vendor presenting the software, which gives more scope for glossing over the missing features. To avoid this, Jon suggests introducing some interactivity into the demo, “Pose questions to the supplier during the demo,” he suggests. “Introduce some subtle changes to the scenario and make sure they show you what happens when you do this. But it is very difficult to police this area. A clever sales pitch built on slideware will look very much like real software.”
Even after the demos picked a conclusive winner, there were still questions from the stakeholders that had not been answered definitively.
“So that led us, once we had selected a preferred supplier, to have them back and interview them in much greater depth about the technical aspects. In fact we spent two days with them: longer than we originally thought. We had a good poke round with the software, and even found that in some cases it did a lot more than we were expecting. I would not have been comfortable knowing we had made the right choice without having the answers to all our questions.”
Wise advice from Jon Hulbert
- Prepare an agenda for the presentation session.
- Have a real case study for your vendors to work on.
- Develop a list of criteria by which to evaluate the demonstrations.
- Be careful that you are not having the wool pulled over your eyes in the presentation.
- Go back in more detail, as a duty of care, once you have selected a preferred supplier.
This case study is taken from an article first published in research magazine.