Online research tools: some purchasing considerations

By Tim Macer

With hundreds of competing web survey systems to choose from, picking out a winner for your needs is a major research exercise in itself. There is always the fear – which vendors have little motivation to allay – that things move so fast that whatever you choose, it will already have been superseded before you have finished the training session. It is not just the technology that is moving fast, but the needs of online researchers are too.

In last year’s Confimit MR Software Survey (PDF 1.7MB) 24% of the research agencies surveyed were planning to change their survey software in the next year, and a further 27% were not sure they would be sticking with what they had. Of those seeking a change, 62% of the firms went on to cite seeking more capabilities and features as a reason to switch; only 33% mentioned cost as a driver. There is clearly an appetite in the market for more sophisticated tools, and vendors report that much of the action is now with companies outgrowing one system and wanting to upscale.

Behind the froth and the heady buzzwords are some major differences in the kinds of solutions that are on offer: understanding these from the outset can make the difference between outgrowing a package in a couple of years, or finding one that grows with you over the years. These differences are rarely apparent from the different suppliers’ websites, or rolling demos – and unless you set out an agenda of the requirements you want to see in the sales presentation, you are unlikely to appreciate just what is missing from a well-rehearsed sales demo either.

Here, we provide some of the key differentiators across the market today. You can judge for yourself how important these may be to you, but they will all help you gauge the strength of the products you review and their ability to stick with you for the long haul.

Your place or theirs?

The most obvious differentiator is where the software runs and where the data are collected – does the software run entirely on the vendor’s servers, as a hosted or ASP (application service provider) solution, or does it sit on your own network and web server? It is not just the bargain-basement solutions that work in this way – some to-end ones make a virtue of this too. Having someone else manage all the technical and resource management issues for you can save no end of trouble and expense.

Ask your potential supplier what contingency plans are in place to ensure continuity of service and cope with hardware or communications failures – and ensure these are reflected in the Service Level Agreement they offer as part of the contract. If inadequate guarantees are forthcoming, this is a provider to avoid.

Only the largest research agencies are likely to benefit from a system they can host themselves. A happy medium for intensive users can be to have software where the set-up and analytical tools run on your own systems, and hosting on your provider’s servers. For corporate clients, a hosted server can mean independence from corporate IT, whose agenda is often at odds with the needs of an in-house research team.

Panel beaters

There are massive differences in the extent to which various web survey tools support the respondent selection and invitation processes. Some leave the whole task to you; others provide invitation and reminder capabilities but still expect you to manage all the contacts and load them in each time – even some of the more expensive ones. But there are plenty now that offer a good range of integrated respondent panel capabilities, so that both recorded participation and responses given in prior surveys can be used when selecting respondents for new projects, and even in predicting the sample frame size to achieve your desired quota matrix without wasting invitations.

If you want to set up your own panels, a proper panel solution will also make it easy for you run the panel recruitment side, run multiple panels side-by-side, create distinctive respondent panel websites and look after reward management and redemption activities too. Only a few tools go that far yet.

Avoiding the boredom factor

Though online surveys can look visually interesting and engaging for respondents, they rarely do. The Internet is now essentially a visual medium, and the trend towards greater interactivity continues. Many web survey tools limit the range of forms and questions you can use, and make it hard to move beyond a meagre range of stock templates provided for questions buttons and progress bars, without having to engage a website designer each time – an unrealistic prospect for most surveys. But some providers are pioneering more engaging ways to create surveys that look compelling and are enjoyable and stimulating to complete, through the use of interactive components such as animated graphics, drag-and-drop selections and pre-built Java components.

This should not be dismissed as mere gimmickry. Engagement feeds directly into response. Dull, uninteresting surveys do appear to result in fewer completes than those that convey the sense that care and effort has gone into their presentation, and keep the respondent wondering what the next screen may contain. Look hard to see what effects the different packages can offer, and in particular, see how much of this you can use without needing a designer each time.

What You See and What They Get

Most web survey tools are not fully WYSIWYG for good reason – you design questions and these are then interpreted as forms and laid out by the system – which means that the overall look and feel can be applied systematically for consistency, and also that you do not have to spend hours hand-crafting each page, which could easily become an obsessive compulsion. But the degree of disconnect between what you see on screen and how it actually looks in front of the respondent can be considerable. Some will let you preview each question as you go. Others make you jump through hoops before you get to see the finished article. The greater the distance between creating and previewing, the harder it is to improve the overall visual appearance of the questionnaire. Count how many steps you need to go through between writing a question and seeing what it will really look like on screen.

The Language Barrier

No system will translate your interview into other languages yet. That day is still far off. But the handling of multiple languages is a good litmus test in survey tools. At one extreme, some expect you to duplicate the survey and retype the text – really no support at all. Some require that a programmer goes into the script and cut and paste each text in one-by-one, which is a painful and error-prone activity. At the other end of the scale, tools provide special translator’s interfaces to make it easy for a translator to log in, over the web, and translate every text easily and reliably.

Others let you export all of the texts into an Excel spreadsheet, to email to your translator, who translates into the adjacent column. You simply load the translated file back in. The more efficient methods can save precious days of time, especially when it comes to making last minute changes.

Check too that system messages, such as error prompts, either have been translated or can be translated on a once-and-for-all basis. It is no good having a stray English text pop-out when something untoward happens because it is hard coded somewhere into the system.
What about non-Roman alphabets such a Greek, Cyrillic, Chinese? What about right-to-left languages like Arabic and Hebrew? You might not need them today, but one day…

The hidden costs

Confusion pricing, worthy of the mobile phone networks, now stalks the online survey world. Some firms price according to the number of complete interviews you achieve, some by the maximum concurrent connections allowed, some by panel size and one even by the number of interviewing minutes consumed (like a real mobile phone tariff). Worse, there are often price breaks that commit you to a particular level of activity too.

If you run the software on your server, some charge per server, some per site or per live connection. Charges can be one-off, with, say, a 20% annual maintenance charge; others charge an annual sub – in advance, if they can.

To make sense of all this, try to roll out the total cost of ownership over 3, 4 or 5 years. Put in all the start-up costs, including servers or hardware, design and template creation, panel development, training and so on. Work out different scenarios for volume-based pricing on optimistic and pessimistic predictions of your interviewing over the coming years. The true picture will emerge as to what is the best deal for you – and it could be quite surprising.

Going for gold

  • That is not all. You are likely to find revealing differences in many other areas, as you dig down. Here is another half dozen that can sort sprinters from marathon runners:
  • The ease by which you can obtain response rate data, quota metrics and other performance data, such as drop-off analysis, during the fieldwork period;
  • The extent to which you can edit and clean your data either case-by-case or systematically (both would be nice) and keep an audit trail of any changes made;
  • The range and quality of reporting tools provided for cross-tabs and report production;
  • The range of data exports available, which will avoid any retyping of definitions when loading data into your preferred analysis software;
  • The extent to which changes can be made safely, on the fly;
  • The level of technical skill required to operate the system – can anyone do it, or is it for teccies only?

This report is based on two articles published in the Online Research supplement of Research magazine, September 2007.

Share This