The latest news from the meaning blog


SPSS Text Analytics for Surveys Reviewed

In Brief

What it does

Textual analysis software which uses the Natural Language Processing method to process textual data from verbatim response to surveys which will categorise or group responses, find latent associations and perform classification or coding, if required.



Our ratings

Score 3.5 out of 5Ease of use

Score 4.5 out of 5Compatibility with other software

Score 3.5 out of 5Value for money


One-off costs: standalone user £2,794; optional annual maintenance £559; single concurrent network user: £6,985 software, plus maintenance £1,397


  • Flexible – can use it to discover and review your verbatims individually, or to produce coded data automatically under your supervision
  • User interface is simple, straightforward and productive to use, once you are familiar with the concepts
  • Lets you relate your open-ended data to closed data other questions or demographics
  • Easy import and exports from SPSS data formats or Microsoft Excel


  • This is an expert system which requires time and effort to understand
  • System relies on dictionaries, which need to be adjusted for different subject domains
  • Rules-based approach for defining coded data requires learning and using some syntax

In Depth

One of the greatest logistical issues with online research is handling the deluge of open-ended responses that often arrive. While much of the rest of the survey process can be automated, analysing verbatim responses to open questions remains laborious and costly. If anything, the problem is gets worse with Web 2.0-style research. A lot of good data gets wasted simply because takes too long and costs too much to analyse – which is where this ingenious software comes in.

PASW Text Analytics for Surveys (TAfS) operates as either an add-on to the PASW statistical suite – the new name for the entire range of software from SPSS (see box) – or as a standalone module. It is designed to work with case data from quantitative surveys containing a mixture of open and closed questions, and will help you produce a dazzling array of tables and charts directly on your verbatim data, or provide you with automatically coded data.

A wizard helps you to start a new project. First, you specify a data source, which can be data directly from PASW Statistics or PASW Data Collection (the new name for Dimensions an ODBC database, or an Excel file (via PASW Statistics). Next, you select the variables you wish to work with, which can be a combination of verbatim questions, for text analysis, and ‘reference questions’ which are any other closed questions you would like to use in comparisons, to classify responses or to discover latent relationships between text and other answers. Another early decision in the process is the selection of a ‘text analysis package’ or TAP.

SPSS designed TAfS around the natural language processing method of text analysis. This is based on recognising words or word stems, and uses their proximity to other word fragments to infer concepts. The method has been developed and researched extensively in the field of computer-based linguistics, and can perform as well if not better than human readers and classifiers, if used properly.

A particular disadvantage of using NLP with surveys is the amount of set-up that must be done. It needs a lexicon of words or phrases and also a list of synonyms so that different ways of expressing the same idea converge into the same concept for analysis. If you wish to then turn all the discovered phrases and synonyms into categorised data, you need to have classifiers. The best way to think of an individual classifier is as a text label that describes a concept – and behind it, the set of computer rules used to determine whether an individual verbatim response falls into that concept or not.

TAfS overcomes this disadvantage by providing you with ready-built lexicons (it calls them ‘type’ dictionaries), not only in English, but in Dutch, French, German, Spanish and Japanese. It also provides synonym dictionaries (called ‘substitution dictionaries) in all six supported tongues, and three pre-built sets of classifiers – one for customer satisfaction surveys, another for employee surveys and a third for consumer product research. It has developed these by performing a meta-analysis of verbatim responses in hundreds of actual surveys.

Out of the box, these packages may not do a perfect job, but you will be able to use the analytical tools the software offers to identify answers that are not getting classified, or those that appear to be mis-classified, and use them to fine tune them or even develop your own domain-specific packages. So, selecting dictionaries and classifiers is done in just couple more clicks in the wizard, the software then processes your data and you are ready to start analysing the verbatims.

The main screen is divided into different regions. One region lets you select categories into which the answers have been grouped, another lets you review the ‘features’ or words and phrases identified , and in the largest region, there appears a long scrolling list of all your verbatim responses to the currently selected category or feature. All of the extracted phrases are highlighted and colour coded. The third panel shows the codeframe or classifers, which is a hierarchical list. As you click on any section of it, the main window is filtered to show just those responses relating to that item. However, it also shows you all of the cross-references to the other answers, which is very telling. There is much to be learned about your data just from manipulating this screen, but TAfS has much more up its sleeve.

One potentially useful feature is sentiment analysis, in which each verbatim is analysed according to whether it is a positive or a negative comment. Interface was not able to test the practical reliability of this, but SPSS claim that it works particularly well with customer satisfaction type studies. In this version, sentiment analysis is limited to the positive/negative dichotomy, though the engine SPSS uses is capable of other kinds of sentiment analysis too.

The software also lets you use ‘semantic networks’ to uncover connections within the data and build prototype codeframes from your data, simply by analysing the frequency of responses to words and phrases and combinations of words and phrases, rather like perform a cluster analysis on your text data – except it is already working at the conceptual level, having sorted out the words and phrases into concepts.

You can build codeframes with, or without help from semantic networks. It’s a fairly straightforward process, but it does involve building some rules using some syntax. I was concerned about how transparent and how maintainable these would be as you handed project from one researcher to another.

Another very useful tool, which takes you beyond anything you would normally consider doing with verbatim data, is a tool to look for latent connections between different answers, and even the textual answers and closed data, such as demographics or other questions.

This may be a tool for coding data, but it is not something you can hand over to the coding department – the tool expects the person in control to have domain expertise and moreover, to possess not a little understanding of how NLP works, otherwise you will find yourself making some fundamental errors. If you put in a little effort, though, this tool not only has the potential to save hours and hours of work, but to let you dig up those elusive nuggets of insight you probably long suspected were in the heaps of verbatims, if only you could get at them.

A version of this review first appeared in Research, the magazine of the Market Research Society, June 2009, Issue 517

Cognicient Link

In Brief

Cognicient Link version 1.0

Cognicient, UK
Date of review: May 2009

What it does

Software utility that allows you to create a consolidated database of all numerous survey datasets for meta analysis and aggregation of survey questions and their responses across different surveys by creating flexible taxonomies that work independently of the original data structures to resolve variations at the survey level. Data may be queried directly within Link or extracted to statistical packages for analysis or modelling.

Our ratings

Score 3 out of 5Ease of use

Score 4 out of 5Openness

Score 3.5 out of 5Value for money


Annual fee £20,000 for the core system and a single data upload user and £5,000 for each additional user.


  • All your survey data in one place – analyse anything by anything
  • Works with just about any format of survey data, and will import directly from Triple-S and SPSS and SPSS legacy formats (Quancept, Surveycraft etc)
  • Flexible in how you treat and resolve similarities and differences
  • Robust and highly scalable


  • Steep learning curve – requires some technical expertise to use
  • Some complex manual intervention required during set-up stages
  • Exports limited to SPSS or raw data files

In Depth

It is a pipe dream for many big research buyers that one day they will have a single database that contains absolutely all of their surveys, so they can play with all their data as if it was just one big survey. Talk to the people at Cognicient, and that dream can become reality. Cognicient, a small UK/US consulting company that specialises in survey data management, decided to make available the tools it had developed for its own internal use that allow it to create vast data warehouses of survey data, offer them as a package for others to install and use for themselves, or as a managed service through Cognicient.

It’s a common problem that, as research agencies or research buyers accumulate surveys on different products, different markets and at different times, it becomes increasingly difficult to pull these together in any meaningful way in order to make comparisons. The individual dataset becomes a straightjacket. Minor differences in the formats of the questions, the answer lists or rating scales used, or even discrepancies in the underlying format of the data make it impossible to put the data together for analysis, unless you are able to grab a few figures from previously reported data that just happen to meet all your criteria for comparison.

Cognicient Link is a database application based on a standard Microsoft SQL Server database, that lets you create a data warehouse of all your past surveys. In doing so, it breaks down the artificial barriers that normally exist between surveys. The importer lets you load data from each survey without needing to reformat it, and alongside that, to load in metadata on the survey too, such as how the sample was constructed, the fieldwork dates and method used or any other relevant information. This metadata can also be used, alongside the actual survey data, for queries and comparisons, adding another dimension to the data.

At the other end of the process is a range of tools that let you query the data and extract sets of variables selected from the database, whether they were original survey questions or from metadata added the import stage, to provide a working set of variables for analysis or modelling in stats programs like SPSS or SAS. Link does not attempt to offer any analytical tools other than providing simple counts when querying the data, and currently only outputs data as SPSS or raw data.

At the heart of system is what Cognicient call the “taxonomy”. This is the truly clever bit – it’s a sophisticated master list of all of the fields you have in the database which categorises them for comparison. At the point where you import new data into the database, you supplement the taxonomy to point it to the specific variables in the incoming survey, and provide enough information for Link to be able to bring in the data and, if necessary, transform the data into a standardised format. It therefore maintains an indirect link between the source data and the consolidated data in the database, so you can add more data later if you need to. You can create your taxonomies to be very specific – to define a particular rating used on one product in a tracker, or make it generic, such as a ‘value for money rating’ which might be found in any survey. And the taxonomy could even accommodate variations in how that question was asked, essentially adding another dimension to that question which can be used in analysis or filtering. The concept is simple, but taking decisions about the best taxonomy to use is a complex process, and one that the people at Cognicient prefer to be involved with directly, when introducing Link to new customers.

Taxonomies are created as Excel worksheets, to a model specified by Cognicient. You also create a Survey Information Sheet, in which you specify the survey-specfic metadata and information on file names. It will cope with multi-level or hierarchical data, and you specify this here. With the definition complete, you move into loading the data, and here some rather ugly edits are required to the SQL Server database tables – something that Cognicient are planning to automate in a future version. You are then ready to open the Link Manager tool, from where all the other operations are performed, including importing the data, resolving differences, associating the questions in the current survey with the master list in the taxonomy.

The interface is functional, rather than elegant – but this is not software you spend time looking at – it is the means to an end. The tools it contains to query the data and make up extracts are basic but easy to use and exports are performed surprisingly quickly, usually in seconds or minutes. It is a pity that you cannot see more within the database itself though, and my plea would be for Cognicient to provide some tools that allow some real-time analysis in the future. Link is a fairly expensive product, and the process of adopting it requires a considerable commitment, but what it will let you do, through pitching all of your surveys into one big melting pot, is very exciting indeed.

Customer viewpoint: Brett Matheson, Vice President of Synovate MarketQuest

Brett Matheson is Vice President of Synovate MarketQuest, the firm’s global product design and development practice, where he has been co-ordinating an initiative to build a global database of concept, product, and packaging ratings for international consumer product clients. This database allows their analysts to perform a wide range of meta-analyses across the entire range of consumer studies, which could be from understanding the performance of a format of product packaging across different international markets, or identifying seasonality effects on concept and product test results, to observing the relative effectiveness of different kinds of questions in obtaining consistent results.

Brett explains: “Cognicient Link fundamentally does two things: it provides the platform for combining data from a wide variety of sources and a wide variety of formats into a single database, and it provides the tools to efficiently get the data out in a meaningful way. The Link software allows us to do things that really set our database apart. First, the database is at the respondent level, not aggregate. That allows us to use much more powerful analytics. Second, instead of requiring strict adherence to common data collection protocols, Link allows us to embrace the variability of different approaches used by different clients. This makes construction of the database much more difficult, but it also gives us the flexibility that our clients demand.

“We already have hundreds of studies in the database and this is just the beginning – we add to it every day. We have many thousands of product, concept, and package tests, and we need a lot of data because when you start to drill down into individual categories or regions, you need to have data there to support your analysis.”

Analysts in Synovate MarketQuest use the query tools provided within Link to interrogate the database and extract relevant datasets to the question or hypothesis they are working with. Usually, the data are then exported in SPSS format for analysis in SPSS, or for some advanced modelling in SAS, which can also read the SPSS formatted data delivered from Link.

Brett continues: “We are very excited about what we’ve been able to do with it — it has already been an enormous benefit to us and our clients. There is a certainly a lot of client interest in it. We now have all this data in one place and we are discovering new uses for it all the time.

“Some of the research on research we have done with this is focused on how we can help our clients make more efficient use of the funds they have for research. This includes things like examining the impact of sample size and composition on research conclusions.

“There is a very steep learning curve, because what you are trying to do is complex. But in the end you have an extremely effective process that allows you to do a lot of things that you might have always wished but you couldn’t very easily because they were just too expensive and took too long to do.”

A version of this review first appeared in Research, the magazine of the Market Research Society, May 2009, Issue 516.

Key Survey reviewed

In Brief

Key Survey version 7.1

Date of review: April 2009

What it does

Online survey data collection and reporting system provided on a Software-as-a-Service basis by a USA provider represented in the UK and elsewhere. Sophisticated mid-range web survey tool with a lot of flexibility and sophistication, at a highly competitive price.

Our ratings

Score 4 out of 5Ease of use

Score 3.5 out of 5Compatibility with other software

Score 5 out of 5Value for money


Annual fee for an unrestricted licence in US dollars: single user $3,950, additional users at around $3,500 (cost reduces by number of users). Annual usage-based licence for up to 10,000 completes at $1,950 per user and 25¢ for extra pre-paid completes.


  • Browser based and system independent – works on PC or Mac browsers
  • Simple GUI for most operations with a powerful scripting language in the background
  • Highly customisable and extensible through a series of plug-ins and a software interface (API)
  • Built-in error checking capabilities


  • No panel management
  • Analysis capabilities somewhat restrictive
  • Only exports to SPSS and Excel. No Triple-S.
  • Cannot create your own plug-ins.

In Depth

Researchers looking for a flexible web survey tool will be familiar with the conundrum that the more affordable entry-level tools lack the flexibility you need, while the grown-up ones not only cost a fortune but can be bewildering to use. Key Survey is a relatively new entrant to the UK market which focuses on the middle-ground and is aimed at researchers who want to build and deploy sophisticated surveys for themselves. Compared to other products on the market, Key Survey sits towards the high end of the middle ground. For a relatively low fixed cost, based on the number of users, you get a hosted solution with no restrictions on the number of interviews you do each year – something that is almost unheard of in the software-as-a-service market.

Key Survey not only covers all the basics in style – like all the standard question types, all kinds of survey routing and logic, text piping, logic on answer lists, question libraries, look-and-feel templates, multiple languages, survey invitations and reminders – it also leads the non-technical researcher into some very sophisticated territory. It offers analysis and online reports, but does not provide any panel management support.

It has a question library, which comes populated with hundreds of well-worded questions organised by subject, and very useful search capabilities that other tools often lack. You can add to the library, or treat any prior survey as part of the library. Using this could help you to standardise survey design and harmonise demographics across surveys.

There are over a hundred different design templates to choose from, and it is easy to take any of these as starting point and design your own templates too. These are all based on CSS (cascading style sheets) so they are sound at a technical level, but you do not have to get your hands dirty with any actual CSS coding, unless you actually want to.

One of the most versatile capabilities is the collection of plug-ins. These allow you extend the basic functionality into untold areas of sophistication – and there are no extra costs for using them. Each essentially goes off and performs a task and then returns any resulting data back into the questions you have defined in your survey. There are plug-ins for various kinds of flash animated questions, such as sliders, calendars for selecting dates, or using Google Maps to choose a location. There are plug-ins to validate and clean data on the fly, reach out to an external database, or to perform geo-IP checking on survey respondents, to detect those pretending to be in countries they are not. There are already over 20 plug-ins, and whenever a customer requests a new one, WorldAPP make it available to all customers too.

Logic-related errors are a common and expensive problem with surveys as they become more complex. I particularly like the Key Survey approach, which has a level of logical scrutiny built into it so that it will advise you of ill-formed logic, questions that may never get asked, and so on. Furthermore, wherever you make logic selections for routing, filters or text piping, a simple on-screen assistant anticipates the choices you are about to make, and presents you with context-sensitive overlay of the questions and answer options available – it is ingenious and very intuitive, and will actively help you stamp out most common scripting errors at source.

My only concern over the design interface is that the survey questions and answers are presented as a single scrolling page (though they can be displayed in the survey on different screens) and on long and complex surveys, this could become difficult to navigate through – especially the buttons and actions you need at the top of the page quickly scroll out of sight. It feels more web page than web app, in this respect.

The software does offer support for other self-completion streams of data too – there are additional cost modules offered for print/mail/scan surveys on paper, and for IVR on the phone, which share the same design and reporting environment.

The analysis and reporting side is a bit of a disappointment after all the capabilities in the upstream areas. It does make it relatively easy to create simple reports showing frequencies and charts, and these can also include a wide range of statistics. For some users, this may be sufficient, but the reporting formats are inflexible and most researchers will struggle to use the tool to probe their data fully and then build reports to communicate what is salient. What it does seem to be good at is letting you create quick snapshot reports that can be updated regularly as results arrive and which you can publish and share with your clients.

Users wanting to do deeper analysis are likely to want to move the data into other cross-tab or statistical tools, and here your luck runs out if it is not SPSS that you use, as that is currently the only export route that provides you with labels and definitions ready to run. WorldAPP is working on further exports, though the next one is likely to be SAS.

A particularly nice feature of the software is the live help button – this connects you immediately to a support representative who will answer any question you have using instant messaging chat. WorldAPP provides support from its three locations in Massachusetts, London and Kiev, in the Ukraine, so someone is usually available to answer questions around the clock. Support is also included in the fixed annual fee.

Customer viewpoint: Monica Coetzee, Research Manager, The College of Law

The College of Law is the UK’s largest dedicated postgraduate law school, operating from six campuses across England. It’s been using Key Survey for the last nine months in its research division, which carries out a broad range of surveys including student surveys on course quality, HR surveys, a continuous customer satisfaction survey on IT services and even ad testing, all done online with samples that are often in the thousands.

Monica Coetzee, Research Manager, explains: “We were previously using a desktop survey tool, but because there was only one person trained to use this software, it was causing a bottleneck. We wanted to restructure the department anyway and I wanted to change to working online.”

“As a non-profit organisation, price was critical for us when choosing a replacement, but we also needed something with a lot of advanced features. We reviewed several tools and Key Survey came out on top.

Monica’s team now have around 30 surveys done with Key Survey under their belts. “What I like about it is that it allows all three of us to have access to the software online from anywhere, giving us greater flexibility, for example when working in different College centres. Each person has responsibility for an entire survey project and not just for specific tasks.” This, in many cases, includes exporting report-ready data into SPSS. “It exports it directly with all the data and value labels correctly filled in”, she reports.

“We now know there is nothing that we cannot do – and we use all sorts of advanced features like show/hide options and complicated skip routing. We very often bring data we already have into the survey and use the autofill function to use it for verification or routing. And it all works very well.”

A particular requirement for Monica is the visual appearance of surveys, and she was pleased to be able to produce a set of templates matching the corporate visual identity of the College that could be used consistently across all surveys. “Although they have a good range of templates, there is the option to go into the HTML programming, so I can play around and see what happens to make my own. With very little HTML knowledge we have produced some really nice-looking surveys.”

Monica also appreciates the built-in warnings and safety features of the system. She remarks: “If you try to do some things that would result in an error, it will give you a warning or prevent you from doing it. But once you have gone live with a survey, if you need to make a change, you can react so quickly with this – it will make that change instantly. It must be said, my favourite function is the live help button. They almost always answer within a few seconds, and they are always very friendly and helpful. I actually prefer using a typed interface to a phone calls – I find I am generally more comfortable communicating online in this way.”

Published in Research, the magazine of the Market Research Society, April 2009, Issue 515

Catglobe reviewed

In Brief

Catglobe version 5.5

Catinét a/s, Denmark Date of review: February 2009

What it does

Web-based software-as-a-service (SaaS) product for mixed mode data collection and analysis, including CATI, CAPI, CAWI and integrated panel management.

Our ratings

Score 3 out of 5

Ease of use Score 4.5 out of 5

Compatibility with other software Score 4 out of 5Value for money


Variable cost based upon usage. Start-up costs typically €3,500 for configuration and training then €0.015 per panel member and €0.03 per interview minute, with some additional charges applicable


  • Completely web browser-based – supports Internet Explorer or Firefox on PC or Mac
  • Simple GUI for most operations with a powerful scripting language in the background
  • Strong on panel management and sampling capabilities
  • Good range of imports and exports including Triple-S, SPSS and Excel


  • Data analysis is inflexible and limited in scope
  • GUI questionnaire editor is cumbersome to use
  • Some performance issues – complex sample queries can be slow to run

In Depth

It may seem as if we are spoilt for choice with data collection software packages, but if you are looking for a multimodal interviewing solution that is also web-based, the choice is relatively narrow – especially if web-based CATI is part of the mix. So it’s good to welcome a new entrant on the scene in the guise of Catglobe, a mixed mode interviewing system offered as a SaaS (software as a service) product by the software division of the Danish fieldwork company Catinét. The SaaS model makes it very easy to get started with little infrastructure in-house – all that is needed is a reasonable internet connection and a web browser. It is not fussy about which one: Firefox on Windows or Mac, or Internet Explorer on Windows works equally well. Catglobe is a surprisingly vast system, and the fact that it has been extensively road-tested by Catinét’s in-house fieldwork team is evident in the range of capabilities and options provided. There are different modules for sampling, questionnaire authoring, fieldwork management, reporting and some report automation. The interviewing module supports CATI, laptop CAPI and CAWI and even has a special Hall Test mode for a temporary local network of interviewing stations. All of these modules are accessed from a central home page through a pop-up menu similar to the Windows start button. Behind all of this is a single relational database which holds all of the assets or resources relating to your surveys – questionnaires, survey responses, respondents or panellists, interviewers, reports and so on. This is one of those advanced systems that moves you away from rigid boundaries of the survey to define how data are organised. The concept of the survey still exists – however more as a workflow concept. The system presents the surveys you have available to work on as a folder structure, which you can model as you wish. However, in the background, all the survey does is provide a convenient, organising view of the central data repository. Questions and response data from one survey are easily accessible from others, if you can make a connection through questions or respondents in common. This opens up endless possibilities for using your data more intelligently both in sampling and in analysis, and it makes the logistics of running one or more panels really simple. Panel management is an area Catglobe handles particularly well. At the sample selection stage, there is a wonderful tool for building ‘groups’ – which are effectively a database query. You use a group to pull a sample from the respondent database. However, this is a query tool that understands concepts such as key demographics, sample frames, frequency of previous response and interviewer resting rules. It then ties in seamlessly with the ‘communications’ module that serves invitations and reminders for web surveys. These work directly from a library of templates, so it is very quick to set up an invitation from an existing project and adapt it slightly for the survey. The system is fully multilingual, so invitations can be templated in several languages then dispatched in the appropriate one for each respondent. As it is also truly multimodal, samples can be drawn for CATI or CAWI in parallel. The workflow is well-designed, so it is not only quick to run through the process from end to end, but also flexible when changes are needed, or if the sample requires a boost part-way through the fieldwork. Panel recruitment works equally well, and there is considerable scope to automate this, including ongoing top-up recruitment. Recruitment can be by web or by telephone, and a phone recruit can be used to trigger an immediate web survey invitation for new panellists to complete their profile data. There is also an elaborate points allocation and redemption capability too, if you wish to incentivise your panel. Access to surveys and functionality is managed from the HR module, which allows you to define roles and allocate individuals to roles. Respondents and panellists are treated in the same way: everyone from the system administrator to the panel member is registered as a user and has usage rights associated with them. The majority of the system has a very cohesive appearance, which is simple to follow – it passes the test of deceiving you that you are using a desktop program, when in fact it is a browser-based web app. At the bottom of the screen are two buttons – one labelled Tools, which is the ‘start’ button that gets you to all the different modules; the other is the Folders button, which takes you to a tree view of folders containing surveys, questionnaires, templates – essentially all your data. The questionnaire editor has a somewhat different feel, and is not as well-crafted as the other modules. It does provide pretty much all of what you need, but it feels clumsy to use. You can view a list of the questions, but important details, such as the answers, can’t be seen without going into the question itself. There is no overview of the logic or routing, which makes like difficult for the scripter. There is a powerful scripting language available, and parts of the questionnaire created in the GUI can be exported out into this too, which makes writing it (and learning it) much easier. This is an excellent capability. Unfortunately, you are likely to need it more than you should if writing web or CATI interviews of even a medium level of complexity, such as creating a constrained sum set of questions. Really, more options should be built into the questionnaire editor. Also on the downside, actual users have reported sluggish performance with some of the database operations such as drawing sample or exporting results once the number of records is in the hundreds of thousands – though Catinét report that they have worked to improve this. The reporting capabilities are also unlikely to meet most users’ needs at present – there are some nice features there, but Catinét have ambitious development plans for the reporting side, so this is likely to be improved over the current year.

Customer viewpoint: Ólafur Thor Gylfason, Market and Media Research, Reykjavik

MMR moved to Catglobe a year ago, in order to move to a single interviewing platform for CATI, CAPI and web interviewing replacing a range of different packages in use to that point. As Ólafur explains: “The good thing is we have been able to use this platform for everything we do from CATI recruiting of panels to CATI phone interviews, CAPI and CAWI. “There is a powerful programming language within the software so when we do complicated surveys such as international surveys where you have to produce an exact data map afterwards, we can write the data handling programs in advance, so that when the survey is finished, we can export the data in the exact format the client requires straight away. With this programming language there is nothing you can’t do with the software, provided you have a little bit of programming experience. “Another positive thing about the software is that we use it for open-ended coding and this capability is very powerful – we can do this on the fly so that the turnaround time on projects can be reduced considerably. “We use it for CATI recruiting, and once the phone phase is completed, the automated CAWI questionnaire is sent out immediately, and everything is always interlinked, so it is very good for us. “With panel management, there are two key points. Firstly, because it is completely multimodal, and all recruitment is done by phone but recruits are immediately served a web survey to complete their profile. Everything happens at the same time. Secondly, their sampling is very easy to work with. It makes sure there is the right load across the sample and making sure that panellists get the right number of invitations, keeping track of invitations and reminders . Their ‘group builder’ is very powerful and very easy to use and the communicator, which is the email part of the system, links in with the group builder. “We have run into some problems, but the main thing for us has been that the support has been excellent – they are almost acting as a division within our company when it comes to support, so you forget about the bad things very quickly.” Published in Research, the magazine of the Market Research Society, February 2009, Issue 512

Marketsight reviewed

In Brief

Marketsight version 7.3

Marketsight Inc., USA
Date of review: January 2009

What it does

Web-based research data reporting environment offered as a hosted solution and aimed at research data consumers, either to browse existing tables and charts, or to produce their own analysis. Offers capabilities for research agencies to publish results to clients through the software.

Our ratings

Score 4.5 out of 5Ease of use

Score 4.5 out of 5Compatibility with other software

Score 4.5 out of 5Value for money


Professional $995, Enterprise (includes portal features) $1495. Academic licences at 90% discount. Charges priced in US dollars, per user per annum and include training and support. Reduced fees for agencies providing licences to end-users.


  • Very easy to upload your own projects as either SPSS SAV files or Triple-S data
  • Excellent support for charting both within the tool and when exporting to Excel or PowerPoint
  • Can use simply as a means to distribute reports, or to interrogate data, or do bot
  • Rich set of capabilities for recoding and transforming variables


  • Though web-based, currently only works under Microsoft Windows with IE6 or IE7
  • A little prescriptive in the kinds of reports it can produce – not necessarily for the power user
  • Ignores any variable names in any data imported from Triple-S or SPSS: you have to work with the question text

In Depth

The transformation that MarketSight has gone through since we last reviewed this web-based cross-tab tool two and a half years ago is a bit like getting a visit from the son of a friend who was a teenager the last time you saw him, and is now a confident and capable adult with a university degree who wants to come and work for you.

Back then, MarketSight was a simple end-user tab tool with a few nice touches, but quite a lot of limitations too. Though it was provided as a self-drive tool, it really relied on purchasing some consulting time in the background to get surveys set up, or to carry out the kinds of transformations you were likely to need on the data. Then, the product was developed and marketed by a division of the Monitor Group, a large business consulting firm based in Cambridge, Massachusetts. This provenance showed in the kinds of features the software had, or more importantly, did not have. It was very SPSS-like in its approach to tables and lacked support for filters and even multiple response data.

Since then, Monitor Group has spun off the original MarketSight team, which now owns and develops the software independently. Development is now strictly MR-focused and the result is a much more research-centric approach to data analysis. At heart it remains an easy to use cross-tabbing tool but with a new drag-and-drop interface. You can build reports and save them for re-use later, or if someone else has set up the report for you, you can simply open the tool and review the reports.

Gone are the irritations about not being able to define or apply filters, or create tabs with multiple response data: they are all in place now. You can also drag as many variables as you like into the rows and the columns of the table.

Charting has been integrated with the tables in a very practical way. Each one-by-one combination of variables in the cross tab is presented in the output display with its own small histogram icon. Click and a window opens to display the data graphically in a way that makes any interesting variations in the data immediately obvious to any lay user. A further button lets you tailor the chart, print it or export it.

There is better-than-typical support for ranking or sorting of answers in cross-tabs, and you can rank by any column, by the base or the mean. A simple arrow icon highlights which column has been used for ranking. Charts too are easily ranked.

You can also export whole groups of tables as charts, and post them directly to PowerPoint or Excel. Within Excel, the program will helpfully provide you with a tabbed worksheet containing the chart and another containing the table – and both look extremely presentable without any tweaking, which in my experience is an accomplishment in itself. The program will not produce a completely presentation-ready PowerPoint deck, but it will get you very close to it.

Other strong areas within the product are the creation of calculated variables and categories to combine variables or categories or break numbers into ranges, and a powerful way for end-users to create very similar transformations on a lot of variables, such as to add a top-two box to a rating scale. It means that researchers or end-users can be very self-sufficient, and avoid the need to keep going to their DP supplier. Whole sets of analyses can be copied from one dataset to another.

A big breakthrough is the importing: anyone can upload their own data and variables, if you have either a SAV file or a Triple-S data and metadata file – which means you can load in data for a very wide range of survey data collection tools. My only grumble is that, while it imports all the text, it does not import the variable names, and that can make identifying questions difficult in many surveys.

MarketSight is still a bit prescriptive in what it will allow you to present in a table, which could frustrate the power-user. It also lacks the means to examine cases individually, to check outliers or view verbatims. It does not handle duplicated datasets or files and reports saved from multiple users as well as it should – unaided, your report libraries could descend into chaos. Plus, it currently only works in a Microsoft Windows environment, under IE6 or IE7, which is not everyone’s browser of choice – though this is planned to change.

If you pay a bit more and get the enterprise edition, you also get a portal environment in which you can upload other files relating to a project, and use it to start building your own research library. The system also contains a full permission control system, so that different users can be given different access rights to surveys and also to have functionality turned on or off. It therefore makes the program an attractive proposition for research agencies wishing to provide a data portal to their clients.

MarketSight’s developers deserve praise for providing users with a wealth of online help, tips, tutorials and advice all through the product. It makes this web-based tool feel like a cross between program and a website: and what could be more appropriate for a product focused on providing information?

Customer Viewpoint: Renée Zakoor at KB Home, Los Angeles, USA

Renée Zakoor is the Director of Market Research at KB Home, a new home building company the operates in 15 states in the USA.

MarketSight is used across the business to distribute market research information. Renée explains: “We do a specific survey in each of our nine divisions and that data becomes the basis for major decisions each division has to take about what to build, where to build it and so on. We upload each survey onto MarketSight. My team works with it to do analysis, but it is also put there for the people in the divisions to make use of.

“What I love about MarketSight is that non-market researchers can easily go in and answer their own questions. Then the ability to export it into Excel so they work with it that way, and do graphics to PowerPoint is just great as well. We tend to give staff members an hour’s worth of training and usually they can run with it. I also have senior managers who find they can go in and answer their own questions. It is very user-friendly, which I think is critical.

“We have now started to work with using MarketSight as a repository for all kinds of files we want everyone out in the divisions to have access to. Previously we were using an intranet, which meant using another internal resource. Using MarketSight, this is easy for me to do for myself. You do not need a lot of sophisticated computer skills to be able to upload files to it.”

Another improvement that Renée welcomes is the ability to replicate sets of analyses for different regions or users, where the project is essentially the same, but different users in will each work their own dataset. “We can set up analysis for one market and it is then easy for us to copy it over to all the other markets without having to recreate it – so there are a lot of efficiencies for us in that.”

Asked about any anxieties Renée might have about making research data so widely available for non-researchers to run their own analyses, she is unequivocal: “Over the years, I have become less concerned [about this]. I feel the more transparency there is in the data and the more people you get using data, the better. The first step is trying to get people to use research to make decisions and this is a tool that will help them do that. I find it frees up a lot of researchers’ time to be more consultants to the non-researchers. If people have bought into the methodology, it can prevent a lot of misinterpretation. Ultimately, the research is just another tool, and it is down to the researcher to be the partner that will help business people make the most of those tools: MarketSight just helps make those tools more accessible.”

Published in Research, the magazine of the Market Research Society, January 2009, Issue 511

Vovici 4.0 reviewed

In Brief

Vovici Community Builder and Feedback Intelligence, version 4.0

Vovici, United States
Date of review: November 2008

What it does

Web-based suite for building online research communities and custom panels, for both quantitative and qualitative research. Allows you to create fully branded respondent community portals easily, using an online point-and-click interface. Feedback Intelligence module offers sophisticated dashboard and drilldown reporting systems for individualised reporting to stakeholders across the enterprise from multiple data sources, and integrated with Business Objects

Our ratings

Score 4.5 out of 5Ease of use

Score 5 out of 5Compatibility with other software

Score 3.5 out of 5Value for money


Annual fees in US dollars: Community Builder module starts at $24,995 for up to three named users; Enterprise Feedback Management module starts at $24,995 plus $1,500 for each portal user.


  • Standardise and co-ordinate surveys, questions and measurements across all a company’s survey activities
  • Requires no web programming or HTML skills for the most part
  • Platform independent – Windows, Mac or Linux with any modern browser
  • Integrates with Oracle CRM and a range of industry standard CRM systems


  • No built-in incentive or reward management
  • Can only execute surveys created in the Vovici EFM survey module
  • Relatively expensive

In Depth

Today more and more companies are realising the benefit of building online panels of customers to involve in research. The idea is simple, but the reality can be complex and costly to deliver from a technical standpoint. Whether firms try to build them for themselves or park the problem with a research agency, it is an area crying out for an off-the-shelf solution like the new Vovici Community Builder, which was released last month and effectively relaunches the concept of the panel management tool for the demands of Web 2.0-style research.

The product is a completely web-based suite which sits astride a database of contacts or panellists, and allows you to interface directly with the Vovici EFM survey engine as well as other enterprise platforms or CRM systems – such Siebel or Hyperion – so that sample selections can refer to real behavioural data from recent transactions for that customer. Configuring the interfaces with other enterprise data sources is, understandably, beyond the lay user, but once these have set up a customer’s purchase history can be used just like any other piece of panel profile data, such as age or location, or be used to drive sample selections for just-in-time research.

The Portal Builder

At the heart of the software is the Portal Builder, in which you design the pages of the community site your target panel members will visit. It is effectively a content management system which allows you to lay out pages with placeholders for content that will be streamed in from other sources, and which you can arrange neatly in different columns and boxes in the way most websites are organised these days. So, in the centre you could choose to put a list of the surveys the respondent is invited to some introductory text above, headlines from the current community newsletter on the left, highlights of recent survey results on the right and so on. The portal has built-in support for just about all of the objects you are likely to need to add when building a research community site: a profile editor so panellists can view and update their personal data; current survey invitations; past surveys taken; containers for welcome messages, help and links to more information or contacts. The list reaches far into the Web 2.0 milieu: you can add forums for collaborative discussion, blogs for respondents to view and react to, data mash-ups.

There is also a wealth of collaborative tools, from a simple suggestion box to access-controlled forums that can be used for asynchronous focus groups, so that quantitative surveys can be backed up by some selective qual work or vice versa. The highly modular approach means that any tool can be access controlled, and only available to invited participants. And if you are concerned that this portal page is getting a bit busy, it is easy to spread it across a series of tabbed pages, which you can title and organise how you like. There is a large template library, and it is very simple to create an overall theme with your own imagery and branding.

You can also publish results through the portal and make these relevant to the respondent – you could present each member with a report showing their answers compared to the survey as a whole, for instance, show highlights and add commentaries. Vovici emphasise this as the means to build interest and engagement, and work on the assumption that the kind of interest a community member will derive from the experience as a whole will eliminate the need to offer financial inducements. As a consequence, there is no built-in incentive and reward management capability in the product – something that will not go down well with agencies wishing to build panels.

Though the Vovici name may be unfamiliar to many, what is now branded as Vovici EFM was originally developed as Perseus EFM. The main web survey capabilities and engine are an incremental development of the Perseus EFM software which Interface reviewed in Research July 2006 when it was already a mature and capable offering for online research. Vovici has recently established sales and support offices in London and Singapore alongside three existing locations in the United States.

The other major addition since Vovici took over is in reporting. There is now a dashboard reporting system largely in place, with some development ongoing. It follows a similar philosophy to the Community Builder by allowing you to arrange graphical and tabular reports across the screen in columns and rows – as designer, you choose what reports to show simply by pointing and clicking, selecting them from menus and so on. Again, the overall appearance is controlled by externally defined templates and stylesheets, so the entire reporting experience can be themed and branded to match a corporate intranet site.

Reporting in Business Objects

The reporting system is, in fact, built on Business Objects (using Crystal Reports), which is a widely used reporting tool in the mainstream corporate database and business intelligence sector. However, Business Objects is typically of limited use with survey data, because it does not understand common survey concepts such as multiple-response data, respondent bases that may differ from the number of responses to a given question, or one-off data formats for each short ad hoc survey. The breakthrough with Vovici is that the developers have created a data model and accompanying metadata to make research data comprehensible to Business Objects.

The beauty of this is that any reporting can be a composite of hard commercial data alongside softer attitudinal and intentional survey data. Questions too can be analysed across different surveys. By smashing through the old silo approach, Vocivi is also working towards delivering true benchmark capabilities. The idea is that any question can be reused across any survey, and once the same question has been reused, all responses to it can be used to provide a benchmark, or by filtering that benchmark to provide sector-specific comparisons.

Enterprise Feedback Management providers like Vovici are probably more aware than most MR software suppliers that their products will appeal to both the corporate user wishing to do their own research, and the research agency – and the platform lends itself to collaborative working between client and supplier. For example, the community portal and interfaces with corporate data sources could all be under the responsibility of the corporate client, while the creation of actual surveys and the preparation and publishing of results can be contracted out to one or more research suppliers, using the same platform.

This is a vast system with massive potential, which in its very design reveals some research-literate minds were behind it. Users I spoke to report that that the community functionality is stable, reliable and relatively easy to learn, and has enabled users to standardise and systematise their research and harmonise measures across very large enterprises. Perhaps the product’s greatest strength is in its ability to integrate with CRM systems and other business intelligence sources, making research more relevant and mainstream within the corporate enterprise.

A version of this review first appeared in Research, the magazine of the Market Research Society, November 2008, Issue 509