The latest news from the meaning blog


Cognicient Link

In Brief

Cognicient Link version 1.0

Cognicient, UK
Date of review: May 2009

What it does

Software utility that allows you to create a consolidated database of all numerous survey datasets for meta analysis and aggregation of survey questions and their responses across different surveys by creating flexible taxonomies that work independently of the original data structures to resolve variations at the survey level. Data may be queried directly within Link or extracted to statistical packages for analysis or modelling.

Our ratings

Score 3 out of 5Ease of use

Score 4 out of 5Openness

Score 3.5 out of 5Value for money


Annual fee £20,000 for the core system and a single data upload user and £5,000 for each additional user.


  • All your survey data in one place – analyse anything by anything
  • Works with just about any format of survey data, and will import directly from Triple-S and SPSS and SPSS legacy formats (Quancept, Surveycraft etc)
  • Flexible in how you treat and resolve similarities and differences
  • Robust and highly scalable


  • Steep learning curve – requires some technical expertise to use
  • Some complex manual intervention required during set-up stages
  • Exports limited to SPSS or raw data files

In Depth

It is a pipe dream for many big research buyers that one day they will have a single database that contains absolutely all of their surveys, so they can play with all their data as if it was just one big survey. Talk to the people at Cognicient, and that dream can become reality. Cognicient, a small UK/US consulting company that specialises in survey data management, decided to make available the tools it had developed for its own internal use that allow it to create vast data warehouses of survey data, offer them as a package for others to install and use for themselves, or as a managed service through Cognicient.

It’s a common problem that, as research agencies or research buyers accumulate surveys on different products, different markets and at different times, it becomes increasingly difficult to pull these together in any meaningful way in order to make comparisons. The individual dataset becomes a straightjacket. Minor differences in the formats of the questions, the answer lists or rating scales used, or even discrepancies in the underlying format of the data make it impossible to put the data together for analysis, unless you are able to grab a few figures from previously reported data that just happen to meet all your criteria for comparison.

Cognicient Link is a database application based on a standard Microsoft SQL Server database, that lets you create a data warehouse of all your past surveys. In doing so, it breaks down the artificial barriers that normally exist between surveys. The importer lets you load data from each survey without needing to reformat it, and alongside that, to load in metadata on the survey too, such as how the sample was constructed, the fieldwork dates and method used or any other relevant information. This metadata can also be used, alongside the actual survey data, for queries and comparisons, adding another dimension to the data.

At the other end of the process is a range of tools that let you query the data and extract sets of variables selected from the database, whether they were original survey questions or from metadata added the import stage, to provide a working set of variables for analysis or modelling in stats programs like SPSS or SAS. Link does not attempt to offer any analytical tools other than providing simple counts when querying the data, and currently only outputs data as SPSS or raw data.

At the heart of system is what Cognicient call the “taxonomy”. This is the truly clever bit – it’s a sophisticated master list of all of the fields you have in the database which categorises them for comparison. At the point where you import new data into the database, you supplement the taxonomy to point it to the specific variables in the incoming survey, and provide enough information for Link to be able to bring in the data and, if necessary, transform the data into a standardised format. It therefore maintains an indirect link between the source data and the consolidated data in the database, so you can add more data later if you need to. You can create your taxonomies to be very specific – to define a particular rating used on one product in a tracker, or make it generic, such as a ‘value for money rating’ which might be found in any survey. And the taxonomy could even accommodate variations in how that question was asked, essentially adding another dimension to that question which can be used in analysis or filtering. The concept is simple, but taking decisions about the best taxonomy to use is a complex process, and one that the people at Cognicient prefer to be involved with directly, when introducing Link to new customers.

Taxonomies are created as Excel worksheets, to a model specified by Cognicient. You also create a Survey Information Sheet, in which you specify the survey-specfic metadata and information on file names. It will cope with multi-level or hierarchical data, and you specify this here. With the definition complete, you move into loading the data, and here some rather ugly edits are required to the SQL Server database tables – something that Cognicient are planning to automate in a future version. You are then ready to open the Link Manager tool, from where all the other operations are performed, including importing the data, resolving differences, associating the questions in the current survey with the master list in the taxonomy.

The interface is functional, rather than elegant – but this is not software you spend time looking at – it is the means to an end. The tools it contains to query the data and make up extracts are basic but easy to use and exports are performed surprisingly quickly, usually in seconds or minutes. It is a pity that you cannot see more within the database itself though, and my plea would be for Cognicient to provide some tools that allow some real-time analysis in the future. Link is a fairly expensive product, and the process of adopting it requires a considerable commitment, but what it will let you do, through pitching all of your surveys into one big melting pot, is very exciting indeed.

Customer viewpoint: Brett Matheson, Vice President of Synovate MarketQuest

Brett Matheson is Vice President of Synovate MarketQuest, the firm’s global product design and development practice, where he has been co-ordinating an initiative to build a global database of concept, product, and packaging ratings for international consumer product clients. This database allows their analysts to perform a wide range of meta-analyses across the entire range of consumer studies, which could be from understanding the performance of a format of product packaging across different international markets, or identifying seasonality effects on concept and product test results, to observing the relative effectiveness of different kinds of questions in obtaining consistent results.

Brett explains: “Cognicient Link fundamentally does two things: it provides the platform for combining data from a wide variety of sources and a wide variety of formats into a single database, and it provides the tools to efficiently get the data out in a meaningful way. The Link software allows us to do things that really set our database apart. First, the database is at the respondent level, not aggregate. That allows us to use much more powerful analytics. Second, instead of requiring strict adherence to common data collection protocols, Link allows us to embrace the variability of different approaches used by different clients. This makes construction of the database much more difficult, but it also gives us the flexibility that our clients demand.

“We already have hundreds of studies in the database and this is just the beginning – we add to it every day. We have many thousands of product, concept, and package tests, and we need a lot of data because when you start to drill down into individual categories or regions, you need to have data there to support your analysis.”

Analysts in Synovate MarketQuest use the query tools provided within Link to interrogate the database and extract relevant datasets to the question or hypothesis they are working with. Usually, the data are then exported in SPSS format for analysis in SPSS, or for some advanced modelling in SAS, which can also read the SPSS formatted data delivered from Link.

Brett continues: “We are very excited about what we’ve been able to do with it — it has already been an enormous benefit to us and our clients. There is a certainly a lot of client interest in it. We now have all this data in one place and we are discovering new uses for it all the time.

“Some of the research on research we have done with this is focused on how we can help our clients make more efficient use of the funds they have for research. This includes things like examining the impact of sample size and composition on research conclusions.

“There is a very steep learning curve, because what you are trying to do is complex. But in the end you have an extremely effective process that allows you to do a lot of things that you might have always wished but you couldn’t very easily because they were just too expensive and took too long to do.”

A version of this review first appeared in Research, the magazine of the Market Research Society, May 2009, Issue 516.

Key Survey reviewed

In Brief

Key Survey version 7.1

Date of review: April 2009

What it does

Online survey data collection and reporting system provided on a Software-as-a-Service basis by a USA provider represented in the UK and elsewhere. Sophisticated mid-range web survey tool with a lot of flexibility and sophistication, at a highly competitive price.

Our ratings

Score 4 out of 5Ease of use

Score 3.5 out of 5Compatibility with other software

Score 5 out of 5Value for money


Annual fee for an unrestricted licence in US dollars: single user $3,950, additional users at around $3,500 (cost reduces by number of users). Annual usage-based licence for up to 10,000 completes at $1,950 per user and 25¢ for extra pre-paid completes.


  • Browser based and system independent – works on PC or Mac browsers
  • Simple GUI for most operations with a powerful scripting language in the background
  • Highly customisable and extensible through a series of plug-ins and a software interface (API)
  • Built-in error checking capabilities


  • No panel management
  • Analysis capabilities somewhat restrictive
  • Only exports to SPSS and Excel. No Triple-S.
  • Cannot create your own plug-ins.

In Depth

Researchers looking for a flexible web survey tool will be familiar with the conundrum that the more affordable entry-level tools lack the flexibility you need, while the grown-up ones not only cost a fortune but can be bewildering to use. Key Survey is a relatively new entrant to the UK market which focuses on the middle-ground and is aimed at researchers who want to build and deploy sophisticated surveys for themselves. Compared to other products on the market, Key Survey sits towards the high end of the middle ground. For a relatively low fixed cost, based on the number of users, you get a hosted solution with no restrictions on the number of interviews you do each year – something that is almost unheard of in the software-as-a-service market.

Key Survey not only covers all the basics in style – like all the standard question types, all kinds of survey routing and logic, text piping, logic on answer lists, question libraries, look-and-feel templates, multiple languages, survey invitations and reminders – it also leads the non-technical researcher into some very sophisticated territory. It offers analysis and online reports, but does not provide any panel management support.

It has a question library, which comes populated with hundreds of well-worded questions organised by subject, and very useful search capabilities that other tools often lack. You can add to the library, or treat any prior survey as part of the library. Using this could help you to standardise survey design and harmonise demographics across surveys.

There are over a hundred different design templates to choose from, and it is easy to take any of these as starting point and design your own templates too. These are all based on CSS (cascading style sheets) so they are sound at a technical level, but you do not have to get your hands dirty with any actual CSS coding, unless you actually want to.

One of the most versatile capabilities is the collection of plug-ins. These allow you extend the basic functionality into untold areas of sophistication – and there are no extra costs for using them. Each essentially goes off and performs a task and then returns any resulting data back into the questions you have defined in your survey. There are plug-ins for various kinds of flash animated questions, such as sliders, calendars for selecting dates, or using Google Maps to choose a location. There are plug-ins to validate and clean data on the fly, reach out to an external database, or to perform geo-IP checking on survey respondents, to detect those pretending to be in countries they are not. There are already over 20 plug-ins, and whenever a customer requests a new one, WorldAPP make it available to all customers too.

Logic-related errors are a common and expensive problem with surveys as they become more complex. I particularly like the Key Survey approach, which has a level of logical scrutiny built into it so that it will advise you of ill-formed logic, questions that may never get asked, and so on. Furthermore, wherever you make logic selections for routing, filters or text piping, a simple on-screen assistant anticipates the choices you are about to make, and presents you with context-sensitive overlay of the questions and answer options available – it is ingenious and very intuitive, and will actively help you stamp out most common scripting errors at source.

My only concern over the design interface is that the survey questions and answers are presented as a single scrolling page (though they can be displayed in the survey on different screens) and on long and complex surveys, this could become difficult to navigate through – especially the buttons and actions you need at the top of the page quickly scroll out of sight. It feels more web page than web app, in this respect.

The software does offer support for other self-completion streams of data too – there are additional cost modules offered for print/mail/scan surveys on paper, and for IVR on the phone, which share the same design and reporting environment.

The analysis and reporting side is a bit of a disappointment after all the capabilities in the upstream areas. It does make it relatively easy to create simple reports showing frequencies and charts, and these can also include a wide range of statistics. For some users, this may be sufficient, but the reporting formats are inflexible and most researchers will struggle to use the tool to probe their data fully and then build reports to communicate what is salient. What it does seem to be good at is letting you create quick snapshot reports that can be updated regularly as results arrive and which you can publish and share with your clients.

Users wanting to do deeper analysis are likely to want to move the data into other cross-tab or statistical tools, and here your luck runs out if it is not SPSS that you use, as that is currently the only export route that provides you with labels and definitions ready to run. WorldAPP is working on further exports, though the next one is likely to be SAS.

A particularly nice feature of the software is the live help button – this connects you immediately to a support representative who will answer any question you have using instant messaging chat. WorldAPP provides support from its three locations in Massachusetts, London and Kiev, in the Ukraine, so someone is usually available to answer questions around the clock. Support is also included in the fixed annual fee.

Customer viewpoint: Monica Coetzee, Research Manager, The College of Law

The College of Law is the UK’s largest dedicated postgraduate law school, operating from six campuses across England. It’s been using Key Survey for the last nine months in its research division, which carries out a broad range of surveys including student surveys on course quality, HR surveys, a continuous customer satisfaction survey on IT services and even ad testing, all done online with samples that are often in the thousands.

Monica Coetzee, Research Manager, explains: “We were previously using a desktop survey tool, but because there was only one person trained to use this software, it was causing a bottleneck. We wanted to restructure the department anyway and I wanted to change to working online.”

“As a non-profit organisation, price was critical for us when choosing a replacement, but we also needed something with a lot of advanced features. We reviewed several tools and Key Survey came out on top.

Monica’s team now have around 30 surveys done with Key Survey under their belts. “What I like about it is that it allows all three of us to have access to the software online from anywhere, giving us greater flexibility, for example when working in different College centres. Each person has responsibility for an entire survey project and not just for specific tasks.” This, in many cases, includes exporting report-ready data into SPSS. “It exports it directly with all the data and value labels correctly filled in”, she reports.

“We now know there is nothing that we cannot do – and we use all sorts of advanced features like show/hide options and complicated skip routing. We very often bring data we already have into the survey and use the autofill function to use it for verification or routing. And it all works very well.”

A particular requirement for Monica is the visual appearance of surveys, and she was pleased to be able to produce a set of templates matching the corporate visual identity of the College that could be used consistently across all surveys. “Although they have a good range of templates, there is the option to go into the HTML programming, so I can play around and see what happens to make my own. With very little HTML knowledge we have produced some really nice-looking surveys.”

Monica also appreciates the built-in warnings and safety features of the system. She remarks: “If you try to do some things that would result in an error, it will give you a warning or prevent you from doing it. But once you have gone live with a survey, if you need to make a change, you can react so quickly with this – it will make that change instantly. It must be said, my favourite function is the live help button. They almost always answer within a few seconds, and they are always very friendly and helpful. I actually prefer using a typed interface to a phone calls – I find I am generally more comfortable communicating online in this way.”

Published in Research, the magazine of the Market Research Society, April 2009, Issue 515

Catglobe reviewed

In Brief

Catglobe version 5.5

Catinét a/s, Denmark Date of review: February 2009

What it does

Web-based software-as-a-service (SaaS) product for mixed mode data collection and analysis, including CATI, CAPI, CAWI and integrated panel management.

Our ratings

Score 3 out of 5

Ease of use Score 4.5 out of 5

Compatibility with other software Score 4 out of 5Value for money


Variable cost based upon usage. Start-up costs typically €3,500 for configuration and training then €0.015 per panel member and €0.03 per interview minute, with some additional charges applicable


  • Completely web browser-based – supports Internet Explorer or Firefox on PC or Mac
  • Simple GUI for most operations with a powerful scripting language in the background
  • Strong on panel management and sampling capabilities
  • Good range of imports and exports including Triple-S, SPSS and Excel


  • Data analysis is inflexible and limited in scope
  • GUI questionnaire editor is cumbersome to use
  • Some performance issues – complex sample queries can be slow to run

In Depth

It may seem as if we are spoilt for choice with data collection software packages, but if you are looking for a multimodal interviewing solution that is also web-based, the choice is relatively narrow – especially if web-based CATI is part of the mix. So it’s good to welcome a new entrant on the scene in the guise of Catglobe, a mixed mode interviewing system offered as a SaaS (software as a service) product by the software division of the Danish fieldwork company Catinét. The SaaS model makes it very easy to get started with little infrastructure in-house – all that is needed is a reasonable internet connection and a web browser. It is not fussy about which one: Firefox on Windows or Mac, or Internet Explorer on Windows works equally well. Catglobe is a surprisingly vast system, and the fact that it has been extensively road-tested by Catinét’s in-house fieldwork team is evident in the range of capabilities and options provided. There are different modules for sampling, questionnaire authoring, fieldwork management, reporting and some report automation. The interviewing module supports CATI, laptop CAPI and CAWI and even has a special Hall Test mode for a temporary local network of interviewing stations. All of these modules are accessed from a central home page through a pop-up menu similar to the Windows start button. Behind all of this is a single relational database which holds all of the assets or resources relating to your surveys – questionnaires, survey responses, respondents or panellists, interviewers, reports and so on. This is one of those advanced systems that moves you away from rigid boundaries of the survey to define how data are organised. The concept of the survey still exists – however more as a workflow concept. The system presents the surveys you have available to work on as a folder structure, which you can model as you wish. However, in the background, all the survey does is provide a convenient, organising view of the central data repository. Questions and response data from one survey are easily accessible from others, if you can make a connection through questions or respondents in common. This opens up endless possibilities for using your data more intelligently both in sampling and in analysis, and it makes the logistics of running one or more panels really simple. Panel management is an area Catglobe handles particularly well. At the sample selection stage, there is a wonderful tool for building ‘groups’ – which are effectively a database query. You use a group to pull a sample from the respondent database. However, this is a query tool that understands concepts such as key demographics, sample frames, frequency of previous response and interviewer resting rules. It then ties in seamlessly with the ‘communications’ module that serves invitations and reminders for web surveys. These work directly from a library of templates, so it is very quick to set up an invitation from an existing project and adapt it slightly for the survey. The system is fully multilingual, so invitations can be templated in several languages then dispatched in the appropriate one for each respondent. As it is also truly multimodal, samples can be drawn for CATI or CAWI in parallel. The workflow is well-designed, so it is not only quick to run through the process from end to end, but also flexible when changes are needed, or if the sample requires a boost part-way through the fieldwork. Panel recruitment works equally well, and there is considerable scope to automate this, including ongoing top-up recruitment. Recruitment can be by web or by telephone, and a phone recruit can be used to trigger an immediate web survey invitation for new panellists to complete their profile data. There is also an elaborate points allocation and redemption capability too, if you wish to incentivise your panel. Access to surveys and functionality is managed from the HR module, which allows you to define roles and allocate individuals to roles. Respondents and panellists are treated in the same way: everyone from the system administrator to the panel member is registered as a user and has usage rights associated with them. The majority of the system has a very cohesive appearance, which is simple to follow – it passes the test of deceiving you that you are using a desktop program, when in fact it is a browser-based web app. At the bottom of the screen are two buttons – one labelled Tools, which is the ‘start’ button that gets you to all the different modules; the other is the Folders button, which takes you to a tree view of folders containing surveys, questionnaires, templates – essentially all your data. The questionnaire editor has a somewhat different feel, and is not as well-crafted as the other modules. It does provide pretty much all of what you need, but it feels clumsy to use. You can view a list of the questions, but important details, such as the answers, can’t be seen without going into the question itself. There is no overview of the logic or routing, which makes like difficult for the scripter. There is a powerful scripting language available, and parts of the questionnaire created in the GUI can be exported out into this too, which makes writing it (and learning it) much easier. This is an excellent capability. Unfortunately, you are likely to need it more than you should if writing web or CATI interviews of even a medium level of complexity, such as creating a constrained sum set of questions. Really, more options should be built into the questionnaire editor. Also on the downside, actual users have reported sluggish performance with some of the database operations such as drawing sample or exporting results once the number of records is in the hundreds of thousands – though Catinét report that they have worked to improve this. The reporting capabilities are also unlikely to meet most users’ needs at present – there are some nice features there, but Catinét have ambitious development plans for the reporting side, so this is likely to be improved over the current year.

Customer viewpoint: Ólafur Thor Gylfason, Market and Media Research, Reykjavik

MMR moved to Catglobe a year ago, in order to move to a single interviewing platform for CATI, CAPI and web interviewing replacing a range of different packages in use to that point. As Ólafur explains: “The good thing is we have been able to use this platform for everything we do from CATI recruiting of panels to CATI phone interviews, CAPI and CAWI. “There is a powerful programming language within the software so when we do complicated surveys such as international surveys where you have to produce an exact data map afterwards, we can write the data handling programs in advance, so that when the survey is finished, we can export the data in the exact format the client requires straight away. With this programming language there is nothing you can’t do with the software, provided you have a little bit of programming experience. “Another positive thing about the software is that we use it for open-ended coding and this capability is very powerful – we can do this on the fly so that the turnaround time on projects can be reduced considerably. “We use it for CATI recruiting, and once the phone phase is completed, the automated CAWI questionnaire is sent out immediately, and everything is always interlinked, so it is very good for us. “With panel management, there are two key points. Firstly, because it is completely multimodal, and all recruitment is done by phone but recruits are immediately served a web survey to complete their profile. Everything happens at the same time. Secondly, their sampling is very easy to work with. It makes sure there is the right load across the sample and making sure that panellists get the right number of invitations, keeping track of invitations and reminders . Their ‘group builder’ is very powerful and very easy to use and the communicator, which is the email part of the system, links in with the group builder. “We have run into some problems, but the main thing for us has been that the support has been excellent – they are almost acting as a division within our company when it comes to support, so you forget about the bad things very quickly.” Published in Research, the magazine of the Market Research Society, February 2009, Issue 512

Marketsight reviewed

In Brief

Marketsight version 7.3

Marketsight Inc., USA
Date of review: January 2009

What it does

Web-based research data reporting environment offered as a hosted solution and aimed at research data consumers, either to browse existing tables and charts, or to produce their own analysis. Offers capabilities for research agencies to publish results to clients through the software.

Our ratings

Score 4.5 out of 5Ease of use

Score 4.5 out of 5Compatibility with other software

Score 4.5 out of 5Value for money


Professional $995, Enterprise (includes portal features) $1495. Academic licences at 90% discount. Charges priced in US dollars, per user per annum and include training and support. Reduced fees for agencies providing licences to end-users.


  • Very easy to upload your own projects as either SPSS SAV files or Triple-S data
  • Excellent support for charting both within the tool and when exporting to Excel or PowerPoint
  • Can use simply as a means to distribute reports, or to interrogate data, or do bot
  • Rich set of capabilities for recoding and transforming variables


  • Though web-based, currently only works under Microsoft Windows with IE6 or IE7
  • A little prescriptive in the kinds of reports it can produce – not necessarily for the power user
  • Ignores any variable names in any data imported from Triple-S or SPSS: you have to work with the question text

In Depth

The transformation that MarketSight has gone through since we last reviewed this web-based cross-tab tool two and a half years ago is a bit like getting a visit from the son of a friend who was a teenager the last time you saw him, and is now a confident and capable adult with a university degree who wants to come and work for you.

Back then, MarketSight was a simple end-user tab tool with a few nice touches, but quite a lot of limitations too. Though it was provided as a self-drive tool, it really relied on purchasing some consulting time in the background to get surveys set up, or to carry out the kinds of transformations you were likely to need on the data. Then, the product was developed and marketed by a division of the Monitor Group, a large business consulting firm based in Cambridge, Massachusetts. This provenance showed in the kinds of features the software had, or more importantly, did not have. It was very SPSS-like in its approach to tables and lacked support for filters and even multiple response data.

Since then, Monitor Group has spun off the original MarketSight team, which now owns and develops the software independently. Development is now strictly MR-focused and the result is a much more research-centric approach to data analysis. At heart it remains an easy to use cross-tabbing tool but with a new drag-and-drop interface. You can build reports and save them for re-use later, or if someone else has set up the report for you, you can simply open the tool and review the reports.

Gone are the irritations about not being able to define or apply filters, or create tabs with multiple response data: they are all in place now. You can also drag as many variables as you like into the rows and the columns of the table.

Charting has been integrated with the tables in a very practical way. Each one-by-one combination of variables in the cross tab is presented in the output display with its own small histogram icon. Click and a window opens to display the data graphically in a way that makes any interesting variations in the data immediately obvious to any lay user. A further button lets you tailor the chart, print it or export it.

There is better-than-typical support for ranking or sorting of answers in cross-tabs, and you can rank by any column, by the base or the mean. A simple arrow icon highlights which column has been used for ranking. Charts too are easily ranked.

You can also export whole groups of tables as charts, and post them directly to PowerPoint or Excel. Within Excel, the program will helpfully provide you with a tabbed worksheet containing the chart and another containing the table – and both look extremely presentable without any tweaking, which in my experience is an accomplishment in itself. The program will not produce a completely presentation-ready PowerPoint deck, but it will get you very close to it.

Other strong areas within the product are the creation of calculated variables and categories to combine variables or categories or break numbers into ranges, and a powerful way for end-users to create very similar transformations on a lot of variables, such as to add a top-two box to a rating scale. It means that researchers or end-users can be very self-sufficient, and avoid the need to keep going to their DP supplier. Whole sets of analyses can be copied from one dataset to another.

A big breakthrough is the importing: anyone can upload their own data and variables, if you have either a SAV file or a Triple-S data and metadata file – which means you can load in data for a very wide range of survey data collection tools. My only grumble is that, while it imports all the text, it does not import the variable names, and that can make identifying questions difficult in many surveys.

MarketSight is still a bit prescriptive in what it will allow you to present in a table, which could frustrate the power-user. It also lacks the means to examine cases individually, to check outliers or view verbatims. It does not handle duplicated datasets or files and reports saved from multiple users as well as it should – unaided, your report libraries could descend into chaos. Plus, it currently only works in a Microsoft Windows environment, under IE6 or IE7, which is not everyone’s browser of choice – though this is planned to change.

If you pay a bit more and get the enterprise edition, you also get a portal environment in which you can upload other files relating to a project, and use it to start building your own research library. The system also contains a full permission control system, so that different users can be given different access rights to surveys and also to have functionality turned on or off. It therefore makes the program an attractive proposition for research agencies wishing to provide a data portal to their clients.

MarketSight’s developers deserve praise for providing users with a wealth of online help, tips, tutorials and advice all through the product. It makes this web-based tool feel like a cross between program and a website: and what could be more appropriate for a product focused on providing information?

Customer Viewpoint: Renée Zakoor at KB Home, Los Angeles, USA

Renée Zakoor is the Director of Market Research at KB Home, a new home building company the operates in 15 states in the USA.

MarketSight is used across the business to distribute market research information. Renée explains: “We do a specific survey in each of our nine divisions and that data becomes the basis for major decisions each division has to take about what to build, where to build it and so on. We upload each survey onto MarketSight. My team works with it to do analysis, but it is also put there for the people in the divisions to make use of.

“What I love about MarketSight is that non-market researchers can easily go in and answer their own questions. Then the ability to export it into Excel so they work with it that way, and do graphics to PowerPoint is just great as well. We tend to give staff members an hour’s worth of training and usually they can run with it. I also have senior managers who find they can go in and answer their own questions. It is very user-friendly, which I think is critical.

“We have now started to work with using MarketSight as a repository for all kinds of files we want everyone out in the divisions to have access to. Previously we were using an intranet, which meant using another internal resource. Using MarketSight, this is easy for me to do for myself. You do not need a lot of sophisticated computer skills to be able to upload files to it.”

Another improvement that Renée welcomes is the ability to replicate sets of analyses for different regions or users, where the project is essentially the same, but different users in will each work their own dataset. “We can set up analysis for one market and it is then easy for us to copy it over to all the other markets without having to recreate it – so there are a lot of efficiencies for us in that.”

Asked about any anxieties Renée might have about making research data so widely available for non-researchers to run their own analyses, she is unequivocal: “Over the years, I have become less concerned [about this]. I feel the more transparency there is in the data and the more people you get using data, the better. The first step is trying to get people to use research to make decisions and this is a tool that will help them do that. I find it frees up a lot of researchers’ time to be more consultants to the non-researchers. If people have bought into the methodology, it can prevent a lot of misinterpretation. Ultimately, the research is just another tool, and it is down to the researcher to be the partner that will help business people make the most of those tools: MarketSight just helps make those tools more accessible.”

Published in Research, the magazine of the Market Research Society, January 2009, Issue 511