The latest news from the meaning blog

 

Revelation reviewed

In Brief

What it does

Online qualitative research environment for asynchronous or bulletin-board style depth interviewing, discussions and auto-ethnography, allowing research to take place over several days or even weeks. No special software or plug-in is required to participate.

Supplier

Revelation Inc., Portland, OR

Our ratings

Score 5 out of 5Ease of use

Score 4 out of 5Compatibility with other software

Score 4 out of 5Value for money

Cost

Single project licenses start at $1,500 US.  Discounts available for larger volumes, annual licenses and longer projects (3 months+).  Helping Hands project support and translations are costed on a per-project basis.

Pros

  • Daily activities can be set up in advance and launched automatically
  • Participants can blog, view stimulus material and upload content using any browser
  • Integrated translation service for discussion guides and transcripts
  • Full transcripts easily exported at any time

Cons

  • Discussion groups lack versatility
  • Can be difficult to analyse quail-quant type pre-coded questions
  • Can be overwhelmed by emails from a busy board
  • Currently cannot personalise the welcome email text

In Depth

Online qualitative research does not need to be a pale imitation of conventional face-to-face groups and depths, and it is better not to try. Revelation is a piece of web-based software that provides a rich environment for qualitative researchers to design Internet-age research projects that play to the strengths of the medium. Respondents tend to welcome the convenience of being able to participate whenever they choose from the comfort of their own home or office. Researchers and client may enjoy the same, but more importantly, by moving beyond the temporal and spatial constraint of the single-point-in-time group, they may find they get richer and more considered insights.

A Revelation group can take place over several days, or even weeks, with new questions or exercises being presented on a daily basis. Participants can be encouraged to contribute much more in the way of content, taking the focus group into the realms of auto-ethnography and co-creation. Revelation allows you to decide, for each question you ask, whether the responses are to be visible to others – before they respond (“influenced”), only after they respond (“uninfluenced”) or even withheld from all but the moderator and any client observers (“private”).

The way the software works as a researcher, is that you log into your account on the Revelation server and create a series of activities for your respondents to take, building different tasks from a toolbox of stimuli. You can cue them exercises to start on different days, and if you are planning some kind of diary activity, an exercise can be made to repeat. You can build an exercise very simply from a toolbox of components.

There’s everything you would expect there, from open questions and closed, pre-coded questions, places to provide descriptions, welcome texts and explanations, cues to present any multi-media stimulus material you may wish to display and questions where you request an upload of a photo, document or even a video. You could simply present a series of open questions each day or you could lead your participant through creating an daily blog illustrated with photos they have taken of their actual experiences. Either way, you can also probe away to your heart’s content, and even change the direction of the research part way through.

The software also lets you manage your participants, send out initial invites and get them to fill out a short profile survey, which you can customise. You can also import participant lists from Excel. Projects are divided into segments, which you can use in multiple ways: to divide your participants into smaller subgroups, assign them to different moderators or to assign different tasks to different subgroups.

The respondent interface presents some very simple tabbed areas to view – things to do, things already done and direct messages to or from the moderator. This and all the interfaces have a very pleasing “Facebook era” design which make them pretty much self-explanatory.

As a moderator, is very easy to track participation, view all the new content, add probes and send email reminders or messages to participants that don’t seem to be logging in.  You too will get emails whenever anyone completes one of your tasks – if you are running several large groups, their can be a tidal wave of emails coming your way.

Revelation also allows you to conduct discussion groups online. Here, I found the software to be a little less flexible. It forces you to divide each discussion topic into a series of different tasks. You follow up any point with a probe to that individual, but the tool currently lacks the ability to ask follow-up questions of the group as a whole, or simply open out a probe to everyone without setting up an entirely new discussion task.

In the current version, there are some other minor niggles, such as not being able to personalise the welcome email, and it not being very easy to output and analyse or present the answers to closed questions. However, in the piece of research I used this for (a group among six IT professionals), I was astonished with the quality and clarity of the responses I got. Going online clearly cuts out the waffle, as respondents draft their responses carefully, and consider what they are saying. The result is data that is relatively easy to analyse with very little padding to cut away.

It would be wrong to consider this method a replacement for all groups or depths, but it does provide a credible alternative, and this software certainly encourages creativity in the actual research design.

Client perspective: Claire Dally, GfK Automotive, London

Claire Dally is a Research Manager at GfK Automotive in London, and has recently completed a multi-country study of over 190 people across 15 participant groups using Revelation.

She describes her experiences: “Revelation is very intuitive, easy to use and has a visually appealing interface.  We took advantage of the Helping Hands package, where we were given a dedicated member of the Revelation team to guide us through this multi-market project and set up some of the scripts. They were extremely supportive throughout the whole process.  We also used their translation service and found the quality of the translation was excellent.

“A transcript created during a Revelation session will often be much longer than that of a focus group, because respondents generally have more time to consider their answers and to write down their opinions in detail.  With online qual I find you need to recruit double the number of respondents you actually need.  There are a number of reasons why you lose people: sometimes they are not available during the whole fieldwork session; others may lose interest along the way.

“You do lack some of the rapport you gain face-to-face.  Participants log on at different times, so you don’t necessarily get the chance to probe on things straight away, and therefore you can lose momentum.  However, there are ways of building rapport; you need to invest some time in the start of the fieldwork warming up respondents, and making sure you keep on top of each person’s response, so that they feel someone is reading their comments.  Participants can also upload photographs or videos to illustrate their ideas.

“Clients find the software easy to use too.  They are able to log on as observers, watch comments being made in real time and suggest probes via the moderator.   This means that clients feel really involved in the moderation process and that all of their questions are being addressed.  However, Revelation is not suitable for all types of projects.  If your client wants participants to view confidential stimuli, you have no guarantee that any material tested is not screen grabbed, copied down or viewed by others.

“Participants also find the experience very positive too; they are able to log on at a time suitable for them, and many have commented about how much they have enjoyed taking part.  You can be slightly less structured in your approach, allowing the topic guide to evolve over the fieldwork session, rather than relying on a pre-determined list of questions.  New questions can be loaded up on a daily basis if required.

“It is a cost-effective way of running online qual.  You can bring together participants from different locations without needing them to be in one place at the same time.  For less than the price of a UK focus group of eight respondents you could run a Revelation session of around 20 participants.  Our clients are becoming increasingly interested in online qual and we believe that this interest will only become stronger in the future.  We’ve been very happy with this software and what you can get out of it.”

A version of this review first appeared in Research, the magazine of the Market Research Society, April 2010, Issue 527



Marsc.net 1.1 reviewed

In Brief

What it does

Web-based panel and sample management tool, based on a subset of the most useful features in the desktop/server version of the MARSC sampling software.

Supplier

MARSC Ltd

Our ratings

Score 4.0 out of 5Ease of use

Score 4.5 out of 5Compatibility with other software

Score 4.5 out of 5Value for money

Cost

Prices start from around £175 per month for a small-scale operation, plus hosting fees if required; £250-£300 per month for a mid-scale operator with multiple panels. Price determined by volumes.

Pros

  • Allows precise targeting of samples with extremely accurate incidence calculations and estimates
  • No limit on size, demographics or history kept
  • Can reuse sample jobs to draw fresh samples, create ‘favourites’  and define default sample templates
  • Provides resources for creating multiple panel members’ sites

Cons

  • Does not support non-interlocked (margin defined) quotas yet
  • Some reports rather cryptic and confusing
  • Windows and IE only for admin interface
  • Panellist portal module needs programming skills to configure

In Depth

MARSC has always been the heavyweight among the sampling and panel platforms, but now a new software-as-a-service version of the tool has emerged with the aim of lowering the bar to entry. It makes it easier to get up and running and offers something of a break on the price too, which should appeal to the smaller-scale operator.  Just how useful this is will depend on how sophisticated your requirements are, but if you need something with all the bells and whistles, the desktop and server-based MARSC is still available and being developed.

These days, most users tend to look on MARSC as a panel management tool but wasn’t always that way: MARSC started out as a sophisticated sampling tool to allow corporate research clients to draw balanced, quota-controlled samples directly from their own CRM databases.  It was a program that was in the right place at the right time when researchers realised that the most efficient way to do research online was to have a panel of pre-screened, actively managed respondents over which there was a known pedigree, both in terms of demographics and past participation.

MARSC maintains its own database of contacts and therefore, compiling and revealing all of this history is second nature to it. It is not an interviewing system – to use MARSC you will also need survey data collection platform, though it is agnostic about which one, and supports SPSS Data Collection and Confirmit directly, and many others via Triple-S.

The new .net version rationalises the process of sample selection by setting out all the options across six tabbed screens. You start by stating who you do want – referring to any of the profile variables in your panel, overall targets and incidence estimates. In the filters tab, you set exclusions, which can also be on demographics or past participation, such as people who have already received a survey invitation to another study in the last two days, or who have been interviewed on a previous wave of the same study in the last year.

In the third tab, you choose the variables you want to pass across the interviewing system, and in the fourth one you define your quotas – the quota targets you are aiming for. While you may also need to reinforce these with quotas applied in the interviewing module, a great advantage of MARSC is that, over time, it can build up very detailed response history and it will use this for each sample segment you are selecting, to predict just how much sample you need to quota target without over-inviting respondents or wasting sample.

The fifth screen handles notifications – who the reports of the sample jobs get emailed to, and last one ‘properties’ (misnamed in my view) which are not job options but the metadata for the job – the type of project, its name, who is the client and the exec, and also where you specify the reward points for participation.

All of this set-up is saved as a job, which you can give a name to and save in a folder structure that is always visible on the left of the screen. Once a job is saved it can be queued for execution, when it will draw a sample and mark them in the database as having been selected for a survey. You can also do a trial run, when it will simply report on what it would draw – a useful prior step in ensuring you do have enough sample to run with.

Saved jobs can also be stored as “favourites”– a nice web-like touch. Indeed, generally, the program has ported well to the web environment. However, the report displays could be improved as they present a mass of data and tend to use rather cryptic two-letter codes as column headers taken straight from the desktop version, whereas so much more is possible using dynamic HTML on the web.

The respondent portal is a vital part of the panel management tool, and MARSC provides a versatile portal module and set of tools for configuring it. The module is common to desktop and .net versions. In this, participants can update their profile, review what surveys are available for them to take, review and redeem incentive points and so on. The tool is designed for those with developer skills, however – sadly, there is no simple a point-and-click interface for creating or customising panellist portals, which the SaaS user is likely to expect and other systems now provide. Those without an in-house web programmer are likely to need to buy some consulting services from MARSC to get set up.

MARSC say that it is their intention in time to move all of the desktop functionality over to  the .net interface – at the moment, it lacks a handful of the more advanced features, the most serious one being the ability to interlock quotas, where an iterative model is used as a direct counterpart to using rim weighting on tables. MARSC.net won’t appeal to everyone yet, but it does go a long way to democratising efficient panel management by making it available to smaller operators without the expense of dedicated servers and teams of specialist programmers.

Client perspective: Robert Favini, Research Results, Massachusetts, USA

Research Results, a research company in Massachusetts, uses desktop MARSC to host a number of custom panels for clients in a number of consumer sectors incuding the entertainment industry.  Robert Favini, VP Client Services discusses some of the changes that MARSC has enabled.

“Originally, we had had our own internal systems with screener surveys attached which we used in a rudimentary way to pull sample, but it was a bit of a patchwork of things and wasn’t very sophisticated.  About three years ago we started to look at what other people were doing, and we came across MARSC. Shortly before that, we had also had decided to use SPSS as our main survey authoring environment, and as the two tools fit together really seamlessly, that was a big draw for us. As a result, our level of sophistication in what we can offer to our customers for managed panels has jumped up a lot.

“Our clients need something very robust because often they are doing quite a lot of analysis on the data that we provide. With our home-grown tools, the problem we were having was compatibility. This has a nice agnostic format that talks to everything.

The full implementation took place over a two-month period, including converting all of the data, though setup and training required only a few days.

“It was relatively easy to get it up and running: we had a couple of training sessions and someone from MARSC came over to work with our in-house developers. They are in the industry so they are aware of what we are trying to do and use the same terminology as us.

“About the time MARSC came along, I think the industry started using sample in a different way. Gone were the days when people would happily take surveys – we were having to use sample much more carefully. What we were looking for with MARSC was something that would use sample wisely, and let us treat it as a precious resource.”

“What we found was that as we used it, we appreciated it more and more. We like its ability to gain intelligence within the panel. We find we can target sample really precisely, and the incidence calculations it provides – the ‘guesstimates’ of how much sample to broadcast – have been phenomenally accurate. The bottom line is we are not over-using sample: we are basically being very efficient, which is where we were looking to be.”

Robert welcomes MARSC’s strategy to migrate the product to the web, although the lack of interlocked samples along with some other advanced features make it unviable for his firm yet.  “We’d still be interested in using a web-based product because of the portability it brings. Sometimes we have staff scattered all over the place and at the moment we have to use VPN to give them remote access. It would be useful for client users, but we too would like to have that bit of greater flexibility to be able to work out of the office.“

A version of this review first appeared in Research, the magazine of the Market Research Society, March 2010, Issue 526.

Rosetta Studio 3.3 reviewed

In Brief

What it does

Report automation platform which takes tabular output from almost any MR cross-tab program and transforms it into well-crafted PowerPoint presentations. Works with existing slide decks or will generate new ones selectively, directly from tables.

Supplier

ATP Canada

Our ratings

Score 4 out of 5Ease of use

Score 5 out of 5Compatibility with other software

Score 4 out of 5Value for money

Cost

Annual subscriptions from £5000 for a single user, £5,850 for 2 named users, £10,500 for 10. Concurrent pricing options also available. Prices include training, support and updates

Pros

  • Saves hours when converting tables into presentations
  • Greatly reduces the scope for errors when creating presentations
  • Shared templates reduce work and allow for a custom look for each client
  • Presentations can be updated with new data even after they have been modified in PowerPoint

Cons

  • No full preview without exporting to PowerPoint
  • No undo when editing or making changes
  • Windows only

In Depth

It’s a program that few profess to love, but PowerPoint shows little signs of yielding its iron fist over the boardroom presentation yet. Researchers often feel they are slaves to the god PowerPoint twice over, not just when presenting, but when preparing too due to the sheer monotonous drudgery of creating each slide by hand – slowly patting figures and graphs into shape, often against the pressure of the clock.

Rosetta Studio automates this process, and does it in a very efficient and comprehensive way. As a tool, it’s been around for over five years now. We reviewed version 1 back in 2005, when it was simple and straightforward to use, but fell short of doing all you needed it to. There wasn’t enough control over style and layout to create client-ready presentations, and this inevitably resulted in having to do a lot of finessing on the final output within PowerPoint to get the look right, and leaving you vulnerable to having to repeat all this work if you needed to re-run the data.

Improvements since then, culminating in version 3.3, have removed all these limitations. The range of capabilities has, quite simply, exploded. Pretty much any tabular input is now supported, either through built-in importers, or by using an upstream mapping utility. Within the tool, there is now fine control over every aspect of output, and a lot of attention has gone into providing ways to prevent anyone from every having to do anything more than once.

As an example, colouring, shading and chart options are all created and controlled within Rosetta and are not limited to what Excel or PowerPoint can produce. Colours can be linked with products or even applied automatically from the labels in your data for brand names, company names, countries and so on. It eliminates any fight you may have with PowerPoint showing different colours from the ones you had hoped for because of the clumsy way that PowerPoint templates and document themes interact. Instead. All of this is controlled, safely and predictably from within Rosetta, yet still it is standard PowerPoint that comes out of it.

A very powerful feature of the tool is the template. These takes a little time to master, but templates have the advantage that, once defined, they can be used across the whole organisation and shared easily among teams. Using templates, it takes just seconds to build charts from tables. Templates not only apply the styling, but work out what to pick out of the raw table – e.g. just the percentages or the frequencies, and not the total columns and rows.

Not everyone needs to become a template guru. It is not hard to modify them, or adapt them – but if you want to ensure a consistent look, and to control this, they can also be password protected against unwanted or unintentional changes.

There are now three modes of operation: generate, populate and update. In version 1 only generate was possible – this limited Rosetta to the “push” method of production, where you effectively created then exported a PowerPoint from scratch. Generate is ideal for ad hoc work, but not much help for any kind of continuous or large scale production.

Populate mode introduces the alternative “pull” method, where you can take an existing PowerPoint and link it to tables within Rosetta Studio by going through the PowerPoint document, stripping out the variable content and replacing it with a tag that will pull in the relevant data from Rosetta. Tags can pull rows, columns, individual cell, tables or sections of tables, and are to some extent smart – e.g. you can pull the first row, the second from last cell, or the column headed ‘females’. ATP’s implementation is delightfully simple, though it does take some effort to get your mind around the process. But it is ideal for large-scale report production on trackers, where many similar reports are produced on different cuts of the data, and the suite provides some batching and automation modules for power-users that go beyond the desktop interface.

Even more ingenious is the new “update mode” which achieves a kind of magic with PowerPoint. There is nothing to stop you from going in and making extensive changes to your presentation and still be able to update the PowerPoint, for example, because you had to remove a case and re-weight the data. Rosetta invisibly watermarks each chart it produces and uses these hidden tags to identify each value correctly and substitute the updated value. It’s very clever.

All this increased sophistication does come at a cost, however, as the price has nudged upwards, and so too is the investment you need to make in learning it.  This is not something you can pick up for the first time and guess your way through. ATP Canada encourages new all users to take their two-day training course before getting started, by including it ‘free’ within the licence fee. The program is reasonably intuitive, once you have got to grips with the fundamentals, though it’s a pity that you cannot get a better preview of what you are doing within the Rosetta interface – you only see it properly when you generate the PowerPoint.  If you find yourself going along the wrong track, it does not provide any real undo capability either, which is a pity.

Producing presentations is complex and without something like this, is very time-consuming. Speaking to actual users, it’s clear that users not only find learning it an investment work making, but that they soon wonder how they ever managed without it.

Customer viewpoint: Leger Marketing, Canada

Leger Marketing is a full service research company with nine offices across Canada and the United States. Here Christian Bourque, VP Research and Patrick Ryan, a Research Analyst at Leger Marketing, speak of their experiences with Rosetta Studio.

CB: “At the time we were looking for something, we felt most automation software was aimed at large trackers. About eighty per cent of our work is ad hoc. We needed something where the up-front time would be quite small. Rosetta Studio seemed to be better-designed for ad-hoc research, certainly at that time. “

PR: “Now, nearly every one of our quantitative projects runs through Rosetta at some stage.   We’d even use it for a four question omnibus study, where it is still faster than creating a report slide-by-slide. It means the bulk of our time is no longer taken up with the work of creating a report.  The focus is now on analysing and interpreting the data.”

CB: “Once you have spent a little bit of time devising your own templates, you will save 60 to 75 per cent of your time analysing the data. “

PR: “Something that would take four days or a week to put together is now taking us one or two days.   “

CB: “It not only saves the analyst time, but you also need to consider the quality review perspective. We used to do a line-by-line review. Now, because it is automated, this is no longer necessary. It’s a load off our shoulders. It means we can spend more time improving the quality of the insights.  We also find we can include more complex cuts of data in the report that we would not have had time to do, beforehand, like that little bit of extra multivariate analysis.”

PR: “Something we like a lot is the flexibility it gives you to try different things. You might be creating a set of graphs and you realise it could be better presented another way.  Now the hassle of changing your graphs or charts isn’t such a big deal. It takes you two seconds.

“It takes two days to learn, though the basics can be covered in a morning. It is fairly intuitive. We have a couple of reports where the analysis use the tagging. The interface is the same but the logic is different. You have get your mind around how to use and place tags, but once you have done one it is fine. It’s actually very simple.”

CB “We like the flexibility it provides from a look and feel point of view. We can have different templates for different companies. Many of our client have a corporate library of anything they generate, so when it circulates on the client side, it needs to look as if they it’s their document.

“This is something we introduced to add value, not to reduce staffing. It’s the nature of our business that you constantly have to be faster than the year before.  The demand on time is extreme. This is one of the ways we’ve been able to meet that challenge, while improving quality.  And the other major demand is for better insights, and this is one of the tools that allows us to do that.“

A version of this review first appeared in Research, the magazine of the Market Research Society, February 2010, Issue 524

Infotools Viewers and Consoles reviewed

In Brief

What it does

A collection of graphical data delivery tools, based around a single interactive and highly responsive  for integrating streams of research and non-research data. May be used as an alternative to dashboard reporting.

Supplier

Infotools

Our ratings

Score 5 out of 5Ease of use

Score 4.5 out of 5Compatibility with other software

Score 3 out of 5Value for money

Cost

Per project. Viewers, from £1000 for a one-off project, £2000 annually for trackers. Unlimited users, includes Espri data analysis tool. Consoles: from £20,000.

Pros

  • Integrates diverse data streams into a single interactive view
  • Compatible with any web browser
  • Compelling style of presentation makes complex data highly accessible

Cons

  • Architecture an interface places limits on the number of variables or dimensions that can be included
  • No tool to build databases: only in conjunction with Infotools’ data import services
  • Expensive for small-scale uses.

In Depth

Simplicity is the hallmark of good design, whether it is for a mobile phone, a website or a piece of software. It is something we are surprisingly haphazard at achieving in market research. Compared to other kinds of corporate data, survey data tends to be complex, and the analysis and reports that come out of the other end all too often reinforce rather than reduce complexity by stripping it back to the essential. For New-Zealand based MR software provider Infotools, the aim of a new range of data reporting tools, which they call simply Viewers, is to pare back on clutter, and bring radical simplicity to data analysis.

In this initial release, there are four different Viewers to choose from, InfoPlot, InfoSwitch, InfoTrend and InfoWorld.  Each follows the same principles and uses a single chart, accompanied by a small amount of tabular data, to present a related series of measures.  These would typically be brands or product categories. Each Viewer is aimed at presenting particular kinds of data. InfoSwitch, for example, presents brand switching behaviour. They are all minimalist Web 2.0 products – users are hardly aware they are using software. Instead, they interact with a single webpage containing a chart, sometimes a few figures too and some lists of hyperlinks to other items.
The clean design approach means there are no legends with the charts. Each bar, line or pie slice always has its own label, and the software determines that the labels are always visible, even on quite crowded charts.  For instance, in the InfoTrend viewer, mouse over a trend line, and it will highlight it, and the others will shade out. Click on the trend line, and it will split it at that point, and redraw the chart to show trends before and after that point.

All across the range of tools, the charts respond intuitively to pointing, clicking and dragging, in ways that avoid the need for any kinds of explicit controls. There are no lists of properties, no menu of options, no clutter: all the control is ceded to the actual data being viewed.  Other items are listed on screen to allow you to vary the view. These are organised into are measures and demographics. Measures could be from the same survey, e.g. brand awareness, affinity or association, but very powerfully, they could be from other sources: marketing spend, retail performance and so on, all scaled to the same axis. You just pick the ones you want to see together.
Other viewers offer different views. InfoWorld presents data plotted over a world map, and maps covering other geographic regions are also feasible. InfoPlot brings to live two-dimensional plots or perceptual maps, with the same highly intuitive approach to encourage experimentation.

Brand managers could get very interested in InfoSwitch – if data are available on brand loyalty, previous brands or alternative brands used. This shows each brand as a sphere orbiting other brands, and the strength of the links between them, with all the Viewer possibilities of moving items around, filtering them and focusing on subgroups, to really understand the interplay of brands, as consumers divide or shift their loyalties.
The Viewers offer great flexibility in deployment. You could build them into an intranet, publish them on a website or you could email links to recipients for them to download a package they can run locally on their laptop without for a permanent internet connection. The technology is built on Microsoft’s excellent Silverlight technology ¬– something we are going to be seeing much more of, in software applications for MR. It will run on any browser and any platform.

Infotools also offer what they call Consoles, which are bespoke data delivery portals based on one or more customised Viewers – they are deliberately avoiding calling these dashboards, though the result is definitely dashboard-like in its approach.

Is there a catch? Well, there are potentially two. For many, a deterrent is that both Viewers and Consoles are not offered as standalone software – they are tied into Infotools’ service offer. Only Infotools can build the underlying database and populate your Viewer with your data. However, the database is in the same structure as for its full-blown Espri cross-tab and analysis tool. They will throw in a copy of this for free if you are using a Viewer. This has the advantage that the numbers in published data and any subsequent ad hoc requests are likely to agree, as they come from the same data source.

The other catch is the constraint imposed by the interface on the number of variables or dimensions you can include. There is space for around 70 items, but as this includes all measures and all the categories of any demographics, you could struggle to accommodate all you need from a sizeable tracker. However, even here, this discipline could be an advantage, as it forces you to focus on the essential.  After all, some do say that less is more.

Customer viewpoint: Othman El Ouazzani, The Coca-Cola Export Corp., Casablanca, Morocco

The Coca-Cola Export Corporation, North & West Africa Business Unit has implemented an Infotools Console based around the new Viewer technology. Knowledge & Insights Manager Othman El Ouazzani, based in Casablanca, Morocco, worked closely with Infotools during the implementation, and introduced the new console to around 25 regional marketing managers and national sales managers.
The console brings together data from a wide range of sources, merging sales data, retail audit data and consumer tracking data from different fieldwork suppliers, media data and even data on weather and temperature.
“All these streams of data can be very difficult to look at or grasp in an in,” Othman explains.  “So the idea was how to integrate all of this data into one platform – so as to make it easy for the user to see not all of it, but the  most important parts of it so they can make sense of what is happening with the business and the with the brands.”

Infotools took the team through a process of identifying the data and what the users needed to see. “We had to limit ourselves to the most important things, which was difficult, but it was also a valuable activity. This process took us about 3 months, from inception to the first run of the Console.”

The Console was designed to meet the needs of different users within the company – providing high-level views for the marketing unit, as well as individual country views for the different national managers.
“The success we have had lies in the fact that we are able to integrate all of this information within one interface. We can chart sales data with retail data, brand equity data plus media data. You can see all of them, all at once, on the same chart. This makes it very easy to navigate your way through all of this information and intuitively drill down if you wished so.

“It does not require any training – it is so self-evident. If you are an Internet user, it is exactly like using the Internet, except that it does not take you to a new page  each time, it just displays everything on the same page”.
Othman advises anyone considering this approach to “make it simple for the users from day one – don’t try to complicate it. You have to be very picky on what dimensions to include in each matrix. Recognise you cannot satisfy everyone’s needs with one tool. The user eventually finds that all most important information is there, and they can refer to the databases directly if they need more.

He concludes: “One of our biggest constraints is time. This allows us to have a very quick view of the current situation – how the brands are doing, the volumes and so on. It is quite easy to use and it saves you quite a lot of time.  Before, it would have been hours and hours of work to do this, but now it is available in just the click of the mouse. “

A version of this review first appeared in Research, the magazine of the Market Research Society, January 2010, Issue 523