The latest news from the meaning blog

 

Read the results: Globalpark Annual MR Software Survey 2009

We’re excited to be able to share with you the results of the Globalpark Annual Market Research Software Survey, which is published today. We have been conducting this international survey – which looks at software and technology used in the market research industry – every year since 2004, the results of which we make freely available to all.

Pie chart showing the perceived viability of moble self-completion research

In 2009 we introduced some new questions about mobile research and communities.  These reveal the MR industry is split over the merits of self-completion mobile-based research, with 45% seeing it as viable 48% only seeing it as close to becoming viable (and not really viable at present) with a further 7% never expecting it to be viable (chart to left). It is also very striking, that large companies have a far more optimistic opinion about the viability of mobile research.

Our questions on communities also proved to be interesting. For example, we learnt that only one in six research firms was operating a community in 2009, and those that do are operating very few communities.

Four- and five-year trends

Chart showing the growth of the mixed-mode integrated platform

Over the years, our tracker questions have revealed some interesting trends. For example, we ask respondents whether they use an integrated platform for their multimode research or whether they switch between modes (chart to right). We first asked this question in 2006 and since then we have detected a gradual but consistent trend to wards integrated platforms. In another question we found that nearly all respondents (84%) thought that multimode data collection was either essential, very important or moderately important when choosing a new data collection tool. Software developers, please take note!

Member Login

Where are the tools to enable Web 2.0 research?

Researchers cannot afford to ignore Web 2.0 approaches to research, as Forrester’s analyst Tamara Barber makes clear in a persuasive article on Research Live, in which she settles on market research online communities (MROCs) as being the most effective way to achieve this.  How to do Web 2.0 research, from a methodological point of view, is engaging a great deal of discussion at MR events this year.

In her piece, Ms Barber has focused on social or participatory characteristics of  Web 2.0, where there is obvious value to research. But the other characteristics of Web 2.0 lie in the technological changes that have emerged from its 1.0 antecedents – that the Internet becomes a platform for software, rather than a delivery channel for information. Indeed it is technology – using Ajax, Web services, content integration and powerful server-side applications – that are as much the hallmarks of Web 2.0 as the outward manifestations of the social web. It’s on the technology side that there is a lot of catching up to do, in the world of market research, and until this gets sorted out, Web 2.0 research will remain an activity for the few – for patient clients with deep pockets.

The specialist tools we use in research are starting to incorporate some Web 2.0 features, but nowhere does this yet approach a fully integrated platform for Research 2.0 – far from it. Panel management software is morphing into community management software, but the Web survey tools they link to don’t make it easy yet to create the kind of fluid and interactive surveys the Web 2.0 researcher dreams of. Neither are the tools to analyse all of the rich textual data that come out of these new kinds of research truly optimised for all forms of Web 2.0 research data. There are pockets of innovation, but multi-channel content integration – a key feature of Web 2.0 sites – is still difficult, so researchers are still drowning in data and left running to catch up on the analytical side.

Another problem arises too as more ambitious interactive activities and research methods emerge: the demands on both the respondent and the respondent’s technology increase, and some are getting left behind. Participants find themselves excluded because their PC at home or at work won’t let them run the Java or other components needed to complete the activity – whether it’s a survey, a trip into virtual reality or a co-creation exercise, and their PC won’t let them upload what you are asking them to upload. Even relatively modest innovations such as presenting an interactive sort board in the context of an online survey or  focus group will exclude some participants because their browser or their bandwidth won’t handle it. Others simply get lost because they don’t understand the exercise – there is a growing body of studies emerging into the extent to which respondents fail to understand the research activities they are being asked to engage in.

New Scientist recently reported on innovations taking place in gaming technology where the game learns from the level of competence demonstrated by the player and uses this to adjust the game’s behaviour. It’s the kind of approach that could help considerably in research. Unlike online gamers, we can’t ask participants to spend more than a few seconds in learning a new task and we can’t afford to lose respondents because of the obvious bias that introduces into our samples.

For Web 2.0 research to move beyond its current early-adopter phase, not only do researchers need to take on these new methods, but research software developers also need to be encouraged to take a Web 2.0-centric approach to their developments too.

Globalpark EFS 7 Panel and Communities Reviewed

In Brief

What it does

Fully hosted software-as-a-service online research suite that offers a high level of performance and flexibility, with tightly integrated panel management capabilities. The panel module now offers support for online research communities

Supplier

Globalpark

Our ratings

Score 3.5 out of 5Ease of use

Score 4 out of 5Compatibility with other software

Score 4.5 out of 5Value for money

Cost

Three components: Set-up and customisation fee for panel typically £10,000-£14,000; plus, annual company-wide licence fee for survey module: £2,700 and for panel on sliding scale, from £6,830 (10,000 members) up to £20,630 (half a million or above); plus, usage fee per complete interview on a sliding scale, e.g. 49p for 10,000-20,000 in a year; 12p for 2 million.

Pros

  • A captive application for CATI interviewers and supervisors rather than a web browser interface
  • Integrated question and media library for rapid survey development
  • Works with any modern browser or OS
  • Provides a full web content management system (CMS) for multiple panel/community sites
  • Panel can work standalone with other interviewing software, e.g. for other modes

Cons

  • Online and mobile interviewing are the only survey modes supported
  • Steep learning curve
    A lot of web technical knowledge needed to fully exploit panel customisation
  • Contains quant research elements but no obvious survey workflow for quant projects

In Depth

How a panel differs from a community has become a bit of a topic among the research profession of late: how to avoid influence, whether incentives should be paid or not, or even whether the two differ at all. It’s clear that there is diversity in understanding and practice, and in introducing community support to the Globalpark EFS interviewing suite (the EFS stands for enterprise feedback management) this research software provider leaves those decisions to the individual. You could use the software to run multiple communities, multiple panels or any combination of the two, with different websites for members to use for each, and behind the scenes you may choose to keep all your panel members in one database, and segregate them logically, or physically segregate them into separate databases.

Globalpark EFS splits the task into three essential components: panel (or panels), projects and websites (the panel members’ portal). Therefore, if you had a panel of customers, and wanted to create a community of premium customers, as an elite group drawn from the panel, you could create a special website for these customers. Surveys are deployed through the respondent-facing website, and can be deployed to more than one site. They can even be skinned differently, so the survey the premium customers get be the same survey as in the general panel, but could take on a different look, consistent with the premium site’s theme. It also makes this a very appropriate pick for research companies, alongside the corporate EFM customers that Globalpark target, since panels and surveys can easily be branded for different customers or contexts.

The real power of the system is in its ability to create multiple panel and community websites, and for these sites to contain dynamic content driven from a number of sources. It means that once the site has been configured, no further technical tweaking is required, provided you do not fundamentally change the scope of what you are doing. All the routine activities such as putting surveys live, inviting panellists to participate, collecting demographics and contact updates from members, reward redemption, and the more community-oriented capabilities such as adding content to news feeds, featuring snap polls and results of surveys are simply managed through a set of attractive and straightforward control panels.

The site builder is another matter though – this is something aimed squarely at the web technician, and even then it will tax even the specialist, as there is a lot to learn and a lot of layers to work through. What Globalpark give you is a fully functioning web content management system (or CMS) which conveniently happens to understand surveys and panels. It is HTML and PHP based, browser-independent and, following best web practice, rigorously separates presentation from function. In an attempt to make it a little less complex, rather than having to write any PHP code, most text content can be written in Smarty, a text markup system. This makes it easy to pull fields from the panel database for display, and put logic into the text too.

It’s a highly accomplished implementation of a CMS and you could certainly use this software to build big fast-moving content-rich sites in which the survey activity was only a small component. It is a clever stance to take, though the trade-off is that all this flexibility is the time and expertise required to create a new site. This will not let you pop-up a new community in a couple of hours. To be fair, the people at Globalpark recognise that only a minority of customers would be able to do the configuration from scratch and tend to quote for doing the initial configuration work with new customers.

Version 7 also introduces a number new Web 2.0-style ‘community’ building blocks. Forums allow you to create threaded discussions, with members contributing responses, or optionally, defining new topics too. Whiteboards let you create a simple single-topic forum. Blogs let you turn the commenting over to your participants, who can add their own content and upload images, documents and so on. You can also feature selected blogs on the home page. Chat lets you hold one-to-one or group discussions in real time, to a limited extent, though stops well short of a full online group.

You can restrict access to forums and all the community components, so you can work with an invited subset of members only. Whenever content upload is an option, you can restrict the files you permit, e.g. only to allow JPEG images or Word documents, and the size can be limited too. It’s all very sensible, but it does not really jive yet for the qualitative researcher wanting to pull panel members into open, semi-structured research. There is no built-in workflow in the way there is for a quant survey and your data is likely to end up scattered all over the place. This needs more thinking through, and no doubt later versions will improve the situation.

However, praise must go to Globalpark for providing these features and making the software entirely DIY, if you have the skills to do the CMS configuration work behind the scenes, because many other community tools do not give you this degree of control or flexibility. You could do a lot of novel and interesting community-based and collaborative research with what this offers.

Customer perspective

Sony Music in Germany started using Globalpark EFS a year ago for a range of research activities carried out in-house using their own panel. These include new product and concept testing, as well as song cover and artist image tests for upcoming artists or newcomers. Michael Pütz, Director CRM, Web Strategy and Research explains: “We also create target group profiles, including information about media usage, which is useful for developing marketing and media plans, later on, and we use it to gain additional overall consumer insights.

“It is sometimes said that the music industry is failing to meet consumers’ needs and adapting too slowly to new business models and technologies; our activities with our online panel at www.musikfreund.de (along with other initiatives) shows evidence to the contrary. For some years now, our consumers have become regular part of a&r [artist and repertoire] and marketing decisions and our reliable partners in developing new business models and proofs of concepts.”

The market research team was therefore seeking something that would let them create well-structured and well-designed surveys and offer integrated panel management capabilities too – and to expand some of these into communities – something else EFS offered.

Mr Pütz continues: “The possibilities with EFS are huge. We are constantly challenging EFP and the Globalpark team, and they nearly always come up with good ideas on how to transfer what we want to do into solutions.” He notes in particular the ways in which Globalpark allows users to save time and improve consistency through the use of both standardized ready-made types of questions and the ability to set up a media library to make it easy to insert audio and video clips, which are fundamental to the research he does.

“The basic functionalities of EFS are easy to learn and to teach, however, configuration and tool menus of EFS can be a little bit confusing to beginners – it is not self-explanatory, which is when the help of Globalpark support teams and experts is needed.”

A version of this review first appeared in Research, the magazine of the Market Research Society, October 2009, Issue 521

Do research communities provide a new way to engage with customers for research – or are they just panels on steroids?

Communities do seem to be generating a buzz this year. I’m speaking at the Research Conference’s Online Research Conference in London – on technology of course. I’ve put my slides into our downloads area. Several of the speakers talk about communities – particularly the clientside presenters. There is a discussion about whether communities should have an incentive or not, and surprisingly, many of the client-run communities run very effectively with no incentive, and indeed some practitioners were of the view that an incentive altered the dynamic and effectively missed the point of building a community.

It revealed that there is a divide in practice emerging between those developing and using communities. I fear that for some, the community is merely conceived as being a kind of ‘panel plus’, where a bit more feedback is provided, a bit more branding is built in, a bit more effort is put into keeping people happy (no bad thing in itself), but the relationship is still fundamentally of the researcher and/or client wishing to control the process. In this context, it is understandable that incentives are necessary, because to the respondent, the experience differs little from the best practice of some of the really reputable and more respondent-centric panel companies.

Communities, on the other hand, are subversive of the research process. It is clear that several firms attempting to implement a community struggle with the control that they have to cede to the participants in order to make it work. If people are going to be encouraged to speak freely, and exchange ideas with one another, with blogs and forums and the like, the corporate message-control wonks often get restless, and many an initiative has probably been crushed by the fear of what could come out in a public or semi-public arena. Yet those who do go the full mile with their communities and allow members to set the discussion agenda too seem to come away with surprisingly positive experiences.

Both speakers at the conference and practitioners I’ve chatted with seem to agree that negative opinion is always in the minority across the board as a whole, and when it does arise, other voices will often defend or moderate the company, or sing its praises elsewhere – in ways that have the credibility of the Web 2.0 milieu which is in inverse proportion to anything that a PR department could produce.

Yet I have heard others describe communities as ‘glorified panels’, and that I find worrying. True, they do share some characteristics, and indeed, the underlying technology used for panels can sometimes be tweaked to run a community too. It is important that we, as an industry and as practitioners are able to distinguish between the two. Perhaps one useful differentiator is whether an incentive is involved or not. However, an incentive would be ethically appropriate if the research activity was particularly time-consuming, such as keeping a daily blog for an extended period.

However, another differentiator, and certainly a challenge to the researcher, is that a community is likely to have a wider remit than just research, and responsibilities may be shared among marketing, product development, PR as well as research – indeed research may be a relatively minor player. This did not surface at the Online Research conference, but Pat Molloy and I mentioned it in our presentation at Casro Tech 09: according to Tribalisation of Business 2008 it is the marketing departments that tend to be running communities, not the MR or insight teams. Building more communities for each fiefdom is hardly going to be the answer – researchers are going to have to find ways to align their goals and methodological approaches with colleagues who have a very different take on communities. It seems that communities demand power-sharing with more than just the participants.

Vovici 4.0 reviewed

In Brief

Vovici Community Builder and Feedback Intelligence, version 4.0

Vovici, United States
Date of review: November 2008

What it does

Web-based suite for building online research communities and custom panels, for both quantitative and qualitative research. Allows you to create fully branded respondent community portals easily, using an online point-and-click interface. Feedback Intelligence module offers sophisticated dashboard and drilldown reporting systems for individualised reporting to stakeholders across the enterprise from multiple data sources, and integrated with Business Objects

Our ratings

Score 4.5 out of 5Ease of use

Score 5 out of 5Compatibility with other software

Score 3.5 out of 5Value for money

Cost

Annual fees in US dollars: Community Builder module starts at $24,995 for up to three named users; Enterprise Feedback Management module starts at $24,995 plus $1,500 for each portal user.

Pros

  • Standardise and co-ordinate surveys, questions and measurements across all a company’s survey activities
  • Requires no web programming or HTML skills for the most part
  • Platform independent – Windows, Mac or Linux with any modern browser
  • Integrates with Oracle CRM and a range of industry standard CRM systems

Cons

  • No built-in incentive or reward management
  • Can only execute surveys created in the Vovici EFM survey module
  • Relatively expensive

In Depth

Today more and more companies are realising the benefit of building online panels of customers to involve in research. The idea is simple, but the reality can be complex and costly to deliver from a technical standpoint. Whether firms try to build them for themselves or park the problem with a research agency, it is an area crying out for an off-the-shelf solution like the new Vovici Community Builder, which was released last month and effectively relaunches the concept of the panel management tool for the demands of Web 2.0-style research.

The product is a completely web-based suite which sits astride a database of contacts or panellists, and allows you to interface directly with the Vovici EFM survey engine as well as other enterprise platforms or CRM systems – such Siebel or Hyperion – so that sample selections can refer to real behavioural data from recent transactions for that customer. Configuring the interfaces with other enterprise data sources is, understandably, beyond the lay user, but once these have set up a customer’s purchase history can be used just like any other piece of panel profile data, such as age or location, or be used to drive sample selections for just-in-time research.

The Portal Builder

At the heart of the software is the Portal Builder, in which you design the pages of the community site your target panel members will visit. It is effectively a content management system which allows you to lay out pages with placeholders for content that will be streamed in from other sources, and which you can arrange neatly in different columns and boxes in the way most websites are organised these days. So, in the centre you could choose to put a list of the surveys the respondent is invited to some introductory text above, headlines from the current community newsletter on the left, highlights of recent survey results on the right and so on. The portal has built-in support for just about all of the objects you are likely to need to add when building a research community site: a profile editor so panellists can view and update their personal data; current survey invitations; past surveys taken; containers for welcome messages, help and links to more information or contacts. The list reaches far into the Web 2.0 milieu: you can add forums for collaborative discussion, blogs for respondents to view and react to, data mash-ups.

There is also a wealth of collaborative tools, from a simple suggestion box to access-controlled forums that can be used for asynchronous focus groups, so that quantitative surveys can be backed up by some selective qual work or vice versa. The highly modular approach means that any tool can be access controlled, and only available to invited participants. And if you are concerned that this portal page is getting a bit busy, it is easy to spread it across a series of tabbed pages, which you can title and organise how you like. There is a large template library, and it is very simple to create an overall theme with your own imagery and branding.

You can also publish results through the portal and make these relevant to the respondent – you could present each member with a report showing their answers compared to the survey as a whole, for instance, show highlights and add commentaries. Vovici emphasise this as the means to build interest and engagement, and work on the assumption that the kind of interest a community member will derive from the experience as a whole will eliminate the need to offer financial inducements. As a consequence, there is no built-in incentive and reward management capability in the product – something that will not go down well with agencies wishing to build panels.

Though the Vovici name may be unfamiliar to many, what is now branded as Vovici EFM was originally developed as Perseus EFM. The main web survey capabilities and engine are an incremental development of the Perseus EFM software which Interface reviewed in Research July 2006 when it was already a mature and capable offering for online research. Vovici has recently established sales and support offices in London and Singapore alongside three existing locations in the United States.

The other major addition since Vovici took over is in reporting. There is now a dashboard reporting system largely in place, with some development ongoing. It follows a similar philosophy to the Community Builder by allowing you to arrange graphical and tabular reports across the screen in columns and rows – as designer, you choose what reports to show simply by pointing and clicking, selecting them from menus and so on. Again, the overall appearance is controlled by externally defined templates and stylesheets, so the entire reporting experience can be themed and branded to match a corporate intranet site.

Reporting in Business Objects

The reporting system is, in fact, built on Business Objects (using Crystal Reports), which is a widely used reporting tool in the mainstream corporate database and business intelligence sector. However, Business Objects is typically of limited use with survey data, because it does not understand common survey concepts such as multiple-response data, respondent bases that may differ from the number of responses to a given question, or one-off data formats for each short ad hoc survey. The breakthrough with Vovici is that the developers have created a data model and accompanying metadata to make research data comprehensible to Business Objects.

The beauty of this is that any reporting can be a composite of hard commercial data alongside softer attitudinal and intentional survey data. Questions too can be analysed across different surveys. By smashing through the old silo approach, Vocivi is also working towards delivering true benchmark capabilities. The idea is that any question can be reused across any survey, and once the same question has been reused, all responses to it can be used to provide a benchmark, or by filtering that benchmark to provide sector-specific comparisons.

Enterprise Feedback Management providers like Vovici are probably more aware than most MR software suppliers that their products will appeal to both the corporate user wishing to do their own research, and the research agency – and the platform lends itself to collaborative working between client and supplier. For example, the community portal and interfaces with corporate data sources could all be under the responsibility of the corporate client, while the creation of actual surveys and the preparation and publishing of results can be contracted out to one or more research suppliers, using the same platform.

This is a vast system with massive potential, which in its very design reveals some research-literate minds were behind it. Users I spoke to report that that the community functionality is stable, reliable and relatively easy to learn, and has enabled users to standardise and systematise their research and harmonise measures across very large enterprises. Perhaps the product’s greatest strength is in its ability to integrate with CRM systems and other business intelligence sources, making research more relevant and mainstream within the corporate enterprise.

A version of this review first appeared in Research, the magazine of the Market Research Society, November 2008, Issue 509