The latest news from the meaning blog

 

Industry taking a twin-track approach to social media research

Social media research continues to be one of the hottest topics in research. I’ve just been reviewing the abstracts for this year’s CASRO Technology Conference in New York in June, which I will be co-chairing, and of all the topics, its the one with the longest string of submissions. Not only that, but there is some diversity of opinion into what it is, how to do it, and whether it adds anything at all to the existing researchers’ toolkit. Closer to home, it’s a topic that will be debated in next week’s Research conference in London too.

Analysis technology used on social media research projects, based on the 17% of firms who are active in social media research

Social media research is also one of the new topics we focused on in our 2010 annual software survey, sponsored by Globalpark, the results of which are published today. There are some curious findings – and some predictable ones too – that add perspective to the current debate.

Our survey of over 200 research companies of all size around the world, shows social media research is still at the early-adopter stage,  accounting for revenue-generating activity in just 17% of the firms surveyed. Close to the same number – 19% – say they are unlikely to offer social media research, and of the remaining 63% who gave an answer, 31% say they are either experimenting with it and 32% are considering it for the future. Small firms and research companies in Europe are the least likely to be doing social media research and are also the most likely to have ruled it out, whereas large firms are the ones that are most active. The actual volumes of work are still low – we also asked how much revenue social media research accounted for. It is 5% or under for  two-thirds of the agencies that do it and tails off beyond that – but there appear to be some specialists emerging, with a handful of firms deriving more than 20% of their income from it.

Many firms are bullish about the future, though, with 20% predicting strong growth, and a further 52% anticipating some growth, with North America, and again the larger firms, most optimistic about its future.

As a technologist, I was most interested to see what technology firms were applying to what is, after all, something born out of technology. Were the tech-savvy gaining the upper hand, or were researchers taking the conventional, low-tech approach beloved of qualitative researchers. Again, it’s a bit of both. Of all the software-based or statistical methods we suggested for data analysis, the one that came top, was “manual methods”, used by 57%. For analysis, this followed by 54% citing “text mining” (as correspondents could pick all that they used). Text mining, though it uses some computing power, is also very much a hands-on method – but it’s good to see more than half turning to this method. Other methods make much less of an appearance, and the method that I consider shows most promise for dealing with the deluge of data, machine learning-based text classification, was bottom of the list, cited by one in six practitioners.

For data collection, technology was much more apparent – although it is hard to avoid here. We were still intrigued by the massive 54% who say they are using manual methods to harvest their social media data from the web; 57% were using web technologies to collect the data, and the more exotic methods were also fairly abundant, including using bots (43%), crowdsourcing (41%) and avatars (24%).

I’ll pick up on some of the other intriguing findings from the study later. But as the report is out now, you can pick up your own copy by visiting this webpage – and there will be a full report in the May issue of Research magazine.

NVivo 8 reviewed

In Brief

What it does

Analysis software for qualitative research data now with multimedia support, allowing integrated analysis of textual transcripts, native audio files, video recordings and other source materials. Offers a wide range of analytical and visualisation methods to support both rapid and in-depth analysis.

Supplier

QSR International

Our ratings

Score 3.5 out of 5Ease of use

Score 4 out of 5Compatibility with other software

Score 4.5 out of 5Value for money

Cost

Single user licence for commercial users £1155 plus optional annual support and maintenance for an extra £231. Upgrade from NVivo 7 £405. Volume discounts available. Substantial discounts for educational and public sector users.

Pros

  • Very flexible: offers many ways of analysing qualitative data in many formats
  • Excellent support for video and audio recordings of groups or depths
  • Offers visual as well as textual ways to handle and present data
  • Great help and tutorial material

Cons

  • Steep learning curve: not intuitive to the uninitiated
  • Limited multi-user capabilities

In Depth

NVivo, the stalwart of academic qualitative researchers, has suddenly embraced multimedia files with a passion, and in doing so, widened its appeal to a much broader spectrum of qual researchers. While video recordings of qualitative interviews and groups are now common place, handling video is often unwieldy and can force researchers to fall back on textual transcripts that fail to capture expression or nuance.

The breakthrough with NVivo 8 is its ability to import a wide range of source materials and make these easy for researchers to tag with comments and observations, including video and audio. You can also import Word files and even PDFs, and you can link them together if you have a full transcript and a video of your group.

At the heart of the tool is a multimedia player with a timeline of the video or audio. You can view and hear the recording, pause it or slow it down, or start it and stop it from any point simply by dragging a cursor to any position along a time track along the top of the window. Dragging also previews the video, giving you an extremely efficient way to cue in on the part you are interested in. As you add coding or annotations, you can apply these to the timeline. Each is then colour-coded as a band running parallel to the timeline, giving you a very useful graphic representation of the data and where themes and overlaps occur, or even simply the parts you have not yet reached.

Importing any of these file formats could not be easier – NVivo 8 recognises all the main audio and viedo files format and deals with them appropriately, including AVI, Quicktime, MPE, WMV, or for audio, MP3 and simple WAV files. And you can also output video or audio – the software will enable you to create a collage or summary for your client to view.

The power of NVivo as an analysis tool lies in its concept of nodes. Nodes let you bring together strands of data, observations or comments however you wish – used creatively, they become the essence of the analysis, mapping out the concepts and the relationships between them. For example, if working from a topic guide, each topic could be represented as a node, and nodes can be stacked within nodes to form a hierarchy. Nodes can be used more freely too, to ‘mind map’ the data in a post hoc way.

As you work through the data, you assign as many examples as you can find to each node, or attach your own observations or interpretations as you go, so that, ultimately, when you examine any node, you have a rich and relevant collection of examples and ideas for each one, as the basis of your report. Those examples coded directly in the transcript or video timeline will allow you to jump straight back to the source, so you can see the context, and in the case of video, will cue you directly to the segment where very words were being spoken. It is the moment that makes all the upstream preparation worthwhile.

NVivo 8 does support a degree of collaborative working, in that different members of a team can log into the project and any coding and annotations they make will be tagged by who did it. You can even analyse the variance in the use of coding between different users, to check for consistency. However, the software falls short of a true multi-user system, in that for users to work concurrently on, say, a large international project, you will have to provide users with their own separate versions of the project and merge the files together later.

The greatest obstacle in using the software, however, is likely to be its complexity. This is not a tool that you can easily figure out for yourself. It is one of the consequences of a design that offers a wide range of tools but does not seek to impose its own order by reducing the art of qualitative research to a series of wizards. Proper training and probably some coaching too is therefore essential.

To get the best out of many of the tools within NVivo, you really need to spend time coding and tagging your data first – which can easily take a day or two for a couple of groups.

If this level of rigour is considered overkill, or there just is not the time to achieve it, NVivo’s sister program, XSight, is more amenable to quick turnaround jobs where analysis does not have to go into such depth. But at present, XSight cannot handle audio or video. However, NVivo does not force you to do coding, and the ease by which you can analyse audio and video material makes NVivo 8 much more appealing to researchers with short deadlines to meet, as it can actually save time over other methods.

The client view

Silvana di Gregorio, owner of SdG Associates, is an independent qualitative analysis consultant with a specialty in software support and integration. NVivo is one of several qualitative analysis packages she uses to analyse data for both social policy and commercial market research projects.

“I first used the software in 1995 when I was working as an academic. At that time, academics had similar reservations to market researchers today about using software for analysis of qualitative research – but this is based on a fundamental misunderstanding of how this software will support the analysis and a fear that it will reduce everything to numbers.

“Nvivo 8 has made an extraordinary leap forward, with the ability to analyse videos, audio and graphics. I think it can revolutionise ways of analysis. For example, if you code the video, NVivo adds coding stripes along the top and suddenly you have an entirely different picture of the data. It offers a new ways to analyse and also to present your data which may be more attractive to the commercial market researcher.

“With NVivo 8, you don’t have to transcribe everything, you can import the audio or video and then you can just write notes as you work through it. I have found coding directly onto the video timeline works well. With video it is quite easy to do as you can see where you want to stop: it is harder to do with audio. If there are parts that interest you, you can then do partial transcripts just on those parts.

Silvana is also impressed with the data visualisation and charting that have been introduced with NVivo 8. “These are quite straightforward to use. With a focus group, for example, you can instantly get a visual picture to show you if anyone is dominant in the group. Recently, I created a matrix query between the different speakers in two focus groups. I had coded for the different types of statements made. I simply turned the matrix query into a radar chart, with each spoke representing one of the speakers. You could quickly see there were two people who made no balanced comparisons. It offers another quick visual feel for the data. NVivo support a lot of ways of analyzing data – and I am still discovering more.”

A new book, ‘Qualitative Research Design for Software Users’ by Silvana di Gregorio and Judith Davidson, will be published by Open University Press in October.

A version of this review first appeared in Research, the magazine of the Market Research Society,  May 2008, Issue 503

XSight 2.0 reviewed

In Brief

What it does

Windows-based qualitative researcher’s workbench which helps researchers to organise all their material and their thoughts in one place. It supports thorough and systematic analysis of transcripts or other qualitative data using a variety approaches ranging from the quasi-quantitative to through to unstructured, intuitive or ideas-led methods.

Supplier

QSR

Our ratings

Score 3.5 out of 5Ease of use

Score 4 out of 5Compatibility with other software

Score 4.5 out of 5Value for money

Cost

£795 for single user with volume discounts up to 25% and special rates for educational and government sector users. Annual support and maintenance is £160.

Pros

  • Much improved ease-of-use over version 1
  • Now incorporates a mapping tool for free-format thinking or mind-mapping
  • Very flexible: offers many different way to sift and analyse data
  • Can share work and collaborate on the analysis locally or internationally

Cons

  • Hard to analyse groups where participants of individual groups don’t all share the same sample characteristics
  • Performing queries can still be a challenge
  • No easy way to import structured lists or tagged items from Excel or Word
  • Word export uses styles in an illogical way; PowerPoint export virtually unusable

In Depth

XSight, when it launched midway through 2004, was the first serious attempt to bring IT to bear on the work of the qualitative market researcher. It presented a welcome breakaway from the many code-and-tag qual tools used by social researchers. But after two and a half years and a couple of minor upgrades on, the program was looking tired – and some researchers were finding fault for it being a bit stiff and unfriendly.

It is clear that QSR, authors of both XSight and also NVivo, which is widely used in academia, have listened carefully to users and critics, because the transition from version 1 to 2 is like taking a flight from wintry Britain and emerging into the warmth and sunshine of the tropics. The program is brighter, more colourful and altogether more fluid in its approach. There are many great new features to save time and effort on the journey that starts with a wall of words and ends with those of nuggets of wisdom and understanding that get clients excited.

Not only does the program look much nicer, but it addresses several serious lacks in version 1. For example, free-form tagging, outside of analysis frameworks, which was a serious deficit. Now, you have a stock of coloured circles, stars and the like that you can apply, giving each one your own annotation. You can track these all the way through to your report, and your annotation will appear as a tooltip whenever you mouse the shape, which is handy if you use a lot of them.

The biggest change is what QSR call maps. This allows you to map out a tree diagram of ideas for analysis. Anyone familiar with mind maps will recognise the similarity, though it is not a true mind map. Your concepts sit in discrete but linked boxes. You cannot annotate the links as you can with a mind map, only the boxes. But it is fit for purpose, as you can grow your ideas organically, like on a whiteboard and when you are ready, transform it into a ready-to-go analysis framework. It partly overcomes another of my bugbears, in that setting up the analysis frameworks can be extremely tedious – especially if you have book for a discussion guide to work to. But the program still lacks a decent import from structured lists in either Word or Excel, where either the guide or analysis data has already been given a structure externally.

The new interface makes the software much more usable overall. It now supports drag and drop throughout. It also provides plenty of toolbars, which are all configurable, so you can put your favourite tools together, or hide the ones you rarely use. All the options are also available from the toolbar and many also by keyboard shortcuts or from right mouse clicks.

In the old program, your display would split vertically to let you have two work areas in view at once, but the analysis framework and outputs from queries always had exclusive use of the top pane. In version 2 you can use your two display regions however you like, simply by dragging the tab of view from one pane to another, which alone makes the application much more productive to use.

Querying is still a bit of a dark art, and in my view, the default display remains somewhat cluttered and unfriendly, though the capability is extremely useful. Perhaps this will be the focus for version 3. However, the report writing capabilites are greatly improved, and the export to Word works a bit better than it did. Unfortunately it writes out headings as character styles, not paragraph styles, which means you cannot generate a table of contents, and a lot of the rest of the formatting is hard coded as local overrides, so you could spend hours trying to make an almost nice document into something presentable. I cannot imagine who would use the ‘presentation’ feature, which is a bit like PowerPoint, but not enough like it to be much help.

But these niggles are fairly peripheral and should not deter anyone from requesting an evaluation copy. Many users write their presentations in a separate window in Word or PowerPoint, and it is terribly easy to cut and past between the two.

Now I have used the software for analysing some transcripts myself, I cannot think of a better way to do it. If you are still shuffling documents on paper or trying to organise material in Excel and Word, a copy of this should save you enough time to be able to leave the office on time most days.

Customer perspective: Liz Montgomery at GfK London

GfK NOP now make significant use of XSight, with 23 licences in the UK and further afield. Liz Montgomery, Divisional Director, GfK NOP Business and Technology, is a regular user and has moved from version 1 to version 2. She comments on her experiences with XSight 2:

“One of the things I have noticed  you can use it in a lot of different ways. Some people like to start by using the whiteboards, which they (QSR) call maps. This works well if you are time rich and have a more open palette to analyse. Other people will start with a much more structured approach, with a detailed analysis framework, yet others will just put themes in and work their transcript items back into the themes.

It is a very good collaborative tool, especially for large international projects, and is getting stronger in this. We often have five or more consultants working on a project in different locations; it can really help to give them a structure to write things up, overall. But it also gives them to freedom to add stuff in, and it does not mess up the layout, – and we can import it and incorporate and re-analyse it later.

I particularly like the free search capability, and also ‘ideas’ – if you have an idea it encourages you to write it down now, when you think of it, even if the idea falls over later when you come back to look at it, but sometimes it gives you something very exciting.

You could do a lot with version 1, but the old interface was clunky and slow. In the new version it is much faster, and the toolbars seem to be more logical.

The biggest improvement is maps, which I think of these as whiteboards, and along with it the ability to link everything together onto the whiteboard. And tagging is great. It was absent before and was really needed. Tags give you lots of flexibility to organise your data.

Another big improvement is the way you can select and output quotes, which worked before but was rather ugly in the old program. You can even write your whole presentation in this if you want to, and I know of some people who do just that. For me, it is really useful being able to output the analysis and the quotes into a nicely formatted Word document.

At GfK NOP, we see it as adding a lot of richness. It is a very positive marketing tool with some clients, for example, but also it does genuinely gives you efficiency and the ability to handle the data in ways you could not easily do before. I have found particularly, with some complex studies where there is a lot of data, that it would be quite hard to analyse with traditional methods and look at it really thoroughly, which this lets you do, and do it quickly too.

The fear for some qualitative researchers is this is automating thinking and it definitely doesn’t do that. You have to do the thinking. It enables you to think more and look at everything quicker – and just get more out of the data.”

A version of this review first appeared in Research, the magazine of the Market Research Society, April 2007, Issue 492