The latest news from the meaning blog

 

The curse of seeing everything

Model of brain activityFrom Research 2010, MRS, London, 23-24 March 2010

A major issue with post-modern research methods, or ‘new MR’ as it is sometimes called –  a recurrent theme at the Research 2010 conference – is the amount of data and consequent effort that goes into extracting any meaning from this data. This came home in the new technology session, chaired by Robert Bain and billed as ‘Research Unlimited’.  Not that any of the technology being presented was essentially new – naming the session “incremental developments in technologies based around memory and newly applied to market research” may have added precision, but not made the message any clearer.

The pursuit of clarity should be at the heart of any new methods – and that is a challenge with two of the methods showcased  based on neurometrics – from Nunwood’s head of R&D Ian Addie and Millward Brown’s new head of ‘Consumer Neuroscience’, Graham Page. Page is probably the first MR staffer to have the N-word in their job title.

Neurometrics

Improvements in EEG measurement and analysis technology  make the approach more affordable and slightly more applicable to surveys in the real world, but they still have a long way to go. The electrode caps and camera-rigged spectacles modelled on stage by Addie, and even the slimmed down version shown by Page, are still pretty clunky and intrusive. Addie also cautioned that ‘noise’ in the data collection meant that 30 per cent of the data they had collected had to be discarded.

Positivism with a big P

Both speakers showed that this kind of data can aid understanding, and can usefully cast a new light on some deeply held assumptions about consumer behaviour, which is no bad thing. Nunwood respondents who had been wired up with electrodes for supermarket visits had revealed that a significant amount of time in selecting products seemed to be spent in rejecting other projects – not something that is much questioned in conventional recall studies. As research was busy going po-mo in other sessions, this looked like a rallying call for Positivism with a big P.

Page cautioned: “Hype means it is very easy to get carried away with exaggerated claims [for neuroscience]. The results don’t stand on their own: you have to combine this with something else.”

Not only that, but you quickly accumulate a vast amount of data that takes time and effort to process. Furthermore, to give any meaning to it, you must be applying the qualitative judgements of the researcher or neuroscientist. This additional burden was also true of the other novel method in the session. Here, Bob Cook from Firefly presented an interesting extension to diary research – particularly those studies that lean towards the auto-ethnographic – with a methodology based on Lifelogging, or ‘glogging’ using a small fish-eye camera worn by the participant around their neck. This can take a shot and capture everything the respondent sees, paced out at minute intervals throughout the day. Cook reckons it can overcome the usual problems of incomplete recall that can arise over the more mundane and automatic activities respondents may be asked about.

Making sense of the data

The problem, in trying to move such techniques into the mainstream, comes at the analysis stage. To get meaning from these techniques takes extraordinary effort – and they are not amenable to the analytical methods conventionally applied to either qual or quant. We’re not usually short of data these days, but we are short of tools to make sense of these new streams of data. Without them, analysis is inordinately time-consuming. Technology makes it easy to add precision in volumes, but with all these new methods, it falls heavily on the researcher to bring out the message.

Translation on the fly (or on the sly)

World Wide Lexicon Toolbar is a new plug-in to the Firefox web browser that promises to take webpages in any unfamiliar language and, as you browse, simply present the pages in English (or for non-English speakers, the language of their choice). My preparations for the trip I am about to make to Korea have focussed my mind on the frustrations of being unable read webpages. But I was also curious to see how useful this would be to Web 2.0 researchers that are analysing social media content and the like.

It is a very smart add-on: if you browse a page, and it isn’t in the language you understand, the page will be machine-translated and presented to you. If a human translation has been made, it will show this instead. It surpasses the Google option to machine-translate pages in a couple of other ways, too: more languages are covered and the translated version is presented in the format and style of the original page. There is even an option to double up the text so you can see the original and the translation. Of course, the translated text may still disrupt the layout, but it gives you a much better idea of the context of the text,  which aids understanding considerably.

The software is currently in beta, and can be installed free-of-charge from the Mozilla Firefox add-ons page. Reports from early adopters are that it is extremely useful, provided that you are willing to put up with the limitations of machine translations. The human translations it shows are those that have been entered by volunteer contributors to the World Wide Lexicon community. It’s a fantastic idea and is another example of the wisdom of the crowd at work on the Web. Yet the reality for any social Web researcher is that the blogs and community forums you are likely to visit will not have attracted the attention of a community-minded translator, and you will still need to endure the inadequacies of the machine translation.

Machine translations are not bad with well-constructed texts that have been written in a stylistically neutral way, but the more colloquial and idiomatic the text is, the more bizarre and worthless the translation becomes. I don’t have the means to try this out, but I suspect this tool may be more useful when doing Web-based desk research into more authoritative sources than the general Web 2.0 free-for-all. For that, we need machine translations to get smarter.

Why on the sly? You need to login and register to use the service, and the server must, by definition, be aware of all of the pages you visit – so you are giving to the plug-in owner a complete trail of all your browsing activity. This is not made clear when you sign up. If it bothers you, you could only use Firefox when you wish to translate something, and another browser for what you wish to keep private.

Tim is at the First International Workshop on the Internet Survey this week, organised by Kostat, the Korean National Statistics service, and will be posting highlights from the event.

Has the Insight Show overheated?

Technology was an aspect of this week’s Insight Show that the exhibition’s promoters were majoring on, yet on the ground the number of technology providers exhibiting at the show was thinner than ever – I found just 13. Who was there? End-to end  mixed mode providers were represented by Askia, Confirmit, Merlinco, Nebu and Snap, plus online specialists Itracks and the newcomers on the block, ebox software. The niche providers were represented by E-Tabs (a niche maker in their own right for report automation), Centurion and Cint, for panel management, Intellex Dynamic Reporting for interactive analyis,  OnePoint for mobile data collection, Think Eyetracking, for, well, eye tracking, Visions Live, a new qualitative research platform, plus, rather strangely, a presence from Panasonic, featuring their Toughbooks as a rugged CAPI device.

Part of the reason for the shift of the Insight Show from the back end of the year to the middle (last year’s show was barely seven months ago, in November), was to merge four of Centuar’s marketing-related shows together under one roof, where they were colour coded and branded as MarketingWeekLive! Insight was in the orange corner. But lo and behold! Over in the blue corner, was SPSS, a big fish in the diminutive Data Marketing Show. They weren’t the only MR-relevant supplier to show up in the other quadrants – there were some research and fieldwork firms that had taken up positions elsewhere too. To the visitor, it was a bit of a muddle.

The Insight Show does have the feel of being on the wane since its heyday, if you listen to the crowd. But then I hear exhibitors moan each year that traffic is very slow, and most time is spent standing around in an excruciatingly expensive way: but identifying its heyday is elusive and illusory. This year, it seems day one was busier than the day two, when I was there. Yet I can remember being told there wasn’t a busy day at all in past years. Still, the day I was there  seemed to be the one when competing sales teams converged on the orange carpet between their stands to chat about who was up to what and complain about the heat.

I had assumed much of the reason for the merged format was because the Insight Show (which used to be big and standalone) was in danger of disappearing altogether, and alongside the other shows, it would find itself in the naughty corner. Not so. The Insight show was only second in size to the big and bold In-Store show. If the Point-of-Sale people can’t put on a good show, what hope is there for us research boffins? But it did make me wonder how many people, out shopping for illuminated fascias and storefront signage might find some online focus groups coming in handy, or those looking for a decent panel provider  being wowed by the ‘innovative trolley and basket systems’ on display next door.

Apart from the exhibitors, what was hot in the Orange corner? 2009 seems to be the year of online qual. Not only does Visions Live have a very interesting new multilingual realtime and asynchronous (or bulletin board) product which has come out of New Zealand, and already has a significant footprint in Asia Pacific, but then the other newcomers, ebox, seem to have put as much effort into developing qual tools as they have the quant online data collection.  It’s all very Research 2.0, although Itracks, who were also there, would make the point that they’ve been doing online qual from the time when people were still discovering their @ signs. And today I’ve just been given a private preview of yet another virtual qualie tool  (a very nice one in the making too) that locates the group experience in a virtual worlds paradigm.

Beyond that, software providers are talking seriously about automation – as they have for a long time – but they were also showing me things that were starting to make sense in simplifying tasks and saving time. Centurion have a new web-based interface out for their panel and sampling platform, called Marsc.net, which looked very nice – and they have built in lots of heuristic models for drawing samples for trackers. Intellex Dynamic Reporting had a number of smart new reporting goodies on display to make life easier, and can now go straight out to PowerPoint for report automation. The bright people at Nebu, on the other hand, have simplified the panel set-up process so that someone using their panel solution, could create and start populating a new online panel or custom community in just an hour or so – or as long as it takes to create the branding and imagery, in fact – their ‘panel in a box’.

But as I left, I was wondering if someone in Centuar had misheard what I certainly heard last year, that ‘the show would make more sense as a biennial event’ and optimistically decided to make it a biannual event. Hardly more than six months later was really too soon for this event, and the show definitely suffered as a result from the visitor’s point of view.