What it does
Report automation platform which takes tabular output from almost any MR cross-tab program and transforms it into well-crafted PowerPoint presentations. Works with existing slide decks or will generate new ones selectively, directly from tables.
Ease of use
Compatibility with other software
Value for money
Annual subscriptions from £5000 for a single user, £5,850 for 2 named users, £10,500 for 10. Concurrent pricing options also available. Prices include training, support and updates
- Saves hours when converting tables into presentations
- Greatly reduces the scope for errors when creating presentations
- Shared templates reduce work and allow for a custom look for each client
- Presentations can be updated with new data even after they have been modified in PowerPoint
- No full preview without exporting to PowerPoint
- No undo when editing or making changes
- Windows only
It’s a program that few profess to love, but PowerPoint shows little signs of yielding its iron fist over the boardroom presentation yet. Researchers often feel they are slaves to the god PowerPoint twice over, not just when presenting, but when preparing too due to the sheer monotonous drudgery of creating each slide by hand – slowly patting figures and graphs into shape, often against the pressure of the clock.
Rosetta Studio automates this process, and does it in a very efficient and comprehensive way. As a tool, it’s been around for over five years now. We reviewed version 1 back in 2005, when it was simple and straightforward to use, but fell short of doing all you needed it to. There wasn’t enough control over style and layout to create client-ready presentations, and this inevitably resulted in having to do a lot of finessing on the final output within PowerPoint to get the look right, and leaving you vulnerable to having to repeat all this work if you needed to re-run the data.
Improvements since then, culminating in version 3.3, have removed all these limitations. The range of capabilities has, quite simply, exploded. Pretty much any tabular input is now supported, either through built-in importers, or by using an upstream mapping utility. Within the tool, there is now fine control over every aspect of output, and a lot of attention has gone into providing ways to prevent anyone from every having to do anything more than once.
As an example, colouring, shading and chart options are all created and controlled within Rosetta and are not limited to what Excel or PowerPoint can produce. Colours can be linked with products or even applied automatically from the labels in your data for brand names, company names, countries and so on. It eliminates any fight you may have with PowerPoint showing different colours from the ones you had hoped for because of the clumsy way that PowerPoint templates and document themes interact. Instead. All of this is controlled, safely and predictably from within Rosetta, yet still it is standard PowerPoint that comes out of it.
A very powerful feature of the tool is the template. These takes a little time to master, but templates have the advantage that, once defined, they can be used across the whole organisation and shared easily among teams. Using templates, it takes just seconds to build charts from tables. Templates not only apply the styling, but work out what to pick out of the raw table – e.g. just the percentages or the frequencies, and not the total columns and rows.
Not everyone needs to become a template guru. It is not hard to modify them, or adapt them – but if you want to ensure a consistent look, and to control this, they can also be password protected against unwanted or unintentional changes.
There are now three modes of operation: generate, populate and update. In version 1 only generate was possible – this limited Rosetta to the “push” method of production, where you effectively created then exported a PowerPoint from scratch. Generate is ideal for ad hoc work, but not much help for any kind of continuous or large scale production.
Populate mode introduces the alternative “pull” method, where you can take an existing PowerPoint and link it to tables within Rosetta Studio by going through the PowerPoint document, stripping out the variable content and replacing it with a tag that will pull in the relevant data from Rosetta. Tags can pull rows, columns, individual cell, tables or sections of tables, and are to some extent smart – e.g. you can pull the first row, the second from last cell, or the column headed ‘females’. ATP’s implementation is delightfully simple, though it does take some effort to get your mind around the process. But it is ideal for large-scale report production on trackers, where many similar reports are produced on different cuts of the data, and the suite provides some batching and automation modules for power-users that go beyond the desktop interface.
Even more ingenious is the new “update mode” which achieves a kind of magic with PowerPoint. There is nothing to stop you from going in and making extensive changes to your presentation and still be able to update the PowerPoint, for example, because you had to remove a case and re-weight the data. Rosetta invisibly watermarks each chart it produces and uses these hidden tags to identify each value correctly and substitute the updated value. It’s very clever.
All this increased sophistication does come at a cost, however, as the price has nudged upwards, and so too is the investment you need to make in learning it. This is not something you can pick up for the first time and guess your way through. ATP Canada encourages new all users to take their two-day training course before getting started, by including it ‘free’ within the licence fee. The program is reasonably intuitive, once you have got to grips with the fundamentals, though it’s a pity that you cannot get a better preview of what you are doing within the Rosetta interface – you only see it properly when you generate the PowerPoint. If you find yourself going along the wrong track, it does not provide any real undo capability either, which is a pity.
Producing presentations is complex and without something like this, is very time-consuming. Speaking to actual users, it’s clear that users not only find learning it an investment work making, but that they soon wonder how they ever managed without it.
Customer viewpoint: Leger Marketing, Canada
Leger Marketing is a full service research company with nine offices across Canada and the United States. Here Christian Bourque, VP Research and Patrick Ryan, a Research Analyst at Leger Marketing, speak of their experiences with Rosetta Studio.
CB: “At the time we were looking for something, we felt most automation software was aimed at large trackers. About eighty per cent of our work is ad hoc. We needed something where the up-front time would be quite small. Rosetta Studio seemed to be better-designed for ad-hoc research, certainly at that time. “
PR: “Now, nearly every one of our quantitative projects runs through Rosetta at some stage. We’d even use it for a four question omnibus study, where it is still faster than creating a report slide-by-slide. It means the bulk of our time is no longer taken up with the work of creating a report. The focus is now on analysing and interpreting the data.”
CB: “Once you have spent a little bit of time devising your own templates, you will save 60 to 75 per cent of your time analysing the data. “
PR: “Something that would take four days or a week to put together is now taking us one or two days. “
CB: “It not only saves the analyst time, but you also need to consider the quality review perspective. We used to do a line-by-line review. Now, because it is automated, this is no longer necessary. It’s a load off our shoulders. It means we can spend more time improving the quality of the insights. We also find we can include more complex cuts of data in the report that we would not have had time to do, beforehand, like that little bit of extra multivariate analysis.”
PR: “Something we like a lot is the flexibility it gives you to try different things. You might be creating a set of graphs and you realise it could be better presented another way. Now the hassle of changing your graphs or charts isn’t such a big deal. It takes you two seconds.
“It takes two days to learn, though the basics can be covered in a morning. It is fairly intuitive. We have a couple of reports where the analysis use the tagging. The interface is the same but the logic is different. You have get your mind around how to use and place tags, but once you have done one it is fine. It’s actually very simple.”
CB “We like the flexibility it provides from a look and feel point of view. We can have different templates for different companies. Many of our client have a corporate library of anything they generate, so when it circulates on the client side, it needs to look as if they it’s their document.
“This is something we introduced to add value, not to reduce staffing. It’s the nature of our business that you constantly have to be faster than the year before. The demand on time is extreme. This is one of the ways we’ve been able to meet that challenge, while improving quality. And the other major demand is for better insights, and this is one of the tools that allows us to do that.“
A version of this review first appeared in Research, the magazine of the Market Research Society, February 2010, Issue 524
We will only know whether we were at the turning point of an L, a W or a U when looking back. Nevertheless, my own unscientific poll of firms I’ve spoken to in the past few weeks confirms that, although the recession has been acutely felt by MR technology providers, things seem to have been looking up slightly since people got back to work after the summer. Some tech firms have been busy, even very busy, and some have continued to grow despite the downturn. Being inexpensive or on a short track to adoption seems to help here. Another factor seems to be the needs-led solution: an agency client needs a custom panel, a web-based analysis tool, a dashboard – reactive rather than strategic purchases.
At the same time, others have been putting on a brave face, weathering out the storm and continuing to develop their products. Exhibition organisers are going to have a tough of it time next year. Tech providers are wincing at the costs of going to the big shows at a time when very few are buying. For one, just the charge for electricity levied by the venue was sufficient to wipe out all profit.
Even the firms that have remained busy are reporting that it is taking much longer to close the deal. People will talk for 18 months or longer about a £25K order but it never seems to materialise. Others find the orders they do land have been scaled back considerably from what they were asked to bid for.
It makes me feel that MR firms are still not approaching their technology from a strategic point of view. As I reported in June, several research companies at CASRO Tech were seeing a slowdown in work as being the opportunity they needed to get their processes and tools in order for when life got busy again. Is this opportunity being squandered?
There are many tales out there of firms never coming to a decision, seeing almost everyone and rejecting them all, or having virtually an annual review and still sticking with the same set of ageing or complicated tools that require high levels of skill and effort to operate them. This is symptomatic of technology decisions being delegated down to those in the organisation who are perceived to understand them: unfortunately it is often those who have the greatest investment in being indispensable masters of the dark arts who hold most sway. Perhaps they did reach the right decision, though I’m often unconvinced. But even if they did, I’m often unconvinced it’s for the right reasons.
Technology was an aspect of this week’s Insight Show that the exhibition’s promoters were majoring on, yet on the ground the number of technology providers exhibiting at the show was thinner than ever – I found just 13. Who was there? End-to end mixed mode providers were represented by Askia, Confirmit, Merlinco, Nebu and Snap, plus online specialists Itracks and the newcomers on the block, ebox software. The niche providers were represented by E-Tabs (a niche maker in their own right for report automation), Centurion and Cint, for panel management, Intellex Dynamic Reporting for interactive analyis, OnePoint for mobile data collection, Think Eyetracking, for, well, eye tracking, Visions Live, a new qualitative research platform, plus, rather strangely, a presence from Panasonic, featuring their Toughbooks as a rugged CAPI device.
Part of the reason for the shift of the Insight Show from the back end of the year to the middle (last year’s show was barely seven months ago, in November), was to merge four of Centuar’s marketing-related shows together under one roof, where they were colour coded and branded as MarketingWeekLive! Insight was in the orange corner. But lo and behold! Over in the blue corner, was SPSS, a big fish in the diminutive Data Marketing Show. They weren’t the only MR-relevant supplier to show up in the other quadrants – there were some research and fieldwork firms that had taken up positions elsewhere too. To the visitor, it was a bit of a muddle.
The Insight Show does have the feel of being on the wane since its heyday, if you listen to the crowd. But then I hear exhibitors moan each year that traffic is very slow, and most time is spent standing around in an excruciatingly expensive way: but identifying its heyday is elusive and illusory. This year, it seems day one was busier than the day two, when I was there. Yet I can remember being told there wasn’t a busy day at all in past years. Still, the day I was there seemed to be the one when competing sales teams converged on the orange carpet between their stands to chat about who was up to what and complain about the heat.
I had assumed much of the reason for the merged format was because the Insight Show (which used to be big and standalone) was in danger of disappearing altogether, and alongside the other shows, it would find itself in the naughty corner. Not so. The Insight show was only second in size to the big and bold In-Store show. If the Point-of-Sale people can’t put on a good show, what hope is there for us research boffins? But it did make me wonder how many people, out shopping for illuminated fascias and storefront signage might find some online focus groups coming in handy, or those looking for a decent panel provider being wowed by the ‘innovative trolley and basket systems’ on display next door.
Apart from the exhibitors, what was hot in the Orange corner? 2009 seems to be the year of online qual. Not only does Visions Live have a very interesting new multilingual realtime and asynchronous (or bulletin board) product which has come out of New Zealand, and already has a significant footprint in Asia Pacific, but then the other newcomers, ebox, seem to have put as much effort into developing qual tools as they have the quant online data collection. It’s all very Research 2.0, although Itracks, who were also there, would make the point that they’ve been doing online qual from the time when people were still discovering their @ signs. And today I’ve just been given a private preview of yet another virtual qualie tool (a very nice one in the making too) that locates the group experience in a virtual worlds paradigm.
Beyond that, software providers are talking seriously about automation – as they have for a long time – but they were also showing me things that were starting to make sense in simplifying tasks and saving time. Centurion have a new web-based interface out for their panel and sampling platform, called Marsc.net, which looked very nice – and they have built in lots of heuristic models for drawing samples for trackers. Intellex Dynamic Reporting had a number of smart new reporting goodies on display to make life easier, and can now go straight out to PowerPoint for report automation. The bright people at Nebu, on the other hand, have simplified the panel set-up process so that someone using their panel solution, could create and start populating a new online panel or custom community in just an hour or so – or as long as it takes to create the branding and imagery, in fact – their ‘panel in a box’.
But as I left, I was wondering if someone in Centuar had misheard what I certainly heard last year, that ‘the show would make more sense as a biennial event’ and optimistically decided to make it a biannual event. Hardly more than six months later was really too soon for this event, and the show definitely suffered as a result from the visitor’s point of view.
An interesting lunch with B, who is VP of a research software provider, visiting London. “So, what are the changes you see in research software” he asks, and I find myself answering the question at some length on the changes I don’t see happening, and how unambitious research companies are when it comes to using technology to move the research process on. We both agree that too many research firms are timid with their research software decisions: perhaps too many vested interests in retaining the status quo.
We have both been in the industry a long time, but we are both still surprised by how uninterested many rank-and-file researchers are with the data. So many seem content to allow others to push the buttons, rather than get their hands dirty with the actual data. We swap stories of surveys we have seen designed for the web which are just paper forms, with no understanding of the whole context of doing research online. Again, it is the technicians that are left to bridge the gap between intention and action. We wonder whether this goes to explain the ongoing reluctance of research companies to automate, through better use of technology – so many of the decision makers probably have only a hazy grasp of the actual wastefulness of many of the processes which are still commonplace. We think of the reality of coding, of cross-tab production, of chart preparation. I mention the reluctance we uncovered in many CATI centres still to introduce predictive dialling technology, where there can easily be a 6 month ROI, and a hike in profits thereafter (Confirmit MR Software survey).
I think back to the Online Research Conference the previous week: the subtitle of which was “cheaper, better, faster” in reference to what the research industry perceives as being the drivers from their clients (and the hope that the conference speakers might be able to provide some survival tips and thereby pull in an audience). The event was extremely well attended, yet speakers and questioners repeatedly challenged the placing of “cheaper” in the title. “Cheaper” should not be the goal, they asserted, even though there was constant pressure to bring down costs. “Better and more efficient” is the public ambition of the industry, according to the conference attendees.
But those at the conference are clearly not a representative sample of the research industry as a whole. Those fixed on cost don’t do conferences. Those fixed on cost seem content to keep cranking the same handle – squeezing out more product off the same tired production line. It was not a strategy that resulted in success for much of the automotive industry – it proved disastrous for GM, for instance.
It is not a perfect analogy. The automobile industry is not greatly threatened by customers going and building their own cars. Research is expensive and DIY survey tools are cheap, which makes professional research vulnerable at times like these. We do need to talk about cost, and we need to look to better technology to reduce cost by changing the process and making research inherently frugal. The problem is there are too many gas-guzzling SUVs being offered by the research industry at a time when customers are seeking more frugal hybrids. And what is a threat to some, is always an opportunity to others, especially those that get tricky with the technology.