There are plenty of options out there for online discussion platforms and each one has its own pros and cons, advocates and detractors. When Covid struck earlier this year, Agfora needed to switch two important qual projects from face to face data collection methods to online – ‘from bricks to clicks’ – and for this we trialled two very different research platforms. We thought we would share our thoughts on our experiences with you.

Project 1:

The original plan was to run two face to face workshops among owners/managers of small businesses, one in London and one in Glasgow. We were interested in their attitudes and behaviours, their openness and barriers to change, and how best to overcome such barriers.  For this project we chose to work with Web Creator Suite.

What we liked about this platform/approach:

  • It is a very user-friendly platform (a bit like Facebook) – participants threw themselves fully into the process from day 1
  • It is easy for participants to see each other’s comments and to react to them (possibly more so than using the second platform)
  • Key words and illustrative quotes could easily be highlighted using colour coding highlighters allowing us to group together key issues such as ‘communication’, ‘concerns’ etc for easier analysis later

What we didn’t like quite as much/could do better:

  • There was nothing very obvious or memorable to improve. The advantage of being able to take a ‘helicopter view’ of all of the comments simultaneously can be a little overwhelming, but as you get to know the tool, you find ways of filtering what you see and, of course, when the discussion is complete you can export the full transcript to an excel file to ‘slice and dice’ at will.

Project 2:

For the second project we needed to test a series of poster designs and audio messages among consumers. We decided to use Incling for this project, making full use of their ‘Concept Evaluation tool’ which incorporates a pinning/heat map exercise that generates detailed insights on likes and dislikes.

What we liked about this platform/approach:

  • The platform looks a bit more comms oriented than Web Creator Suite
  • Participants liked using it
  • We were a little unsure about the heat maps* to start with, but in the end, they provided really useful insight into the strengths and weaknesses of each concept. Incling works very well for commentary on advertisements and other visuals
  • For this project, we asked our participants to upload a short (up to 2 minute) video* before signing out on the last day, summarising their views on the campaign. It was great to be able to put a face to the ‘voices’ we had been interacting with over the week and to hear them summarise their opinions in their own words

*Note: Web Creator Suite also offers this functionality, but we didn’t use it in project 1

What we didn’t like quite as much/could do better:

  • The highlighting and tagging function could be improved – a bit clunky
  • Some of the user instructions come as standard and can’t (yet) be edited
  • The layout of the platform, which is attractive but rather image heavy, means there is less space to view individual responses on the same page, and more scrolling back and forth is necessary
  • Encouraging interaction and cross-commenting was more of a challenge with this platform, although we got around it in the end by setting cross-commenting up as a specific end of day task. Still, not everyone took the time to cross-comment, and I’m aware from conversations with other qual moderators that this seems to be the main ‘issue’ with asynchronous qual in general

Our conclusions:

  • Overall, our experiences have been very positive, and we will certainly consider both platforms (and others) for future qual projects. The people we worked with on all sides were incredibly helpful and willing to make all the changes and tweaking required. We felt supported all the way through.
  • There are some obvious practical advantages to online methods Vs face to face – no one has to leave their home/office for starters, and for this reason online qual is also great for overcoming geographic spread limitations – the only potential barrier is access to decent broadband. Asynchronous qual also means there is no rigid time schedule for either moderators or participants – you can log in at a time that suits you.
  • Asynchronous online discussions certainly allow you to ‘sweat your resources’, with each person recruited completing all the tasks set, generating very complete and insightful data.
    • This isn’t the case with your typical face to face or live online focus group where the amount of airtime allowed to each participant is limited to about 15 minutes (group duration divided by the number of participants i.e. typically 8 people over 2 hours = 15 minutes each).
  • Our only regret with the asynchronous discussion approach is that it doesn’t allow for the same level of interaction between participants, and it is impossible to pick up on body language and tone of voice, which can be so revealing in face to face focus groups and workshops. However, this can be overcome to some extent with video tasks.

Our 8 Top Tips for any asynchronous online qual project:

  1. Recruitment: As with any qual project, recruitment is everything – and we can’t recommend more highly the great work provided by Plus4 who recruited our participants for both projects in very challenging circumstances. So, choose your recruiter wisely and agree up front what their role should be in the management of any participants who do fall behind on their tasks. A few of ours needed encouragement to get/keep going and the recruiter provided this.
  2. Topic guide/daily tasks: Less is more – Keep it simple. Asynchronous discussions generate huge amounts of data (think multiple depth interviews with a little group interaction). When designing your discussion protocol or daily tasks, stick to the essentials and keep the ‘nice to haves’ to a minimum. Avoid setting too many tasks or sub-tasks. If necessary, distinguish between obligatory and optional tasks. Make sure it is in ‘bite-size pieces’ so that every day there is approximately the same time demand on the participant
    • Ensure to be very clear what you expect of your participants
    • If there are multiple parts to a task, ask them to copy/paste the question and provide a response against each part to avoid confusion over which question they are actually answering
    • Ask them to clearly identify what it is they are referring to in their answers e.g. poster A, B or C
    • Be very careful to avoid repetition/overlap between tasks to reduce participant fatigue.
  3. Planning: Once the tasks are signed off by your client, allow plenty of time to set up, review and test the protocol before going live. Make sure your participants understand what the cut off time will be for completing each task and what the consequences are of not meeting the deadlines. Launch each day’s tasks early in the morning to cater for the early risers. Allow lots of time for analysis – as mentioned before, asynchronous qual delivers huge amounts of data.    
  4. Pilot: Before launch, ask someone who isn’t a research expert and who isn’t too tech savvy to test the tasks for clarity, duration and overall user experience so they can flag anything that isn’t clear. Ask them to record how long each task takes to complete. Carry out a multi-device test too (laptop, tablet, mobile phone).
  5. Tone of voice and building rapport: Upload a welcoming video introduction message – let the participants know that there is a real person interacting with them throughout. Adopt a friendly and empathetic tone of voice both when setting the tasks and when moderating – as you would in a face to face discussion. You will be amply rewarded both in terms of participant engagement levels and through positive feedback at the end of the study.
  6. Moderation: If the protocol is well designed, and each day the tasks and instructions are crystal clear, you can focus your moderation time on building rapport, probing for more clarity on responses and encouraging greater interaction between participants.
  7. Videos: At the end of the week (or each day), ask your respondents to upload a short video of themselves summarising their feelings about the topic – This can add huge value to the debrief by really bringing the insights to life for your clients.
  8. Analysis: Although most platforms seem to have highlighter and tagging tools, which can be used for filtering on key topics or creating word clouds, and you can download or export the transcripts straight into excel for analysis, we really liked being able to scroll through the responses on the live boards, copy/pasting illustrative quotes and writing daily summaries. These are really helpful when writing up your more detailed report at the end.

In all, we would strongly recommend experimenting with these platforms. I know that Agfora will continue to use them as one of our key research approaches and as a useful pre-cursor to quantitative work. Such online platforms are life savers at this time, allowing us to reach a targeted public and obtain first class information in a way that is safe and secure.

Sally Alsop – MD of Agfora