Explore Workshop - top tips for user testing
Following our second Explore workshop, Ufi Grants Manager Sarah Axon has some great advice on testing your ideas...
Our second VocTech Challenge Explore workshop (Testing Your Ideas) included a great panel discussion to support potential applicants to the Challenge. Rachel Simnett (Simnett Consulting), Rachel Smith (INTO University Partnerships) and John Casey (Citizen Literacy CIC) shared their wide experience of working with users and explained how testing is essential to the development of a successful project.
You can view the recording of the event below.
User Research
Before sharing the top tips, maybe the first thing to learn is that it may be better to call it User Research rather than testing when talking with users – the idea of being ‘tested’ may put some of your users off participating and the ‘research’ word can more encouraging.
- “If you build it they will come” doesn’t work! There is no way to avoid doing your homework and engaging with users. Get your foundations right – if your idea is not based on the facts of what users really need, it’s unlikely to be fit for purpose. You can tweak things at a later stage if the foundations are ok. If they are not built first, things are likely to fall over.
- Don’t just test the tech. The key to success is the user experience of your learning solution. If you just ask if the technical aspects work ok, you run the risk of missing the point….
- Test at regular intervals. Book it in advance so that you have users lined up and ready. You may not know exactly what you will have ready for testing at that time, but a commitment to regular user engagement is really good practice.
- Use a mixture of methods to gain insights into what your users really think.
- Surveys can be great to get a broad brush idea, but you can trip up if you don’t know what base of knowledge your users are coming from. Always follow up with some 1-2-1s to check they understand the questions and the context.
- In depth interviews are great to help understand more deeply. Think about using a script so you are consistent between users (and you don’t overly influence what they say!).
- Observe them in action – either ‘doing’ the thing that the training will address so you can see where they struggle, or ‘using’ the learning solution so you can see how they interact with what you have built. Follow up with questions once you’ve seen what they actually do.
- Try ‘guerrilla testing’ if you’re having trouble getting users to participate. Show up where they hang out. Use a ‘pop up’ booth and invite them in to play. Be inventive in how you go to your users rather than expecting them to come to you. As this call will need you to already be engaged with your users, possibly less relevant here, but could help you to work out other communities of learners that you could scale your idea out with.
- Test systems as a whole, not just components in isolation. Try to see the whole user journey and offer things up for testing in that context. What might work alone may not work at all when used in a different context. Think about the testing cycle as a journey from hypothesis to validation, becoming more task oriented as you go.
- Be really, really clear on your learning outcomes. What behaviours do you expect to change as a result of doing the learning? What will the user be able to do after they complete the learning that they could not do before (or do better after).
- Think about a ‘theory of change’ model that maps what inputs you make, how you will make them and what outputs will come from those. Use the structure and logic of the model to help you define your learning outcomes.
- Keep the learner front and centre of your thinking at all times. In all of the define/design/build/evaluate steps of the development process, put in place testing that allows you to see how your users respond to the emerging ideas.
- And finally, DON’T PUT IT OFF! If you know there are aspects of your design that you’re not sure about, test early. Test often. Test in different ways. Test on different people. But keep testing until you get some consistency in responses.