One of the problems companies face in trying to become more “UX-aware” is getting everyone in the team to actually talk to or, more importantly, to listen to and observe their customers.
I recently helped an online insurance broker that was struggling with exactly this problem. They were already on the right track – last year they successfully shifted to true Agile development and were busily integrating many Lean Startup methods into their business. But they were still a long way from being a company where everyone really “gets it”.
Previously they’d outsourced their user research to agencies, conducting sessions whenever time, resources and development schedules permitted. But this was not felt to have the required impact. Not only was traditional research taking a long time and costing a lot; it was often seen as being “owned” by the marketing department or the product manager – not the team as a whole. Developers and designers felt that customer research was something that happened “over there” – not something that was part of their job. Findings often seemed tactical, abstract and remote from their actual work – worse, they often came in too late to be of use.
Over the years I’ve developed some tools to help companies deal with these problems. One of these we came up with while I was at Amberlight Partners (a fantastic UX agency here in London). It brings solid user research into the product development process without costing a lot or slowing teams down. But more importantly, it actively involves the whole team in gathering data, analysing results and agreeing subsequent actions based on user evidence. So everyone participates in the research, everyone is actively involved in watching and listening to users on a regular basis, everyone agrees outcomes. This is why I call the method “Active Testing”.
So why is this different? Well, the first thing about Active Testing is that it involves your whole team. Everyone working on your product should participate, every time you test. Front-end developers, back-end developers, visual designers, industrial designers, UXers, systems, product owners: everyone. Participating should be as close to mandatory as possible in your team. If you’re practising Agile, it may help to think of it like Stand-ups or Iteration Planning – something that you all just do as part of the way you work.
The second important thing about Active Testing is that it happens regularly and it happens often – one day every two weeks is ideal; once a month is really the minimum to be effective. The point is that it should become routine. “Every second Thursday is Active Testing day” is a good approach.
And the third thing about Active Testing is this: you test what you have available on the day. In other words, you don’t build stuff just to test, you don’t spend ages planning and fretting. You just decide, the day before testing, what you can test and then you go ahead and do it. Test what you’re working on now, or what you’re thinking of building next, or something you sketched up yesterday but aren’t sure about pursuing. If Active Testing day comes around and you *really* have nothing, just test your existing site. Or a competitor’s. Or another product your target market uses. In all cases your team builds up valuable knowledge about your customers and their behaviour that you didn’t have before because you’re all listening to customers on a regular basis.
Problems & Challenges
The first challenge is persuading the whole team to commit to a full day of Active Testing every two weeks. This can be hard. Developers, in particular, often feel like if they aren’t coding, they aren’t really working. But this can be overcome. Getting the whole team seeing how customers respond to their work is a powerful tool for changing attitudes. No-one wants to build stuff that customers hate or don’t need, least of all developers. Adopting an Active Testing routine helps prevent this type of waste. People start to understand that discovering problems early means less rework later, and that a single day of learning every couple of weeks more than pays for itself over time. And as your team starts to learn about customers, they start to naturally agree to prioritise work that improves the customer experience, avoiding resource squabbles.
Another tip for dealing with this type of resistance: In the discussions between test sessions, really try to engage the developers and designers – get them to pitch in and explore the problems you’re seeing. Then, at the end-of-day discussion, get them brainstorming solutions that they can work on right away. That way they can see the benefits immediately. (If you’re really struggling, and resources are tight, consider dropping the number of tests down to 3 instead of 5 in a day and just do a half-day every two weeks. Very few people can’t commit to 4 hours every two weeks. And one final trick: Order in a great breakfast and a proper lunch. You may magically see more willingness to get involved).
The second big challenge is staying true to the “test what you have” mantra. I often find teams drifting towards creating stuff to test, rather than deciding what to test the day before based on what’s available. Guard against this habit. The only people who might routinely be producing stuff “just to test” should be UXers building clickable prototypes to test before putting them into development, and product managers who want feedback on new concepts.
Active Testing is a mindset as much as a method. Persuading the whole team to watch customers using our products was a challenge but it’s been a revelation. There have been two main benefits. Firstly, I’ve noticed that in discussions about future features there’s much more focus on “what we saw customers doing in testing” rather than “how I think this feature should work”. And this reflects the second, and to my mind, much deeper benefit – the realisation that creating a great customer experience is everyone’s job, not just the UX team. This is a huge step forward.