Category Archives: UX methods

Lost and found in the design process

Which is better: Having courage to explore the unknown, or a map to help you find the way?

When heading on a design journey, I’m beginning to think courage and a good compass is better than a map.

A year or two back I tentatively drew a map of the design process for a specific client and their design challenge…

It became a 2m long poster to help a team of healthcare professionals new to design see what lay ahead, and refer to as they progressed towards their goal. A ‘you are here’, ‘look how far we’ve come’, (and sometimes ‘oops, we missed that bit!’) kinda thing.

Yet another design process map. Obsolete the moment it was printed.

I enjoyed pulling this together, thinking about the likely journey ahead for this team, reflecting on my own experience of design and building on top of classic frameworks promoted as ‘best practice’ by the likes of IDEO, Design Council, and what I’d learned during my time with Stanford d.school as part of the Better By Design program.

The cynic in me believes these design process diagrams likely derived from sales tools for design agencies, each claiming to have a point of difference and perhaps to have some ‘secret sauce’ that the other’s hadn’t discovered. …but their 3, 4 or 5 step bubble diagrams have become ubiquitous to the point they are almost redundant.

Semantics aside, they all promise gold at the end of a rainbow if you’re prepared to challenge your thinking upfront.

Other metaphors are funnels, diamonds, snakes, vortexes, washing machines, the list goes on, but essentially they describe a few phases of lost and found with a bit of loopback before finding that gold.

They make nice visuals and the theory seems sound enough, but when you start to actually apply one of these to a live project you realise how futile it is to try to map the design process.

A sequence of steps is convenient and tidy but the reality of design is more like this:

… and there’s no ‘one squiggle fits all’ here either…

Maps and guidelines give us comfort. They can provide a sense of shared understanding of where we are going and what to expect…

… but the reality of most meaty design projects is that we don’t know where we are going, and we need to find a new kind of device to help clients feel comfortable with the unknown.

What makes the journey-maker comfortable with the unknown?

Is it time to throw away these maps and find a design compass?

Yes, it’s another metaphor, but perhaps it’s the designer’s job to be that compass. Something the client trusts to navigate through the messy reality.

Designers are comfortable with the ‘lost’ feeling, because we’ve ‘been there done that’ and believe in great outcomes based on our own experiences.

So how do YOU convince a nervous client you know where ‘north’ is?

Do you roll out a map, or are you the compass?

I’d love to hear…

To uncover the story, first lose the script.

A while back I clocked up my 1000th interview. This got me thinking how much my approach has evolved over the years.

Interviews with customers / end users of products and services are often the foundation of my research.

In the earliest projects I’d work from a page or two of questions all lined up in advance, in the shape of a ‘script’, or discussion guide. These were questions I’d literally recite to each participant. Sometimes these had been contributed to, signed off by, or even provided by the client.

I’d been told I should ask the same questions to all participants to maintain consistency, but found it awkward to work to the script, and at times like I was only hearing half of the story from the subject.

Over time, I found the questions I asked in response to the answers revealed more than the questions on my script, so I developed a more conversational approach.

Sounds like a convenient way to take the effort and rigour out of the process, but it doesn’t make it any easier.

Whilst interviewing, you’re running a mental cache of what’s been said, where you need to take the conversation, how much time is left etc. …and all the while you’re trying to make the participant feel like the conversation is following natural twists and turns, rather than being steered by you, the interviewer.

There are plenty of techniques to learn in the craft of interviewing; building rapport, non-verbals, open ended questions, asking ‘the 5 whys’, repeating their words etc. In my book – USERPALOOZA – A Field Researcher’s Guide I cover the three types of question I use in every interview; starting points, prompts and qualifiers.

These techniques, combined with your curiosity will get you so far. …But they are not enough.

When clients ask (and they still do) “So, what are the questions you’ll be asking them” …

I explain:
When it comes to asking the right questions, there is no substitute for actually wanting to know the answer.

Instead of a script, I agree on a set of objectives with the team. This describes the ground we’d like to cover during the conversations and reads like a list of topics around which we’d like to learn.

Some of these might be framed as questions, but it’s far from being a ‘script’.

As an interviewer, you need to truly understand the context and objectives of your client / project sponsor:

It all starts with a set of questions to which I need the answer in my own head, before I begin planning the interviews…

  • Where is the business and product at in the development process?
  • Why is this the right time to conduct the study?
  • Which aspects stakeholders agree / disagree on?
  • What assumptions exist about the market, end user or value of the product to end users?
  • How will the client measure market success for the product / service?
  • How will the research be used, by whom?
  • What design decisions do the team need to make based on the insights you uncover?
  • Why are we including these types of participant in the study?
  • Which areas does the team have enough insight about already?

This goes beyond the due diligence of taking the brief, scoping the study etc.
It’s a deep understanding of the business, product and design context and should be embedded in your curiosity.

The flow of the conversation and lines of questioning should all come naturally if you’ve built this level of empathy for your client’s position.

In the end it’s about user centred design – The user of the research is your client, so you need to understand your end users’ needs to be able to design the product (interview structure) to give them the best outcomes. In this case, rich and useful insights.

Design thinking – One size doesn’t fit all

“It’s not about going from left to right and some magic happens on the other side, it’s about understanding the intention”.

…Said Director of Design Innovation at Intuit, a stop on the Bettter by Design Study Tour I was part of in 2012.

During our visit, Intuit shared how design thinking changed the culture, and profit of the company (eventually).

Their story: How they tried and failed to install ‘design as a process’ into their teams, arriving at a more engaging and successful model of ‘principles’.

A couple of key points I took away:

Design thinking – to the rescue?
With a history of incremental usability improvements but no real innovation, Intuit took a big swig from the design thinking cup. Their aim was to integrate design thinking into the business, to be more user-focused – exceeding customer expectations, rather than just meeting them.

Things didn’t go as planned…

“We made a mistake in that we started with design thinking as a process, when we brought a process back to Intuit, they puked all over it. Because in a culture where; product development has an agile process, marketing have a go-to-market process and legal have a compliance process, they couldn’t reconcile the design thinking process on top of theirs, so they did nothing”.

So, the conventional design thinking process wasn’t flexible enough for the realities of their culture and practice, and it simply wasn’t engaging teams.

Principles, not process.
To replace the rigid ‘process’ approach, Intuit arrived at three principles to underline all development work- teams could use whatever methods they liked, so long as they adhered to those principles.

So, how did that go?

“A seminal moment in our journey was when we took the process of design thinking and made it into principles. It’s not about going from left to right and some magic happens on the other side, it’s about understanding the intention behind these principles, then you can make it your own”.

“We have watched teams come up with their own methods and tools at any time in the process and it totally works. Thats when we saw uptake in the culture, when we started to see behaviours change, when we gave them permission to make it their own”.

Here are their principles:

  1. Deep Customer Empathy (Know your customers better than they know themselves)
  2. Go Broad to go Narrow (Quantity of solutions, then focus)
  3. Rapid Experimentation with Customers (Prototype, test, iterate)

Sounds like a win to me, especially if this has taken hold in an organisation of thousands.

So, how do Intuit involve customers in their design process?
To deliver on their first principle, (apart from their dedicated UX team) Intuit has committed to getting their teams out of the office and into the context of their customers using their products. Unsurprisingly, this has proven to build empathy for the customer, and as a positive bi-product, engagement with the ongoing design process.

Two big wins.

“We went from listening and fixing problems, to watching to find what they really need but can’t tell us.

This changed the way the organisation makes decisions by watching people’s behaviours versus listening to what they say”.

If you’re wanting to institutionalise design thinking, there’s a video on the Adaptive Path website which tells this story from another Intuit insider’s viewpoint http://youtu.be/HrxD_BaZlcU

All-hands-on-deck …for rapid user insights

Taking notes form the user's point of view...This year I’ve surprised myself by recommending some super short approaches to user research.

When there’s no time, money or buy-in for a ‘full noise’ project I’ve been running a 2 day process where I put my clients in the research seat as they work together to make their own observations, draw their own conclusions and insights.

It felt risky and compromised at first, but it’s working out well so far.

Here’s how…..

(Once the objectives and scope are nailed down)

  • I invite stakeholders to attend and observe interviews with customers.
  • I set the stakeholders up to take notes.
  • Then facilitate interviews with paid participants.
  • Between sessions we gasp for breath and I draw out the top-of-mind observations from each stakeholder.
  • After the last session, I guide them through a hands-on exercise where they match and group individual observations into themes.
  • Together we agree on what these mean for the design/business and prioritise them into an action list.

This is a collaborative, intense and compressed way to work but has massive value to the client. … even if you are exhausted at the end of it.

Some things I’ve learned from working this way:

PLANNING:

Critically, this requires time investment and commitment from the stakeholder team – be crystal clear from the start that this is totally a ‘get out what you put in’ scenario. Participation is required if the client is going to see value.

It’s best to have a mix of stakeholders involved, different parts of the business, levels of seniority, familiarity with the product, market etc.

I can’t imagine doing it justice with less than 3 stakeholders.

Try to make this an off-site activity to minimise distractions.

Make sure food for them and you is arranged in advance. The sessions will be almost back to back so there will be no skipping off to lunch.

Recruitment – You should consider all-day ‘standby’ participants in case of a ‘no-show’.

THE SESSIONS:


Stakeholders need a strong briefing around observation. Reinforce that it’s a team effort, several stakeholders observing the same behaviour can take different meaning away – It’s all valuable.

Keep note taking physical and portable (paper / sticky notes).

Don’t be precious about format, it’s most important that notes are actually taken, not how.

Suggest notes are written from the customer’s point of view. This helps the stakeholder to think through what they are writing, and these ‘quotes’ really come to life during the analysis.

For a usability type project, you could have a sheet of paper for each participant with columns; Where, What and How – Where was the customer at, What did they say/do, How does it impact their experience.

Pinning the objectives up on the wall can remind observers what they are looking for.

Start a ‘discuss’ list and encourage observers to add items as they come up rather than talk through the session.

You need 5-10 mins between each session to conclude what was learned, what was confirmed etc. Asking each stakeholder to write down them share their ‘Top 5’ observations works well.

AFTER THE FINAL SESSION:

Aim for a 2 hour analysis and wrap-up.

Collate all the notes and get them up on walls, grouped by customer, topic etc.

Have everyone spend time (10-15 mins) scanning the data and writing down what they feel are key observations. Go for quantity. 100 is a good start.

Go for some sort of ‘KJ’ collaborative analysis to group individual observations into themes. Name each theme and what it means for the product and customer.

Roll this into a prioritisation exercise by ranking / voting, plotting on a scale etc.

OUTCOMES FOR THE CLIENT:

Making decisions based on first hand observations is a powerful experience.

Getting answers in hours to questions which have been hovering for weeks is a liberating feeling for clients.

Clients arrive at conclusions and reach consensus and create the output together.

This approach can also show the client it’s something they can do themselves.

… and of course, questions emerge which they didn’t know they needed to answer.

Suddenly… where time, budget and buy in for customer research was lacking… it miraculously appears!

I was nudged over the fence into taking this approach by Dana Chisnell, so thanks Dana for the nudge!

I’d love to hear other people’s experience with this…
In another blog post I’ll tell you how it goes when you send the stakeholders out into the field to do their own research.

Ethno-unpacked – A design research toolkit.

Design research toolkit(UPDATED Feb 2016)

Every band needs a manager and a ‘roadie’. The manager books the gigs – The roadies set the stage so the band can focus on playing the gig. Between them, they’ve usually got a big truck full of kit, and lots of gaffer tape.

With design research (contextual inquiry or ethnography, if you like), there’s an amount of planning and kit required too. When running in-home interviews I need to play  both manager and roadie roles, but isolate these activities as much as possible from my role as researcher.

Every minute spent with a customer is valuable, so I can’t afford to be distracted by practicalities like recording equipment and timings.

After a few years experimenting with these practicalities I’ve arrived at a ‘toolkit’ of things in my backpack, so when I pull up at the customer’s house the ‘roadie’ can take a back-seat and let me get on with capturing the insights.

Here’s what’s in my bag:

The contents of my bag when I hit the road on an ethnography / contextual inquiry / design research

1. Discussion guide. I try to keep this to a one pager with topic areas rather than ‘script’ like questions. I have the research objectives embedded in my curiosity, so by the time the first interview kicks off, this serves as prompts only. As you can read in my article: ” To uncover the story, first lose the script”, I’ll be completely free-styling after the first few interviews.

2. Livescribe Pen & Paper. Records every word and lets you playback what was said when you took notes or sketched. Here’s a detailed article about how I use a smartpen to free my mind and eyes during user research.

I tape spare ink refills to the book, as they run dry with no warning after about 50 pages. I use the display on the pen for timing – it’s less obvious and distracting to check the time on here than glancing at your phone. If a subject seems interested in the pen (or any technology you use) take the time to explain what it does and why you use it, this removes the distraction, so you can get on with it.

When I can’t use the pen,but know I need to record, I use  ‘Highlight’, a great iphone recorder app, with ability to add ‘moments’ just by tapping the screen… It’s very discreet. Olympus and Sony voice recorders also let you ‘highlight’ moments in recordings, but they are not so discreet.

3. Video camera.
I’ve tried many many cameras and always come back to a handycam with accessories.
Everything else has compromises in battery life, audio quality, zooming etc.

I use a Sony, with a stack of SD cards and a Sennheiser shotgun microphone – I find audio is more important than video quality.
A wireless lapel mic is essential when you want to cut out all the noise except the person you’re interviewing, or you’re in a sensitive context (like a hospital ward project I worked on) where the subjects may tend to whisper. I use a Sony ECM-AW4
I also have a beast of a battery on there, which can do a whole day of fieldwork on one charge, which has made #4 below obsolete.

4. Extension cord.  I used to carry this but opted for bigger batteries for the video camera. They are pricey, but essential.

5. Tripod. I carry a Gorrillapod SLR tripod with a Manfrotto ball mount and quick release, it’s compact, instant to set up and perfect for tabletop work. When you need to ‘walk and talk’ with someone, it’s a snip to just grab and use as a handle.

When I need the camera to be further out of the conversation, I go with an entry-level Sony tripod. It’s discreet, is smaller when folded than higher quality ‘mini’ tripods, goes up to about 1.2m high. I sort this out with the same quick release mount for the camera so there’s no screwing things on and off while you’re with the participant. This comes in handy

6. Laptop. I use this immediately after sessions to type up my reflections while they are still fresh. I always drive a bit down the road first …best they don’t see you frantically typing about them from behind their curtains.

7. Schedule. Who, When, Where and sometimes demographics; age, segment, occupation etc. I usually have a pared down version on the dash, but the full version stashed away in case I need phone numbers etc. This is a part of how I maintain my curiosity.

8. Map. As well as Google Maps I try to have a hard copy with all participants located, named, numbered and timed. This comes into it’s own in a city you’re unfamiliar with, when there’s a change in the schedule and you need to know whether you can actually shoehorn in a replacement participant and make it from A-B in the timeframe.

9. Cables, chargers etc. Including 12V in-car USB for boosting phone and livescribe pen while driving.

10. GPS / Satnav. Yes, I can use my phone, but sometimes I prefer a dedicated tool for the job, leaving my phone free for other things. I input all the addresses with names the night before. So when I arrive, the Satnav will tell me ‘arriving at ‘Daves’ number 26′. Geeky I know, but this really helps.
Yes. I make sure I delete all this data before handing it back to the rental co.

11. Smartphone. I use Alarm clock for when I can’t afford to run over the allotted time in a session, Voice to text to brain-dump my thoughts while driving between sessions, Camera for improptu shots, Messaging for contacting participants for timing / directions etc. and ‘Highlight’ for recording those moments.

12. Stills camera. As unobtrusive as possible. Must be usable by ‘feel’ alone (real buttons) and with one hand, so I can maintain my connection and focus while snapping away. Good as a secondary video camera too. I use a Canon S120

13. Rental car. Small & discreet – depending on the context, I sometimes park round the corner or out of sight of the address and appear to arrive on foot. …unless I’m in a rural area.

14. Cash incentives. In marked envelopes – for the participant’s time and involvement. Folding cash speaks everyone’s language – I avoid vouchers or direct payments. I always pay the participant at the start of the session and reinforce that it’s a payment for their time, not ‘for saying the right things’.

15. Receipts / NDAs. To be signed by participant. This keeps accountant and lawyers happy. I always include permission to video record session and detail the rights of use.

16. Smart/casual clothes. I dress up or down a bit depending on the topic I’m working with and neighbourhood I’m visiting – Dress smart enough to be credible, but not authoritative or superior in any way.

And the most important tools of all…

2 eyes

2 ears

1 mouth

…but I’m all ears if you’d like to add to my list, or suggest how I might adapt for different contexts?

Visualising UX research

I’ve never seen clients stand around a written report gesturing at various pages discussing their implications… but when this happens with a drawing, I really feel like my job is done.

A written report can be restrictive when working with rich, emotive material, so I often use visuals to communicate insights and what they mean to my clients.

The same drawings I use to help myself ‘see the wood for the trees’ can be a valuable tool for sharing findings and concepts.

Until recently I’ve produced these to a simple but polished level:

Polished visuals can extend beyond initial graphic impact to tell stories, build context, explain relationships and show processes.  Until now I’ve used these as part of a final deliverable as they can be absorbed in a fraction of the time it takes to read a report, are well circulated and fantastic for getting buy in.

…more recently I’m using sketches earlier in a project as a different kind of tool – a platform for discussion.

Although clients don’t always consider it up-front, consensus building can be a valuable outcome from customer research. Teams across design, product, marketing etc. often need to just ‘get on the same page’.

Bringing the voice of the customer, or insights from their behaviour alive with a simple cartoon and can really get people talking.

A polished deliverable always has it’s place but the pencil is getting a workout earlier in the process these days. I’ve realised different stages of a project require different styles of visual and by using the appropriate level of detail for the audience and the decisions they face at the time, they can be one of the most powerful tools in the box.

Update:
By popular demand I’ve put a few more examples on the ‘approach’ page of my design research consulting website. … and there’s a link there to request a fuller set.

Getting a Grip. Prodesign Magazine showcases my approach to UX

Design Research and User Experience article in Prodesign

This month I’m featured in Prodesign mag.

The article harks back to my days designing surfboards and the moment I became ‘hooked on usability’ during a project for Sony Playstation.

Read the Prodesign article ‘Getting a grip’ here as a PDF.

It turns out this is the last issue of this magazine after 16 years.

What does that say about design in New Zealand?

…or does it say more about print publishing?

End to end customer experience for Swiftpoint

All too often, I’m working on one aspect of a product while valuable insights emerge relating to other areas of the broader customer experience.

Classic example: A website usability study generates feedback around physical product, brand, delivery, billing or in-store interactions.

In theory this offers double or triple whammy for the sponsor of the project. …but not always in practice.

…In some (often larger) organisations, each channel of the customer experience is ‘owned’ by a separate department, and there’s no guarantee insights will be shared with those who can use them to improve their part of the product or service.

In a welcome change I worked with a bite-sized firm where it was possible to actually ‘get everyone in the same room’, for industrial, web, marketing, packaging designers and copywriters all able to benefit from each round of research, acting on insights relevant to their design process.

Swiftpoint, a nimble Kiwi start-up were well aware their customers would interact with more than just their website, or the physical product.

I ran several streams of user research, covering all customer touch-points, knowing every insight would be put to good use.

…A refreshing change to know each part of the team could have their part of the customer experience informed by the research.

Here’s a step-by-step case study to reveal the approach I took.

Anyone else had similar experience getting this level of buy-in with small teams? … or better still, with departments in larger companies?

10 tips for usability studies with children

Usability-with-kids

Children are some of the most demanding and discerning users of interactive products, making them difficult to design for and challenging to moderate in a usability or UX research situation.

Whilst they can’t always articulate their thoughts and you can’t rely on what they say, with a careful approach you can generate incredibly useful design feedback by watching them use a product.

Before kicking off a recent project with with 7-14 year olds, I spoke to teachers, parents and others who have worked with this age group. I’ve added to their collective advice:

1. Have a guardian introduce you

Kids will trust you if their parents do, so meet them first, then have the parent introduce you, they’ll also do a better job than you can.

2. Avoid letting the guardian sit in
Kids behave differently when they know their parent is watching.

3. Explain everything
Kids have an amazing bullshit detector. Be transparent about the purposes of the research and why they are  involved.
The usual upfront introduction to the purpose of the research cannot seem like a formality with kids, tell them why they are involved, let them ask questions at the beginning, or they may ‘sit’ on a question waiting for a time to ask it.
Explain any recording equipment. Kids will be distracted by their curiosity so get all the waving to camera etc. out of the way at the start.

4. Use pairs of friends
Pairs feel more comfortable with a stranger (safety in numbers) and are less likely to get stage fright.
Have them take turns interacting, leaving the other free to talk. This can be difficult to manage at times but creates a great dynamic generating rich feedback.

5. Start easy
Kids, (esp. boys) don’t like to be wrong. Make sure they feel confident and reassured by asking super easy icebreaker questions like; “what are your favourite …” etc.

6. Free range
Thinking aloud while using a product can be very distracting for kids and results in unnatural behaviour, so aim for free-range activities with absolute minimum instruction. Slip into the background as much as possible while they are interacting with the product.

7. Together, then one at a time
Start off directing questions at both kids together before addressing them individually, this saves you putting one of them on the spot and will also help you work out and manage the dynamic when one kid dominates.

8. Choose your words carefully
Try to match your language with the kids (particularly nouns). This might mean you refer or point to ‘things’ until they fill the gaps, then gradually adopt their descriptive terms.

9. Leave the room
Choose your timing and make an excuse to exit the room (assuming you have observation facility). This is the best possible way to observe natural behaviour. Don’t blow your cover though, if you say you’re going to get them a drink, bring one back.

10. Don’t load them up on e-numbers
Their concentration levels can quickly evaporate once sugary or coloured foods kick in leaving you short-changed of feedback. …and their parents will curse you on the ride home.

Anyone want to make it a top 11 or 12 ?

UPDATE: I stumbled across a more theoretical article about usability testing with children. It’s coming from a more academic and psychological angle.

UX research. Making every minute count

ux-research-interview-clock

Facilitating ‘face to face’ interviews with your clients’ customers is central to UX research. It’s also one of the most difficult of all UX skills to develop.

Valuable insights can be generated and captured during each session but you’re often having to cover a lot of ground in the time allocated.

After hundreds of these sessions I still find myself so deeply immersed in observing the participant’s experience of the product that it’s easy to lose track of these precious minutes, dwelling on one activity or area of focus at the expense of another.

I thought I’d share this very simple tool to help keep track of time and make the most of every minute of these sessions. It’s basically a modified clock face.

Grab yourself a wall clock and cut out a new paper face. Cut a slit from the edge to the centre so it can be slipped around the arms of the clock. Mark and name your time allocations on the new face.
At the beginning of each session, just wind the minute hand back to the start and you’re on your way.

For a 90 minute session mark the segments on a spiral going inward, like in the picture of this one I used recently. Oh, and I removed the hour and second hands too.