Converting more customers without spending more money

Context

Rated People is a two sided marketplace which means balancing the customer mix is always key. Two of the major KPIs are number of jobs being posted by homeowners and number of tradespeople on the platform to buy them. Each of these fork off into various other important indicators - on the homeowner side there’s things like the quality of the jobs - will they be purchased? - and the acquisition cost of the jobs (free being ideal). 

For this project our objective was to increase overall volume of job posts without a drop in quality. As a product driven feature this wouldn’t involve just buying more jobs on PPC, but figuring out what other levers we had at our disposal. 

Research

The job post had existed as long as the business (16 years) and had been largely untouched - so it was desktop first and retro fitted to mobile. Our market had traditionally had a lot of desktop based users but that had now flipped to 70% mobile. With a slightly older demographic, desktop usage remained higher than industry averages even after smartphone penetration was ubiquitous. But now that it had finally caught up, there was a clear opportunity to optimise the way the job post flow worked to take full advantage of mobile capabilities. 

The other quantitative data of interest was an abandonment rate of 40% in the existing flow, which presented a further opportunity. 

We made an in-page survey to collect some homeowner feedback with an option to leave details for a follow up call. The main themes from the survey results and customer interviews were:

  • Uncertainty over the information they should provide for their job description
  • Unsure how to answer the budget question - would this lock them into a price?
  • Some concerns over entering personal information (privacy)

Design

Armed with these insights the team (1 PM, 1 researcher and 2x designers) workshopped some ideas. We plotted them on an impact/effort matrix and, balanced with other constraints (eg very little back end resource) settled on two approaches we’d like to test. 

Myself and the other designer worked up two, separate, clickable prototypes to test with users. The first would be a one page solution with increased use of icons (to provide larger tap areas on mobile) and the second would take a ‘one thing at a time’ approach.

Validation

I worked on the ‘one thing at a time’ approach and created a clickable prototype to demonstrate the whole flow. My goals were to keep each section as focussed as possible, make the process feel fast but thorough and to help the homeowner to feel in control.

Here's a few of the ideas incorporated:

  1. I broke down each step of the job post into its own screen, hence the 'one thing at a time' monicker. Creative huh?
  2. A new tooltip style device would be used to help explain each step, this would be displayed by default making it harder to miss.
  3. After each step, we’d present the homeowner with a playback panel to remind them what they’d already done and make amendments if necessary.
  4. Feedback from tradespeople, backed up by our data, told us that jobs with photos attached were more likely to be sold. So, giving greater prominence to photos was a no-brainer.
  5. We were also keen to help homeowners feel in control, I added a progress bar at the top to indicate how long it would take (roughly) to create a job.

Testing

We invited a group of homeowners into our offices to test the prototypes. Both ideas tested well but there was an overall preference for the one thing at a time approach.

Test structure

1-to-1 usability sessions with 10 users - screens and voice recorded on Lookback

  • Tested with mobile web wireframes (not high fidelity prototypes)
  • Recruited a mixture of 6 RP HO users and 4 non RP HOs users
  • Ages 27-67, London based
  • Mixture of iOS and Android users
  • Users tested both prototypes (think aloud protocol) to post a job and were asked to compare afterwards

General feedback

  • Users felt the steps were self explanatory, intuitive, logical - it was clear on the whole, which helped them focus on getting the job done quickly. “just a couple of taps”
  • Users found the form generally quite easy to complete, with no difficult questions or questions that were too long “Everything asked is required info.”
  • Users found it straightforward, basic, simple to do. “My father in law would be able to use this website on his tablet – easy.”
  • Copy is straightforward - most users read all of it and liked the jargon free tone of voice
  • The Single Ease Question (SEQ)—overall, how difficult or easy was the task to complete?—yielded an average score of 5.8/7. Decent.

Limits of wireframes

Neither the progress bar nor the tooltips were universally picked up on in the test. When prompted, all users thought they were useful additions.

We decided this was largely a fidelity problem and that a proper visual design would likely take care of the issue. 

Head to head

The icon based approach also tested quite well but overall received more negative feedback. There was more observable confusion amongst users, reflected in a lower SEQ score.

Iteration | Prototype | Handoff

The user testing result gave us enough confidence to proceed with the ‘one thing at a time’ version. I created the high fidelity assets and worked closely with the developers to bring them to life.

Results

When the new flow was ready to deploy - following QA and appropriate tracking tags - the release went out as a throttled A/B test. Initially to 5% of total traffic so we could monitor for errors and general performance. As this is such a fundamental funnel to the the business a cautious approach was necessary. 

Satisfied that the new flow was working we gradually increased traffic until it was at 100%. After the first three months of it being live we observed the following results:

  • 10% uplift in jobs with photos
  • 20% uplift in overall conversion
  • 4-5% uplift in service rate (service rate is the metric used to describe the amount of times a job lead is purchased at least once. This number jumped from 70-75% following the redesign)

These numbers were huge and exceeded our expectations, we were able to significantly impact quantity and quality by providing our users a more modern, more bespoke experience, purely through product focussed work.

Challenges and what might I have done differently

It’s always hard to tamper with one of the core flows and - as this one captures data and involves using the keyboard - prototyping was challenging as we had to simulate those steps. 

I’m always slightly skeptical of testing with wireframes as, whilst they’re useful for answering high level layout questions, I think they’re best used internally. Real world users can struggle to figure them out and they don’t give an accurate representation of the finished product. This was reflected in some of the user testing feedback we received.

So not testing a hi-fi prototype wasn’t ideal but the results of the project weren't impacted. All this goes to underscore: there’s no perfect process, there’s no right way to do things, everything is an experiment and sometimes it pays off. There are of course good practices to follow, but nothing is guaranteed. We just need to make sure we give ourselves the best chance of success. 

Return to top of page