{squeaky lean}

Hit objectives quicker through Validated Learning. Here's how.

validated learningValidated Learning is another key concept in Eric Reis' Lean Startup book. He talks about how startups can learn from what they've launched already, to influence what they build next.

For bigger companies, installing a validated learning approach is more complicated.

The problem is that a lot of them see Agile like this:

"Yeah, Agile! We do that. We hand the dev team a prioritised list of things to build, and they tell us how much they can get done in 4 weeks.

We draw straws. Whoever picks the short one, goes to see the developers once a day to discuss progress. They do this at a white board with cards and coloured pens. And no-one's allowed to sit down in case they can't get up again.

At the end of the 4 weeks the developers will have built and launched all our stuff. We hand them a new list and they start again. I think we pay them in cake."

Sounds good right?

Actually, there are two things wrong with it:

  1. It assumes that whoever hands the dev team the prioritised list is the rare kind of genius that knows exactly what should be done and in what order. That they need no feedback at all to help them grow your company faster than mere mortals could possibly achieve; and
  2. It assumes that the products are always perfect first time, every time. That you can just launch them and forget them, confident that the world will be enthralled by their sheer magnificence. And without a single tweak required.

Thing is:

  1. They're not that rare kind of genius; and
  2. Products are never right first time

Assuming we're agreed on those two points, we should try to change the way we do things to accommodate them.

So along with our developers, editorial team, sales people and anyone else with a say in what the dev team do, we need to always:

  1. Improve our knowledge of which products and features will delight our audience; and
  2. Check that the products and features we launch are having the desired effect on our audience

The good news is that just by doing number 2, we get number 1 for free!

Validating launched products and features

I'm assuming that your dev team is using an Agile approach to development. If not, at least go and sort that bit out first. I don't care which flavour of Agile they use, just that the basic concepts are being followed.

Done? Good. Your dev team should be proudly standing by a whiteboard that looks like this:

Good for your dev team. Now get everyone else to join in.

Everyone needs to feel responsible for whether or not the products and features being delivered are the right products and features to deliver. And everyone needs to feel responsible for wether or not they are performing as well as they could be.

To achieve this, all you need to do is:

  1. Make sure, as a team, you've agreed your objective along with an immediate KPI to improve. Read objective setting for dummies for more on this
  2. Write the KPI, the current score and a target score on the top of the whiteboard
  3. Add a column to the end of the workflow board entitled 'Validating'

User stories (Agile speak for a feature that needs building) normally finish their journey across an Agile workflow board once they're released to the live environment and no bugs show up. They are considered DONE.

But not when you add a Validating column.

The story would instead move from DONE to VALIDATING, meaning that the team are in the process of measuring and monitoring the story to decide wether it's working in business terms. I.e. If it is positively effecting the chosen KPI.

I recently launched a pilot programme to try this approach at the company I work.

How WE do it

Every two weeks, about 10 people who work on a brand website: publishing, editorial, development, and in some cases even the Managing Director, will stand at the board and discuss the stories in the validated column.

By this time, the product manager will have ensured there is results data against the chosen KPI, and they will have written the relevant numbers under the story card.

As a story's merits are debated, there are three possible outcomes for it, each of which normally trigger an action:

    1. It is proven to be a success and gets to live out its days proudly in the 'Good Idea' envelope at the bottom of the board
      Actions:

      • New similar stories might be generated because of its success.
      • Some further A/B testing might be agreed, to see if its results can be improved even further
      • Related stories in the backlog might be prioritised higher because there is more confidence that they will improve the KPI
    2. It is deemed a failure and is vanquished to the 'Bad Idea' envelope at the bottom of the board
      Actions:

      • The product or feature might rolled back so it can't do any more damage
      • Related cards in the backlog might be removed, ripped up and binned
      • We may learn that it was our approach that was flawed, rather than the idea, and a new story is generated to do it the way it should have been done first time round
    3. It's deemed too early to tell if it is a good or a bad idea
      Actions:

      • We might agree to wait another two weeks to debate again
      • We decide wether or not related stories should be put on hold until this one is validated
As you can see from the pic below, we weren't as harsh as to actually call the envelopes "Good idea" / "Bad Idea", but you get the idea:validated learning envelopes

Once this debate is finished, we go into a meeting room and build a backlog for the next two weeks with all of this data and debate fresh in our minds.

This is Validated Learning.

We've been doing it this way for 3 months and the sense of pace towards a focused goal has increased significantly.

Also, because everyone is motivated by the KPI, there is a natural collective push for Minimum Viable Product. The team now crave fast feedback through metrics. They know they can get it quicker by first validating a less sophisticated version of the planned end product, so they do.

Overall, they are more engaged because they always have a common goal and are regularly measuring themselves against it. As a team.

And suddenly, the trusty workflow board has developed from:

  • a tool used to help a product manager and a dev team work through their backlog of work,
  • to a tool used by the wider business to pinpoint the tactics required to hit the business' objectives.

And its the same £100 white board it was before. Pretty good value if you ask me.

So, in summary:

  1. Pick a KPI
  2. Add a VALIDATING column
  3. Debate the value of things in the Validating column regularly
  4. Take action based on the outcome of these debates

And that's Validated Learning.

Like this?

OR

By Kevin Heery.

Google

OR