Product / Feature discovery process
Research block. Session 3.
It’s an ongoing process and it is advisable for PM teams to have at least one new feature in discovery at any given time, roughly one per quarter.
1. Assess the Problem space. Think it
1.1. Comprehend the situation (define the goal, product outcome, not an output)
How? By doing secondary research
- external
- Space: search reddit forums. Understand how the features are currently used
- Competitors - own: search through feedback database (Monzo)
1.2. Identify the user. Then mark the user journey (all steps from start to the end including how they feel at each point). How: by talking to them.
ProTip: use 5WHY technique. Ask why 5 times in order to reach the root of the actual problem.
Schedule interviews. Use the data from research when talking to users.
1.3. Build a stakeholder map (who needs to be involved in the decision making process along the project).
1.4. Define the Problem
Look at intensity and frequency. What type?
- Friction: a barrier, smth missing
- Other type: ambiguous
Worth solving this problem?
2. Solution Space
WHAT! from a user perspective. Not HOW yet.
2.1. Come up with Solution hypotheses (How-Might-We)
Suss it out / determine how the solution will work on a high level in Figma. Prepare a prototype, pilot or a proof-of-concept (simulate the real world set up).
2.2. Data collection to validate the solution hypotheses
- Get quantitative data
Work with Analyst to help you to gather the data to validate the solution hypotheses
- Example: if you NPS after a call and ask if the connection was an issue - Get qualitative data
Work with UX research team and run studies. Bring CS and see how much complaints.
Use prepared prototype simulating the real world set up
=> test the Solution hypotheses
2.3. Testing the Solution for the following risks:
- value prop risk
- usability / design
+ check feasibility (below)
+ check the business viability
Why?
1. To account for these risks and mitigate them -> plan ahead of the time or event
3. To show you foresee it and use to address questions (create a slack channel and add the team, stakeholders)
4. Part of the PRD
2.3.1. Business viability risk
What are the constraints, ex: regulatory. How would that affect the entry, pricing of the product, etc?
Example: if it’s required to store personal info, compliance will be required.
How to assess?
- talk to legal team
- consultants
2.3.2. Value prop risk
How to assess?
By interacting with the buying persona. Read more on User research here.
ProTip: In b2b it’s often that buying persona is different to a user persona
Test with 5 users (stop after 6 users if no new findings). You could use https://dovetailapp.com and bring stakeholders to feedback on findings.
ProTip: How to bring the team / stakeholders on board?
- record the sessions with users
- get 1 min snippet
- share with the team: UX, engineer..Why?
- increased motivation
- to understand the priority
- to get ideas early on -> save you time..
Things to take into account:
Key metrics
Depends on the stage.
- At launch: is this working? People able to use these features?
- Later: engagement around the feature. NPS
- After: retention.
How this affects the North star?
If all good, then put these all into PRD (do this all the time even if feature request from the CEO) and move to stage 5.
ProTip: until this stage only go to developers when there is something doesn’t exist.
Start drafting:
- 1-pager / 6-pager -> PRD
Synthesize all the findings and concisely write it into business case
- Competition (pros and cons)
- Trends (3D) - Think of a prototype (how do I imagine this working)
2.3.3. Usability risk
Whether the user understand how to access the value we’re proposing.
It’s done by testing the prototype or a mockup (ask the designer to create one). It’s an incremental process.
Prototype 1 -> Prototype 2 -> which then turns to MVP (a value unit)
Tool to use: https://www.userlytics.com
2.3.4.Feasibility risk assessment evaluates
- engineering capability
- support team capability
1st meeting with the Engineering Lead (EL).
High level feedback on PRD draft.
Designers / Execs / Engineers to comment on the doc -> Engineers will write a doc describing the build (tech solution) -> Then we list epics together (in-person)
ProTip: involve them early on, share links. If they see the impact is high, they will prioritise this.
Outcome options:
- too expensive to build
- there are more impactful things to build (RICE or Kano model (delightful) if difficult to estimate or combination of both methods). If that’s the case then bring in to the next refinement when you complete the most impact
- if positive, discuss capacity (Is this backend? Is this front-end? Rough timeline?)
3. Build it
ProTip: PM’s goal is to ship something to learn and get feedback.
In order to change stories ahead of the time
2nd: Make work as efficient as possible by removing barriers.
3.1. Refinement.
2nd meeting with LE. Deep-dive.
LE - Functional and non-functional descriptions
PM - Efforts required
Goal: feedbacks and clarifications until approved.
Read detailed refinement process here.
3.2. Prioritisation
If same score for various epics — go with one which brings the outcome (something to show). Engineers write tickets (genkins methods).
Read detailed prioritisation process here.
ProTip: be upfront with what’s Must have and Nice have things.
3.3. Sprint Planning
Depending on the stage you may need to start with onboarding the team, setting up the Scrum: milestones and prioritisation.
Read detailed sprint planning process here.
Gage the progress of the team by weekly meet ups. Also daily stand-ups (3 ppl: PM, Leads). Stand-ups there to talk about roadblocks!
Add monthly check-points
4. Ship it
Roll out the feature with AB test and see the metrics:
- time-spent
- successful number of completed sessions (incremental value)
Data science team will tell for how long to run / how many users to avoid being in false positive.
Why AB test?
Determine incrementality on baseline, treatment (noise vs added value)
To protect bad user experience (accidentally crash the site)
This is possible if below available. To get testing setup:
1. Feature flag available i.e. ability to turn on and off the features -> define the spec to engineering team
2. Analytics instrumentation. Are the metrics being fired on per user features (data analysed)
ProTip: this is not permanent. Epics on the backlog are only bets! It’s on a roadmap only if the items are vetted! Lots of features can be only hypotheses.
5. Tweak it
Ideally dashboarding tool will help to monitor the performance. Use Mixpanel / Amplitude
How a new feature can fail:
- nobody uses it (don’t know how or what -> extra backlog)
- some traction but behaviours don’t change, still uses zoom.
What to do if fails?
Calculate how much it costs to maintain —> put it in maintain mode.
Credit: Teresa Alvarez, Sumanyu Sharma, Daniel Eugénio
Did you find this information helpful? If so, click the “follow” button to receive notifications when I write my next article on project management. If you would like to show your appreciation or support my future work, you can buy me a coffee at the following link: https://ko-fi.com/ayuby