pm@olin: Metrics (Class 8)
This one is a bit of a tough one to write, because I didn’t write the class! Adam Sigel came in to do the lesson on Metrics. He’s a Product Manager at InsightSquared, a Boston-area company that provides tools for Business Analytics. I’m incredibly glad Adam taught this lesson because I spend less time with metrics than most PMs. I learned a lot too. A huge thanks to Adam for coming!!!
- Understand how to pick the right metrics for your project.
- Understand how to track metrics and some of the tools that are available.
- Understand how to apply metrics back to your Product in order to learn and make a better version.
- Understand how metrics fit in with the other parts of your product process.
- Measuring What Matters (Yoskovitz)
- SMART Framework (Wikipedia)
- AARRR — Pirate Metrics (McClure)
- More than Metrics talks at #ProductSF and Beyond the Code (second is a longer, expanded version of the first (Chisa).
- Blameless Postmortems (Allspaw)
Whiteboard and markers
Software to demo
This class was more of a full-group style than usual. In many of my lessons, I split the students into groups to do projects. For this class we remained together, seminar or discussion style and worked through the issues together. It was a nice variation from the usual!
Introduction — 15 minutes
Since Adam was new to the class, he took some time to explain his background:
- Previous roles in PM / before PM
- Why he transitioned into PM
- The structure and responsibilities of his current PM team
- The differences between his roles at Aereo and InsightSquared
I think it’s always valuable to know where your teacher is coming from. I try to be very up front about my background/strengths weaknesses, but I did want the class to be exposed to other perspectives.
I particularly liked that Adam was coming from a non-technical background. For technical PMs, early in your career it’s easy to miss what non-technical PMs are bringing that you can’t.
Picking Metrics — 40 minutes
Adam then transitioned into how to pick metrics. He did this by helping the students work through examples.
We started with a student’s, Ryan’s, current project. I won’t share what his project is, but we worked through how he’d measure acquisition and what other things he cares about users doing.
Then, we worked through some other examples using popular companies. In particular, we talked about Snapchat Discover. We discussed what metrics Snapchat might have been trying to optimize for when they made the feature. Then we talked about how they’d measure success of the feature.
It’s a little hard to explicitly go through everything we mentioned for this piece, so I’ll recommend the two articles in the “resources” section as possible frameworks for looking at metrics. In general, I like the “SMART” framework (specific, measurable, actionable, relevant, timely). I don’t often go through the entire framework, I just think “what do I want to know?” and “how can I measure that so I know if it’s changing?”
One of the things we particularly wanted to highlight was the idea that you should be picking a metric with a point of view. Metrics aren’t about measuring random data — it’s about having a hypothesis and testing it, or having something that you specifically want to learn.
Use of Metrics (and Funnels) — 40 minutes
To talk through use of metrics, Adam also showed us what he uses for his own projects. Since he works at InsightSquared, he uses their tools for measuring success.
He introduced the concept of a funnel, which we hadn’t previously covered in class, and how you want to measure different things at different parts of the funnel. Again, I’m not going to show the specifics. A comparable explanation is Dave McClure’s talk on the AARRR metrics.
He also talked about which specific numbers matter to him right now, why those are the numbers that matter, and what he does to impact them.
AMA — 30 minutes
We concluded with a bunch of time to just ask Adam questions about his experiences as a PM. This covered a bunch of different topics we’d already discussed, and some new areas.
Post Mortems — 30 minutes
Towards the end I also took the time to explain the post-mortem process that we didn’t get to in “building” or “feedback.”
We discussed where postmortems come from (engineering failures), and the Swiss cheese theory of things going wrong. I liked this section because it isn’t just applicable to software — you can go through the same process to consider a hardware failure.
This is slide is accessible from this talk by Noah Sussman. I think it’s originally from John’s talk on the same topic as the article in materials. I’ve always found it to be a very clear explanation of how things go wrong.
After explaining the basic concept, I shared a framework I like to use for Product postmortems. My framework is to do them post-launch, and then analyze how the project went:
- The PM should create a detailed, factual timeline of how everything went — circulated to the attendees to add to. The important part is that this is about facts.
- During the meeting, everyone should walk through the timeline and add anything that’s missing, and note patterns.
- The group can then come up with a list of “things we did well” and “things we can improve upon next time.”
This provides learning from the project that will make future projects with the same team go more smoothly. One downside of teaching this process in schools is that students often work with different people the next time around — there isn’t the project consistency that you get at work.
Selected Changes / Notes
- I should’ve taken better notes while Adam was talking. I learned things, but two weeks after the fact it’s hard for me to be sure I’ve documented this as well as some of the other lessons.
- I liked having the more seminar-discussion style participation. With a class of 10 it isn’t too hard to get everyone talking.
- I also learned some of the students didn’t know about this blog, which is a pretty big “whoops” given I document everything here. HI STUDENTS!