Strategic Execution, Negotiation & Full Mock Problems
Roadmap Prioritization Frameworks for EM Interviews
Every EM interview includes some version of this question: "You have five projects competing for three engineers. How do you decide?" The interviewer is not looking for gut instinct. They want a structured framework, clear communication of trade-offs, and evidence that you can say "no" with data.
The RICE Framework
RICE was created by Sean McBride at Intercom as a way to objectively score product and engineering initiatives. It stands for:
RICE Score = (Reach x Impact x Confidence) / Effort
| Component | Definition | How to Estimate |
|---|---|---|
| Reach | Number of users/customers affected per quarter | Use product analytics, funnel data, or user counts |
| Impact | How much each user is affected (scale: 0.25 = minimal, 0.5 = low, 1 = medium, 2 = high, 3 = massive) | Align with business goals -- revenue, retention, satisfaction |
| Confidence | How certain you are in the estimates (100% = high, 80% = medium, 50% = low) | Based on data quality, research, past accuracy |
| Effort | Person-months of engineering work required | Use t-shirt sizing converted to numeric values |
Example: A search improvement project reaches 50,000 users/quarter, has high impact (2), you are 80% confident, and it takes 3 person-months.
RICE Score = (50,000 x 2 x 0.8) / 3 = 26,667
The ICE Framework
ICE is a simpler alternative when you need faster estimation. Each component uses a 1-10 scale:
ICE Score = Impact x Confidence x Ease
| Component | Scale | Notes |
|---|---|---|
| Impact | 1-10 | Business value if the project succeeds |
| Confidence | 1-10 | How sure you are it will work |
| Ease | 1-10 | How easy it is to implement (inverse of effort) |
ICE works well for early-stage prioritization when you lack precise data. RICE is better when you have quantitative reach data.
Weighted Scoring Models
When RICE or ICE feel too narrow, weighted scoring lets you add custom dimensions:
| Criterion | Weight | Project A | Project B | Project C |
|---|---|---|---|---|
| Revenue Impact | 30% | 8 | 5 | 9 |
| Strategic Alignment | 25% | 7 | 9 | 6 |
| Technical Risk | 20% | 4 | 7 | 3 |
| Customer Demand | 15% | 9 | 6 | 8 |
| Team Readiness | 10% | 6 | 8 | 5 |
| Weighted Score | 6.85 | 6.85 | 6.55 |
The power of weighted scoring is transparency. Stakeholders can see exactly why Project A ranked above Project C, and they can challenge specific weights rather than arguing about feelings.
The 20% Rule: Balancing Tech Debt vs. Features
A common EM interview question is how you balance feature work against tech debt. The industry standard approach is the 20% rule -- allocating approximately 20% of each sprint's capacity to tech debt, platform improvements, and developer experience work.
Why 20% works:
- It is large enough to make meaningful progress on debt reduction
- It is small enough that stakeholders do not feel the roadmap is stalled
- It creates a predictable cadence -- engineers know they will get time for improvements
- It avoids the "big bang rewrite" trap where everything stops for months
How to implement it in practice:
- Reserve 1 day per week per engineer for tech debt (or 2-3 days per sprint in a 2-week cycle)
- Let the team choose which tech debt items to tackle -- they know the pain points
- Track tech debt reduction with metrics (build time, deploy frequency, incident count) so you can show ROI to stakeholders
Communicating Trade-Offs to Stakeholders
The hardest part of prioritization is not the math. It is explaining to a VP why their project ranked fourth. Use this structure:
- Lead with the framework -- "We scored all six proposals using RICE. Here are the results."
- Show the data -- Present the actual scores side by side.
- Acknowledge what is being deferred -- "Project X is valuable. Its RICE score is 12,000 versus 26,000 for the top project. We are scheduling it for Q3."
- Offer alternatives -- "If Project X is urgent, we can accelerate it by descoping feature Y or adding one contractor."
Saying "No" Constructively
Engineering Managers must say "no" constantly. The skill is making it feel collaborative rather than adversarial:
- Never say "no" without data. Replace "We can't do that" with "That scores 4,200 on RICE versus 18,000 for our current top priority. Here is the trade-off."
- Offer a "yes, if" alternative. Instead of rejecting a request, state the conditions: "Yes, if we can defer the mobile redesign by 3 weeks" or "Yes, if we get one more engineer."
- Document decisions. Keep a prioritization log so you can reference past decisions. This prevents re-litigating the same arguments every quarter.
Quarterly Planning in Practice
In an interview, describe your quarterly planning process as a cycle:
- Collect inputs -- Engineering tech debt list, product roadmap requests, customer support escalations, OKRs
- Score everything -- Apply RICE or weighted scoring to every candidate project
- Capacity check -- Map scored projects against available engineering weeks (subtract holidays, on-call, and the 20% tech debt allocation)
- Stakeholder review -- Present the draft plan, collect feedback, adjust weights if the business context has changed
- Commit and communicate -- Lock the plan, publish it broadly, and define what "done" means for each project
Next, we will cover how to communicate your prioritization decisions to executives using structured frameworks that match how they think. :::