I recently coached a Product Manager (PM) on Pricing Strategy and this article is a result of that interaction. My PM friend did not have a VP of Product Management, and this is the kind of work that a VP would coach on.

The old way of thinking about pricing is largely unchanged from the

by Al Ries and Jack Trout. Since that book, Apple has added to the conversation on pricing by becoming the example everybody uses to explain "value-based pricing" as distinct from "commodity pricing." My friend was already aware of these distinctions, and it wasn't helping him do the "work of pricing." This is a common problem in the industry ー we know the distinctions, but not what to do with them. I drew on my Decision Analysis and Product Management background to help him build a pricing analytic model and approach. He was able to build the levers needed to examine multiple pricing strategies, evaluate the specific (and different) factors that were key in each strategy and finally, make a defensible recommendation to his Director and Vice President of Product Marketing. If you are in a similar situation, this article will help you think through the pricing question in powerful ways and help you do the heavy lifting you need for your job.

Pricing Analysis is really Uncertainty Analysis!

It is extremely important to realize that the pricing conversation is really an uncertainty conversation. You are trying to price your product without knowing how the market will respond. There are subtle details around the knobs you can turn, and you will need to build intuition around those details. It will make all the difference between going blind into your pricing negotiations versus knowing where you have some leeway. How do we get this intuition, you ask? We will need to build some basic tools that let us experiment with different strategies and work out implications. The challenge is to keep it simple. Before we can play with uncertainty, we need to understand how to build parametric models. Here is a free course I helped build with Alejandro Martinez during my SmartOrg days:

. I suggest this course be completed so you can get a feel for parametric models. A parametric model is a fancy term for a model that is driven by parameters.

Build your own Parametric Model

It is hard for the human brain to put together cashflows. However, it is a lot easier to think about specific input parameters that are intuitive and have a natural strategic interpretation, and let the computer build cashflows. We will be focusing on that mindset. Let's start with a set of basic inputs that can each be assessed individually. The example below will work well for companies selling a large number of units of their product.

Set the inputs above to reflect your pricing strategy. The NPV of EBITDA shows the valuation of this strategy without considering any uncertainty.

Model Start Year:

0000

2021

(

Keep this at current year)

Launch Year:

0000

2030

(

Set this to when your product will hit the market)

Total Addressable Market (TAM):

500

million units (

How many units are addressable)

Market Growth Rate:

0000

25

% (

How much is this market growing year on year?)

Penetration (%):

000

19

%

(What percentage of the addressable market will be open to solutions like yours)

Peak Market Share (%):

000

65

%

(Based on competition, what will be your share?)

Ramp to Peak Market Share (Years):

00

6

(How long to achieve the share above?)

Peak Market Duration (Years):

00

10

(How long will you stay at that share? Ramp eats into this.)

Price per unit ($):

2

Price change year-on-year (%):

0000

-33

%

(-ve values mean price will drop, +ve means price will increase)

Cost per unit ($):

0.5

SG&A (% of revenue):

000

30

%

Tax Rate:

000

35

%

Discount Rate:

000

10

%

NPV of EBITDA:

-$791.87

million (Derived from the cashflow analysis below)

Chart of Cashflow

1

Created with Highcharts 7.2.2YEARUnits SoldOp. RevenueOp. CostOp. MarginSG&ANet MarginEBITDA202520302035204020452050-4k-2k02k4k6k

Introduce Uncertainty

As we mentioned above, pricing analysis is really uncertainty analysis. So, let's introduce uncertainty into our inputs. This is where we are at the limit of what Coda (the platform on which I've created this document) can do. So, I will switch to good old Excel to show the example model. I had open sourced a

. The first thing you will note in the pricing model is that there are three inputs for each parameter. You can also select pricing scenarios. We will start by loading up the Commodity Pricing strategy from the dropdown as shown:

Click on "Load Inputs" and the input table will be populated, and the uncertainty analysis will also be run.

Commodity Pricing Inputs

0

Factor

Units

Low

Med

High

1

Total Addressable Market

M units

250

500

1000

2

Market Growth Rate

%

10%

25%

40%

3

Penetration

%

5%

10%

20%

4

Peak Market Share

%

25%

30%

35%

5

Ramp to Peak Market Share

years

2

3

4

6

Peak Market Duration

years

3

5

8

7

Price Per Unit

$

1.5

2

3

8

Price change year-on-year

%

-1%

-5%

-10%

9

Cost Per unit

$

0.2

0.5

2

10

SG&A

%

8%

10%

15%

There are no rows in this table

10

Count

Each factor is assessed at the extremes first. The question to be asked is: what would make you fall off your chair in disbelief if the factor was any lower (or higher). Probabilistically, these inputs have a 10% chance of being any lower (or higher). The median number has a 50% chance of being above or below what's assessed. This is now sufficient for us to put together a Tornado model that shows us which factors are the most important:

In this example, you can see that our assumption about cost per unit has the biggest influence on this particular pricing strategy, followed by market growth rate. As a PM, you would then craft a narrative to make this clear to stakeholders. For instance, the above could represent a commodity pricing strategy that only makes sense if you have strong reasons to believe that you have a tight control on costs,

and

the market is expanding quite a bit year on year. The Tornado makes quite clear that if your costs go out of control, you will lose a lot of money on this strategy. What's also interesting is that prior to seeing the Tornado, you might have fussed a bit on the Price change based on the pressure your customers are putting on you. The Tornado shows that this is not as important ー don't fuss too much on this.

The model also gives you a summary Cumulative Distribution Function that visualizes the gain and the loss, like so:

Summary Statistics

Mean: $82M

Low: -$45M (10% chance you will lose more than this!)

Med: $53M (50% chance you will be above or below this)

High: $224M (10% chance you will make more than this)

Notice that the mean and the median of this strategy are somewhat close. Since we have built the knobs, we can start to do interesting things.

Value of Capping Costs

For instance, if your product team is able to guarantee the high-end of the cost will never exceed $1 (instead of $2), we can easily run this scenario. In the drop-down below, select the strategy highlighted and click on the "Load Inputs" button.

You will see that the Tornado changes:

In this scenario, you wouldn't want to fuss over Cost Per Unit. It would be a lot more important to get on top of the market-facing factors! You need to focus on your assumptions on your market growth rate, market penetration, TAM, and price per unit more. Your new summary statistics are:

Mean: $132M

Low: $19M (10% chance you will make less than this)

Med: $77M (50% chance you will be above or below this)

High: $307M (10% chance you will make more than this)

Notice that the high-end changed a little bit (by around $80M). The median did not change meaningfully. The low-end improved from -$45M to $19M. If you can figure out how to cap your cost/unit at $1, you would have de-risked the commodity pricing strategy but overall, you would have made it only slightly more valuable. The median and the mean have more of a difference now, and this is because there is more upside here if your market growth assumptions end up at the high-end.

Value Pricing

Now, let's try a value-pricing strategy. I am going to make up a story for a possible scenario. The TAM drops to half, as there are fewer takers for a value-priced product. Market growth is much slower in this segment. Penetration drops roughly by half as customers are likely to be locked in to whatever they have and there are higher barriers to entry. It is a rarefied market, and you will be the first in there, so we will increase the peak market share range from 40% to 60% to reflect that. Price per unit will be in the range of $7 to $12 to reflect the value pricing. The Cost per unit range will increase to $0.5 to $2, to reflect higher reliability pressure. Also, your customers will tend to keep you in business longer, so the peak market range is from 7 to 15 years. Load up the new set of inputs from the drop down and click on "Load Inputs":

Here are the new set of inputs.

Value Pricing Inputs

0

Factor

Units

Low

Med

High

1

Total Addressable Market

M units

125

250

500

2

Market Growth Rate

%

2%

3%

4%

3

Penetration

%

5%

10%

20%

4

Peak Market Share

%

40%

50%

60%

5

Ramp to Peak Market Share

years

2

3

4

6

Peak Market Duration

years

7

10

15

7

Price Per Unit

$

7

9

12

8

Price change year-on-year

%

1%

2%

3%

9

Cost Per unit

$

0.5

1

2

10

SG&A

%

8%

10%

15%

There are no rows in this table

10

Count

This results in a Tornado like so:

The first thing to note is that Market Growth Rate is no longer on top, and neither is Cost Per Unit. The top uncertainties are around our entry barrier assumptions (Penetration), TAM, how long the market lasts (Peak Market Duration) and Price Per Unit. The intuitively makes sense, now that we see it!

In this scenario, our Cumulative distribution looks like:

Summary Statistics:

Mean: $405M

Low: $107M (10% chance you will make less than this)

Med: $301M (50% chance you will be above or below this)

High: $828M (10% chance you will make more than this)

Notice that this is telling a rather interesting story. Almost every aspect of the economics looks much better here. The lowest end of this story is better than the median (and close to the mean) of the cost-controlled commodity strategy. This is telling us that it is a vastly better strategy if we really believe our assumptions. Overall, on a probability-weighted mean basis, this pricing strategy is about 4x better than the cost-controlled commodity strategy.

Adding New Strategies

We have the knobs to build new strategies easily. Simply go to the "Pricing Strategies" sheet, name a new strategy and give it a starting set of values. Or, after naming it, go back to the Tornado sheet, load up the new strategy and play with the inputs. When you are happy, save the inputs to that strategy.

Debrief

You may have noticed that we just simulated a case study here. Debriefing what we've been able to do, we note that we can now make transparent all our assumptions. My PM friend used a model like this and took a few steps. First, he sat with his Finance team and got their sign-off on the cashflow table structure. Each company accounts a little differently, so this is an important step. Once Finance is willing to sign off on your economic structure, you will feel more confident about proceeding forward. Remember, Finance cannot help you with the uncertainty analysis ー only with the deterministic structure of your cashflows.

Then, my PM friend worked hard on his assumptions. Each range was a conversation about what is defensible. He then setup a conversation with his Director and VP to go through the process. Having the model handy allowed him to change assumptions on the fly and make this work collaborative. I also helped him make his decision architecture clear in just four slides (that will be another post). On the big day, he presented, and was asked several questions. He was able to handle all of the questions, describe the intuition behind his recommended pricing strategy and tackle what-ifs because he had done the heavy lifting. The result: his VP left the pricing decision in his hands and asked him to go ahead. When we debriefed the coaching session, he acknowledged how this intuition building would not have been possible without getting into the mathematics. He set the bar really high for his peer Product Managers, and I hope you can too.

The purpose of any decision model like this is to drive really good conversations between stakeholders. You know you are meeting your purpose if all stakeholders are learning something useful about the problem at hand ー which is how to price your product. If you find this useful, do share your experiences with it.

Acknowledgments

A lot of my uncertainty analysis learning around this conversation came from David Matheson's pioneering work at