Episode 52

full
Published on:

21st Oct 2024

Going Deep into Marketing Mix Modelling and Incrementality - Pranav Piyush

Attribution remains one of the hardest challenges in marketing.

It affects literally EVERYTHING: how we’re perceived as a discipline, the strategies we pick, the activities we decide to do—even how we justify our existence.

B2B companies generally use some combination of first touch, last touch, or multi-touch attribution. They may apply those approaches very diligently and rigorously, but few stop to consider whether those methods are valid and sound.

How do we know whether attribution actually predicts anything? Are we just deluding ourselves? And if we tear down MTA, what do we put in its place?

The good news is that we may not need to reinvent the wheel. There are established methods for measuring correlation and causation between different variables in the real world. In today's conversation with Pranav Piyush—CEO of Paramark—we discuss how to apply marketing mix modelling and incrementality testing to understand the effectiveness of any channel or asset.

Thanks to Our Sponsor

Many thanks to the sponsor of this episode - Knak.

If you don't know them (you should), Knak is an amazing email and landing page builder that integrates directly with your marketing automation platform.

You set the brand guidelines and then give your users a building experience that’s slick, modern and beautiful. When they’re done, everything goes to your MAP at the push of a button.

What's more, it supports global teams, approval workflows, and it’s got your integrations. Click the link below to get a special offer just for my listeners.

Try Knak

About Today's Guest

Pranav Piyush is the Co-Founder and CEO of Paramark, a platform providing marketing measurement and forecasting for fast-growing businesses. Prior to Paramark, he's held growth and marketing leadership roles for companies like Magento, Pilot.com, and BILL.

https://www.linkedin.com/in/pranavp/

Key Topics

  • [00:00] - Introduction
  • [01:45] - Why multi-touch attribution isn’t valid
  • [07:19] - Why MMM overcomes the limitations of MTA
  • [11:05] - Correlation vs. causation
  • [16:00] - Measuring the impact of content
  • [22:47] - How MMM works under the hood
  • [27:12] - Running an incrementality test without MMM
  • [32:26] - Qualitative insights
  • [37:39] - Incrementality deep dive
  • [43:07] - Brand
  • [49:30] - Paramark

Thanks to Our Sponsor

This November, MOps-Apalooza is back in sunny, Anaheim, California, and it's going to be the marketing ops event of the year, packed with hands-on learning from real practitioners.

This is the only truly community-led tech-agnostic MOPS conference out there. It's got the best speakers, the best networking, the best social events, and maybe even a trip to Disneyland. This isn't your 50,000 person tech company conference. It's an intimate gathering of folks who are in the trenches every day.

Registration is capped at 700 attendees, and tickets are going fast.

MOps-Apalooza 2024

Resource Links

Learn More

Visit the RevOps FM Substack for our weekly newsletter:

Newsletter

Transcript
Justin Norris:

welcome to rev ops FM attribution remains one of the

2

:

hardest challenges in marketing.

3

:

It affects.

4

:

Literally everything, how we're perceived

as a discipline, the strategies, the

5

:

activities we decide to do, how we

justify our existence, all of it.

6

:

And at the end of the day, there's

still so little rigor in most

7

:

companies around attribution.

8

:

Most teams I've worked with are

doing something either very basic

9

:

like first or last touch or using

methodologies pushed by a vendor.

10

:

Uh, that are kind of just invented

out of thin air that aren't

11

:

backed by a lot of evidence.

12

:

I don't see that to be critical because

I've used those methodologies too.

13

:

I've been a consultant

for those methodologies.

14

:

but I think there is a growing awareness

as we scratch a little bit beneath the

15

:

surface that a lot of the ways that

we're doing attribution and historically

16

:

have done attribution, especially in

B2B, Don't have a very solid grounding.

17

:

so today we're joined by Pranav

Piyush is the CEO of Paramark, which

18

:

is a company that provides media mix

modeling software, incrementality

19

:

testing software, and we actually got

connected through a linkedin discussion.

20

:

I was posting about attribution.

21

:

He was very, politely pushing back on

some of the things that I was saying.

22

:

So I thought it was actually a great

opportunity to bring him onto the

23

:

show and talk about what he's seeing

24

:

so for now, thank you so

much for joining us today.

25

:

Pranav Piyush: Thanks for having me.

26

:

And I have to give you your flowers for

being engaging and really sort of engaging

27

:

in discussion and debate on LinkedIn.

28

:

I think that's the way it should roll.

29

:

So excited to chat live about it.

30

:

Justin Norris: Agreed.

31

:

So, I mean, Maybe let's just start.

32

:

I've set the stage a little bit with,

some of the problems out there, but

33

:

give us your view from 1000 feet.

34

:

what is the state of B

to B attribution today?

35

:

How is it going from your

perspective in the industry?

36

:

Pranav Piyush: things to be excited

about is the higher level of

37

:

conversation around the concepts

of incrementality and even just the

38

:

concepts of correlation and causation.

39

:

And that's really good.

40

:

I'm glad that we're

having that conversation.

41

:

I'm glad that these types of

podcasts are coming across.

42

:

At the same time, there's almost a

tale of two cities or two worlds or

43

:

whatever, where there's a whole spectrum

of conversation that is either still

44

:

stuck in the multi touch attribution

world or is inventing pseudoscience.

45

:

In the name of incrementality, and

that gets me really upset because

46

:

I have this point of view that the

word attribution has been maligned.

47

:

It's actually a really beautiful

word, but it's literally just been

48

:

maligned and something very similar

is going to start happening with

49

:

incrementality because Of a whole host

of reasons that we can talk about that.

50

:

So I'm both excited and optimistic, but at

the same time, a little bit disappointed

51

:

by some of the conversations going on.

52

:

Justin Norris: Yeah, I am

seeing this play out the scene

53

:

that you've described exactly.

54

:

And I go back to, 10 years ago or maybe

even a little bit more when Bizible first

55

:

came on the scene and it was like this

amazing thing you could track all these

56

:

different touch points and then you could

choose all these different ways to divide

57

:

them up take your opportunity credit and

you just kind of slice it up like a pizza,

58

:

like this touch point gets a slice and we

do w shape or equal weight or whatever.

59

:

and we never really stopped a question,

I guess, as a marketer and like, I'll

60

:

take ownership for this for myself.

61

:

We didn't stop to question

like, is this justifiable?

62

:

Does it make sense to do this?

63

:

Does it actually produce like a,

good outcome in terms of decision

64

:

making if we do this and now, I

guess there is a lot more pushback.

65

:

against this.

66

:

Can you just walk us through

like, is that sort of historical?

67

:

multi touch attribution methodology.

68

:

Is it a valid way of looking at the world?

69

:

And if not, why not?

70

:

Pranav Piyush: I give you three

points of evidence that will hopefully

71

:

help inform this conversation.

72

:

So the first one, if you look at a

whole variety of channels, and this is

73

:

arguably sort of more biased towards

larger brands, whether you're B2B or B2C.

74

:

But if you look at channels like.

75

:

Social video podcasts, Like the one

that we're talking about right now.

76

:

These do not generate clicks

or touches by definition.

77

:

You are going to exclude a

significant part of marketing

78

:

and media from your models.

79

:

If you rely on multi touch

attribution models to assess the

80

:

impact of these types of channels.

81

:

And well, some people will tell

me, well, like, what about view

82

:

through attribution, right?

83

:

Isn't that the solution?

84

:

And I'm like, yes and no, because

that brings me to the second point,

85

:

which is there is coincidence.

86

:

There is correlation

and there is causation.

87

:

The three C's.

88

:

Everything that we are talking

about in MTA world is coincidence.

89

:

A Latin term that I just

recently came across.

90

:

Post hoc ergo proctor hoc.

91

:

All this means is, after this,

therefore because of this?

92

:

That's a logical fallacy.

93

:

That is a very popular fallacy that

just because something happened right

94

:

prior to something else happening

that we assume that there's a cause

95

:

and effect relationship there where

there isn't even a correlation.

96

:

so that's the second point.

97

:

And the third is when you add all the

privacy changes that have happened

98

:

in the last five years, people are

realizing that even the touch based

99

:

data that they do have is incomplete

because guess what, 20 to 30 to 50

100

:

percent of people do not accept cookies.

101

:

And most of your first click,

last click data is coming through

102

:

either cookies or UTM codes.

103

:

And now there's increasing

evidence that UTM codes are

104

:

probably going to get stripped out

from pretty much every browser.

105

:

Safari is already doing that in

multiple cases, Firefox is already

106

:

doing that in multiple cases.

107

:

So, when you look at all three

of those points, it's like, how

108

:

can multi touch attribution work?

109

:

And that's how I generally

think about this conversation.

110

:

Now, it doesn't mean that you

shouldn't track clicks and touches.

111

:

That's not what I'm saying.

112

:

There's perfect legitimate use

cases for tracking a user journey.

113

:

And that's how I think about that data.

114

:

It's behavioral analytics.

115

:

It's not attribution.

116

:

Attribution is very

simply cause and effect.

117

:

If we're not talking about cause and

effect, you can't call it attribution.

118

:

Justin Norris: So that is an important

distinction to make tracking the

119

:

touch points, understanding, these are

observable facts that we can detect.

120

:

We're not saying it's the entirety

of everything that happened.

121

:

It's just, we're saying that this

happened and we could track it.

122

:

That can be useful taking it and then,

you know, starting to dole out credit

123

:

and saying, therefore this channel

drove X million dollars in pipeline.

124

:

That's a, that's a fallacy.

125

:

And I think that makes sense.

126

:

I I've yet to see a strong argument

against, I think those three things that

127

:

you mentioned are kind of devastating.

128

:

so let's then like, I've created

a nice foil now for, MMM, or

129

:

I've heard it say both media mix

modeling and marketing mix modeling.

130

:

I don't know which of those you prefer.

131

:

Maybe just introduce us to that

and why is it different and not

132

:

suffer from those same limitations.

133

:

Pranav Piyush: I don't have a preference.

134

:

I think, you know, people

can call it whatever it is.

135

:

It's we invent so many new terms.

136

:

what's interesting about MMM is

that it's predated all of us.

137

:

It's predated the Internet.

138

:

So this was actually invented,

I believe, in the 70s or 80s

139

:

by, you know, folks at PNG.

140

:

This is folklore.

141

:

I don't think there is good

attribution for this, but I

142

:

think it was literally built by a

partnership of academia folks at M.

143

:

I.

144

:

T.

145

:

And Howard and practitioners

that companies like P.

146

:

N.

147

:

G.

148

:

And they had a hard job, right?

149

:

Because you didn't have

any touch and click data.

150

:

So how do you know which of your

ads are working or not working?

151

:

And they were working with, you know, T.

152

:

V.

153

:

And radio and newspapers and

these types of media channels.

154

:

And so they had to invent something new.

155

:

And so you had a bunch of these

statisticians, who we now call data

156

:

scientists, who in the 70s and 80s are

like, Hey, there's actually a way that

157

:

we can model the data about readership

and listenership and find correlations

158

:

between the increase or decrease

in readership and Of PNG products.

159

:

So as you increase the number of ads

in the wall street journal, maybe

160

:

that's a bad example, how many more

sales can be attributed to that region

161

:

that that newspaper is distributed in.

162

:

And that was the beginning of MMM.

163

:

You look at time series data.

164

:

So you're looking at day by day.

165

:

You have 10, 000 people

reading a newspaper.

166

:

You have 15, 000 people

reading a newspaper.

167

:

You have 20, 000 people

reading a newspaper.

168

:

And as that trend goes up and

to the right, do you have a

169

:

corresponding increase in your

amount in your sales and vice versa?

170

:

When you decrease the distribution

of ads through newspapers, do

171

:

you see a decline in your sales?

172

:

And now when you do that across

multiple channels at the same time, you

173

:

can build very sophisticated models.

174

:

And these are all being done

through spreadsheets and

175

:

manual work back in the 80s.

176

:

And now it's been digitized and we can

talk about sort of what that's been like.

177

:

So when you think of it that way, right,

let's talk about all three issues.

178

:

You no longer have to sacrifice channels.

179

:

This can work for newspapers.

180

:

It can work for Google.

181

:

It can work for podcasts.

182

:

It can work for pretty

much any channel out there.

183

:

Second, you are no longer making an

assumption about one thing happening and

184

:

therefore the second thing happening.

185

:

You're actually looking at the

statistical correlation between a

186

:

quantity increasing and its impact

on your sales increasing or not.

187

:

So you can imagine if those

two numbers are going like

188

:

this, there's no correlation.

189

:

But if the both numbers are going

like this, there is a correlation.

190

:

So that's the second piece.

191

:

And third, because you are not actually

tracking individual users, you're

192

:

making a, analysis on aggregated data.

193

:

You have no privacy concerns.

194

:

We're not trying to spy on a certain

user and say, did you click or touch on

195

:

this particular piece of advertising?

196

:

We're saying as overall numbers

have increased in terms of

197

:

impressions or reach or frequency.

198

:

Okay.

199

:

Has your sales numbers increased?

200

:

It's a very different approach

and completely privacy

201

:

friendly and future proof.

202

:

Justin Norris: So that makes total sense

to me and let me ask you a question

203

:

like I'm going to try to poke a hole

not out of skepticism because I would

204

:

just like to hear how you feel it.

205

:

And this is probably coming from a

place of ignorance because I am not

206

:

a statistician or a mathematician.

207

:

so we've solved the post hoc fallacy

because we've seen that two things

208

:

trend in a similar direction, let's

say, placing ads in the wall street

209

:

journal, like you said, and sales, what

it doesn't solve for, like, let's say

210

:

that the reason why we decided to place

more ads was because, you know, economic

211

:

forecast, consumer confidence is high

as we predict there'll be more demands.

212

:

We're going to place some more ads

so we have correlation, but we don't

213

:

necessarily have causation because the

increasing spending could have been.

214

:

Caused by that.

215

:

How do you resolve that problem in MMM?

216

:

Pranav Piyush: There is a certain amount

of correlation that you will not be able

217

:

to convert into causation through MMMs.

218

:

MMMs are not causal.

219

:

I'm probably the only MMM vendor

that will that out loud and be

220

:

willing to, you know, stand by it.

221

:

So just because I'm talking about

MMM being better than MTA doesn't

222

:

mean that I'm saying that MMM is

the end all be all and gives you

223

:

perfect causality in your models.

224

:

It doesn't.

225

:

It is a correlation.

226

:

It's an estimate of causality,

but it is not causality.

227

:

To get to causality in marketing,

you have just one option.

228

:

That is experimentation.

229

:

By the way, that is also the

only known way of getting to

230

:

causality in any other field.

231

:

Justin Norris: I was just thinking

that it sounds a lot like medical

232

:

tests, like you can do long term

studies and establish correlation.

233

:

But if you want causation, you

have to do a controlled trial.

234

:

Pranav Piyush: Precisely.

235

:

So random control trials are CTS.

236

:

Everything in all other fields of science,

rests on the shoulders of our CTS.

237

:

And there is no difference in marketing.

238

:

If you want to get to causality,

you have to run experiments.

239

:

So one of the things that we talk

about at paramark all the time is the

240

:

purpose of your marketing mix models.

241

:

Media mix models.

242

:

Attribution models is not

to establish causality.

243

:

It is to understand the The hints of

causality, it's estimates of causality.

244

:

And if you really want to get

precise about causality, that

245

:

should inform a series of

experiments that you are running.

246

:

And we can talk about how

experiments can be run in media.

247

:

It's harder than running an A B

test on your website because you

248

:

don't control the surface, right?

249

:

You don't control TV, you don't

control radio, you don't control meta.

250

:

But there are, Reasonable ways of

doing experimentation on that channel

251

:

that gives you a sense of causality.

252

:

And I can talk about that if

there's, you know, if you think

253

:

that's a good time to jump into it.

254

:

Justin Norris: Yeah.

255

:

I mean, let's, go there

let's say a podcast.

256

:

Here's a perfect example.

257

:

Obviously this is my own podcast.

258

:

Let's say, uh, a company's

looking to run a podcast.

259

:

How should we understand causally if

it's affecting, you know, Sales or not.

260

:

Pranav Piyush: That's an interesting one.

261

:

Podcast advertising far easier

to test causality, right?

262

:

Because if you're advertising on Spotify

or on YouTube or any other channel where

263

:

you have sort of podcast advertising,

almost all those platforms are going

264

:

to give you geotargeting capabilities.

265

:

So if you're doing testing on podcast

ads, the way to construct your test

266

:

and control groups is by geography.

267

:

And you'll tell Spotify that, Hey, target

my ads just to, I'm just making this

268

:

up San Francisco and Miami, and let's

keep New York and Austin as the control.

269

:

And you're going to monitor your

conversions from those locations

270

:

over the next six weeks to see if

there was a statistically valid

271

:

increase in your conversions.

272

:

As a result of you running those

podcast ads, that is pretty much

273

:

the blueprint for any channel.

274

:

The hardest parts about this are

identifying the right geographies,

275

:

identifying the time frame,

and identifying the amount of

276

:

budget that you have to dedicate

to be able to see an effect.

277

:

That's pretty simple.

278

:

Podcasts by themselves?

279

:

I think of them as an

asset, not a channel.

280

:

So let's talk about this, You

and I are recording this podcast.

281

:

Well, how does it make it out to people?

282

:

There's only one way you

have to distribute it.

283

:

So the real question is how

are you distributing it?

284

:

And does that distribution

plan have a positive effect?

285

:

On your pipeline.

286

:

So whether that's splicing it up on

social, whether that's cutting it up

287

:

into social ads, whether it's through

email to my list of email subscribers,

288

:

but that's where you get into testing,

not the, whether me producing a

289

:

podcast has an effect or not, it's

like asking, you know, I created an

290

:

ebook, but like that doesn't matter.

291

:

What really matters is how did you

deliver your ebook to your audience?

292

:

And you can test that.

293

:

Justin Norris: If the tree falls

in the forest and no one hears it,

294

:

does it make a sound kind of thing?

295

:

But let's get into content

because this is a huge question.

296

:

I, work with a content team, and

I've worked with, with many content

297

:

teams over time, and everybody wants

to know, like, is this working?

298

:

Is it helping?

299

:

And, I saw a post, I think it was Dale

Harrison who also posts a lot on these

300

:

topics, maybe it was on a post of yours.

301

:

It might have been around, around

incrementality, like seeing a blog

302

:

or not seeing a blog, and it's like,

well, of course, the people that

303

:

are reading your blog are people

that are already more likely to buy.

304

:

Okay.

305

:

And I could see that argument, but then

you're also saying, well, we're investing

306

:

a ton, like hundreds of thousands, even

millions in a bigger company in content,

307

:

there has to be some way to be able to

tell aside from the distribution channel.

308

:

Whether that content is actually

valuable, whether it influences

309

:

people's decisions in any way, how

would you go about trying to do that?

310

:

Pranav Piyush: So first off, I

would say, if you think about any

311

:

creative production, at the end of

the day, content is at the heart

312

:

of what we do as marketers, right?

313

:

If you don't have content, you have

nothing, you're distributing nothing.

314

:

So the idea that you have to

justify your investments and

315

:

content come from a place of severe.

316

:

Anxiety and under confidence

in you as a marketer.

317

:

Okay.

318

:

So that's the first thing that

I'm going to say, like, that's the

319

:

wrong conversation to be having.

320

:

Now you might still get forced into

having that conversation because you have

321

:

a finance team and you have a CEO who

maybe don't understand that and you're

322

:

having to talk about the justifications

of why you're investing hundreds

323

:

of thousands of dollars in content.

324

:

I get it.

325

:

I understand where that's coming from.

326

:

I would flip the script and flipping

the script means if you can explain.

327

:

That a hundred percent of

your marketing budget, right?

328

:

If you're spending 5 million a

year are bringing an incremental

329

:

50 million in pipeline, then nobody

cares how you spend the 5 million

330

:

between content versus distribution.

331

:

The problem and the question arises

because you don't have good math on

332

:

your side about how much incremental

pipe has been generated as a result

333

:

of that 5 million in marketing.

334

:

And I don't care if it's brand,

performance, content, creative, paid,

335

:

earned, owned, it doesn't matter.

336

:

So, that's my answer to that question.

337

:

Trying to pinpoint the efficacy

of every single piece of

338

:

content is a losing proposition.

339

:

If you had to do it, I would do it based

on the engagement metrics of that content.

340

:

What do I mean?

341

:

You know, I had a conversation

with Ashley Faust from Atlassian

342

:

and I pitched this concept to her.

343

:

I was like, look at the total

volume of consumption of a content.

344

:

So if you're talking about a blog

post, the total minutes that have ever

345

:

been read about a certain blog post.

346

:

If it's a video, the total view time.

347

:

Guess what?

348

:

If you go and talk to YouTube creators,

that's what they're going to talk about.

349

:

The lifetime viewership of their content.

350

:

They're going to look at the

percentage of people who make it

351

:

to the end of the video, right?

352

:

That's the video retention rate.

353

:

So you look at the consumption metrics

of the pieces of content to understand

354

:

if the content is good enough or not.

355

:

But again, it's You still have

to figure out how you're going

356

:

to distribute the content.

357

:

And that's a different metric altogether.

358

:

Justin Norris: that makes sense to me.

359

:

And I think I agree with you, but I want

to, I want to drill down on it one more

360

:

level, just because this really, and

I'm sure you have a similar experience.

361

:

This really is a conversation

we have all the time.

362

:

And I think part of

it, you're quite right.

363

:

Is the.

364

:

insecurity, I think we've developed

as, as marketers in the face of like

365

:

these purely short term activation,

performance driven metrics.

366

:

and the way that executive teams

ask questions to marketers.

367

:

So there's that, but then I think

there's also, you want to know

368

:

sometimes like, is my content any good?

369

:

Like I put out podcast episodes.

370

:

I want to understand are these

episodes good or the ones with

371

:

the highest viewership, the best

episodes, or is it, you know, there's

372

:

lots of factors that come into it.

373

:

So to give a tangible

example to just try to test.

374

:

What you're saying.

375

:

we have a, a blog post in my company.

376

:

It's like a very, uh, top of funnel.

377

:

It's a blog post in NASA.

378

:

It's something, it's a piece of

content, very top of funnel, very

379

:

general, not really product related.

380

:

So of course it gets a ton of traffic.

381

:

It's about like onboarding

or something like that.

382

:

Something that's very

relevant to everyone.

383

:

So it's a lot of people, but

there's no guarantee that the.

384

:

People who are consuming that

content, you know, have any sort

385

:

of like product related intent are

ever going to buy anything, connect

386

:

that content back to our product.

387

:

So I, totally buy into the engagement

metrics and I, and that is the

388

:

same feedback I give to my team.

389

:

And yet I struggle sometimes when you

say it could be engaging, but is it

390

:

engaging the right people that eventually

will lead to the outcomes that you want?

391

:

Like, how do you control for that?

392

:

Pranav Piyush: You know, It's

the same topic as the question

393

:

of a marketing qualified lead.

394

:

This is the same exact conversation.

395

:

And if you think about the word qualified,

who are we to qualify our prospects?

396

:

They are qualifying us for whether

we solve their need or not.

397

:

So, that to me is not a measurement

question, that's a strategy

398

:

question of why are you putting

that content out in the first place.

399

:

And if you're putting the content out in

the first place is to, you know, boost

400

:

your engagement metrics, then obviously

it's going to attract the wrong traffic.

401

:

You see what I'm saying?

402

:

Because your strategy is focused

on juicing a number rather

403

:

than to serve your audience.

404

:

If your strategy was to serve your

audience, you would immediately think

405

:

about a different way of measuring it,

Or your measurement would be more real,

406

:

so, I think we conflate different things

when we talk about content strategy, like

407

:

I don't care for the qualified metric.

408

:

I don't care for qualified as a term.

409

:

We don't do any of that at paramark itself

is because it's very clear who we serve.

410

:

And if you don't find that

out on your first visit to

411

:

paramark, we did something wrong.

412

:

So it doesn't matter what engagement

metrics I had from that number.

413

:

and again, it goes back to like having

that conversation with your leadership.

414

:

It was like, no, no, no, I want to.

415

:

Increase my organic visits to

the website by 20%, And you're

416

:

like making shit up to do that.

417

:

But why are we doing that?

418

:

Like, why does 20, why is

20 percent the right number?

419

:

Justin Norris: basically The

content engagement is a useful proxy

420

:

of how valuable the content is.

421

:

But if you're putting out content

about like how to pick the winning

422

:

lottery numbers, you shouldn't

necessarily expect more people to buy

423

:

your contract management software.

424

:

Like it just doesn't work that way.

425

:

And.

426

:

that's logical reframing it as a

strategy rather than a measurement

427

:

question, I think is interesting.

428

:

So digging a little bit into this, and

don't expect you to, like, unpack complex

429

:

math here in a conversation, but how

does it actually work under the hood?

430

:

are the inputs?

431

:

What are the outputs?

432

:

And then how are teams?

433

:

How are your customers actually making

decisions using this information?

434

:

Pranav Piyush: Yeah, there's a lot there.

435

:

No, I think we should,

we should talk about it.

436

:

So if you visualize the question of

marketing measurement as a formula, that's

437

:

the best sort of, that I have found.

438

:

So on the right hand

side, you've got your.

439

:

Success metric might be pipeline, might

be sales, might be orders, might be leads.

440

:

whatever the metric is that

you're optimizing towards, And

441

:

on the left hand side, you've got

every single marketing channel.

442

:

You've got a multiplier on that

marketing channel that represents

443

:

the strength of the correlation.

444

:

Okay.

445

:

And you have a few other variables

that are representing seasonality.

446

:

That are representing, the organic trend

of your business that are representing

447

:

other factors that may be outside

of your marketing team's control.

448

:

When you sum up all of those

things on the left hand side.

449

:

You're trying to predict

the right hand side.

450

:

That's what's happening under the hood.

451

:

And I'm not even kidding.

452

:

If you actually go look up the

academic work, that's literally

453

:

what's happening under the hood.

454

:

So it's just a way to solve that formula.

455

:

And you're applying a whole bunch

of machine learning to predict or to

456

:

simulate, based on all the data that

you have, the answer to that equation.

457

:

So how do customers make use of this?

458

:

When you get output out of a, an

MMM, you essentially get a few

459

:

different sort of interesting,

tidbits of information you understand.

460

:

The percentage of your success metric that

came from a certain channel, or that can

461

:

be attributed back to a certain channel.

462

:

And again, this is an estimate.

463

:

You understand the cost, obviously,

of acquiring an incremental

464

:

conversion from that channel.

465

:

so you get that across

every single channel.

466

:

And you also get what

is known as a baseline.

467

:

baseline.

468

:

is essentially your organic demand.

469

:

Your word of mouth, your brand equity.

470

:

These are all sort of things

that were not driven by marketing

471

:

or sales in the short term.

472

:

These are sort of longer term things

that are happening in your business.

473

:

And then you also get a sense of what

a future might look like if you were

474

:

to invest more in every single channel.

475

:

Again, this is like, you can

think of it as a forecast.

476

:

If I put another 10, 000 into Metta,

what will that do to my success metric?

477

:

Okay.

478

:

And you can imagine all of this

being shown to you on a monthly or a

479

:

weekly basis, depending on how often

you are refreshing your analysis.

480

:

So that's what you get.

481

:

So now you have a very rich understanding

of, as you have increased or decreased

482

:

your spend and your strategies in

different channels, What has that done

483

:

in terms of contribution to your metric?

484

:

And if you invest more, what

is the likelihood of that

485

:

increasing even further or not?

486

:

So our customers will use that to

inform a series of experiments.

487

:

Where you take the most efficient

channels and you figure out, Oh, if

488

:

this channel is looking so good, can I

just dump another million dollars here?

489

:

What is the point of diminishing return?

490

:

Right?

491

:

So you run an experiment, run an

actual incrementality test to test

492

:

that hypothesis, and that becomes

an ongoing set of experiments that

493

:

you're constantly running every

month, every quarter, every year.

494

:

And that's essentially

your marketing roadmap.

495

:

For other channels where it's highly

inefficient, you've spent a lot of

496

:

money, but it's not statistically

correlated with your success metric,

497

:

you might have a different hypothesis.

498

:

Maybe we need to pull back on

spending, or maybe we need to change

499

:

the creative execution in that

channel completely, Maybe static ads

500

:

are not good, maybe we need video

ads, maybe thought leadership ads.

501

:

I'm just making this stuff up, So,

the question is not necessarily

502

:

to cut spend, the question is

to find the winning channel.

503

:

Combination that'll help you

extract even more growth.

504

:

And the only way to do that is

to constantly be experimenting.

505

:

So summarizing, when you run an MMM,

you get a whole bunch of output.

506

:

Think of those outputs as informing your

experimentation roadmap, and then go

507

:

out and experiment on a monthly cadence.

508

:

, Justin Norris: I want to steel man

your case here because on, on many

509

:

levels, I would, I would love for this

to be like the answer not, not that

510

:

I have a horse in this race, but the,

just the status quo was so bad for

511

:

marketing attribution that it would

be wonderful if this was the solution.

512

:

So It feels like a big company thing.

513

:

It feels like, yeah, all right.

514

:

If I've got millions of dollars to play

around, we're like an extra million

515

:

dollars and spend for an experiment.

516

:

Like, sure.

517

:

But if I'm a smaller company or

even I'm at like a scale upstage

518

:

company, 400 people, 50 million

ARR, I'm picking a fake number.

519

:

It's not the real number.

520

:

Let's say a 5 million budget or

even a 10 million marketing budget.

521

:

You don't necessarily have

that sort of wiggle room.

522

:

So what do

523

:

Pranav Piyush: first off, you're

absolutely right that if you are

524

:

just spending 100, 000 a year,

all of the stuff that I said

525

:

is way too complicated for you.

526

:

And I do not recommend it.

527

:

So this is not the go to methodology.

528

:

For seed stage startups or even series a

startups or, you know, your mom and pop

529

:

store around the corner, that's not it.

530

:

This is meant for when you have lots of

channels and a lot of spend to optimize.

531

:

That's the reality.

532

:

Now, what is the, threshold?

533

:

And, probably somewhere around a

million or two, where you start to

534

:

see that transition from mostly one

channel to now many channels and

535

:

this number will be different for

different types of businesses, right?

536

:

You could have an e commerce store

where all you do is Facebook.

537

:

That's it.

538

:

That's your only distribution method

and you have 10 million in spend

539

:

on Facebook and you don't need to

do attribution modeling because

540

:

that's the only channel you have.

541

:

There is nothing else.

542

:

So, there's a little bit of like just

fake, you know, I'm giving you some like

543

:

lines, but they're not precise lines.

544

:

Now, having said that, if I were,

and Paramark doesn't do advertising

545

:

just yet, we are going to start in

Q4, maybe in Q1, and the fundamental

546

:

way that we're going to do this

is through an incrementality test.

547

:

So here's the fun fact.

548

:

You don't need to have MMM to be

able to do incrementality testing.

549

:

You can run an incrementality

test, a geo test.

550

:

So how does one do it?

551

:

If I'm spending for the first

time ever, I can look at all of

552

:

my traffic today and where that

traffic comes from by geography.

553

:

I don't need any privacy

type of software to do that.

554

:

That's just IP address matching

with geolocation, Now I can see

555

:

that, Oh, like 30 percent of my

traffic is coming from California.

556

:

The remaining is split across

these five states, New York,

557

:

Texas, Washington, whatever.

558

:

I'm going to run this campaign just in

Texas, and see if my traffic goes up,

559

:

do my demos go up, does my pipeline

go up, based on the location of the

560

:

people who are entering the country.

561

:

The funnel.

562

:

That's it.

563

:

I don't need any complicated

software to do this.

564

:

So the interesting thing is,

everything that I just talked about,

565

:

you don't need software to do it

on your own at a smaller scale.

566

:

You can do it yourself.

567

:

You need basic math skills and basic,

you know, understanding of how to do

568

:

geotargeting in all the ad platforms.

569

:

so that's my answer for

smaller stage companies.

570

:

Justin Norris: That's really interesting.

571

:

is there anything to be said

about confounding variables as we

572

:

record this, you know, terrible

573

:

events with Hurricane

Helene on the east coast.

574

:

of, the U.

575

:

S.

576

:

that's gonna affect demand to some

degree for some period of time.

577

:

How do you adjust for

that in these models?

578

:

Pranav Piyush: listen, I think the,

for smaller businesses, when you're

579

:

just starting, you literally will know

what's happening in your test market.

580

:

you're not going to set it and then

not look at the news for the next six

581

:

weeks, Right?

582

:

so you can always kind of reset if

you run into any issues that have a

583

:

potentially negative effect on your test.

584

:

This happens all the time.

585

:

Even in A B testing, you

launch an A B test, like.

586

:

Oh shit, there's a bug

in our test version.

587

:

Okay, we're gonna have to fix that

bug and then relaunch the test.

588

:

So, that's a very acceptable

answer to that question.

589

:

It gets a little bit more interesting

when you're at a large scale.

590

:

When you're at a large scale, you may

not have your eyes on every single DMA,

591

:

and every single state and location.

592

:

That's impossible.

593

:

And so the art is in constructing your

test and control groups in a way that you

594

:

minimize for the confounding variables.

595

:

So you're not just looking at

one whole, one DMA, one city.

596

:

You're looking at a

collection of test states.

597

:

You're grouping them together

that sometimes will avoid the

598

:

noise that might come from like

location specific variables.

599

:

Even having said that, if you were

doing this test in March of:

600

:

I would probably not look

at the results of that test.

601

:

So the point that I'm making there

is you always have to use judgment.

602

:

That's why us humans will have jobs for

a very, very long time is you're applying

603

:

human judgment on top of the data and not

letting the data do the judgment for you.

604

:

And that's how I think about, you know,

confounding variables is you have to

605

:

have a good hypothesis of what else

could have happened that impacted this

606

:

result that doesn't pass your intuition.

607

:

Justin Norris: That makes a lot of sense.

608

:

I want to bring it back.

609

:

to the original posts that

got us chatting on this topic.

610

:

I was sharing.

611

:

it was kind of like a

history of one opportunity.

612

:

One opportunity.

613

:

Look at all these touch points.

614

:

This is interesting.

615

:

And I can't even remember what

exactly I was saying about it, but

616

:

I think I was saying, you know,

this isn't representative, but

617

:

it's still kind of interesting.

618

:

It's useful as a

communication tool for sales.

619

:

it's 1 opportunity.

620

:

It's not statistically valid now,

let me ask you, am I deluding

621

:

myself when I look at that and

be like, Oh, this is interesting.

622

:

Like, is it just irrelevant?

623

:

Should we ignore it?

624

:

What's your take on these like very local

datasets and are they worth looking at?

625

:

Pranav Piyush: It's a great question.

626

:

humans are interesting creatures, right?

627

:

we're visual in nature.

628

:

We like to see things.

629

:

We like to be able to touch things.

630

:

We like to be able to feel things.

631

:

And there's a natural tendency

for us to do the same thing

632

:

when it comes to analytics.

633

:

If it's not on a, chart, it's really

hard for us to visualize, right?

634

:

Which is why MMM struggled so

much because there's such like.

635

:

Probabilistic statistical sort of type of

things that it's people's intuitive mental

636

:

models don't map to that way of thinking.

637

:

so, that's what's going on in your

brain when you see that laid out very

638

:

neatly on a chart that this thing

happened, this thing happened, this

639

:

thing happened, this thing happened,

and it gives you a sense of comfort,

640

:

I know that this is what happened.

641

:

So.

642

:

are you deluding yourself?

643

:

Maybe that's a strong word.

644

:

what I would say is it's I don't

think that it adds anything to your

645

:

reality is what I'm going to say.

646

:

think about the additional

information that you got out of

647

:

that that you didn't already know.

648

:

So my way of thinking about this

is if you are looking at that data,

649

:

you're doing that to understand

the behavioral journey of somebody.

650

:

And that's literally like, this is

what I did, and then this is what

651

:

I did, and then this is what I did.

652

:

If you view it from that lens,

it's perfectly reasonable.

653

:

Hey, we understand our customer's

journey through the buying process.

654

:

Here's the typical things that are

involved in the behavior journey.

655

:

But does that mean that

there is cause and effect?

656

:

Probably not.

657

:

And those two things are

very distinct things.

658

:

And you have to just have an

intellectually honest conversation

659

:

about what are you actually looking at.

660

:

Justin Norris: I think that's reasonable.

661

:

the way you put it about the

way people process information.

662

:

William Carlos Williams, a modernist

poet from the United States.

663

:

He had this famous expression,

no ideas, but in things.

664

:

and I, I often think about that because

I actually have a lot of trouble dealing

665

:

with like mathematical abstractions.

666

:

Uh, I really find that my

insights and my understanding

667

:

come from concrete particulars.

668

:

So.

669

:

Would it be valid to say, I'm looking at

this journey, obviously there's nothing

670

:

even correlative about one opportunity's

journey, let alone causative, but I say,

671

:

oh look, they're like, before they buy,

all these people started attending these

672

:

workshops we were having, maybe there's

something to that, and like what you

673

:

said for MMM, it could inform a larger

scale experiment, is that an okay?

674

:

Way in your opinion to look at it.

675

:

Pranav Piyush: Absolutely.

676

:

And I would also look at the individual

things that are happening in that journey.

677

:

Right?

678

:

So, the one that you described, workshops.

679

:

Workshops, webinars, events.

680

:

These are real things that

are happening in the world.

681

:

As opposed to, I sent an email

to my entire email database.

682

:

And email showed up as a touch point in

that journey, so you have to apply a human

683

:

judgment when you're evaluating these

journeys of like, what is really going?

684

:

And I say this to many marketers,

if you're not talking to your

685

:

audience on a daily basis.

686

:

What are you even doing?

687

:

Right?

688

:

You can't call yourself a marketer.

689

:

So, the other part of this is like, let's

break out of our charts and visuals and

690

:

let's go talk to actual human beings.

691

:

That's going to tell you a lot more

about whether it was the email or

692

:

the workshop that got them excited

to engage with your buying process.

693

:

Justin Norris: So even those

conversations, it's another example

694

:

of something that it doesn't really

scale mathematically, but it's very

695

:

rich in terms of giving ideas, get

like it feeds the sort of intuitive,

696

:

emotional side of your brain, you

could say for lack of a better word.

697

:

Pranav Piyush: Totally.

698

:

And I go back to the PNG example, right?

699

:

So tying it all the way back, MMM

started with this partnership between,

700

:

um, operators and academics at P.

701

:

N.

702

:

G.

703

:

Guess what?

704

:

P.

705

:

N.

706

:

G.

707

:

Was also a pioneer in how

you do customer research.

708

:

They were spending time in people's

homes, understanding how they used P.

709

:

N.

710

:

G.

711

:

Products, and they still, to

this day, have a huge team

712

:

of people who just does that.

713

:

So you can have both methodologies to

understand the impact of your products

714

:

and your customers lives and how they

make decisions at an individual level,

715

:

but then really get to understand them and

talk to them and observe them and their

716

:

reality and also look at the aggregate

impact of your marketing strategies on

717

:

buying behaviors and your sales metrics.

718

:

It's not an either or it's a both.

719

:

so great line of questioning

720

:

Justin Norris: So turning to

incrementality again, you've,

721

:

you've mentioned it a few times.

722

:

I see some vendors using

incrementality probably in ways

723

:

that you would disapprove of, I'm

sure that I've used incrementality.

724

:

probably in ways that, that you

would disapprove of in terms of,

725

:

communicating about it internally and the.

726

:

The example I wrote down, which I think

was the same, uh, maybe the same example

727

:

from your blog post, all right, two

charts, people that viewed a blog content

728

:

and people that didn't, look, the people

that viewed blog content have a greater

729

:

chance, greater likelihood of converting,

like, that's kind of seems reasonable,

730

:

it's better than multi touch, we're not

just saying, like, because they viewed a

731

:

blog, we're giving it some credit, which

is probably the most reductive, we're at

732

:

least dividing them into groups, why is

this, not useful, let's say, not valid?

733

:

Pranav Piyush: Yeah, it's a great question

in that analysis you are comparing

734

:

two cohorts or two groups both groups

had the option of Viewing the content

735

:

or not viewing the content one group

chose to view the content one group

736

:

chose not to view the content So these

are very different Different groups

737

:

by themselves, and therein lies the

challenge of claiming incrementality.

738

:

If you take a step back and

think about the concept of

739

:

incrementality testing, right?

740

:

We talked about test and control groups

for test and control groups for RCTs.

741

:

Your groups have to be identical.

742

:

So in an RCT world, in an incrementality

testing world, you would have a test group

743

:

where it's made up of both people who are

viewing the blog and not viewing the blog

744

:

and a control group where there is no blog

745

:

because you're testing the effect

of the existence of the blog.

746

:

In getting to the conversion,

747

:

Justin Norris: Not just did they,

did they choose to do it because

748

:

already they've self selected into,

749

:

it's like

750

:

Pranav Piyush: bias is exactly

the term, so that's precisely it.

751

:

Anytime you have a question of

incrementality, the immediate question

752

:

you have to have is what is the control

and the control has to be the non

753

:

existence of the marketing or the media

or the asset that is in the test bucket.

754

:

Otherwise, it wasn't a

causal analysis at all.

755

:

Justin Norris: and still on the subject

of incrementality, I saw something,

756

:

on LinkedIn the other day, which I

actually found disturbing on some level.

757

:

So I'd like you to comment on it.

758

:

It was the comment that, or let's

say you spend, you bid on AdWords,

759

:

you acquire a customer that way.

760

:

We're not saying that AdWords was the only

thing responsible for that, but at least

761

:

like common sense that you would feel,

I'm going to give AdWords a little bit

762

:

of credit and then someone in the, in the

comment, again, I can't remember who's

763

:

like, actually there could have been no

incrementality whatsoever from doing that.

764

:

And in the sense that spending

that money actually gave you some

765

:

sort of incremental lift that you

weren't otherwise going to get.

766

:

And that's disturbing, I think, because

if you can't rely on the fact that this

767

:

person came, we have a trackable touch

point here that came in from this channel.

768

:

If we can't rely on that to say that at

least We're getting something from this.

769

:

what can we rely on?

770

:

What can we trust?

771

:

So, you know, it undermines a

lot of the ways of thinking.

772

:

In other words, that I think are

very standard, very accepted.

773

:

tell us about incrementality

from that point of view.

774

:

Pranav Piyush: Yeah, so I'm going

to play it back to you, right?

775

:

So the example is somebody bids

on an AdWord, somebody clicks on

776

:

that AdWord, comes and converts

on your, you know, website.

777

:

Is that incremental?

778

:

Is it not incremental?

779

:

Justin Norris: Yeah.

780

:

Pranav Piyush: I think it's the

devil's in the details a little bit.

781

:

I'll give you two examples, and I

can prove incrementality or lack

782

:

thereof in both examples, okay?

783

:

If that AdWord purchase was for

a branded keyword, And nobody

784

:

else was bidding on that keyword.

785

:

It's actually probably not

incremental at all, right?

786

:

Because there's no

competition for that keyword.

787

:

And if you hadn't bid on that keyword,

your organic search result would

788

:

probably have taken the top result.

789

:

Now, here's the caveat.

790

:

Maybe your organic is awful because

you just are new and you haven't done

791

:

anything and you know that your organic

result is on page three, then obviously

792

:

that AdWord click is incremental.

793

:

There's no way somebody

is going to page three.

794

:

So the devils in the details,

unfortunately, you can't have

795

:

this conversation on LinkedIn post

without all the additional context.

796

:

Now let's take another example, right?

797

:

If it's a non branded search keyword,

It's a long tail keyword, non branded

798

:

has nothing to do with your brand.

799

:

I would argue it's very hard.

800

:

To not be incremental in that bucket.

801

:

Again, the same conversation

would apply, right?

802

:

For it to have been non incremental

means that that person would have

803

:

done a non branded search, found

your organic page, click through.

804

:

And then bought something.

805

:

So your organic has to be a plus and

generally speaking, that's like the

806

:

chances of that happening are pretty low.

807

:

So you have to view these

things from the subjective lens.

808

:

Now, here's my argument.

809

:

Run that as an incrementality

test, and you will know the answer.

810

:

It's not that hard.

811

:

If you limit your AdWord by to a

certain geography, and you can literally

812

:

compare and contrast that with a test

geography and a control geography,

813

:

and you can see the incrementality of

whatever strategy that you're employing.

814

:

If you're in doubt.

815

:

If you're not in doubt, and you

can have a reasonable conversation

816

:

like this, you should be able

to get to that pretty quickly.

817

:

Justin Norris: That answer seems

commonsensical, at least, and I

818

:

think, in part, I'm probably being

unfair as I'm asking you to justify

819

:

somebody else's comment, which doesn't

make a lot of sense, but perceiving,

820

:

I suppose, we're chatting a little

bit before the show, just about how

821

:

MMM and sort of related comments

are becoming much more mainstream.

822

:

I suppose they already do.

823

:

We're, mainstream for decades, as

you were saying, but sort of unknown,

824

:

like running along this parallel

track to the world of B2B SaaS, all

825

:

of a sudden, they start poking in.

826

:

And even over the last six months, I see

more and more other podcasts, more and

827

:

more people popping up, more vendors.

828

:

So it's really interesting time.

829

:

And, in a lot of these discussions,

MMM and brand, um, The notion of

830

:

brand seemed to be fellow travelers.

831

:

I don't suggest the

832

:

same thing, but there's a, on

the one hand, we have sort of

833

:

like multi touch attribution,

trackable short term activation.

834

:

On the other hand, we have

like brand long term and MMM.

835

:

And so that's why I sort of

correlate those things together.

836

:

And I think it's really interesting

conversation is for the longest time

837

:

brand spend, maybe it was either

derided or people were afraid to do

838

:

it because you couldn't justify it.

839

:

And now there's like a lot of

people coming out, or at least a

840

:

small group of vocal people coming

out and saying unapologetically,

841

:

brand is super important.

842

:

We really need to do this.

843

:

People are forming the consideration

set before they even search.

844

:

And maybe that's where that notion of is

the search even incremental if they've

845

:

already decided who they want to buy.

846

:

so maybe just talk about brand.

847

:

How do you view brand and how does

it relate to the MMM discussion?

848

:

Pranav Piyush: brand is another one of

those words that has been maligned a lot.

849

:

and the first conversation I have

about brand is what is brand?

850

:

What is branding?

851

:

And what is brand marketing?

852

:

These are three completely

different things.

853

:

And we conflate them.

854

:

So when people, you know, ask me

about brand is like, which version

855

:

of brand are you talking about?

856

:

Are you talking about brand as in the

concept of brand, which is to me, the

857

:

perception that an audience of people

has about your product or service.

858

:

It's merely a perception.

859

:

or are you talking about brand marketing,

which again, I hate as a term because it,

860

:

way people think about brand marketing is,

oh, it's just harder to measure marketing

861

:

and I'll call it brand marketing, And so

to me, brand marketing is no such thing.

862

:

Every piece of marketing performs.

863

:

You just didn't have a way

of measuring it in the past.

864

:

And so what MMMs and incrementality

testing have done have made it

865

:

possible for you to measure any

type of marketing, whether, you

866

:

know, you call it brand marketing or

performance marketing or what have you.

867

:

It doesn't matter to me if it's a

billboard, if it's TV, if it's radio, if

868

:

it's, you know, a feel good, hilarious ad

on YouTube, whatever it is can be measured

869

:

through incrementality testing or MMM.

870

:

Now, if you flip the script and

say, Oh, but I want to understand

871

:

the impact of this type of

marketing on my brand perception.

872

:

Now, that's a very different

thing because you're talking about

873

:

peeking into somebody's brain

and quantifying their perception.

874

:

This is an incredibly hard thing to

do, and it's almost not worth doing.

875

:

So until Elon Musk commercializes

Neuralink, and we all have direct access

876

:

to each other's brains, I would recommend

not trying to measure brand perception.

877

:

and there are better things to

measure than that to understand

878

:

the impact on the long term.

879

:

So, again, not a satisfying answer, but

if you're talking about brand marketing,

880

:

It's a myth that you can't measure it.

881

:

You can absolutely measure it.

882

:

If you're talking about brand

perception, brand awareness, it

883

:

gets a little bit more finicky.

884

:

And I think it's very,

very hard to do well, a

885

:

Justin Norris: On the subject of brand

based on your experience in this field

886

:

and working with your clients, as a

marketer, you know, I'm in operations.

887

:

I work adjacent to marketers.

888

:

You feel trapped.

889

:

You feel like, all right, AdWords

is safe, directly attributable,

890

:

direct response channels are safe.

891

:

I can justify them.

892

:

As soon as I go into brand,

it's very, very difficult.

893

:

I don't have an MMM today.

894

:

and yet if you don't invest in

brand, what you find is everyone's

895

:

going to fish at the same spots.

896

:

There's the, whatever, 5 percent

of people that are in market.

897

:

Everyone's competing, throwing money

and you can't scale those channels

898

:

all of a sudden they're like, all

right, we're going to increase

899

:

your targets by 20 percent and

here's another 20 percent budget.

900

:

You know, I can't turn

this tap open any wider.

901

:

There's no more water coming out.

902

:

So you need, you do.

903

:

I believe strongly you do need

to do something, whether you

904

:

call it brand marketing or demand

creation or whatever the terms are.

905

:

I think they're all referring to

a similar thing of trying to reach

906

:

people that aren't in market today,

get in their head so that they want to

907

:

come to you when they are in market.

908

:

How do you perceive the value of that?

909

:

Does that, do you have any data that

suggests that like, yes, this really

910

:

is a very important thing to do.

911

:

So

912

:

Pranav Piyush: hundred percent.

913

:

I may not like the word brand or brand

marketing, but the concept is, a plus

914

:

you have to be able to talk to people who

are not actively searching for a thing

915

:

in your category, because to your point,

that's three or 5 percent of the market.

916

:

For certain SAS categories,

it's even lower.

917

:

So for the remaining 95, 97%,

what are you going to do?

918

:

Wait until they get into market.

919

:

And have already sort of decided and

you're going to be playing from, behind

920

:

the starting line, or do you want to be 10

or 20 percent ahead of the starting line?

921

:

And that's the perfect way

to talk about brand, right?

922

:

Is you could be starting from ahead of the

starting line if you spent enough time and

923

:

energy sort of creating that perception.

924

:

So I think it's incredibly important.

925

:

I think more people need to figure

out a way to measure those things.

926

:

It's not that hard.

927

:

Frankly, you know, the one thing that

I would say is if you can get a great

928

:

analyst on your team that is not being

pulled in:

929

:

the same time, it can be a game changer.

930

:

Or, there's so many consultants and

vendors out there in the ecosystem,

931

:

wink, wink, that you should really have

somebody who can be an extension to your,

932

:

you know, marketing analyst, marketing

operations team, that's giving you

933

:

that, firepower to go test new channels.

934

:

That you don't have the internal sort

of infrastructure to be able to do.

935

:

And if you can make the case

to go build that internal

936

:

infrastructure, hell yes, go for it.

937

:

But I find that that's a much harder.

938

:

You know, hill to climb than getting

an external solution just because

939

:

hiring people is significantly

harder in this new environment.

940

:

Justin Norris: there are a lot

of different vendors popping

941

:

up into the space right now.

942

:

you're one of them.

943

:

Like, tell us about your

vision for Paramark.

944

:

why is it different?

945

:

what are you doing that's

special in this field?

946

:

Pranav Piyush: I'd say three things.

947

:

One is, I have a former

VP of marketing myself.

948

:

I've been in the hot seat.

949

:

I've had to defend budgets.

950

:

I've had to present to the

board, uh, reported into the CMO.

951

:

I've managed large budgets and I know how

anxiety inducing that week before QBR is.

952

:

When you have to stand up in front

of everybody and talk about how

953

:

marketing is driving the business,

but you yourself are a little bit

954

:

unsure about how it's exactly going,

and it's not for the lack of trying.

955

:

It's not for the lack of creative.

956

:

It's not for the lack

of strategy and vision.

957

:

It's because your hands are tied

behind your back because you don't

958

:

have the internal infrastructure to

do the measurement that you ought to.

959

:

So that's our whole purpose.

960

:

We are built for CMOs.

961

:

Literally the name Paramount comes

from being on the side of the marketer.

962

:

We're never going to go and

sell to CROs and CEOs and CPOs.

963

:

Our single mission is to

make CMOs successful and.

964

:

Really position them back into the

leadership role that they should

965

:

have always been in through better

measurement, through calmer measurement.

966

:

not a death by a thousand paper cuts.

967

:

So that's our vision.

968

:

I don't get into features

and capabilities.

969

:

Like people can figure

that out on their own.

970

:

Justin Norris: I love

features and capabilities.

971

:

So I'm just going to ask, like, if

I come to you, is it sort of like

972

:

self service, like give me your

data and, and all right, here's your

973

:

bottle, go have fun with it, or.

974

:

Or is it sort of blended with

a, professional service, like

975

:

to help interpret the data,

to help run experiments.

976

:

How are you partnering

with clients in that way?

977

:

Pranav Piyush: Complete white glove.

978

:

We get you up and running

in about four weeks.

979

:

We hook into your warehouse.

980

:

We hook into your ad platforms.

981

:

We hook into spreadsheets.

982

:

You have slack access or team's

access or whatever you use.

983

:

And every two weeks you have a call with

a dedicated customer success rep that is

984

:

literally white glove hand holding you

through the entire process, interpreting

985

:

the data, designing experiments,

recommendations, And obviously you have

986

:

all the data that you can possibly need.

987

:

So dashboards and reports that help you

slice and dice every single campaign,

988

:

every single channel, every single

time frame that you can possibly need.

989

:

Justin Norris: setting realistic

expectations, let's say, are

990

:

people signing up and it's like.

991

:

Oh my gosh.

992

:

I found the way the world makes sense now.

993

:

Like life as a marketer is amazing

now or are there still, and it's

994

:

fine if there are, but are there

still like challenges, ambiguities,

995

:

uncertainties, does all that just go

away or is it still a fact of life?

996

:

Pranav Piyush: Think of it as the

calmness because you know the way to

997

:

get to an answer is running experiments.

998

:

So that confidence of if there

is uncertainty or ambiguity

999

:

Let's go run an experiment.

:

00:52:21,104 --> 00:52:24,124

And that uncertainty will be

removed in about six weeks.

:

00:52:24,724 --> 00:52:30,555

So it's a different way of operating,

which is coming from a, position of

:

00:52:30,575 --> 00:52:35,785

strength and confidence and predictability

and a way of knowing the answer.

:

00:52:35,785 --> 00:52:38,655

Because you have experimentation

as part of the platform.

:

00:52:39,225 --> 00:52:43,585

If you were just doing marketing, mixed

modeling or attribution modeling, which

:

00:52:43,585 --> 00:52:45,045

is much more sort of looking back.

:

00:52:45,970 --> 00:52:46,920

Well, I don't believe it.

:

00:52:47,930 --> 00:52:51,230

How should I believe it and you

get into these like philosophical

:

00:52:51,230 --> 00:52:53,930

debates rather than no, no,

no, let's go run an experiment.

:

00:52:54,550 --> 00:52:57,550

It will be very clear what's

happening at the end of that.

:

00:52:57,906 --> 00:52:58,526

Justin Norris: It's interesting.

:

00:52:58,526 --> 00:53:02,456

I've, uh, and I've, I've seen, you know,

a good handful of vendors in this space.

:

00:53:02,496 --> 00:53:07,286

Now you seem to be emphasizing the

incrementality testing in your go

:

00:53:07,286 --> 00:53:10,186

to market in a different way, or at

least more, much more prominently than

:

00:53:10,186 --> 00:53:11,476

others are, and there is something.

:

00:53:11,996 --> 00:53:14,806

To your point about like, all right,

I've got this model, but do I believe it?

:

00:53:14,806 --> 00:53:16,996

But the, the testing, it feels active.

:

00:53:17,026 --> 00:53:20,609

It feels like you're, going

on, on offense, so to speak.

:

00:53:20,609 --> 00:53:23,400

And like you said, you're

resolving and puts agency in your

:

00:53:23,400 --> 00:53:25,250

control, which is very comforting.

:

00:53:25,703 --> 00:53:26,213

Pranav Piyush: Exactly.

:

00:53:26,223 --> 00:53:27,473

You can do something about it.

:

00:53:27,493 --> 00:53:29,643

You don't have to sit

there and debate models.

:

00:53:29,893 --> 00:53:30,863

Justin Norris: So let

me just last question.

:

00:53:30,863 --> 00:53:34,320

I'm just curious, you know, your own go

to market, you have an unfair advantage.

:

00:53:34,320 --> 00:53:37,130

You see all the data, you see,

you see what's working in theory.

:

00:53:37,670 --> 00:53:39,630

You know, you should be

able to like, do anything.

:

00:53:39,630 --> 00:53:41,700

Does that, of course I'm being

somewhat facetious, but, how are you

:

00:53:41,700 --> 00:53:44,913

thinking about, how you're getting

the word out about what you're doing?

:

00:53:45,325 --> 00:53:46,882

Pranav Piyush: I say this all the time.

:

00:53:46,962 --> 00:53:47,842

Measurement is.

:

00:53:48,287 --> 00:53:51,227

Is Robin creative is Batman.

:

00:53:52,097 --> 00:53:56,262

So the secret to go to market, I

could be sitting at a trove of data,

:

00:53:56,602 --> 00:54:01,902

but to differentiate and to make

a place in the audience's mind,

:

00:54:01,912 --> 00:54:03,312

you have to have great creative.

:

00:54:03,832 --> 00:54:09,632

So for us up until now, that has been

our organic social activity and having

:

00:54:09,652 --> 00:54:11,492

good high quality conversations.

:

00:54:12,082 --> 00:54:16,902

On social some, you know, warm outbound

and that's been sufficient to get us

:

00:54:16,902 --> 00:54:19,502

from zero to one to go from one to five.

:

00:54:19,502 --> 00:54:22,632

We're probably going to need to do

something different and it's going to

:

00:54:22,642 --> 00:54:28,632

be figuring out new ways of reaching B2B

audiences that are not the cookie cutter.

:

00:54:29,132 --> 00:54:30,902

And that's hard, right?

:

00:54:30,982 --> 00:54:34,422

Everyone is kind of doing the

same thing and we have some tricks

:

00:54:34,422 --> 00:54:37,442

up our sleeve, but you know, I

won't talk about that just yet.

:

00:54:37,452 --> 00:54:38,852

You'll, you'll see it in market.

:

00:54:38,872 --> 00:54:40,892

So hopefully you'll see it in

market and you'll tell me if

:

00:54:40,892 --> 00:54:41,742

you liked it or you didn't.

:

00:54:42,011 --> 00:54:42,541

Justin Norris: that's great.

:

00:54:42,571 --> 00:54:45,631

And I will say it's it's an

exciting time just as an idea.

:

00:54:46,499 --> 00:54:50,344

it makes me happy to see, what feels

like more substantial conversations.

:

00:54:50,344 --> 00:54:53,574

And I'm sure there's, there's a lot more

debate to be had around measurement.

:

00:54:53,574 --> 00:54:57,954

It's not just going to resolve itself, but

at least bringing marketing a little bit

:

00:54:57,954 --> 00:55:02,084

further, at least B2B marketing further

along the maturity curve to a place where

:

00:55:02,084 --> 00:55:07,039

it isn't like, Starting from zero where

every day is kind of day zero of how

:

00:55:07,039 --> 00:55:09,699

should we measure this thing, which really

does feel like for the last 10 years.

:

00:55:10,293 --> 00:55:11,063

it's been like that.

:

00:55:11,063 --> 00:55:13,775

So, but thank you so much

for coming on the show.

:

00:55:13,775 --> 00:55:15,015

This was really, really interesting.

:

00:55:15,395 --> 00:55:18,345

wish you well, we'll watch yearly to

see how Paramount does going forward.

:

00:55:18,656 --> 00:55:22,206

Pranav Piyush: Thank you for having

me and great work on this podcast.

:

00:55:22,206 --> 00:55:22,586

It's awesome.

Show artwork for RevOps FM

About the Podcast

RevOps FM
Thinking out loud about RevOps and go-to-market strategy.
This podcast is your weekly masterclass on becoming a better revenue operator. We challenge conventional wisdom and dig into what actually works for building predictable revenue at scale.

For show notes and extra resources, visit https://revops.fm/show

Key topics include: marketing technology, sales technology, marketing operations, sales operations, process optimization, team structure, planning, reporting, forecasting, workflow automation, and GTM strategy.

About your host

Profile picture for Justin Norris

Justin Norris

Justin has over 15 years as a marketing, operations, and GTM professional.

He's worked almost exclusively at startups, including a successful exit. As an operations consultant, he's been a trusted partner to numerous SaaS "unicorns" and Fortune 500s.