Better Together 22 min

The Importance of Data in Content Marketing


Anthony Kennada sits down with Tim Sanders, VP of Research Insights at G2, to talk about why good data is the only sustainable competitive advantage in today's competitive landscape.



0:00

Well, Tim, first of all, congratulations on your new role at G2, VP of Research

0:04

Insights.

0:05

You know, G2 is sitting on such a treasure trove of data.

0:10

Last I pulled up here, 2.5 million verified reviews, 90 million annual visitors

0:15

who are

0:15

researching products on G2.com, and over 160,000 products and services are

0:21

listed.

0:21

I'm curious as you're thinking about approaching this role, can you just speak

0:26

to the power

0:27

of data and insights in original research as a function of a content marketing

0:32

program?

0:32

Well, to quote one of my favorite artists, Wu-Tang Clan,

0:36

data rules everything around me, right? So when you think about what's powering

0:42

artificial

0:42

intelligence, it is data. And this is especially true with the rise of gener

0:47

ative AI, right?

0:48

Because generative AI allows natural language, like what we're doing, to become

0:54

the interface,

0:56

to all of the world's data. However, the outputs are only as good as the data

1:02

is high quality

1:03

and irrelevant to the situation. The other thing I think about when I ponder

1:09

this concept of like,

1:10

how important is data? Because we say it's existential. It's the most important

1:14

thing,

1:14

but I'm going to put it in a different kind of light. So when you think about,

1:18

you know, technology today where AI is becoming always included more than artificial

1:25

intelligence,

1:26

when you think about it, right? What's happened is that for the first time in

1:30

human history,

1:31

technology, software, has decoupled prediction from judgment.

1:36

First time ever, in the past, prediction and judgment happened at the same time

1:44

in a human being called an expert. And they set up, based on my experience,

1:49

I think this is going to happen. So I prescribe this action. And oftentimes we

1:52

would integrate it

1:54

into a firm with a rule of thumb. And the problem with rule of thumb is that

1:57

they have

1:58

invisible expiration dates. You really don't know that the rule of thumb doesn

2:02

't work until you've

2:03

had your third quarter in a row of things falling off a cliff. And then you're

2:06

in a doom loop. So

2:07

what happens is if we let the machines make predictions based on high quality

2:12

data,

2:14

then the humans simply pass judgment. And there is an opportunity now to scale

2:19

personalized decisions for the enterprise, right? Every situation now you're

2:24

going after it with

2:25

code or codified solutions. So it creates an incredibly dynamic organization. I

2:30

think about

2:30

ant financial in China, it's a Harvard business case, fastest growing financial

2:35

services company

2:36

in the world. They got to 500 million users with literally 10% of the employee

2:42

base of Wells Fargo.

2:44

Wow. Because they had decoupled this thing. The analogy I would use here is

2:48

that when you think

2:49

about the data behind ways or Google Maps, it's profound. The quality and the

2:54

quantity of that data.

2:56

Why is that important? Well, I'll give you an example from a recent trip.

2:59

Which the United Kingdom, Love London, always loved my London cab experience.

3:04

There's always

3:07

been a shortage of London cabs. It created huge queue times during rush hour

3:11

and surge. And the

3:12

reason why is the industry thought they had a logistics problem. I've got to

3:16

buy more cabs and

3:17

I've got to recruit more drivers. No, the problem was they had a prediction

3:20

problem.

3:21

You had to go to school for almost four years. It's called the knowledge.

3:26

Before you could be a

3:27

cab driver for London cab because you needed to learn every possible route

3:31

scenario so you never

3:32

had to pull over and look at a map. Because writers hate that. And as a result,

3:37

it created

3:38

a huge crimp in the funnel. So we're like we've seen now with GoToMarket. And

3:42

then along comes

3:43

ways. And along comes Uber who thinks differently and says, "Really, all you

3:47

need is human judgment.

3:48

So let's put an app in their hands." And if they know not to honk and not to

3:51

tailgate,

3:52

not to have music on, not to take calls judgment, then they can be a great

3:56

driver. And now you've

3:57

got a thousand percent more drivers in the United Kingdom. Mobility has been

4:01

restored with

4:02

artificial intelligence. It's just invisible. So my message is data is your

4:08

only mode. For any

4:10

organization, there is no other sustainable competitive advantage. Well, let's

4:15

stay on that

4:16

topic of AI. Well, we're here. You referenced Harvard. You're recently

4:20

appointed executive fellow at

4:21

the Digital Data Design Institute at Harvard. And my understanding is this is a

4:26

function that

4:27

enables researchers to deliver insights that drive the adoption of AI more

4:32

broadly. And I'm

4:33

curious of just, you know, we're here at this Better Together conference

4:36

thinking about the

4:37

application of really we just referenced the mobility kind of exercise that

4:42

what ways is introduced.

4:44

How is AI going to shape these emerging use cases for the GoToMarket functions

4:49

within businesses as well?

4:50

So to pivot from D-cubed, that's what they call it at Harvard, D-cubes research

4:55

has been very

4:56

use case focused. They've also looked at governance. They've looked at AI

4:59

safety. I'm looking a lot

5:02

at packaging use case work they've done to kind of meet their charter, which is

5:07

to democratize

5:08

digital transformation and AI for all of business people. Right. So I think it

5:12

's a fantastic purpose.

5:14

So I get behind it. I'm the flavor of evangelist called explainer. So I'll talk

5:18

about that in a

5:19

minute. So I want to pivot to McKinsey research because this is really

5:23

interesting.

5:24

McKinsey did an analysis at the end of last year and it was on point to gen AI,

5:29

which is the fast

5:29

motion right now in AI. And what they looked at is they looked at what

5:34

functions in an organization

5:36

can you have the highest revenue impact first and second were sales and

5:40

marketing.

5:40

Right. Okay. Now, it doesn't have the highest take out of cost center, highest

5:45

take out of

5:45

cost center is customer operations. Doesn't have quite the revenue impact and

5:49

it's got a lot more

5:50

cost behind it up front. But in the long term, it could take out cost. But go

5:53

back to sales and

5:54

marketing. Incredibly high revenue opportunity, low hanging fruit for an

5:59

organization. Why is that?

6:01

Because when you think about prediction and then you think about automation

6:05

based on prediction,

6:06

letting the machine now take an action on your behalf or following the machine

6:10

's advice.

6:11

It's only as good as your tolerance for hallucinations. Sure. Right. So the

6:15

last thing we'll ever see

6:16

is autonomous driving. The first thing we saw was cadence improvement with

6:21

products,

6:21

right AI, writing assistants like Grammarly, right? Because we have a tolerance

6:25

for it being slightly

6:26

wrong because we can fix it right with the human in the loop. So sales and

6:29

marketing of all organizations

6:32

more than pricing, more than R&D, certainly more than customer operations can

6:36

experiment and they

6:37

can fast fail and they can figure out what that production model looks like to

6:40

take advantage of

6:41

prediction and not collapse the business. So that's why immediately they are

6:47

the functions. Go to

6:48

market where generative AI can produce incredible results and perhaps build a

6:52

moat behind it.

6:54

Yeah. It's super interesting. I have to ask the off script a little bit, but we

6:58

've leveraged,

6:58

we've used this language in B2B or Go to Market for some time of marketing

7:02

automation,

7:03

which almost feels like in the 20 year context in which that term came up,

7:07

mistimed or too early for its time to some extent. But now, as we think about

7:14

predicting

7:15

actions using data, potentially, agentic workflows to take the action on half.

7:21

Covering that. I'm excited about the agentic economy. Yeah.

7:24

Very curious about like, is marketing automation, true marketing automation now

7:27

coming?

7:28

Well, humans, be in the loop, have to be in the loop given some of the lack of

7:34

tolerance for

7:36

hallucination, maybe for at least a high profile prospect or whatever. Just

7:40

curious, where do you

7:41

see the marketing mix headed? So first of all, when you think about the

7:45

disciplines of artificial

7:46

intelligence, there's two. There's machine learning, which is rather mature. It

7:50

is very simple.

7:51

Zeros and ones. That's the way I talk about it. Very simple. Yes, no, this,

7:55

that kind of predictions.

7:57

And then there's natural language processing. The artist known as Gen AI, much

8:01

more complicated,

8:02

much more sexy, because any fool can get value from it. M.L. is a black box. As

8:06

a result, machine

8:08

learning has become the nickelback of artificial intelligence. Even though you

8:11

know it works,

8:12

you would never tell your friends at a party. It's a private pleasure for data

8:16

scientists.

8:17

Automation right now should really be called marketing velocity, because that's

8:24

what you're

8:25

really talking about. You're talking about increasing the velocity of go-to-

8:29

market motions that capture

8:31

the value of innovations. That's what we're really talking about. Automation is

8:36

a path to

8:37

velocity. The issue though is that there's automation, semi-automation, path to

8:43

automation,

8:44

I think right now with Gen Ritavaya, we're still at path to automation because

8:48

it has a high human

8:49

and the loop requirement. Automation is only a goal in as much as velocity has

8:57

impact on your business.

8:58

You have to ask yourself, faster to market is that an advantage for us. It

9:03

could be.

9:04

The other thing is getting back to this McKinsey research. For sales and

9:08

marketing, what they

9:09

looked really hard at was the idea that when you're selling, and this is

9:12

important for the sellers,

9:14

when you're selling artificial intelligence, you've got two models to sell it

9:19

to the customer.

9:20

You've got the replacement model. They call that the cost model. I'm going to

9:23

bring down your cost

9:24

measured by headcount reduction. Then you have the capacity model where you say

9:30

I'm going to augment your existing team members so they can increase production

9:34

without increasing

9:35

headcount. Very sexy. What they say is organizations that sell based on the

9:40

capacity value proposition

9:42

will create more satisfaction, find more true value, and our terms get better

9:46

renewal and expansion.

9:48

Why? Because we're never satisfied with the substitute. That's the inherent

9:53

flaw in looking

9:54

monolithically at a concept like automation. The idea is that AI is not going

10:00

to take your job

10:00

and kill your company, but humans that are armed with AI that strategically

10:04

focus will

10:05

that take your job and kill your company? Totally. Well, you know, I want to

10:09

just coming back to

10:09

your role at G2. A big part of it, as I understand, is evangelism as well. So

10:14

obviously doing the

10:16

research, but then getting the word out to market, maybe doing conversations

10:19

like this and several

10:20

others. Evangelism as a concept has been around for a while, but it feels like

10:25

over the last few

10:26

years we've seen new terminology around things like creators or influencers

10:31

really start to

10:33

pop up as well. I'm curious, how do you think about evangelism as a function of

10:39

helping brands

10:40

build these relationships with a market? And maybe how does it differentiate

10:44

from some of these other

10:44

emerging functions as well? A creator produces content. An influence who

10:50

produces results.

10:51

The influence, that's the difference. There's an old Chinese proverb,

10:58

if they're not following you, you're just taking a walk, right? You're not a

11:01

leader. So an influencer

11:03

inherently grows their following in a measurable sort of way. They grow the

11:08

experience of people

11:10

taking their advice and putting it into action. So I think a lot about

11:14

connecting with influencers.

11:15

One of the things I'm doing for DQB at Harvard is we're syndicating insights to

11:20

certified digital

11:22

influencers to spread the word beyond what we can do with say Harvard Business

11:26

Review or whatever.

11:28

Let's talk about evangelists. I got interested in this concept maybe 25 years

11:32

ago when I was

11:33

working from Art Cube and Guy Kawasaki, by all accounts, would be your first

11:38

codified chief of

11:39

evangelist for Apple, right? So he was a mobilizer. What he did is he mobilized

11:47

the emotion of design

11:49

excellence and he transferred it from Steve Jobs Mind to the end users where

11:54

they said,

11:55

you know what? Computer's personal computers should be beautiful. They should

12:01

be able to be up and

12:02

running in five minutes and they should be simple and they should almost

12:06

disappear into the fabric

12:08

of our lives instead of being this thing that a Hewlett Packard or Gateway

12:12

sending us. So his

12:13

job is to mobilize people to make decisions based on design and not price. And

12:18

as a result, when you

12:20

think about the effect of Guy Kawasaki and we're here in 2024 to quote Prof G.

12:24

Scott Galloway

12:25

my favorite podcast out there markets, Apple has become the number one luxury

12:31

retailer in the world

12:32

who now sells scarcity, right? We never think about the price of an iPhone. It

12:37

is ridiculously high

12:39

because we think about the design opportunity, right? So when I fast forward

12:44

because that was

12:44

exciting to me, I fast forward just a few years when Yahoo buys broadcast.com

12:49

and I go to Silicon

12:50

Valley. I work in the value lab, which is kind of what I'm doing at G2. We're

12:54

converting data into

12:56

actionable insights to increase leverage for the sales team and get clients and

13:01

keep them.

13:02

I really had my eye on being the next Guy Kawasaki and I got very lucky because

13:07

our CEO at the time

13:08

felt like we had a what you would call a go-to-market problem. No one believed

13:13

in digital advertising.

13:14

The trope was banner ads didn't work and all of a sudden market is say we're

13:19

going to pay only

13:19

on clicks. Well, we ship impressions and that should be rewarded but there was

13:23

a real problem in the

13:24

market associating us with the kind of impact of television printer radio. So

13:31

now we had to sell

13:32

on clicks which could take profit out of the model 80, 90 percent. So the job

13:36

of ValueLab

13:37

was to create data but what we lacked was something public facing to make it

13:42

simple for people to

13:43

understand. So I became Chief Solutions Officer which was a proxy for Chief

13:47

Evangelist and my job

13:49

is to get in front of all of our significant quality opportunities and serve as

13:53

what I call an

13:54

explainer. Okay, so there's two kinds of evangelists right now. And I'm just

13:59

making this up. There's

14:00

what I can tell you about. There's the explainer that takes the mystery out of

14:04

something and

14:05

democratizes it for the listener. The listener feels empowered and the listener

14:10

usually becomes

14:10

an evangelist. Okay, then there's the mobilizer. Mobilizer connect and this is

14:15

really based on the

14:16

corporate executive board research from Challenger customer. The fantastic

14:20

follow-up to challenge

14:20

the sale. The mobilizer knows how to get people within their organization to

14:26

embrace change and

14:27

they have the gravitas to unseat incumbents. Okay, mobile hours are powerful.

14:32

They don't just

14:32

have influence. They have power, right? So that's a different kind of evangel

14:37

ist because that

14:38

evangelist, their job isn't to democratize the product and the technology and

14:43

the innovation

14:44

because the problem for them is in mystery. Their job is to do a change

14:47

management because there's

14:49

some kind of scaffolding at the customer level that's inhibiting utilization

14:53

which shows up with

14:54

where's the value. Right. You know, because I believe right now for software

14:59

the biggest problem

15:00

when it comes to renewal or perceived value is that we don't as sellers take

15:06

ownership for

15:07

utilization. Right. We put it on them. Yeah. Right. We do a couple of cursory

15:12

moves. So right now the

15:13

mobilizer's job is to convince everyone in the organization through the

15:18

acquisition of mobilizers

15:19

and they become evangelizers too. Change is critical. There's a burning

15:23

platform. Change is easier than

15:26

you think with rapid collaboration and change is going to be your competitive

15:31

advantage because

15:32

once you learn how to be agile and adaptive learning effects, think of it like

15:36

network effects on

15:37

steroids, learning effects will create a win for you that no one can catch up

15:42

with. So those are the

15:44

two types and they approach different problem spaces based on the enterprise.

15:47

You don't have to be

15:48

both. Yeah. But I found in the last year is that if you're selling artificial

15:53

intelligence,

15:54

especially you're not selling Nickelback, you're selling this new kind of AI as

15:58

a higher cost

15:58

structure, probably a higher on ramp that you have to suffer through and you

16:02

have to have a higher

16:03

tolerance for mistakes because it's not as clean as ML. Right. If you're

16:07

selling that, I think you

16:08

need to be an explainer. Interesting. Okay. Right. So I find that when it comes

16:12

to getting people to

16:13

the table with their checkbook and CFO right next to them, just democratizing

16:20

that AI is just a

16:21

prediction machine. Yeah. And that's all it does. It takes information that you

16:26

currently have and

16:28

produces data that you don't have. Yeah. Just that bit can unlock C suite who

16:35

currently see it as a

16:36

threat, see it as a bad right, see it as something that is currently overpriced

16:41

. We're going to wait

16:42

for the bubble to break all of those really come from that internal ego pushing

16:47

back and says,

16:48

I don't get it and I'm a smart person and that bothers the crap out of me.

16:51

Right. So

16:53

that's what I think the most important motion is for AI. However, what I think

16:57

a little bit

16:58

downstream and I look at at software solutions that you define as being in

17:05

competition with the

17:06

status quo in that situation, I think you need a mobilizer. Gotcha. Right. So I

17:11

think you need to

17:12

accordingly authors of a challenger, customer, Matt Dixon in particular, you

17:17

need a dog whistle

17:18

that brings the mobilizers out and that's where content marketing, getting

17:22

connected with revenue

17:23

organization and product and success where all of those functions coming

17:27

together can

17:28

absolutely bring mobilizers out so we can create relationships with them. We

17:33

can enable them through

17:34

a variety of different tools to go back home and sell, especially to all the

17:38

decision makers we

17:40

can't round up and talk to. So I think you have to make a decision what your

17:43

problem is.

17:44

Are you selling something that's just new and confusing or are you selling

17:49

something that

17:50

requires some internal change and behavior for them to utilize and realize

17:57

value and then you should

17:59

go pick your evangelizers wisely. Right. That's so good. Well, look, we're one

18:04

time for one last

18:05

question here. We're here at the Better Together event and I'm curious.

18:09

Obviously, you're ramping

18:10

up in the AG2 in this new role. We're even at Harvard or as you kind of think

18:14

about the work

18:15

required to educate, to build the insights and so on. What are two kind of

18:21

tools that you leverage

18:23

to technologies, to services that work better together in your own work?

18:26

You know, I have to say one of the most powerful collaborative tools that I use

18:34

anywhere is Miro. I love that. Right. And it's because I've learned it. Right.

18:38

You can use a

18:39

variety of different tools, but I like Miro. And the reason why is because it

18:42

helps us kind of

18:42

the prototype and visualize concepts and kind of bring them to life for people

18:46

because we're

18:47

very visually oriented. One of my old mentors, Tom Peters, he wrote a great

18:51

book called 40 Years

18:52

ago Called In Search Of Excellence. You said the value of a prototype is that

18:55

someone can point at

18:57

it and say that's not it. Oh, I understand. Right. It solves this ambiguity of

19:00

like you and I just

19:01

sitting around talking. Right. So I really like Miro as a tool that I'm going

19:06

to use all the time

19:07

and I can't live without my slack. Yeah. Right. I mean, slack's like so much

19:12

better than traditional

19:13

messaging products. Slack is so much more collaborative than asynchronous email

19:18

, etc. So I started to

19:21

try to find more and more innovative ways to use slack. I've also tried to

19:26

become much more conscious

19:27

of not reaching out to people on slack during times I should know better. Right

19:33

. So if they

19:33

don't have calendar on, if I'm in a slacksum, but I literally take a 10 second

19:37

journey to Google

19:38

Cal to see where they are. And I've learned that if you know when to get them

19:42

live and once you have

19:43

them live, you can clearly pursue the next decision, slacks a killer

19:48

application right now. I love

19:51

that. I do want to say one thing before you forget because, you know, people

19:54

ask me all the time,

19:55

like, what have you figured out, like, in the last few years that that caused

19:59

you to leave your

19:59

career at Upwork after five successful years and go to G2 and it was this

20:05

reality. I read a book

20:07

several years ago by Daniel Kahneman, a great researcher on statistics and data

20:13

and all that

20:13

type of thing. He just recently passed away. He wrote a book called Noise that

20:17

everyone watching

20:18

should read. He talked about the idea that you should look at data like oil. It

20:23

has grades.

20:25

Right. And crude oil is not really usable by an enterprise. I think that's very

20:29

interesting.

20:30

So what I began to look at a few years ago is the concept of data bias. So data

20:35

is only as good

20:36

as the context in which it was collected. Oh, interesting. Okay. So what I

20:41

figured out, and this

20:42

caused me to be like, I got to go to work for G2, a customer's first party

20:46

voice of the customer

20:47

data is so biased and noisy, it would actually decrease your sales if you

20:51

completely relied on it.

20:52

Because it's the context for that gathering is usually perceived by the

20:57

customers of business

20:58

development conversation. The most biased data in the world is data from a

21:02

buyer before a renewal.

21:03

Like, how are we doing? Well, I'll tell you how we're doing because I want to

21:05

discount,

21:06

I'm getting ready to turn. I've learned that the G2 data, the second party

21:10

verified, well-structured

21:12

data is rocket fuel, especially for training a large language model via buyer

21:20

intent, market

21:21

intelligence, or just informing decision making. So to me, this idea that

21:27

second party data is

21:28

completely better than third party data, obviously, because the context is

21:32

terrible there. We're

21:34

literally making stew a first party data. And that's an insight because when I

21:38

talk to organizations,

21:39

they always went, we never thought of it that way because they're so precious

21:43

about what they

21:43

collect. So that's the reason I joined G2 because they're the leaders in the

21:48

world by far, much more

21:49

than Gartner at capturing high quality, well-structured second party data

21:53

because it's in their DNA from

21:55

GoDard all the way down to the rest of the org. Well, congrats again on being a

22:00

part of that team

22:01

and appreciate you being on the show and looking forward to seeing all the

22:05

great things to come

22:05

out of the research arm. Thank you so much, man. Nice to meet you. Thank you.

22:15

[BLANK_AUDIO]