The SEO Podcast: Page 2 Podcast Hosted by Jon Clark & Joe DeVita

SEO Testing Secrets from SearchPilot’s Will Critchlow 🚀 | Enterprise A/B Testing That Actually Works

Episode Summary

Will Critchlow reveals why instinct isn’t enough anymore and how enterprise brands are using rigorous A/B testing to turn SEO into a measurable, controllable growth engine. From spinning out SearchPilot to navigating the rise of LLM-driven traffic, this conversation will change the way you think about search.

Episode Notes

https://page2pod.com - What if your SEO “best practices” are actually costing you traffic?

In this episode of the Page 2 Podcast, Jon Clark and Joe DeVita sit down with Will Critchlow, CEO and co-founder of SearchPilot and former founder of Distilled, to unpack the evolution of SEO experimentation, the transition from agency to software, and why instinct-driven SEO is no longer enough.

Will shares the behind-the-scenes story of spinning SearchPilot out of Distilled just before the pandemic, how enterprise brands are running statistically rigorous SEO A/B tests across thousands of pages, and why even “obvious” fixes (like breadcrumb schema updates) can reduce traffic.

We also dive deep into how SEO testing is changing in the age of LLMs. With ChatGPT and other AI tools emerging as traffic sources, traditional attribution models are breaking down—and experimentation is more critical than ever.

If you care about measurable SEO impact, enterprise experimentation, or navigating the future of search, this episode is packed with insight.

🔬 In This Episode

• Why “SEO best practices” can actually hurt your traffic
• The real story behind spinning SearchPilot out of Distilled
• Why most SEO tests fail—and what makes a strong hypothesis
• How enterprise brands structure scalable SEO A/B testing programs
• The surprising breadcrumb schema test that reduced traffic
• Why title tag tests are high-risk, high-reward
• How LLMs like ChatGPT are becoming measurable traffic sources
• Why net impact matters more than isolated traffic gains
• The shift from services to software—and leadership lessons along the way
• Why focus, not innovation, is often the real competitive advantage

This episode is a masterclass in bringing scientific rigor to SEO and building a controllable performance engine with provable ROI.

If you found this conversation valuable, make sure to Subscribe to the Page 2 Podcast so you never miss an episode.

And we’d love to hear from you—What’s the most surprising SEO test result you’ve ever seen? Drop your thoughts in the comments 👇

🔗 Tools and Resources Mentioned:

• Will Critchlow on LinkedIn → https://www.linkedin.com/in/willcritchlow/
• Will Critchlow on X → https://x.com/willcritchlow
• Will Critchlow on Threads → https://www.threads.com/@willcritchlow
• Will Critchlow on BlueSky → https://bsky.app/profile/willcritchlow.bsky.social
• SearchPilot → https://www.searchpilot.com
• Brainlabs acquisition of Distilled → https://theygotacquired.com/agency/distilled-acquired-by-brainlabs/
• SearchPilot's Testing Distinction → https://www.searchpilot.com/resources/blog/what-is-seo-split-testing
• 75% of SEO Tests are Inconclusive → https://www.searchpilot.com/resources/blog/seo-a/b-testing-realizing-the-value-of-a-good-experimentation-program
• SearchPilot's GEO testing → https://www.searchpilot.com/resources/blog/searchpilot-geo-testing

Episode Transcription

Jon Clark (00:00)

What if your SEO best practices were actually costing you traffic? Will Critchlow has the data to prove it and a platform that tests it at scale. Will is the CEO and co-founder of SearchPilot, the SEO AB testing platform used by some of the largest e-commerce and travel brands in the world. But before SearchPilot, he spent 15 years building Distilled into a global search agency, complete with conferences, training products, and a content engine.

What started as an internal R&D project at Distilled became its own software business spun out just months before the pandemic hit. This episode isn't just about SEO. It's about the shift from services to software and what it takes to bring scientific rigor to a notoriously squishy industry. We talk about how enterprise companies are navigating the complexity of SEO testing, why most tests fail, and what actually makes a good hypothesis. Will explains how testing has evolved in the age of LLMs,

where ChatGPT is now a traffic source worth measuring, and what traditional metrics may no longer tell the full story. There's even a breadcrumb schema test that reduced traffic. Yes, you heard that right. Will built a company based on prioritizing experimentation over instinct. But as he admits, he still has to fight the temptation to chase every shiny object. Focus, it turns out, is the real test. This one hit close to home for me.

As someone who's worked in both agencies and platforms, I really appreciated how candid Will was about the mistakes and mindsets that shaped his second act. If you learned something new today, take a second to subscribe to the Page 2 Podcast, leave us a rating, a review, and let us know what resonated. We'd love to hear your thoughts. Okay, I think you're really going to enjoy this one.

Jon Clark (01:37)

Welcome to episode 105 of the Page 2 Podcast. I'm your host, Jon Clark, and I'm joined as always by my partner at Moving Traffic Media, Joe DeVita. Today we're joined by a 20-year SEO veteran, someone that I've definitely admired throughout my career. He's not only a Cambridge-trained mathematician, but was responsible for running Distilled, a global search agency for over 15 years. He eventually spun that out into SearchPilot in 2020,

Joe (01:46)

Hi.

Jon Clark (02:03)

an enterprise testing platform. And Joe and I are really excited to dig into all things testing today. Will Critchlow, please welcome to the show.

Will Critchlow (02:10)

Great to be here, thanks for having me.

Jon Clark (02:12)

So I have a little bit of a confession to make. So back in 2015, going back in time a bit, Rob Alzby and Chris Hart and I were on a panel and that sort of relationship eventually resulted in a discussion about me potentially joining Distilled and was incredibly flattered but ultimately didn't make the move. But back then that was sort of like

peak distilled, right? That was 60 employees, three offices, SearchLove conferences, and really sort of just leading the charge in everything that we were doing SEO. You sort of also described that period as one of the worst times in your business career. And the only reason I bring that up is because that seemed to be the impetus for the spinning out of distilled ODN,

Jon Clark (02:53)

which I believe if my research is right, eventually became SearchPilot. Can you take us back to that period in time? Like what was the driver behind sort of that R&D effort on the distilled side?

Will Critchlow (02:58)

That's right.

Yeah, I mean, the arc and ebb and flow of things is funny to look back on really. I put off talking about this in public for a long time and you've seen, I think, some of the places where I've started talking about it more recently and

Jon Clark (03:08)

It is,

Will Critchlow (03:18)

The lows were actually a little earlier than that, I think, probably to kind of 2014-ish. So there was a kind of a ton of excitement in the first 10 years of growing things, you know, when we were doing new things all the time, whether it was going to the States, opening up offices over there, whether it was launching the SearchLove Conference series or Distilled Hue. And we ran into a little bit of a wall around that kind of 2014 time. And...

We have I think I've shared this somewhere before, but we had our first ever year where top line revenue was lower than the previous year, which, you know, on the one hand, champagne problems, right? Nice problems to have, you know, we'd had almost a decade straight of growth. But on the other hand, was like really existentially challenging. And, you know, you kind of you have baked into your mindset, I think, as an entrepreneur, that your growth is everything.

Jon Clark (03:53)

All right.

Will Critchlow (04:05)

Obviously it's not, but it's such a kind of, I don't know, core part of that, how you kind of see yourself, I think. So that was part of it. I was struggling with that. Other people in the business were struggling with that. And we didn't have the next exciting thing to, you know, the next vine, if you like, as Charles Unge was kind of swinging through the forest. It was only actually, the time period you're mentioning was kind of as we figured out what that next thing was going to be, which ultimately turned out to be

Jon Clark (04:06)

Right.

Will Critchlow (04:29)

what back then was called ODN, what was now called Search Pilot. And I would say we were on the upswing again by 2016 of excitement, at least. And I was getting reinvigorated, enjoying the new things. Novelty has always been, I think, a big driver for me personally. And that's where all of my energy comes from. So I was feeling, although it was probably a leading indication, things hadn't turned the corner in terms of everybody's mind. I think in my mind, I was back to...

enjoying things and that excitement by that point.

Jon Clark (05:00)

.

Joe (04:57)

Can you dig in a little bit more about that period of time where your role was, it seemed like just changing all the time from leading a service based organization to the, just the beginning stages of what could, what have become an incredible software platform. You've, you decided at some point,

I think I could shift from service to software in those years.

Will Critchlow (05:19)

Yeah, so that mindset shift actually in a sense came a little later, because at the point we're talking about here in the story, we were still building with the Distilled mindset. And we were still, we're building software, but it was software within the context of that business that we'd been building. And it was a division anyway, in fact, so it started as R&D. We were doing this kind we formed a small R&D team as part of the re-energization project, if you like.

Jon Clark (05:31)

Thanks

Will Critchlow (05:46)

My co-founder was now working in that team. Tom Anthony, who's now CTO of SearchPilot was in that team. And they were hunting for ways that software could make distilled better at that time. So I hadn't kind of gone on the journey to, what we are is a product business, what we are as a software business. That hadn't, I mean, it was obviously there in the back of the mind, but at that time we were building the software enabled agency. And so that particular mindset came later and it was in...

was actually early to mid

Jon Clark (06:12)

you

Will Critchlow (06:12)

2019 when we first started talking to Brainlabs about the possible acquisition of Distilled. in the course of that, we can get into the weeds of that story. But in the course of that, it became clear that the way we were going to make this work was by spinning out the software. That's where it then quickly became clear to me that that's where I wanted to be. If we were doing that, that was my future. was also Duncan, my co-founder's future. was also...

Will Critchlow (06:36)

and a whole bunch of the team who'd been working on that division of the business. And yeah, that was actually a fairly short time period. was kind of, guess, six months or so, maybe even less, during which time we're doing due diligence and going through an acquisition. And so there wasn't really a lot of time to think. And we planned, you know, it's okay, we'll do all the thinking after spin out. You know, we'll spin out, we'll be on our own two feet. We'll set this up for everything it's gonna be January 2020,

Jon Clark (06:48)

All

Will Critchlow (06:59)

we do the spin out and we're all set. We're all ready to go for that exciting new year. 2020, I don't think was the year any of us anticipated it was going to be, that's a different story. But yeah, so we kind of, in a sense, combination of opportunity, serendipity, timing and all the rest of it. And before you know it, we're in it and embracing it.

Jon Clark (07:10)

Right.

Sometimes not taking the time to think about those types of big decisions is the best thing to do. You just have to sort of choose what your gut is and just do it. Cause if you sort of set back and start to think through all the possibilities and potential issues, right? You may end up not doing it at all. So Joe and I recently acquired an agency and so we're sort of fresh off of the acquisition I'll say.

Will Critchlow (07:27)

Sure.

Jon Clark (07:42)

and so the nuances of Brainlabs acquisition of Distilled and you sort of spinning out, you know, what could arguably call it a pretty important piece of that agency, right? Like the, the R&D and the testing capability. I imagine that probably impacted maybe the evaluation and obviously not getting into specifics, but was that a hard conversation to have with them about, well, we actually want to keep this thing.

Will Critchlow (07:52)

Mm.

Jon Clark (08:06)

You know, you guys can have everything else, the conferences, the training, all that sort of stuff. Was that a tough negotiating element of that, you know, acquisition?

Will Critchlow (08:15)

Funnily enough, no. I mean, it would have been had... So there was never a deal on the table to buy the whole thing with the software business valued as a software business, right? That was not a thing. The software business was very nascent at that time. It was its own division, but it was small. We had the vision for what it might become.

Jon Clark (08:24)

Got it.

Will Critchlow (08:33)

but was loss making, it was a startup, right? Being incubated. And I think I can talk about this bit publicly. So BrainLabs is private equity backed and the private equity folks were very interested in the bottom line, right? The actual profit that it was making and the growth rate of both the top line and the bottom line. And so they were actually more interested in the profitable agency where you stripped out the

Jon Clark (08:35)

Right.

All

Will Critchlow (08:59)

at best break even, at the time, loss making software startup. So

Jon Clark (09:03)

Sure, that makes sense actually. It's a cost center.

Will Critchlow (09:06)

they actually wanted it without this of founder excursion into product land. We, like Kofran and I, we're kind of the opposite. We were like, well, we've done agency. At one point, they thought there might be this big sticking point where they wanted us to sign a non-compete of like, you guys aren't going to go and start a new agency, you?

Jon Clark (09:10)

All right.

Will Critchlow (09:24)

Don't worry. We need no incentivization to sign. We're not worried about that. We'll sign any agency non-compete you like. We're out of that game, having done it for, as you said, entire 15 years. So interestingly, no, this was the path to getting a deal done. And it was one of those moments where you see the finance guys actually earn their keep.

Jon Clark (09:24)

You

Will Critchlow (09:44)

because weirdly I already knew backers, Brainlabs' backers, because I'd spoken at their portfolio company conference a couple of years prior. And so I knew the partner on the finance side. Dan, the founder of Brainlabs and I were trying to figure out this deal and we couldn't kind of make it work, couldn't figure out how to navigate all the bits. Yeah, it was one of the finance guys who was like, wait, there's a solution here, literally back of an envelope.

He's like, this bit goes here, this bit goes here. We retain this much profit. You take this thing. Money moves like this. And yeah, the kind of financial engineering that was needed to stand the startup on its own two feet, have them take the most bit that they wanted and have all of that be sustainable. And everybody was the bit that they were most excited about. Yeah, it was kind of quite serendipitous in that way. And probably not

I mean, yeah, not short term cash we'd have been able to sell, I guess, the software business or the software valuation, but that wasn't something that we were hunting at that moment in time anyway.

Jon Clark (10:43)

I love these stories. It's just incredible how these points in time, right, like you knowing the finance, financier and, you know, having the relationships with Brainlabs and just, it's just incredible to me the full circle, how all of this stuff comes just through these.

Will Critchlow (10:57)

Well, you

know, know, I know Dan, the Brainlabs founder from Searchlove. So he'd actually spoken at Searchlove, I think, I forget 2013, 2014, something like that. I remember he did the, because he was paid search guy and he'd done the one paid search session at Searchlove London,

Jon Clark (11:01)

Right.

Will Critchlow (11:15)

wearing, if I remember correctly, a Superman outfit because he was talking about, I can't even remember now, something to do with how paid is like supercharges your, the two things work really well together. And yeah, it's, as you say, it's funny where these dots join up and how everything

Jon Clark (11:17)

It's amazing.

Yeah, I love

Joe (11:34)

I'm really interested⁓ in your evolution professionally a... an agency

to running a software business but even when you were running an agency was way more than just running an agency you guys had hundreds of clients you had 60 employees but you had the conference you had the training business you guys were a content machine too with the newsletter I mean you were doing so much at the time do you find I guess is it easier now that you can stay focused on just this one product

Will Critchlow (12:00)

Yes, is the short answer. I am not a very focused person.

Jon Clark (12:01)

You

Will Critchlow (12:03)

I have to work really, really hard on focus. I see the value of it and I appreciate myself when I do it. Certainly historically, certainly in my twenties, I was all over the place. It was like, Ooh, shiny thing. Let's build that. We built Distilled You in a hack week and wrote most of the content on my commute on the train. My contribution.

And other team members were doing similar things around the edges, you know, hammering out a quick module here and there. And it was ludicrous. And Searchlove was kind of similar. We fell into that. It was an opportunity with the Moz folks to run a conference together. So we did that. And then they branded theirs, what became MozCon, and we were rebranding ours. And there's a lot of those years that I look back on, like, how, how did we do that? Why did we do that?

In particular, those years where I mentioned we were expanding to the US. So we opened up our Seattle office in 2010 and New York in 2011. And both my co-founder and I had our first kids in 2010. And I think I might have told this story somewhere before, I remember, so Rob, who you mentioned, Jon, was going to head out to Seattle to open up our first US office.

Jon Clark (13:06)

Mm.

Will Critchlow (13:10)

and he would be, I guess, 26 or something at the time. And we had to get his visa. And to do that at the time, we had to prove that we were really starting a US business. So we had to this whole business plan with the number of people we were gonna hire. And it wasn't just put on paper how many people we were gonna hire. We actually had to have the lease on an office space for that many people that we said, you know, like all this stuff upfront before anything else could happen. And...

I remember I flew year, I guess it would have been. So just between 2009, 2010, at the end of that year, beginning of that following year. And it was like a couple of months before, yep, my first child was June. It was like the latest I could just about pull off traveling at that point. And so I signed a whole bunch of stuff. And then my co-founder, who was a month behind us on the due date, he took another trip out.

three or four weeks after I did. And that was his last trip. And I don't really remember the next six months. Like I remember some of the home stuff and I having a baby and that stuff. The work stuff is a complete blur. I do not remember the summer of 2010. And by the end of that, we were traveling again all this kind of stuff. Anyway, was chaos and it was crazy. And it was fun.

Jon Clark (14:04)

Incredible.

Will Critchlow (14:26)

in your 20s,

early 30s kind of time. But I've worked really hard to do things a little differently at SearchPilot. And that I think is good for me at my age. But also, I think it's good for the business and makes a lot of things so much easier. And actually, we've been really working hard to focus in more and more narrowly on who are our ideal customers? What exactly are we building for them? What exactly is the value proposition

Will Critchlow (14:53)

for the ideal customer? And yeah, so I, it's never been a natural skill, but it's one of those key areas where I like to think of SearchPilot as like doing this entrepreneurial thing for the one and a half time. It's not a completely second time around the block because it's so connected and the team is all connected back. Some of the folks we've been working with for well over a decade now.

Jon Clark (15:05)

Thanks

Will Critchlow (15:14)

longer than technically the company has existed. So it's not a, it wasn't a clean break and it's not a brand new startup, was enough of a new thing to be able to say, well, we're to take these three or four things that we don't feel like we did as well the first time around and fundamentally do them differently. And yet the focus is 100 % part of that.

Joe (15:34)

Can you talk a little bit more about the differences you personally, you led a successful service business, you're leading a successful software business, but some of the things that you did to lead a service business, I imagine didn't help you lead a successful software business. Were there some habits maybe you had to dump to be successful software side?

Will Critchlow (15:57)

first thing that springs to mind actually, I don't know if call it a habit. Well, yeah, maybe an organizational habit, not me personally, but something that I drilled into the

org that I had to undrill into the org, I guess, is in the professional services game. Not everybody will agree with what I'm about to say, but we found our world, in the Distilled era of doing professional services work, that you're not careful,

especially as you grow and it's not just the founders and it's not just, you know, senior team on every project, you can be suckered into doing so much free work, right? you know, people ask for everything during the sales process, just that one more thing, that one more thing. And we had to be quite rigorous with saying, there are some things that you only get when you sign on the dotted line. remember actually from my, before family Distilled, I had a couple of jobs out of university before family Distilled and

It was a big thing in both of those places. It was kind of formational in my professional career, I guess, was always the partner or the senior consultant or whoever it was kind of going, you've got to stop working until they pay, until they sign. there are other ways of doing that. think, especially if you're very small, very boutique, especially if you're a one-man band, you can get away with just building the goodwill by actually kind of doing a lot of the work and then...

figuring out how to bill for it later. But when you're building an agency, you have to be kind of quite rigorous about that stuff. And we've found that in the software world, it's actually, it doesn't really work like that because the value is in the software. They're not getting the software until they sign because legal liabilities and IP, we can't deploy the software until they sign a contract. Everything else we can do.

So we will now onboard folks who are not customers yet into our Slack, for example, and have a shared Slack environment. We're pretty enterprise, right? We're working with large organizations and we will start treating them as customers before they've signed. It's still going through procurement, it's still going through legal, but we will invite MIT, right? We'll have a shared channel. We will start work for them. We'll start ideation. We'll start coming up with test ideas. We might even start building them tests on our platform.

And of course they can't have that value until, they sign. And so that natural gate means that we can be much, much more relaxed about saying, hey, no problem, right? We will start working with the business team, with the SEO team. Now, I will wait for, and that actually helps put the time pressure and the momentum back on the legal and the procurement folks, because now the business is saying, hey, we're ready to go. We set up these five tests and we've got a backlog of 10 more and when can we sign?

And so they start actually pushing for that. But because I spent so many years kind of bullying the team into not doing free work, that was actually quite a big mindset shift in not just me doing it, but the organization I thinking that way.

Jon Clark (18:37)

That's definitely a lesson that Joe and I had to learn as well, especially when it was

just the two of us and we started to grow and you're trying to create revenue however you can to grow and to hire. And sometimes you end up working longer than you should when folks aren't paying and you sort of learn the lessons of that, unfortunately, but yeah, that's a good lesson.

Will Critchlow (18:54)

Yeah, don't know

how transferable back it is, but it's definitely one of those things that kind of shifted for me.

Jon Clark (18:59)

Yeah. So maybe that's a good segue into SEO testing in general brings with it, I guess not always, but oftentimes brings with it some confusion just in terms of how to set up a test. What are you actually doing? What are you serving to Google? and in researching the tool, there's a very distinct difference between how your software runs versus maybe a traditional

A-B test, right? So you're sort of splitting out pages into control and variant groups versus users, right? You're not identifying a user and then serving up a different page. I'm probably explaining that much less eloquently than you would. So if you want to take a stab at just sort of the, you know, the distinct differences between how your software runs and maybe a traditional A-B test.

Will Critchlow (19:21)

Yeah, no, you basically got it. That is the fundamental key difference. And the reason it's important is you're bucketing users, you kind of have to put Google or whatever crawler you're thinking about into one bucket or the other. So they get either a control experience or a variant experience. And so then you've lost the split nature of the test. And the only way to split without cloaking and everything else is to give all users and all bots the same experience, which is some pages are in the control,

Jon Clark (19:54)

brave.

Will Critchlow (20:06)

and some pages are in the variant. And yeah, a lot of the innovation work that we actually do is in improving that what we call bucketing. So where you might just purely randomly bucket users. In fact, you kind of have to, right? You don't know anything about user when they arrive anonymously on your site. So they get randomly put into control of variant. The innovations in SEO testing are a lot about building balanced buckets of control and variant pages. And we can get into the...

Will Critchlow (20:30)

statistical, technical weeds the gist is basically you can build better buckets. And that's where a lot of our data science energy goes. But yeah, the fundamental thing is we want to make server-side changes. We want to make them present for every visitor to that page. So there's no cloaking. It's the same whether it's Googlebot or an LLM bot or a user. And we want to make sure that they're seeing that consistent experience site-wide and there's no duplication.

the crawlers. you

Jon Clark (20:56)

So when you're setting up a test, I think one thing that's always interesting to talk about, at least for

me, maybe not for everyone else, but is the parameters around the test. Like how do you get to a successful result? And I've heard you mention, I think it's like 30,000 organic sessions per month is sort of like the page threshold that you're sort of looking for. Are there other criteria or parameters that are needed?

in order to set up a successful test outside of what you might think of traditional traffic, like is total conversions important or number of pages.

Will Critchlow (21:29)

So number of pages is yeah, conversions is a funny one. We can kind of come back to that, but fundamentally, traffic and pages is the key thing. We have less of a kind of hard and fast rule of thumb because the number of pages varies a little bit depending on the traffic. You can get away with fewer pages if you have enough traffic to them, but you can never drop all the way down. You can't ever do it with just two pages, for example. So we're typically talking about

Jon Clark (21:50)

Right.

Will Critchlow (21:51)

I mean, normally hundreds of pages, if not thousands of pages in each bucket. And as I mentioned, part of the focus thing actually is that we're really working with, we've really narrowed it and focused in on the folks where we're doing our best work. And that tends to be enterprise, retail, some big travel sites, those kinds of things. And so we don't spend a lot of time thinking about these minimal thresholds because most of them are way above that level. But we talk about them a little bit because it's useful to the broader community, the broader audience

Jon Clark (22:14)

Thanks.

Will Critchlow (22:17)

to understand what can I do on a smaller site. We actually have a tool coming out pretty soon. It's in kind of feature at the moment where folks can upload their own traffic data and get told how sensitive a test they might be able to run, how many tests they could run on that site section, how long it would take to get to a result, those kinds of things. but yeah, good rule of thumb, 30,000 organic, a thousand a day is a kind of what I tend to have in the back of my mind. And you need that distributed across a good number of pages. We have definitely run on dozens.

But, you know, hundreds into thousands is better if you can.

Jon Clark (22:46)

Got it. And then as far as like the types of tests that you're running, how do you think about organizing them? Do you always start with something that is very simple or can you onboard a client and jump into something that's maybe a little bit more advanced? How do you bucket the when you're thinking about the opportunities?

Will Critchlow (23:03)

So that really depends how we're working with the organization in question, because we do everything from one end of the spectrum is kind of completely self-service. So just giving folks the software and they go ahead and run those tests. And we will typically do some training, some, you know, help them out, get up and running in the early days. I guess the very first tests, might suggest they do some basic ones, but then from there they can run with that. The other extreme is

And this is one of the lessons that I don't know whether it's a good one or not, but has flowed from the agency professional services days into the way that we run a software business is we also do quite a lot of professional services and it's highly attached to the software revenue. And we can get into the weeds of the business model, but fundamentally there are times where we're actually helping folks out in building their experimentation program. And that can be anything all the way up to coming up with test ideas for them.

And in those cases, again, those ideas can come from anywhere. We've had a good webinar on this recently with folks Wayfair, from Adidas, and sometimes it's from their ideas, their team, the SEO team. Sometimes it's from elsewhere within their organization, whether it's the product team or wherever it might be, the content team. And sometimes it's from an agency. And so there is no hard and fast rule on that. Some folks come to us with a wish list of 10 tests that they're desperate to run. Some of them are like, we don't know where to start.

Will Critchlow (24:19)

Can you help? And yeah, we're often looking for easier quick wins right at the beginning, but that's just to have the program go, well, there's no kind of fundamental SEO reason to need to do that.

Jon Clark (24:28)

So when you bring on a client is, and they say, know, we know we need to run some tests and figure out some incrementality, you know, somewhere, but we're not sure what to run. Is that a situation where you sort of fall back to your SEO background and say, okay, well, let's run a technical audit. We'll identify some low-hanging fruit, use those as our initial test criteria, or is it a different approach? Are you thinking about things, I guess, less...

service oriented and more like business oriented. Does that make sense?

Will Critchlow (24:55)

Yeah, it does. So we don't get fully into the weeds of doing everything we would have done back in the day, full audit and so forth. But we do have a kind of playbook that can help us get them up and running with the testing program. And that could be a combination of

obviously reviewing those pages, some competitor analysis can be quite valuable. So, you know, what do these pages look like on your key competitors? But actually, we rarely find that folks come to us with no ideas. We're almost always, again, because we're working with large orgs, we're pretty much always working with folks who have a strong in-house team. And they obviously have things they want to do. so...

Yeah, again, whether they're their ideas, whether they're from an agency they've been working with or whatever else it might be, typically there are tons of ideas. And actually it occasionally comes up in the sales process, you know, is this something we're only going to need for six months because then we'll run out of test ideas. You know, quite, quite, our experiences, the backlog of test ideas grows faster than your ability to run the tests. And, yeah, that, that's a little bit like saying, well, our website be finished one day and,

Jon Clark (25:45)

Yeah. Okay.

Will Critchlow (25:55)

obviously, it's always evolving because user needs change, markets change,

Google's preferences change, increasingly new startups come along and challenge Google's dominance. So all of that's kind of quite exciting. Yeah, I think just thinking back over the last kind of few that we've onboarded, I would say if anything, they've come to us with a list of things that they've got ideas for. And our help has been more in the prioritization, kind of saying, we think these ones are the more likely to be big

to move the needle, we think these ones are more likely to be easy to build, for example, and obviously looking at that sweet spot of the easy ones that are going to have the big results.

Jon Clark (26:32)

Okay.

Joe (26:30)

Your team is in a unique spot to see the types of tests that succeed and the types of tests that fail. And I think I read recently how often tests fail. They fail more often than they succeed. I think was the gist of something you wrote recently.

Do you have a system internally to review who's producing great tests and who's producing failures to try to teach each other?

I mean, you can then teach the industry, sure, if you want, but you've got such a great, you've got such great information at your fingertips. How do you share it internally?

Will Critchlow (27:04)

Yeah, so we obviously we have to be kind of somewhat cautious to make sure that we do this in a in a very kind of compliant way, right? So some of our customers are very opt-in into wanting to share their information in exchange for receiving shared information back, right? And you'll see the you see the benefits of that on the outside. Some of them are even up for having their test results shared publicly. And we write the case studies on the website and so forth.

Jon Clark (27:14)

If. you

Will Critchlow (27:26)

Other folks are more locked down and kind of say, we don't want this. We will forego some of those insights, but we don't want our data being shared. But it's an area of active exploration for us. I think it is, the answer

is yes, obviously we have this massive database. I think we must now have run more of these tests than anyone else in the world.

The folks who we know who are doing it at scale are doing it on a single website, right? You know, there are some very big orgs who do this in-house without us, but we do it across all these different verticals, all these different sites. And I think our team has probably seen more of this now than

Jon Clark (27:48)

Okay. ⁓

you

Will Critchlow (28:00)

anyone else. But the more exciting thing I think is that the technology is coming on stream to make it possible to analyze in a way that wasn't before, right? AI is giving us tools to analyze these things and tease out those

similarities, differences, those indicators that something might be a weak test, for example. We have seen less of the spotting that a certain individual isn't building good tests or whatever else. I think we've generally had really good knowledge sharing.

obviously, not necessarily everybody's cut out for that work, but the folks who've gone through that onboarding training, development, and have stuck with us, that team has been together for a long time now, have

all been actively learning from each other in ⁓ a very proactive way. What I think we see more of is sometimes a customer will come to us and we'll go to them and say, the win rate isn't where we want it to be on the portfolio test that you've been running. So it tends to be more customer by customer rather than team member by team member. And typically that's not because the ideas aren't there. It's to do with organizational bottlenecks or sign off or,

Jon Clark (28:49)

So,

Will Critchlow (29:01)

some other team saying, you can't change that thing. That's our remit. And so normally where we see those problems, what we're trying to do is break down those barriers between teams, help, you know, for example, convince the product

team that they should be running some above the fold tests on the PDP, the product detail page is in some orgs that that's a hard ask. And yet not doing it dooms the whole program to

maybe not failure, but certainly some kind of averageness, mediocrity, because that's what users see first. And that's the most important real estate on the page. And so I would say, yeah, we are trying to learn from the whole database. But actually, what I think we see as a bigger determinant of success is more organizational. The problems tend to come when there's slow sign off or sign off is not or permission is not granted to run certain kinds of tests that the SEO team know would be

worth running. so it's either everything slows down, right, the cadence is too slow, or there are certain kinds of bits of the page that are off limit, or whatever. And so actually trying to break those silos down has been higher ROI for us and our customers rather than I think, you know, just the test ideas aren't good enough.

Jon Clark (30:11)

I've been a subscriber of your email for a long time. And I think it's one of the emails that I open almost 100 % because I'm always curious like what the test output is going to be. And oftentimes, I think you've talked about this as well. It's different than

what you would anticipate on paper, just being in the industry for a long time. The one that jumps out to me is there was a breadcrumb, there was like a broken breadcrumb schema. And as SEOs, right? Like that is just, you're doing a technical audit. You either don't see that schema implemented or it's broken. You call it out. It's just something that you would naturally go in and fix. You guys did that and actually saw a 7%, I think decrease in traffic.

Are there, like when you see these sort of like best practices.

sort of opposite what you expect. Is it applicable to other sites as well? I think this maybe is a building on Joe's question. Like you have this corpus of test data. Are you finding things that are regularly consistent across all sites or is it really a situation where every site is highly unique and so you have to run basically that same test over that new site to validate whether it's gonna work or not?

Will Critchlow (31:22)

So that specific test you're referring to is a really interesting one. I'll come back to why that turned out to be confounding.

in a second. But to answer the main question first, luckily for us, it turned out that it's mainly the latter, right? You have to rerun the test in your own situation. And that is a combination of the uniqueness of everything else about your website, the competitors you're up against, the space you're in, the vertical, a million different things. And also these things change over time with user preferences, with algorithm updates, with

competitors changing things, something that might be a competitive advantage today might be completely copied in six months time. now you're not even maybe even better going back to how you were before. And so we're often retesting things as well. so, that's, didn't know that was going to be true when we started out. I guess there was a chance and back to the idea that we were building this originally

Will Critchlow (32:12)

within the context of an agency, we thought maybe one of the outcomes would be we would end up with the answers that nobody else had, right? We would know.

Jon Clark (32:12)

All

right.

Will Critchlow (32:19)

better than anybody else what to do for all of our clients. And we would have this like secret weapon and we wouldn't necessarily have to deploy the software for everyone. We would have just figured it out, have all the answers. no, that's not where we ended up. And that's maybe an accident of history. But I think, yeah, we do end up learning some things, but what we learn more than

Jon Clark (32:30)

Yeah.

Will Critchlow (32:39)

this specific hypothesis is gonna win is we learn the kinds of changes that move the needle, but you don't necessarily know which way they're gonna move the needle. And I can give you some examples. So, and we learned the converse. So actually I think one of the things that I've been banging on about internally, we haven't got developed yet, but I've been wondering about is I think AI is gonna help us build

Jon Clark (32:56)

and

Will Critchlow (33:01)

this test is not going to do anything, detector. Right? So

I don't think we can build an AI that will tell you the answer to a test, but I think we might be able to build something that says this isn't going to work. This is just going to be flat, inconclusive, because a combination of the kind of hypothesis, the kind of change to the page is just underpowered, right? Even if this is a good idea, it's not a powerful good idea.

Or it's just the kind of, you're tweaking a thing that we've never seen any evidence of moving the needle. And I've talked about this publicly before, but one example is alt attributes on images. are many, many reasons to put alt attributes on your images. SEO, I don't think is one of them.

We've not seen a test where it's moved to the needle in either And so think we might be able to build a underpowered test idea detector.

But what we have learned is there are certain things. So title tags are the classic example. A title tag test is the most likely test to be plus 20 % and the most likely test to be minus 20%. And that's because it is uniquely powerful, right? It is a very powerful SEO attribute, but it is also a click-through rate attribute. Even with Google rewriting a lot of titles, they don't rewrite everything. They often start from what you got, blah, blah, blah. And so...

it factors into the organic click-through rate as well as the rankings. And so it's an exceptionally powerful And it's one of my kind more shameful admissions is that I have gone back and reread some of our old, some of my old recommendations, some things that I wrote to 2009, let's say. And I found one that we did some keyword research and

we delivered, I think, I still think some pretty good keyword research, some great kind of competitor insights and so forth. But then we did have this one recommendation that was like, and you should think about implementing some of these keywords that we've helped you find in your title tags. And I'm like, now I look at that, I'm like, wow, wow, that is a high variance recommendation that could be plus 50%, minus 50%, right? Depending on what exactly you write. And I should have spotted this because

Jon Clark (34:48)

Correct. you

Will Critchlow (35:04)

back in that era, pre-AI stuff, a huge part of the job of a paid search analyst was writing adverts. And we all knew that one advert could out-compete another by 5x just by being written better, not because it had more keywords or fewer keywords or whatever else, but by being more compelling. And why on earth would we not think the same thing would be true for the organic snippet? So what you write really, really, really matters.

Jon Clark (35:14)

you

Will Critchlow (35:28)

And that was one of our earliest insights. I remember a test that we ran 2017 or 2018 that was just, luckily we got to iterate on it because the first version was catastrophically bad. You're like minus 20 something percent in organic

Jon Clark (35:42)

Wow. .

Will Critchlow (35:42)

traffic. But the idea of changing the title tags was good and we managed to iterate to a plus 18 % or something, if I remember correctly. And it was the same keywords, right? It was the same insight, but the actual implementation

was highly variable. Yeah, I think we've learned powerful things, but not necessarily, as in there are transferable, there's transferable knowledge about what kinds of things might make a big difference. But the knowledge of whether it's

gonna be good for you or not is less transferable, if that makes sense.

Jon Clark (36:10)

Right.

Yeah, that was something that always was incredible to me about the paid side. Like you had all this rich data and you could get it so quickly, right? You just turn it on and you start collecting data. And it was so rarely used in SEO, especially in the early days. That was always such a great opportunity. You mentioned AI a couple of times. And I believe you guys recently rolled out

Will Critchlow (36:13)

Yep.

Jon Clark (36:31)

the ability to do some A-B testing on, I think you used the GEO acronym for ChatGPT.

Talk to us a little bit about that. How are you setting up those tests? What are you looking for? What data are you analyzing? Because we also know that attribution through LLMs is pretty tough at the moment.

Will Critchlow (36:48)

Yeah, so this is one of those areas of focus paying off because I don't have great answers for you if you're in media, for example, but in our core space of e-commerce, there is still an action at the end of it.

Jon Clark (36:53)

You

Will Critchlow (37:02)

Right? ChatGPT isn't going to ship you a pair of sneakers. So there has to be an actual purchase that the retailer knows about. Right. You can't, can't secretly sell a pair of sneakers. And so ⁓ even with some of the new agentic commerce protocol, all that kind of stuff, there's still a conversion. And actually in most cases that it'll be interesting to see where that goes, but in most cases, there's still actually traffic. Right. So there might be less of the top of funnel stuff, but the actual conversion traffic is still coming to your website and still buying from your website. And so what we're focusing on right now,

Jon Clark (37:06)

Okay.

Right.

Will Critchlow (37:29)

is running those experiments focused on

driving LLM traffic to your website. That is working well interestingly actually, it's even better in travel. So we see of the biggest percentage growth, like percentage of traffic coming from LLM sources are in the traffic, in the travel parts of our customer base. And that's kind of getting ahead of some of the retail

folks right now, but it's all changing and all growing so fast. Who knows where that'll be in three months

Jon Clark (37:57)

You

Will Critchlow (37:57)

or six months. But so what we're doing right now is we've got two kinds of approach. One is a blended net traffic approach. And so this is actually my although I think it's going to be less exciting for the outside world. And this is the idea that firstly, a lot of Google search is AI powered. And I don't just mean AI overviews or AI mode.

Even the template link stuff has a lot of what maybe they would have called machine learning, rather than AI over the last few years. And so that whole thing is increasingly opaque, increasingly black box. You've got to, this is why we exist in the fundamental sense is that you can't just approach it with a checklist mindset or a best practices mindset. You have to like stop guessing and start testing on that stuff. And so I talk a lot about net impact

Jon Clark (38:30)

Okay. you

Will Critchlow (38:44)

of the idea that imagine you could come up something that was amazing for your ChatGPT referrals, but created your

Google traffic. You probably don't want to do that right now, right? Maybe one day that's a good idea if some of these trajectories carry on, but for most of our customers, it's 10, 100X more Google traffic. So you can't afford even a 1 % drop in your Google traffic in order to go hunting some of this stuff. And so I think a lot about net impact. And so there we're looking at total traffic to those pages.

Jon Clark (38:49)

you

Will Critchlow (39:13)

We've got some interesting ideas bubbling around at the moment of like, maybe

we even include some of the dark traffic, right? Some of the direct so-called traffic to deep pages, because sure, direct traffic to your homepage might be somebody typing in the whole URL, but direct traffic to a product page, it's not really, it? Somebody clicked on a link that didn't send a referral, right? It's from WhatsApp or it's from Slack or it's from a desktop app or it's from an LLM. And so actually I think maybe there's something there, but.

Jon Clark (39:27)

Right. Totally unlikely, right?

Will Critchlow (39:40)

Let me just back out of that rabbit hole a second. So there's a whole net impact side of things. The stuff that I think is going to be more exciting for the audience at large is where we are running a pure LLM test. So we're saying, okay, let's just look at the traffic from ChatGPT or Perplexity or whatever it might be. And often we're doing a bundled look at that traffic. You know, an analytic segment that combines some of those sources. And...

we're essentially saying the mechanism for that test is that primarily it's attacking the fan out queries. So what is happening for most of these things is when you do something that looks a bit like a search in a conversation, in a ChatGPT prompt or whatever, it's grounding its answer in search results. And it's doing that because the model itself has a cutoff in its training information.

I think as SEO is one of the things that we've learned that Google has learned over the last quarter century is freshness is critical speed of refresh. All that stuff is so important. You remember when query deserves freshness came along and all the new search and all that kind of stuff. So whether in again, now space, it's things like stock levels, it's things like price offers, it's things like all the kind of competitive dynamics of that space.

Jon Clark (40:41)

you

Will Critchlow (40:49)

And so there's no way that you can train right now with current technologies, no way you can train that stuff into your LLM. And actually the trend has been in the other direction. So GPT-5 was

a smarter, but less knowledgeable model than prior iterations. And that was deliberate, apparently. I mean, I'm not on the inside there, but they essentially realized that they can pipe the knowledge in and what they want is intelligence in the core You know, there's no point it being absolutely certain who won the Superbowl when

you know, another Super Bowl comes along and your knowledge about who won the Super Bowl is out of date. know, in product sensors, it's no point having this perfect knowledge of who has a particular sneaker in stock when that's changing day by day, hour by hour, minute by minute. And so all that stuff has to be fresh. And so all of these things are working off grounding on search results. And this is actually why I'm quite bullish on Google. Grounding on search results, grounding on freshness, getting that from the

fan out queries and those fan out queries are amenable to our kind of testing. So we are running the same test or the same implementation and measuring the result in that traffic. And I'm hoping we're just working with some of our customers on the permissions and what we can say about which test, but we've got some test results where we've got differing results. So we've got positive for Google, negative for LLM, positive for LLM, negative for Google.

Will Critchlow (42:08)

I think the industry is going to be quite excited about that when we finally get to write about it, because

if there's one thing that we love, it's a good argument about whether GEO is SEO, whether SEO is dead, and so on.

Jon Clark (42:13)

Definitely. Well, we know SEO is dead. I think the GEO, or sorry, I know we know SEO is not dead. Let me clarify.

Joe (42:21)

Hehehehe

Jon Clark (42:27)

The GEO debate is always a fun one. I feel like we asked that on most podcasts to get people's opinions, but I wanted to talk a little bit more about the AI testing because,

and you touched on this a little bit, the ability for LLMs to find information. One of those critical elements is And so again, thinking about how you guys set up tests generally, there is an element of how the site is built that is dependent on being able to get an outcome. So is there a situation where

trying to think how to ask this question best. Maybe it's two parts. One is, do you test things like UX where you say, let's take this section of the page that's currently rendered in JavaScript and let's try to turn it into HTML and just see what the impact is. So that's sort of like one piece of it. And then if you're doing things like that and someone wants to run a test on the LLM side, is that sort of the initial criteria? Like we need to get these pages out of a heavy JavaScript environment and more into a digestible format

so that the LLMs can get that information a little bit easier and execute the page, find the page, find the content on it. How are you guys thinking about that nuance there?

Will Critchlow (43:32)

Yeah,

I think the short answer is yes and yes. So we definitely have done the first thing of saying, you know, have this content that's in the page, but it's rendered or it's async or whatever. Can we have that in the HTML that's served from the server? And we've seen that make no difference in some situations and we've seen it make a difference in other situations. I personally

don't love JavaScript, but that doesn't really help in the modern world of the web. And so yeah, there are definitely cases where you could at least have the hypothesis that, hey, either for UX reasons, for speed reasons, for crawlability reasons, we're going to experiment with that. The second part of your I think it's even more important when you're looking at non-Google LLMs. ⁓

Jon Clark (44:11)

Great.

Will Critchlow (44:12)

Google is using

its main crawler and indexes JavaScript fine. Although we still sometimes see some benefits, but we can come back to I like to think of it as the supply chain of this information. The supply chain is incredibly convoluted right now. And you can see this in some of the lawsuits that are flying around. Right, people are crawling. Some of these startups are crawling Google.

Will Critchlow (44:33)

Some of them are using other providers who crawl Google. Some of them are using old Bing API that you

Jon Clark (44:39)

you

Will Critchlow (44:40)

can't get your hands on anymore. Some of them are using other people who crawl Bing and actually trying to figure out what is the supply chain? Where did this one piece of information that is on your website get from there to the LLM is very, very convoluted. And just like we're not trying to actually understand the Google algorithm, we're trying to answer the question, is this change a good idea or not?

Jon Clark (44:40)

Okay.

Will Critchlow (45:02)

And just the same kind of thing. We're not trying to actually figure out what that supply chain looks like. I'm sure it'll be different again in three months or six months. just trying

Jon Clark (45:04)

Okay.

Will Critchlow (45:09)

to say, hey, if we take this thing and put it in the raw HTML of the page, is that a good idea? Does that help on net across the whole thing? And my suspicion is that we will see as we get to, we haven't, I'm not aware of a test that we've run yet that focuses on that specific hypothesis in the LLM measurement. I'm sure we will get to, and I'll be interested to see what that

Jon Clark (45:09)

Right.

Will Critchlow (45:29)

the result ends up looking like.

Joe (45:30)

I follow up for clarification on where the information comes from.

And I guess I'll use really high level to try to clarify. The LLM has a library of information that it's trained with on the left side, and it draws from that information when it can about some historical fact or whatever. On the other side, there's like the retrieval process where maybe that information

Will Critchlow (45:35)

you

Joe (45:56)

comes from a shared database

or a live website search, your SearchPilot can help on the retrieval side, testing within the retrieval side. Am I getting that right?

Will Critchlow (46:07)

That's correct. Yeah. We're not testing on the... you could have hypothesis that says, you know, if we change our website in this way, GPT-6 will look more kindly on us or something. We don't see that as a particularly fruitful marketing endeavor right now, as in like our customers

should not be spending their time doing that. Partly that we know that we couldn't help them if they did want to do that. But I also don't think it's particularly valuable use of their time, because it's utterly unknowable. And you're doing all this work that might pay off in a year's time. But even if it does, you're not going to tell if that was a good idea or not. It's actually quite similar to the concept of Google's core updates. Right? So should you be spending all your time optimizing for the next core update?

Will Critchlow (46:51)

My view is no, because they bounce around all over the place. And you're much better looking at the trajectory of the underlying, you know, like the day by day, week by week, making your website better as the algorithm stands right now. And trusting that if you do that in a aligned

way, right, so you're not trying to trick anyone, you're trying to build a good user experience, you're trying to build a valuable website, that you're skating to where the puck is going to be in like, you're optimizing for the same thing that Google is. I wrote a more detailed log

article about that, that argument has a bit more nuance to it, but it's a similar kind of concept. The same reason that we don't chase core updates is the reason we don't chase the next model training iteration.

Jon Clark (47:27)

Right.

So what are the talk about log

I feel like they've always been a valuable analysis tool, but I feel like they've had a resurgence in recent months just because of the challenges with attribution. Chris Reynolds was on the show last season from Indeed, and he talked about some ways that they were connecting

⁓ log file data to potential LLM visits. Are you guys using those files in any way as part of your tests to sort of validate invalidate a test or is that sort of an area where you're not really integrating into your testing plans?

Will Critchlow (48:02)

There's layers to this question. And we're also in the, we're getting into my favorite geek space. So forgive me if we, if I go down too much of a nerdy rabbit hole. So first of all, we don't do anything with log files technically. So log files are the side effect, right? So

have a visitor, whether it's a human or a crawler visits a website and the web server writes that log file as kind of a side effect. One of the big challenges we always used to face in SEO consulting or in agency work was getting hands on aggregated log files, because those things are big for a start, right? Petabytes of data they're scattered and they don't get kept. They tend to be quite transient

Jon Clark (48:37)

Right.

Will Critchlow (48:43)

and they end up scattered across different systems, right? So you've got some in your CDN, some in your origin server, you're piping them to this tool, that tool, whatever else. Some of the enterprise tools for managing and analyzing this stuff

Will Critchlow (48:53)

are very expensive and very hard to integrate in particular. And one of the most fail, most commonly failed integrations that I've heard about is folks buying software to deal with log analysis, all kinds, not just for SEO and struggling to integrate that stuff.

Jon Clark (48:59)

you

Will Critchlow (49:08)

We don't touch our customers' other log files, as in like from their web servers and from those kinds of things. However,

what we do is we often see those requests directly. So SearchPilot deploys in one of three ways. One integration method is an API integration on the server side. we're integrated by the API, we don't get to see every crawl request, because we could be cached. There's all kinds of... ⁓

technical details in that side of things. But the other two deployment modes, one is deploying inside the CDN and one is actually deploying into the edge. if our customers have a CDN that supports edge technology, then we can actually deploy into that edge. And that means that we get to see those crawl requests directly in our system without having to ingest petabytes of somebody else's log files. And so we can keep just the bits we care about, which are typically

Jon Clark (49:51)

Will Critchlow (49:55)

crawl logs, right? So we don't really care about the human visits. So we can throw away the vast majority of that petabyte of data and just keep the Google bot or the LLM bot stuff that we care about. So that's a long preamble into saying not exactly log files, but we do look at crawl data. And it's another area of active exploration for us right now is because everything's changing so fast on the LLM side, how indicative of actual results is crawl data?

Jon Clark (50:09)

you

Will Critchlow (50:20)

Up till now, we have not used it for calling tests successful. So we've not had the case where we said, you have the hypothesis that if you make this change, Google is going to visit your website more often and crawl your pages more often that in itself is a win. We've always figured we want to measure the actual something a bit closer to the business than just Google bot visiting a lot. What we have used it for is more around the sense checking that a test could have been successful by now,

because if the site section hasn't been recrawled since the change was made, it's not possible for it to be showing up in search results yet. If you think about changing a title tag or whatever and Google hasn't come back and seen that title tag has changed, it can't possibly have made a difference yet. So we sometimes use it more in that sense to say, this test that is currently neutral, inconclusive, flat, don't worry about the fact that it's not moved yet. Only 10 % of the site section has been recrawled,

Jon Clark (51:07)

you

Will Critchlow (51:09)

so far. And we're going to wait until that's 80 % before we even start to think about whether this could be beneficial or not. So we've more used it in that sense, but we have some R&D going on to figure out whether we could integrate it more directly into the measurement. you

Jon Clark (51:22)

Got it, got it, that makes sense. You talked about the reliance of AI models in general, their reliance on sort

of user generated content. And it makes sense, an LLM can't physically try a product, right? You mentioned sneakers before, they can't like put on a pair of sneakers and make a decision. So they're reliant on this user generated content to sort of formulate that answer.

Will Critchlow (51:29)

Yep.

Jon Clark (51:43)

We know it's incredibly impactful for LLMs. It's also getting integrated into traditional Google search results. Are you guys doing any testing on the UGC

side of things? I don't necessarily know how you would even set that up without having control in the social platforms themselves. But are you guys doing any testing around that or helping clients think through how they might get more involved there to influence the LLM results?

Will Critchlow (52:08)

So we are very focused, okay, back to that focus thing. We're very focused on what you do on your own website. So.

Will Critchlow (52:14)

we're not playing in the PR space or the social media influencer space or any of those kinds of things. I'm sure there's value in those areas. And if I was in the agency world, would definitely be running after some new shiny things. We're very focused on what you do on your own website. And the overlap is that we do think about some of those data sources that are useful to the that overlap with this. So one example is

reviews and UGC stuff that you put on your own website. That either putting it into the HTML rather than JavaScript, embedding it so that it doesn't need some external API call or whatever, increasing its prominence. There's a variety of things where that can make a difference. The other part is that it doesn't have to be UGC. So it does need to be from some form of human experience, but it could be what manufacturer says about a product, right? So it can be...

product information. And actually, that's one of the not to give too

much away. But I think some of our more successful or some of our only successful LLM tests so far have been about giving more information. And that source of information could be, manufacturer stuff, product stuff, and just exposing more details and data. Because, yeah, you the LLM doesn't get bored, right? A human might be bored reading some endless long spec of

things. But, we are attacking that some bits of that problem, but we're very much not looking at it. We're not trying to influence what happens on Reddit or YouTube or wherever. you

Jon Clark (53:33)

Will Reynolds had a great, I mean, it the simplest of A-B tests where he added a snippet into the footer of the site, something about, you know, average tenure of a client, I think it was, and LLMs like gobbled that up like so

quickly and started including it.

Will Critchlow (53:46)

They're so naive,

I don't mean not Will, I love Will's work, that the LLMs are exceptionally naive at this point because I'm sure the black hats are having a feel. You don't hear the same phrase black hat these days, do you? I'm sure they're still out there, there're probably just making so much money. Right, exactly. And they're just not feeling it.

Jon Clark (53:51)

Yeah,

think when it's quieter, they're probably doing more.

Joe (54:03)

There is no black hat for LLMs

yet.

Will Critchlow (54:06)

Well, but I mean, I think there is, and they're just not telling us about it. They're just quietly making bank because not to cast suspicions at all, but I saw an experiment, I think Lily Ray did, and she was looking at stuff you know, writing a, I forget if it was just, maybe it was even just a LinkedIn post, or was it a LinkedIn article? I don't remember. And, you know, it was the only place on the internet that a particular thing was,

Jon Clark (54:06)

first.

Yeah.

Will Critchlow (54:29)

a particular claim was made. And before you know it, the LLMs are repeating that claim. You're like, that's a single source from like one social media post. This is wild. What are you doing? And I said earlier, one of the kind of bullish on Google, I think that this also is another reason why I think Google has a great shot right now is they've been playing the adversarial information retrieval game for a quarter century. And I am amazed,

consistently amazed that you folks are apparently getting paid these huge pay packages to go and work at the large language model startups. And none of them have thought the fact that some of the stuff you read on the internet isn't true. I know some of the people who write the internet and I don't trust all of them. And I think Google has been fighting this battle for, you know, well, longer than I've been in the game and they've got pretty good,

Jon Clark (55:07)

I'm

Will Critchlow (55:15)

and they can ground their model on trust data, on truthiness scores, whatever the evolution of all this stuff is that they've got. And they don't expose that. So nobody outside of Google is getting access to that data. And I think we're going to see a lot of that kind of stuff being very important in the next few years, at least till the Singularity.

Jon Clark (55:34)

I know we're bumping up against time, so I want to jump to some rapid fires. But one question before that, I think one thing that I've always enjoyed reading from you is your thoughts on leadership and even just of like structuring your life, right? Like those things that improve what you're doing. I think your brother, Tom, does this really well also. if you had to, you talked a lot about focus. If you had to

think about like a personal metric of sorts that you use to sort of judge whether you're spending your own time like well, like what would that be? Like do you sort of keep, I don't know if keep scores the right analogy here, but do you think about that?

Will Critchlow (56:07)

I know where you're coming from and it's

a very interesting question. And actually, you mentioned Will Reynolds earlier. I've chatted to Will about this a little bit because we have a lot of parallel experiences in, whether it be in business or in being a dad or whatever else.

Will Critchlow (56:20)

I think he's probably a little bit more metrics driven and some of my co-workers are the kind of people who are very, very, you know, numbers leaders board, you know, like metrics, you know, the kind of person who knows their personal best of every weight in the gym and you know, all that kind of thing. I am not naturally that I am an optimizer at heart more than a target getter at heart. So I'm always like, I'm here and I want to be there as in a bit better.

Will Critchlow (56:45)

Bit better every day, rather than a, you know, I'm going to run a marathon in two years time. So I better start structuring my training life around that or whatever. I'm not going to run a marathon instantly, I hate running. But, have a very vibes based answer to your question, which is I look to feel good about what I'm doing. And I don't mean that in the kind of hedonistic, you know, I had a lion and a, you know, hot chocolate or whatever. I mean more like,

Right.

Will Critchlow (57:11)

I feel like I did valuable work. I feel like I did, I was true to my values was aligned with the things that I want to achieve in my life. I do take stock of that stuff. When I'm thinking, I get, whether it's family stuff or being a good friend or whatever, I do reflect on those things, but I don't have metrics for all of it. And I look to try to just string together the wins, right? String together the...

I felt good yesterday, I feel good about today, I feel good about what I'm doing tomorrow, of one foot in front of the other. And yeah, I often refer to it as typing. So much of our work is done on computer. But I'm like, I've got to do some typing, as in like, the rubber's got to meet the road. It's all very well keeping it up there in your head, but you've actually got to do the thing. And yeah, obviously that's for knowledge work.

But I think probably the biggest thing that I unlocked recently on that, that might be helpful to someone was for a long time, kind of bumbled along with of taking work stuff quite seriously and having other things fit in around the edges, particularly health things and working out and exercise. And I've always been very active, you know, enjoyed playing sport, but...

That isn't enough at a certain point. That's maybe enough in your 20s, but at a certain point you can't just play pickup. And I was really hesitant. It took me a long time to kind of really figure out why I was reluctant to kind of commit to a schedule or say I'm going to work out this many times a week or whatever. And I realized that it felt like a compromise. It felt like I was saying, I'm going to do less of the other things that are important to me, whether that's family life, whether it's...

work or whatever. And I saw a quote and I should really look up who said it, but it stuck with me, which was how you do one thing is how you do everything. And the idea that you can take all these things seriously, actually. There's a kind of compounding benefit actually. And I found that actually adding some little goals or adding some structure or adding some commitments in all bits of my life,

have made me more, whatever you want to call it, productive, focused, output oriented, better at some of the other bits of my life, even those ones that you wouldn't normally set a target on, if that makes sense. I feel like we'll have a therapy session.

Joe (59:07)

I'd like a whole nother hour to go into that.

Jon Clark (59:07)

Yeah.

Joe (59:12)

I

Jon Clark (59:12)

Yeah, exactly. Exactly.

Joe (59:13)

I know, but I think our audience would love it. At least John and I would. Maybe we get a couple more rapid fires out of you. The Business Class Launch podcast has brought me so many hours of enjoyment. Thank you for continuing to make time for that. Is there one guest, can you just, can you think of one guest who said something that really stopped you in your tracks and made you think differently about

Jon Clark (59:18)

You

So good.

Will Critchlow (59:26)

I'm pleased about that.

Joe (59:35)

your work today?

Will Critchlow (59:36)

Funnily enough, I do have an answer to that. So Brian Hale, who I first got to know Brian when he was at Facebook and he was kind of a early career, but high flyer rising rapidly.

And he's moved on now, but he had a story that he told about his time at Facebook about his manager, who actually I know as well, but we'll keep it about Brian. And we did some work with Facebook way back when. And it was all about doing SEO for your Facebook page and all this kind of stuff back in the Distilled days. And so we got to know Brian and he was somebody that I really wanted to have on the show. And the story he told was a story that he...

a personal development journey he went on with his manager and he's always been this kind of high performer high fire whatever and there was but as he put it he's Canadian and he's too nice and his manager was like Brian you're being too nice the thing that's holding you back is you need to be more forceful in certain situations right and I think you you've got to hear but if you want to make it to that next level of leadership you're gonna have to you know have to have moments where you

you break out another level of approach to some of these things. The thing that stuck with me was his manager saying to him, we will know this was a success if in your next quarterly review, by the time we have our next review, I have had negative feedback from somebody that you've gone too far the other way. The only way that I know that I'll know that you've really taken this on is if you've had a go at going too far.

Jon Clark (1:00:59)

Mm.

Will Critchlow (1:00:59)

And he's basically like, next quarter, I want somebody to come to me and go,

Brian took it a little bit too far and ⁓ you need to have him rein it in a little bit. And he was like, because actually, all of us are kind of captured by our nature and our habits and our way of being. And you almost never do go too far, right? And sure enough, what happened was he tried really hard to go too far and in fact got to exactly right. And that journey is

that really stuck with me, both as a personal development thing, but also as a coach, that idea of having somebody try and bust out of their routines by trying to go too far in the other direction was one that just stuck with me. Okay.

Jon Clark (1:01:34)

God, I can't think of a better spot to end than that. I mean, that's such a good lesson. It's a hard one too, right? I mean, especially for early in their career,

being willing to have a voice I think is a tough thing. Will, this has been incredible. I really wish we had two, three, 24 hours with you to... No, but that's totally fine.

Will Critchlow (1:01:50)

Those rapid fire were not very rapid. I'm sorry about that.

Jon Clark (1:01:55)

But before we wrap up, let our listeners know where they can find you, any interesting tests to sign up for on the email newsletter, or anything else that you would want to share.

Will Critchlow (1:02:03)

Yes, on the work side, searchpilot.com, you can check out everything from the Business Class Lounge podcast, but we also do that as a webinar series. And so if people want to watch the video side of it, they can do that as well. And the test results that we've talked about a lot, it's just in the resources section, you can sign up for the email side of getting those every couple of weeks when we put those out there. And in terms of me, I'm quite easy to find. I'm Will Critchlow on all the social media platforms, most active on...

Twitter, X, and LinkedIn these days. Yeah, easy to find on both places, but also on Threads and Bluesky, if that's your bag.

Jon Clark (1:02:36)

Perfect. I think it's one of the benefits of being early in this industry is you get your actual name as your username for all these

thanks again for joining us on the Page 2 Podcasts. And for those listening, if you enjoyed the show, please remember to subscribe, rate and review. We'll see you next time. Bye bye.