Most brands are optimizing to be found. Far fewer are asking what AI actually says about them once a buyer starts asking questions. That gap is exactly where Jessica Bowman, my guest on today’s show, works.
Jessica is a long-time SEO veteran with a career spanning two decades and a speciality in in-house SEO. As the founder of Coxwell and Gain, she’s an enterprise strategist and consultant to Fortune 500 brands pioneering the operational side of AI visibility. Her consultancy is built around a simple yet urgent idea: AI is forming judgments about your brand, and most organizations have no system to manage it.
Jessica’s work goes well beyond keywords and rankings. She helps companies see how decisions in product design, fulfillment, customer service, and even earnings calls shape what AI tells buyers about them.
In this episode, we discuss how AI builds a case for or against a brand. We cover what an AI visibility audit actually looks like in practice. We explore why the fix often lives outside the marketing department entirely. And we get into what leaders need to do now, before AI Mode becomes the norm for all Google users.
We also get into real tool costs, the new cross-functional demands this discipline creates, and why some of the most damaging AI narratives are ones brands unknowingly wrote themselves.
If you’ve ever wondered whether AI is working for or against your brand, this conversation will provide some really interesting insights to sharpen how you think about it.
So without any further ado, on with the show!
In This Episode
- [02:42] – Jessica shares her path from Enterprise Rent-A-Car to early in-house SEO and now AI-driven “recommendability” work via Coxwell & Gain.
- [09:43] – Jessica explains that future-proofing in the AI era requires leaders to implement governance focused on AI recommendability.
- [14:17] – Jessica explains AI success needs cross-functional alignment across all customer-impacting teams to avoid negative AI narratives.
- [16:41] – Jessica describes an AI visibility audit as a largely qualitative, manual process that analyzes what AI systems are saying about a brand, identifies reputational risks versus opportunities, and translates those insights into actionable guidance.
- [19:39] – Jessica explains that after delivering an AI visibility audit, her role typically shifts into guiding implementation.
- [23:36] – Jessica explains that in AI visibility work, in-house teams are often essential for cross-functional execution, but external agencies still add value through audits, strategic perspective, and supplemental expertise.
- [28:36] – Jessica shares that for prompt tracking and analysis, she’s experimented with several early-stage AI visibility tools and finds the space still immature, often with missing functionality and workflow limitations.
- [31:49] – Jessica says she still prefers ChatGPT because it’s highly personalized to her work and retains useful context, while she sees tools like Claude and Gemini as capable but less consistent for her long-term, work-specific use cases.
- [35:00] – Stephan discusses how long-term platform “lock-in” parallels AI memory retention, introduces a “memory palace”-inspired tool for preserving prompt history and improving AI recall, and then asks Jessica to share any recent AI hacks or discoveries.
- [39:34] – Jessica advises executives to approach AI visibility as a fundamentally new discipline beyond SEO, requiring governance across the organization.
- [43:18] – Stephan closes the episode as Jessica shares her website for learning more and hiring her, and they wrap up with a brief thank-you reflecting on the discussion about AI visibility and governance.
Jessica, it’s so great to have you on the show.
Thank you for having me. Stephan, it’s good to be here.
We’ve known each other for many years. You’ve been in the speaking circuit, SEO speaking circuit, for decades, literally, and let’s start with just sharing your hero’s journey with our listeners or viewers. Like, how did you get into online marketing and SEO and now AEO and AI visibility? Like, how did that all unfold?
So we have to go back. I worked at Enterprise Rent-A-Car 20+ years ago. I started in their R&D department, moved into process analysis, then user experience and finally into a kind of project management for front-end design. And then my job was dissolved, so I sat there. They liked me enough to let me interview for other jobs in the company. And while I did that, I went around asking people how I could help.
And one guy, who turned out to be one of my best managers, Paul Tucker, said, “We want to get into search engines. I think it’s called Search Engine Marketing. Why don’t you go check that out?” And so I did, and I came back about two or three weeks later with my first SEO audit, and I said, ” It’s called Search Engine Optimization, and this is what we need to do. So he went back with his boss for a couple of weeks, and they came back to me and said, “Jessica, do you think this is a full-time job?” I said, “Yes, there are several years of work there.” And he said, “Would you like the job?” I said, “Yes”, and that’s how I got into SEO. So I was learning,
And what year was this?
It involves far more teams in the organization, and a lot more cross-functional work will be needed than we ever saw in SEO.
Oh, my goodness, 2004, No, 2000, I don’t even remember, it was over 20 years ago. And I mean, Ask Jeeves was still around back then. So it was, it was very early days. Google was young, very, very young. I just remember that. And Yahoo was cool. So I started doing SEO, and at that time, no one in a large brand was really doing it or dedicated to it. So I would go to conferences and talk about my problems with development and other teams, and people would just kind of look at me and say, “Wow, that sucks.” And then move on, because they were consultants or independent with their own websites, so there weren’t brands then.
So I was learning trial by fire, how do you do SEO in a very large organization? And eventually that became my specialty. That’s what I talked about at conferences. And I realized very quickly that what I wanted to do was speak. I just started pitching, and they liked what I had to talk about, and it resonated with a lot of brands and in-house managers. And so that’s where I kind of carved out this niche of, how do you do SEO in a very large organization?
So now, fast-forward: I end up starting a consulting company, seoinhouse.com on that. And eventually, when AI came out, I started looking at what’s really needed for AI, not just to rank, but again, on the operational side, what needs to happen in a very large organization and across the organization. And that’s when I realized AI visibility is far greater than that of any other discipline we have. It’s so different. It involves far more teams in the organization, and a lot more cross-functional work will be needed than we ever saw in SEO. And so that is when I started Coxwell & Gain to specialize in this cross-functional need. And how do you improve recommendability, not just being found inside it?
And where did you come up with the name Coxwell & Gain?
Right? It’s actually the first book I’m writing. So, in the process of writing a book that is not coming out soon, I don’t want to preempt that, but the idea behind it is: to navigate this huge organization, you need a. And a coxswain is the guy that sits at the end of the boat, when you are on a crew team, and they row the team rows, and the coxswain is calling the place. This is how we need to row here; he can see the water and call the race plan based on what he’s seeing. And that’s the role we need. It is a completely new function in a new organ in the organization. It converts operations from a perceived expense to an operational revenue-generating role. And so that’s why I came up with, you know, at the end of the day, if you “Cox” well, you’ll gain. So I came up with Coxwell and Gain.
Okay, cool, all right. And so this book that you’re working on, why is it not coming out anytime soon?
So what was happening is, this is all new. We are maturing, right? And as the book’s content was being fleshed out, it kept maturing. I thought, You know what? I want to give this a go, building and standing up with experience and then finishing out the book. The book is phenomenal for helping you really deep-dive into every little aspect of what you need to do, and that was great.
Where you’re winning today may not be where you win in the future. Share on XBut I felt like, you know what? I have an opportunity to start executing with clients. So let me execute and then bring those findings into the book, so the book isn’t over; it’s just delayed. Even better. Yeah, and I think that happens in a maturing organization, a brand-new type of field, so I didn’t want to launch something too soon.
Right? But what would somebody like who’s thinking that things are speeding up and it’s hard to keep up. Why? What would you tell them about? Well, it’s not really, it’s like you sleep for a week and the whole world’s different. You know, that’s kind of what is being sold to us these days, with the accelerating gains from AI and other exponential technologies. Next thing you know, there’s going to be quantum computing and nanotechnology and everything. So is a book even relevant these days?
You know? It’s a great question, and the book I’m writing is not about technology or evolution; it’s about being a book for leaders. How do you navigate? What do you need to lead? What is your organization? Need to stand up? And I think when you look at it in that vein, through that lens, that’s not going to evolve as much as if I were to be writing a book about how to do geo on your website, that is going to evolve. The tools are going to evolve. The opportunities will change a bit, but what you need to do operationally won’t change nearly as much. So I do think a book could still be relevant.
Yeah. So, what would be some of the things a leader in an organization needs to do to be future-proof and be more effective in the AI era?
Yeah, so I would say a lot of its governance, governance over AIs that you already have governance, but not from the lens of your being judged for AI recommendability, for whether or not AI is going to recommend your brand. One example is: I have a client, and one of the narratives AI could tell buyers is that their customer service is bad, that buyers get passed around, and that it’s hard to find the right point of contact, and yada.

And it can also tell the buyer part of this is because, and then it explains, like it’s explained to me as a buyer, that, “Oh, this company is constantly acquiring new brands, and they have seven different screens that the customer service must navigate through.” And, you know, just went on and on. And I was like, “Where are they getting this information?” And it turns out they were getting this information often from two sources.
One was the earnings call, because customer service issues were basically, I don’t want it confessed to, but I do want it admitted to. That was evidence, right? It was, I call it self-induced evidence. That was evidence of self-indulgence they put out against themselves and their customers. And then the other was a case study between the brand and their vendor, whom they hired to solve this problem. And so that case study was often also cited as a reason behind that.
Basically, it was saying that the teams have many different systems to work through. Well, nobody evaluated the earnings call notes and preparation through the lens of, How could AI use this against us or for us? Because AI is always building a case for or against a brand, and so is this earnings call and what you’re going to cover, is it presented in such a way that’s going to help us, or is it presenting things that introduce risk, right, that that AI will be trained on. So nobody’s looking at that same thing with the case study. Right?
The brand decided it made sense to partner with this company and that it was okay to do a case study, but nobody looked at it through the lens of how it could impact narratives AI might tell buyers, because in the current or pre-AI era, none of those things conflicted. They didn’t know that what you needed to be found for in search was never affected by what was said in an earnings call or a case study.
It involves far more teams in the organization, and a lot more cross-functional work will be needed than we ever saw in SEO. Share on XIt just wouldn’t even come across to the buyer during their buyer’s journey, and let’s put me in charge of finding those things. And so things are changing. And that’s just two examples of what a brand needs. And I would call it governance, and that’s what the Coxin role would do. They are the brand advocate for what? What risk are we introducing? What liabilities are we creating for ourselves, and how do we minimize that and instead create more opportunities or reasons for brands to be recommended? Should our brand be recommended?
Yeah, well, there’s been this, you know, SEO site for many decades called ORM, or Online Reputation Management. Seems like this is kind of a new evolution of ORM.
Yes, to an extent, yeah, yeah. It’s more like, ORM is still like, what’s just external, right? Often responsive to problems. And this is looking really, let’s go further upstream to not create the upper, the opportunity for a problem in the first place, and I think that’s really what it is. In my opinion, when I think about this role, it’s not a marketing role; it’s an operational function, because you have to drive change in the organization with a very different mind and skill set than your traditional marketer. Can you find the unicorn, of course, right? Like, I was trained up in process analysis and, you know, thinking about things very differently, so that’s why I can see what’s needed. But not most marketers, I don’t think, have those kinds of skill sets as their greatest core competencies.
Yeah, so you mentioned a cross-functional team to get all this in order, like, what are all the departments? What are all the functions that need to be?
It’s not a marketing role; it’s an operational function.
I think everyone in the organization that affects every function that affects the customer in any way, or something public-facing, right? So obviously public relations, social media, but even fulfillment and shipping, even manufacturing, right? So if you have problems with manufacturing, suddenly, that can become an AI visibility challenge. So even manufacturing comes into play. One brand I audited for ChatGPT would not recommend them. They would not recommend this brand because, for example, they’re likely not going to enjoy the product, given the many complaints about, let’s say, the materials, which will cause you to return it.
But the return process is difficult, and getting a refund credited to your account is even harder, so be prepared to call the bank. It’s a huge physics problem for this brand. And so, what teams need to be involved? Well, refunds, right? That’s finance, then manufacturing or potentially design. Who’s ever approving the different materials? Sometimes it might be how the quality of this is of how things are constructed, right?
So even getting down to quality assurance or manufacturing details, whether it’s training or level of quality from the manufacturer that you need to make sure it is up to snuff for where you want your brand to be and how you want it to be presented in AI, I think we are going to be moving into an era where, as a buyer, we are going to experience higher levels of customer quality, whether it’s customer service or product quality. I think that those are the brands that are going to thrive, and those that don’t end up doing this. I think Nordstrom level everything is how I think of it. They don’t do these sorts of things. They are the ones that aren’t as recommended.
So, yeah, and what’s an audit look like? Like, how? What’s your process for doing the audit? What sort of tools are you using, what kind of output format, and what’s the depth of this document or this presentation that you give?
Your lack of planning or taking action now is going to actually stunt your ability to have a lot of influence on AI narratives in the future. Share on XYeah, so there are various ways that we do it depending on the scope. A lot of them aren’t super deep, because you end up with too many action items, right? So we’re looking at what AI is saying, what we care about, and the action items that come from that? So we’re working with brands to say, “Okay, if you’re launching a new product, what do you need to put in your product plan, roadmap? Right, your new launch roadmap.”
If you are, if you are creating a new model, let’s say, or the next season’s product, what are the things that you need to consider so defining, what is that, that research process and what are the things that need to be used to inform these other teams in the organization, so that as they think about the next iteration of the product, or the brand new product, what goes into that?
And a lot of it is not, oh, this is the tool I use. It’s manual. It’s qualitative, because I need to understand what AI is saying. What are the variations of what AI is saying? So it’s a qualitative analysis to understand the risk relative to the opportunity or strengths. Again, an asset, what’s the asset versus the liability in the search results and the liabilities we want to reduce?
So, like, you’re doing sentiment analysis?
Sentiment is positive or negative; it’s more like, How’s a brand positioned? Do you like how your brand is positioned? You know, you might be positioned as more or less the Trusty Rusty. Are you comfortable with that when your competitors are discussed as being more innovative or specifically engineered for this sort of use case, right? That’s a very different presentation, and it will be evaluated differently by buyers, which will help me, as the buyer, weed out which brand I want. Do I want the one that is specifically designed and engineered for my use case, or the one that’s a very reliable gold standard, you know?

And each buyer will behave differently. And how does a brand feel about that? Are they okay with it? Maybe their buyers are the Trusty Rusty, gold-standard buyer, but that means you may have gotten these friends’ use cases, which are really wanting something uniquely engineered for their use case. They may not get those buyers. So I think a lot of brands are going to lose the friends-and-family use cases they’re getting today, especially big brands, because whenever those buyers search, they’re probably in the shortlist. Still, they’re not necessarily going to be on the shortlist or described in a way that makes them interesting to the buyer for the shortlist.
And what sort of implementation do you offer them after you deliver an audit?
At that point, we walk the brands through the audit and the findings. Most of my clients do the work in-house, so they revise their materials, and then we can go back and re-audit them and provide more detail. Detailed feedback and direction. So sometimes it’s just to provide the recommendations. Sometimes it’s a detail like, “Okay, here’s more detail on the creative brief execution, like, here’s exactly what we suggest you do.”
But a lot of brands, especially my clients, are a lot of B2B companies that are maybe selling machinery that costs, you know, 10,000 to $2 million a pop for a buyer. And in those cases, they tend to want to write and present it in their own way, which requires a lot of domain knowledge. So in that case, they don’t want us to buy to write that content anyway.
It’s not a marketing role; it’s an operational function. Share on XAnd are you involved in, like, the technical side of implementation, like with SEO, there’s page speed. And is that an issue with AEO? Is that something that you know, you get involved with?
Definitely on the SEO side, absolutely. My clients don’t really look at it like, “Oh, we’re doing this for GEO, Page Speed.” We’re still doing page speed because it’s the right thing to do for SEO, which is, you know, Google most likely still going to be a factor in Gemini, so it would still be there. My clients, while I do often do that right now, have me focus on the AI visibility opportunity we need to pursue. So it’s working with training all of the teams. So a lot of my scopes now train up all of these teams, and then they are, as the SEO, the in-house SEO team, they are tackling the geo piece themselves.
Got it. And so what sort of training are you doing? Are you teaching them how to do better prompting and to use certain specialized tools? You teach them how to write in a more AI-friendly way. Like, what sort of training are you doing?
Yeah, so it’s a variety. So I would say there’s training for the SEO team or AEO teams. So, like one of my clients, that’s what I’m training them to do, the narrative audits, so they can then go in and start auditing their, you know, million product portfolio in-house, so their teams will be able to do that. So, I built the entire framework for doing that, and all the spreadsheets that build everything out completely, you know, automatically, so they can focus on analysis. So that’s one kind of layer of what we need to do.
My observation is that these tools are far more expensive than SEO tools.
Another thing is evaluating tools, like, what tools make sense for the SEO team, which I don’t have a favorite tool yet, but what is, what are the tools that you know they should look at, so kind of joining their team to evaluate some of those and then advising SEO managers, like heads of SEO and heads of AI on, what are the things they need to be thinking about. How do they plan for this? How do they talk about it? To leadership, it’s kind of coaching or advising them on what’s next and what that looks like, so they can better prepare. And then, how do cross-functional teams impact AI visibility, and what do they need to do about it? What are their action items associated with that?
Going back to your earlier history, when you were focusing on in-house SEO. How do you see the effectiveness of outsourcing to an agency, AEO or GEO and doing that in-house? You know, what are the pluses and minuses on both sides?
You mean for SEO or AI visibility.
For AI visibility, I mean, SEO is still a foundational cornerstone of AEO. You cannot have effective AEO, I think, if your SEO is a hot mess. So, yeah.
I agree too. So I would say a lot of it depends on the organization and who you have in-house or who you could bring in, so if you have this stellar in-house team, which few companies have, that big brands often do, you can run almost exclusively in-house. It’s not a recommendation. I think that you end up getting narrow views. I think you end up being highly influenced by the org’s structure and some of the challenges you face, so sometimes you don’t get the best strategy or strategic plans, or you don’t see things. That’s been my observation.
So I highly recommend someone come in. You bring in someone for audits at least once a year. So that’s like low-level engagement with an agency. Then, other teams have good people in place, like maybe they’re teams that know the brand, that know the teams that need to be involved, but they’re not the super senior SEOs. And so that’s when it makes sense to bring in someone who can advise and help, help them be better at what they need to be doing.
Your tools are going to cost two or three times what you’re paying for SEO tools.
And then there are also times when you just don’t have enough, you don’t have a big enough team, so you need extra bandwidth, right? That definitely is when it makes sense to bring in an agency. So I think the bigger the organization, the more complex the teams you need to work with. That’s when it becomes more important to have in-house staff to interface with all of those teams and agencies. I just, I think, struggle to do that, not necessarily because they’re bad at it, but it’s that they’re not their boots on the ground, right? They’re not in the strategic meetings that the SEO team is
Yeah. So what sorts of costs should an organization prepare for in this area of AI visibility, such as audits? What are those run, the tools that they’ll need to subscribe to the different service providers or vendors, agencies or the in-house person or people, it’s on their salaries. Like, what are we talking about here?
You know, my goodness, I’m somewhat a thumb in the wind. So I think that the first thing you have to realize is that SEO isn’t going away. So it’s not like you are replacing the SEO budget and moving it into the AEO or GEO budget and suddenly it’s going to be, you know, not as much of a lift. You’re starting a completely new discipline in parallel to the SEO tasks. So just assume SEO tasks remain in place.
My observation is that these tools are far more expensive than SEO tools. So, for you to be monitoring prompts across a large portfolio across multiple regions and languages, you have suddenly increased it even further. So it’s just more per-prompt to track than per keyword. And so all the tools have told me it’s just very expensive. And so I think you just need to know that your tools are going to cost two or three times what you’re paying for our SEO tools.
Are we talking like over $1,000 a month typically?
Oh, yeah, my clients are looking at, like, 80 to 150 a year. So my clients are mostly big brands, so they’re looking at very expensive tools. If you have to go for a small tool, I would say if you have a large portfolio, you’re looking at several 1000 a month minimum. And that’s some of the lower-end tools. Because even for just my audits, I’m struggling to find something for, you know, less than 500 a month. It’s just that you’re not getting any insights at all. I don’t think it’s useful for managing your program over time, unless you are an organization that sells one thing and you have a very small set of prompts that your buyers type in that you need to monitor. Very few organizations fall in that category,
Right? And so what are some of your favorite tools for prompt tracking and analysis?
Yeah, I’ve got to say, I mean, I do like Waikay, I think it’s an inexpensive tool to get started with. I think they do have some interesting insights.
By the way, I’ve interviewed Dixon Jones about his tool, Waikay.
It’s a good tool. It does, but when I sat down to really use it, I saw a lot of functionality gaps that they haven’t filled yet, like, the pieces are there. They just don’t have functionality. Like, I uploaded a bunch of prompts, and now I can’t delete the prompts last time I checked. So I have to, like, delete the project and start over. You know, little things, quirks like that. It’s like it was, I think, in this AI era: rush to market, and a lot of it, I would call minimal viable products right now. They may not, but that’s what I see when I look at them. There’s just a lot of functionality I would have expected that I cannot get.
And, yeah, so I’ve looked at, you know, profound, I’ve looked at Semrush. I’ve looked at Cognizant. Cognizant looks like an interesting one. So that’s what I’ve looked at that one, and I think I’ve really only used, probably actually purchased Waikay. And one of the interesting things for smaller brands or companies and consultants is that I found it hard to just buy, like a lot of these are trying to. Like, let’s talk about your product, what you really need and scope it. And I, like, I was in the middle of a project three times. I was like, let me just go in and try to buy. And some of the tools I really wanted to buy, I couldn’t just buy off the rack. Basically, I couldn’t just log in and purchase. And so that was a little frustrating. So you want to make sure you plan for those tools.
Yeah, yeah, cool. And what do you charge for an AI visibility audit?
It is. So it definitely depends on how many products we’re looking at. My preference is not to do an audit, but let’s pilot what needs to be put in place. So I have an entire framework of what needs to be stood up in an organization to manage, govern and minimize risk for AI visibility.
So my preference is not just to do an audit, but to build the system so that the audits aren’t as important, because it runs, you’re constantly auditing the organization. So you’re definitely looking at a six-figure, longer-term project to do that. In fact, most clients, when they see what needs to be done, say, “Oh, I see this is my long-range planning: year one, year two, year three.” This is what we would end up doing. So it will probably take most organizations three years to do it right?
Got it, okay. Can’t imagine what three years is going to be like, Sky now and everything, right?
Many aren’t going to be recommended for things they didn’t even realize affect whether you’re going to be recommended by AI.
Yeah, but it’s a lot. It’s so much bigger. I think brands haven’t really seen it yet, but many aren’t going to be recommended for things they didn’t even realize affect whether you’re going to be recommended by AI or what AI might say to buyers.
What are your favorite LLMs these days? ChatGPT seems to have lost some favor to Claude in recent months. I’m curious what your favorites are and why.
So I’ve gotta say ChatGPT is still my favorite, though I understand it’s controversial, and I think the difference is I’ve trained it. So my ChatGPT instance is highly trained at this point on what I do, what I focus on, and how I write, and to start over feels daunting, right to start somewhere new. It has a history of understanding. Remember this project, remember this client, remember this thing I was trying to do a while back. Remember, we talked about this, and ChatGPT knows. And so it has that history.
I think a lot of people aren’t putting in the hours to do that, and I don’t blame them, and it’s not a fault. I was writing a book, so I was spending, I think, my ChatGPT instances trained 1000s of hours on one thing, and that’s another thing I learned: the moment I realized when I asked my personal questions, and the one-off questions about this topic over here, ChatGPT wasn’t as good. Still, if I kept laser-focused on my work, ChatGPT’s responses were far superior. So I now never use it for anything outside of my consulting work, and it’s much better.
Do I want to try Claude? Absolutely. I just haven’t really tried it out since it’s had some major enhancements. Gemini is okay. I like to bounce my ChatGPT stuff off Gemini. But Gemini is sometimes more. I think of it as an engineering-type response and an engineering mind response. And so while it’s good, it does have some limitations. I also find it’s not as good at remembering my history.
So sometimes it can remember things in the past, and sometimes it doesn’t. So that’s my preference. I realize it’s counter to what everyone in the field is saying right now. But again, I think, I think this is evidence of what we are going to see in the real world. People are going to have their LLMs trained up on them so well that they won’t want to leave because it will be painful. After all, you’ll have to retrain.
And so, like me, for my work, I don’t want to retrain. I’ve already got this one trained up, and yeah. So that’s what I think is going to happen, that eventually people will become kind of not stuck. Still, they will have such a poor experience changing over that they would eventually come back, and that’s if they continue to use one, because hyper-personalization is already here. It’s only going to mature from here.

Yeah, so it goes to the idea of switching costs. You know, back in the day, when it was costly to switch from one cellular provider to another, I mean, the pricing, the plans and so forth, have changed these days, but it was such a hefty penalty you’d have to pay to leave your contract early that you just stuck with it.
And then they sweetened the deal when it came time for renewal. So you get a free phone, you get all these perks and discounts and everything, but now it has to do with the history and the memory and everything. Have you heard of a tool called mem palace? A pretty famous actress actually released it. She starred in The Fifth Element, so Milla Jovovich.
She came up with this, and it’s gone viral. It’s a GitHub repository with the code that helps the AI remember you, all your previous prompts and all the important stuff without losing resolution, right? So normally, the AI will just kind of remember its own summary of what you were telling it, and stuff is lost in translation. But with this memory palace concept, because do you know what a memory palace is?
No, and this is all new to me.
Okay, so it’s a memorization technique. It’s been around for a while, so let’s say that you’re trying to remember a long number or a series of words. It might be like a seed phrase for cryptocurrency or something, right? So you want to remember it. You never want to forget it. Well, you associate each thing that you’re trying to remember with a room in a palace or a house or a museum or whatever, and you’re going from room to room. You’re sticking these things, you know, maybe it’s an elephant, maybe it’s a blueberry pie or whatever, in these different places in the room, and you’re visually, you know, remembering the layout of the room. Then you go to the next room, and so forth, and then you kind of decompile that when you’re trying to pull out all the things you’re remembering.
So this concept doesn’t just work for remembering stuff, you know, without cheating, without a notepad or something handy, but it also works with the AIs. So this algorithm, based on the memory palace concept, produces markedly better output. So I suggest checking out some MemPalace. Anyways. So what are some of the fun, cool, new hacks, tips and things that you have uncovered in recent weeks that you would want to share with our audience here?
I don’t know if it’s cool or new. For the first time, I started using AI to develop really robust formulas in Excel. It’s not new; it’s just something I never really needed. I was able to do what I needed, but I have a project I’m working on now where I need a lot of data to be easily dropped in, then reformatted, analyzed and used for heat maps and that sort of thing.
And I’m using ChatGPT to write all the formulas. And it’s been phenomenal, unbelievable. What it can do is upload a screenshot and explain it. Sometimes I don’t even upload the screenshot. I just explained what I needed, and it just spits out an amazing formula. So I would say that was one of my big, like, whoa. I knew AI was good, but I really thought this was a complex formula I needed and functionality for it to do, and it built it out for me, and it was an easy copy and paste. I would say that was my latest big hack.
All right, cool. And what are some things that I should have asked you about, but we didn’t get to? I know we’ve got a few minutes left in this interview.
I would say, What do leaders need to know? Like, what? What do executives need to know? What would you tell executives about what’s coming and how they need to plan for it?
This is a great question. Why don’t you go ahead and answer it?
Where you’re winning today may not be where you win in the future.
So I would say executives need to look at AI visibility not through the lens of what SEO was, but with fresh eyes, because it’s totally different. Yes, there are similarities, but I feel like if you come in with the SEO connotation, you end up losing the real essence of what AI visibility is: it’s really about judging your brand. And yes, your SEO team can. Absolutely make sure that you’re found and make sure they know the facts about your brand, but they will hit a glass ceiling if you do not address the operational problems that are causing AI to learn things about you that may limit recommendations.
So AI is looking for, are you a good fit? Do you introduce low risk, high success rate, and, depending on the buyer, you may or may not, based on how your brand is judged? You’ll have those verdicts, you’re going to have millions to trillions of different perceptions, and AI will have different narratives it may tell, leading to more or less different verdicts for each buyer scenario. As an organization, you need a framework and a system for governing all of that so you can nip things in the bud before they become a problem.
And I think another thing to realize is that where you’re winning today may not be where you win in the future, and you may not be able to get around it. It’s almost like a brand needs to choose the few areas they need to win, because you’re not going to be everything to all buyers. You’re not going to be the best match for everyone. And that’s where the market is going to completely recalibrate. And you need to understand, if it were to recalibrate today.
Are you comfortable with that? And if not, what is your plan to change that? You need to understand what needs to change, because some of these things are going to take 18 to 24 months for you to see a change in the narratives from AI, particularly if your brand is a clothing company, right? You’re designing clothes now that won’t launch for another 12 to 18 months, potentially out in the open, for people to experience and comment on, and then AI to be retrained on it.
You need a framework and a system for governing all of that so you can nip things in the bud before they become a problem.
So your lack of planning or taking action now is going to actually stunt your ability to have a lot of influence on AI narratives in the future. When AI is here, right now, it’s on its way. And once this Google switches, it flips the switch, and Gemini really leads Google.com as the dominant response. Well, it’s already showing up. You still have some organic responses there. It will eventually, I think, flip where it’s almost all, all Gemini. And when that comes, are you comfortable with how Gemini is positioning you as a brand?
And most leaders cannot say yes or no to that because they haven’t really looked at or considered it. They don’t understand the operational changes needed, such as reviewing earnings call outlines for risk and opportunities or potential liabilities or evaluating materials, product designs or other areas. Maybe it needs to be evaluated. Maybe your fulfillment needs some change management and change leadership. Maybe, maybe shipping has some issues, right? So you may have to go down to shipping and do some work, and you need to make sure you have the ability to take action, but also recognize what action needs to be taken.
All right, awesome. Where does our listener go to learn more from you? Maybe hire you to get your book whenever it comes out, that sort of stuff.
Yes, you can go to coxwellandgain.com
Awesome. All right. Well, Jessica, so this was a pleasure.
Stephan, Great catching up with you. It’s been a long time, and thank you very much. I hope everyone benefited from the things we discussed today.
All right, and thank you, listener. Now go out there and make it a better world through AEO, SEO, marketing and whatever else is your specialty. We’ll catch you next episode. I’m your host. Stephan Spencer, signing off
Important Links
Connect with Jessica Bowman
Apps/Tools
Film/TV Show
People
Previous Marketing Speak Episodes
YouTube Videos
Your Checklist of Actions to Take
- Audit what AI is currently saying about your brand before you do anything else. AI is always building a case for or against your brand, and most organizations have no system to manage it. You need to understand your AI narrative — what is being told to buyers, what risks have already been introduced, and what opportunities you are missing.
- Review every piece of public-facing content through the lens of AI risk and opportunity. Earnings calls, case studies, and vendor partnerships can all become sources of negative AI narratives that you unknowingly wrote yourself. Ask whether each piece of content introduces liabilities or creates reasons for AI to recommend your brand.
- Treat AI visibility as an operational function, not a marketing one. It is not enough to optimize your website — AI visibility involves fulfillment, manufacturing, customer service, finance, and every team that touches the customer experience. The fix often lives far outside the marketing department entirely.
- Build a cross-functional team and governance framework to manage AI visibility across your organization. Every function that affects the customer in any way or anything public-facing needs to be involved. Without this structure, you will keep introducing risk without realizing it.
- Start planning and taking action now, because changes to AI narratives take 18 to 24 months to materialize. Your lack of planning today will stunt your ability to influence what AI says about you in the future. If you are designing products now that launch in 12 to 18 months, the decisions you make today are already shaping your future AI narrative.
- Evaluate how AI is positioning your brand relative to your competitors. Understand whether you are described as the reliable gold standard, the innovative specialist, or something else entirely — and decide whether you are comfortable with that. Some buyers you are winning today may not choose you once AI recalibrates the shortlist.
- Budget appropriately for AI visibility tools, because they cost significantly more than SEO tools. For large organizations, expect to pay two to three times what you currently spend on SEO tools, with enterprise-level platforms running between $80,000 and $150,000 per year. Plan ahead, as many tools require scoping conversations before you can even purchase access.
- Train every relevant team in your organization on how their work impacts AI visibility. The in-house SEO or AEO team cannot carry this alone — they need support from product, quality assurance, customer service, and leadership. Building that shared understanding across functions is one of the most valuable things you can do right now.
- Decide where your brand needs to win and focus your AI visibility strategy there. You are not going to be the best match for every buyer, and trying to be everything to everyone will dilute your AI narrative. Choose the use cases and buyer scenarios where you want to be strongly recommended, and build your operational strategy around those.
- Visit coxwellandgain.com to learn more about Jessica Bowman’s work and explore how her consultancy can help your organization build the governance framework and systems needed to manage AI visibility at scale. Connect with her on LinkedIn and X @jessicabowman to stay current as this field continues to evolve rapidly.
About the Host
STEPHAN SPENCER
Since coming into his own power and having a life-changing spiritual awakening, Stephan is on a mission. He is devoted to curiosity, reason, wonder, and most importantly, a connection with God and the unseen world. He has one agenda: revealing light in everything he does. A self-proclaimed geek who went on to pioneer the world of SEO and make a name for himself in the top echelons of marketing circles, Stephan’s journey has taken him from one of career ambition to soul searching and spiritual awakening.
Stephan has created and sold businesses, gone on spiritual quests, and explored the world with Tony Robbins as a part of Tony’s “Platinum Partnership.” He went through a radical personal transformation – from an introverted outlier to a leader in business and personal development.
About the Guest
Jessica Bowman
Jessica Bowman is an enterprise strategist and consultant to Fortune 500 brands, pioneering the operational side of AI visibility. She helps organizations understand how decisions in product design, fulfillment, customer experience, and more influence how AI platforms perceive and recommend their brand. She’s the author of The Executive’s AI Visibility Manifesto.








Leave a Reply