By Ardath Albee

GenAI offers opportunities for marketers spanning from creating B2B content to workflows to analysis. One of the biggest priorities for B2B marketers with AI is efficient content creation. Fifty percent of marketers use it to create content and 51% use it to optimize content. Yet only 31% have accuracy and quality concerns about AI.
While GenAI’s capacity to analyze data at record speeds, spot patterns in seconds, and logically put one word after another is impressive, there must be a deep foundation to enable relevant content creation that resonates with your audience.
The foundation needed to enable LLMs includes data.
Data — to an LLM — includes content. Obviously, you may be thinking. But are you sure you have the right content to enable the AI you employ to create on-brand, compelling content? Are you sure you have set the right context for the AI to work with?
We all want to jump straight to the end game. But, without setting the foundation, you’re putting the AI cart before the B2B content horse.
Do You Want More of What You Have Now?
Is the content you have now working?
Does it resonate with your ICP?
Is it generating inquiries from in-market buyers?
Do the “leads” you send sales reps engage with them in meaningful ways?
I read social media posts and articles insisting you must upload your content, brand guidelines, customer data, and more to train an LLM or GPT to write brand-relevant content.
You need to instruct the AI to use what you’ve given along with your prompt — to generate content that is true to your brand. Yet what I see is generic content that relies on the logo for its audience to know who generated the content.
Generic content lacks flair, creativity, or brand “personality.” Worse, it doesn’t share new ideas or insights that arouse curiosity or invite engagement.
Before you throw all your content efforts into using AI and expect it to magically produce game-changing content, it’s important to assess what you will use to train AI.
If the content you have today isn’t the content you want tomorrow, you’ll need to put a bit of elbow grease into creating a deep foundation for AI to work from. If you don’t have a clear understanding of your ICP and personas, you need to do that work.
You also need a content marketing strategy that defines audiences, flow, momentum, jobs to be done, questions to answer, triggers, and tipping points. And that content strategy must roll up to your brand positioning and business objectives.
Without that level of guidance, how will you use AI to create a cohesive content ecosystem that builds memory structures and reels in buyers when they come in-market? More isn’t always better.
Using AI to Create the Foundation Is Not the Answer
I’ve watched webinars where the speaker presents how to use AI to do the work that forms the foundation for you. One example is creating buyer personas.
But how do you know what it tells you is truly reflective of your actual ICP? For example, not all “Directors of IT” are the same. Source matters… a lot. Whether they’ve purchased a solution like yours matters even more. Otherwise, it’s just guessing.
Whether it’s the type of solution they’re buying, the industry they’re in, or the way the problem situation presents within their companies, there are always key nuances of difference.
One of the most valuable things derived from buyer and customer interviews is the language. How buyers verbalize the problem, the situation, the reason for solving it, and the work their teams do varies across personas. Greatly.
If you want to be relevant, you need to get in the weeds with how buyers see and describe things. And how they explain their motivations for solving a specific problem your solution solves.
AI doesn’t do this well. Take for example if you ask it to summarize a transcript and pick out the important parts in relation to X. What you’ll get back is a formalized version of what the person said. AI will often replace the words said with words that fit its pattern-matching training. But it may also overlook something important in the way the speaker said it. However, the AI didn’t register the words as meaningful during its analysis. AI doesn’t have opinions or emotions. At least not yet.
Words, and how they’re used, make a difference. The difference between your buyer rolling his eyes or paying rapt attention. Effective content speaks to its intended audience about something they care about or are curious about. In doing so, it helps to build those memory structures that create recall when the time is right.
If you use AI to create personas and then ask it to evaluate the relevance of your existing content for that persona, how high would you say the probability is that it does so effectively for a real-world application?
Doing the foundational work still requires human effort.
Context is Critical; But it’s Not Foolproof
But once you’ve set a solid context for the LLM to work from, you still have work to do.
A colleague recently shared a document with me and asked for my thoughts. It was a business enabler and value justification for a customer to use explaining the initiative and progress for a project kickoff. The source material given to the LLM was a transcript of the customer discussing the project and answering key questions about the strategy for the initiative.
But something was off with the content.
It was subtle. Yet it felt like the content was holding me at arm’s length. The words used were not the words the customer used. (I had read the transcript several days prior.) The content wasn’t personal or personable.
My biggest question was whether his peers, executive team, and the board would believe in and invest in the case he was putting forth. The content didn’t “feel” like there was any of him in it.
The writing was fine. It was corporate. It addressed all the things. But it was off. And it was obvious.
When I gave my colleague my honest feedback, the admission came that AI had done the writing. The prompt was solid. The transcript contained all the data needed to create the piece. But the content the AI created lacked resonance.
The AI didn’t have a stake in the outcome. Nor is it capable of having one.
We may want to skip straight to putting AI to work. But to do so well means doing the work to set the right context. And even then, you need a human in the loop to catch what AI misses. What looks fine on the surface may be far enough off to do more harm than good.
I use GenAI a lot. But I use it to play to its strengths. And I use my humanity to play to mine.
Originally posted on CustomerThink.com
Leave a Reply