Stop Letting Tools Do the Thinking: Why Real Keyword (and Prompt) Strategy Starts with Questions, Not Lists

A Mini Rant

Most keyword research guides — and now most prompt-building “frameworks” — make the same mistake: they start with tools.

They tell you to plug something into SEMrush, Ahrefs, or ChatGPT and get a list of keywords or prompts. That feels efficient. But those lists reflect what everyone else is doing — not what you do, what you sell, or how your customers actually think.

A few years ago, I reviewed an entire year’s worth of blog posts on keyword research. Not a single one began with the obvious: make a list of what you actually do and what problems you solve.

Maybe I’m “ancient school.” When I started, there were no tools. We had to write down what the company offered, or — shockingly — ask them. It wasn’t glamorous, but it was the only way to build a strategy that matched reality.

And that’s exactly what we need to bring back today — especially as search shifts from keywords to AI prompts.

Step 1. What Do You Do? (The Missing Foundation)

Why it matters

  • Without this grounding, every keyword list or AI prompt is noise.
  • It forces alignment on the language of your products, services, and solutions.
  • It becomes the lens for the rest of the exercise.

Workshop setup

  • Ask the team: “What products/services do you offer, and what problems do they solve?”
  • Push for plain language, not just product names.
  • Capture target audiences (patients, CFOs, small business owners).

Example

  • Product: “Accounts Payable Automation Software.”
  • Problems solved: “Invoices processed too slowly,” “Errors from manual entry,” “Lack of visibility in cash flow.”
  • Audience: “CFOs at mid-sized manufacturing companies.”

Sample table

Products / ServicesProblems SolvedTarget Audience
AP Automation SoftwareSlow invoice processing, manual errorsCFOs at mid-sized manufacturing companies
Toenail Fungus CreamEmbarrassment, discomfort, recurring infectionAdults 40+, caregivers for elderly

Step 2. How Do You Want to Be Found? (Aspirational)

Why it matters

  • Surfaces the vanity terms companies want.
  • Often reveals arrogance (“we want to rank for cloud computing”) vs. realism.
  • Useful starting point for expectations vs. actual opportunities.

Examples

  • Traditional (keywords): “cloud computing,” “best vodka.”
  • AI (prompts): “What are the leading companies in cloud computing for enterprises?”

Facilitator note

  • Ask: “If someone were searching Google right now, what words do you want them to type that lead to you?”

Step 3. How Might Searchers Look for You? (Customer Empathy)

Why it matters

  • Forces teams to step out of their brand bubble.
  • Captures early “how do I…” or “what’s the best…” phrasing.
  • Expands beyond product names to customer problems and needs.

Examples

  • Traditional: “migraine remedies,” “project management software.”
  • AI: “What are the best over-the-counter remedies for chronic migraines?”

Facilitator note

  • Frame it generically: “What would someone type if they had no idea who you are, only the problem they’re trying to solve?”

Step 4. If Unaware of Solutions, How Would They Search? (Problem-Driven, Solution Unaware)

Why it matters

  • These are the earliest signals of demand.
  • Reveals raw symptom language before someone knows solutions exist.
  • Gold for awareness-stage content.

Examples

  • Traditional: “yellow toenails,” “manual invoice process too slow.”
  • AI: “Why are my toenails yellow and brittle?” | “How can I speed up invoice processing without hiring more staff?”

Facilitator note

  • Ask: “If someone doesn’t know a solution exists, what words would they use to describe the problem?”

Sample table

Problem-Driven SearchSolution-Driven Search
“yellow toenails won’t go away”“fungus treatment for toenails”
“manual invoice process too slow”“AP automation software”

Step 5. If Aware of Solutions But Not Brands, How Would They Search? (Solution-Driven, Category Focused)

Why it matters

  • Captures category-level demand.
  • Informs comparison guides, buyer’s guides, and category descriptions.

Examples

  • Traditional: “toenail fungus treatment,” “AP automation software.”
  • AI: “What’s the best accounts payable automation software for mid-sized companies?”

Step 6. If Aware of Your Brand, How Would They Search? (Brand-Driven)

Why it matters

  • Captures navigational searches and competitive-intent queries.
  • Shows where brand strength exists (and where gaps are).

Examples

  • Traditional: “Nike running shoes,” “Acme CRM demo.”
  • AI: “What are the pros and cons of Acme CRM compared to Salesforce?”

Step 7. What’s the Difference if They’ve Met a Professional vs. Not? (Educated vs. Uneducated)

Why it matters

  • A consultation, demo, or competitor pitch changes search language dramatically.
  • Shows the knowledge gap between uninformed and informed users.

Examples

  • Traditional: “stop migraine instantly,” “alternative to Salesforce.”
  • AI: “My doctor prescribed a preventive migraine medication — what side effects should I expect?”

Step 8. Would Caregivers or Proxy Decision-Makers Search Differently? (Role Variation)

Why it matters

  • In many industries, the searcher is not the end user.
  • Caregivers, executives, or managers phrase questions differently.

Examples

Doer: “How do I automate payroll tasks in my HR software?”

Traditional: “panic attack treatment,” “HR software.”

AI:

Caregiver: “How can I help my teen manage panic attacks safely?”

Exec: “What’s the ROI of HR automation for a 500-person company?”

Step 9. What Might Consumers Expect? (Expectation Alignment)

Why it matters

  • Surfaces unrealistic expectations (“cure,” “unlimited”).
  • Helps craft messaging that addresses hopes while grounding in reality.

Examples

  • Traditional: “cure for migraines,” “unlimited cloud storage.”
  • AI: “Is there actually a cure for migraines, or do treatments only manage symptoms?”

The Core Nuance: Problem-Driven vs. Solution-Driven

  • Problem-driven (solution unaware): “I don’t know what’s possible. I’m describing my pain.”
  • Solution-driven (solution aware): “I know there’s an answer. I’m framing my query with that in mind.”

👉 Problem-driven searchers want validation, education, and direction.
👉 Solution-driven searchers want options, comparisons, and proof.

From Keywords to Prompts: The Four-Block Structure

To make prompts systematic, use the Four-Block Model:

  1. Role & Context – “I am a CFO at a mid-sized manufacturing company…”
  2. Objective / Need – “…looking for accounts payable automation software to reduce invoice processing time…”
  3. Constraints / Preferences – “…with a focus on cost savings and integrations with QuickBooks…”
  4. Output Format – “…provide a comparison table of the top 5 options with pricing and features.”

Side-by-Side Example

Traditional KeywordRaw PromptStructured Prompt
“AP automation software”“What’s the best AP automation software for mid-sized companies?”“I am a CFO at a mid-sized manufacturing company looking for AP automation software to reduce invoice processing time, with a focus on cost savings and QuickBooks integration. Provide a comparison table of the top 5 options with pricing and features.”

Final Thought

Tools are helpful — they scale what you know. But if you skip Step 1: What do you do?, everything else is misaligned.

Old school or “ancient school,” the basics haven’t changed:

  • Start with what you do.
  • Start with the problems you solve.
  • Start with how people describe those problems.

Whether in Google or AI search, clarity beats lists every time.