You tested it yourself. You typed something like "best family law attorney in [your city]" into ChatGPT, and it recommended three firms. Yours was not one of them.
One of the firms it did recommend — you know them. They are not obviously better. They might have fewer Google reviews. Their website might be older than yours. But ChatGPT named them and not you.
Here is why that happens, what it means, and exactly what to fix first.
The 5 reasons your competitor is getting the recommendation
ChatGPT, Perplexity, and similar AI tools are not running a ranking auction. They are doing something closer to a trust check: given everything they can find about your business and your competitor's business, which one can they describe with confidence? The one they can describe with more specificity and consistency is the one they recommend.
These are the five gaps that explain almost every case where a competitor shows up and you do not.
1. Their business information is more consistent than yours
When an AI tool looks up your business, it cross-references multiple sources — your website, Google Business Profile, Yelp, directories, LinkedIn, maybe a local press mention. If those sources give different versions of your name, address, or phone number, the AI loses confidence. Not dramatically — it does not refuse to mention you entirely — but it weights you lower in any recommendation that involves trust.
Inconsistency is more common than you think. A past office move that was changed on the website but not Yelp. A business name that uses "and" in one place and "&" in another. A phone number that changed two years ago and is still on three directories.
Your competitor may have cleaner data not because they tried harder, but because they never moved, never rebranded, and their original directory listings have stayed accurate by default.
Google is explicit about this: the local ranking algorithm factors in relevance, distance, and prominence — and businesses with complete, accurate information are more likely to be matched to relevant searches. Google's own documentation notes that business information is sourced from the website, third-party licensed data, and publicly available sources — meaning every contradictory entry across those sources is a signal Google and AI tools have to reconcile. BrightLocal and Whitespark both document the downstream effect: inconsistent NAP data creates confusion and can reduce trust, which translates to lower visibility in local results.
The fix: Audit your NAP (name, address, phone number, website URL) across: your website footer, Google Business Profile, Yelp, LinkedIn, and any industry-specific directories. Inconsistent business details can reduce trust and make it harder for search systems to confidently surface your business. Standardize everything to match your website exactly — same spacing, punctuation, and abbreviations.
2. Their service pages are more specific than yours
This is the most common gap, and the one that is hardest to close quickly — but it pays the most.
AI tools surface business recommendations by matching a user's query against what they understand about each business. If someone asks ChatGPT for "an estate planning attorney who handles trusts in [city]," it needs to find a page that specifically addresses estate planning, trusts, and the city. A generic "Practice Areas" page that lists six things in two sentences each does not give the AI enough to match confidently.
Your competitor might have a dedicated page for estate planning — written in plain language, naming the client problems they solve, mentioning the city, including a few specific questions that clients ask. That page gives the AI more to work with. The match confidence goes up. That gives search systems more relevant signals to match when someone asks for that exact service in that exact area.
Google's guidance on relevance is direct: how well a page's content matches the query terms determines how confidently it is surfaced, and clear titles and headings help Google understand what a page is about. Google's local ranking documentation adds that detailed business information helps match a business to relevant searches. Search Engine Land and BrightLocal have both documented the pattern in local results: localized, service-specific pages have increasingly replaced generic homepages as the primary match point for local queries. The same logic holds whether the surface is a SERP or an AI recommendation — the mechanism is a relevance match, not a popularity vote.
This is not about blog posts or content volume. One specific, well-structured service page consistently outperforms a dozen thin ones.
The fix: For each core service you offer, build (or rewrite) a dedicated page that includes: the service named clearly in the H1, the client type you serve, your city or service area, what the engagement looks like, and at least three client-centric questions answered in plain language on the page.
Two structural habits make a significant difference here. First, lead with the answer — put the most useful information at the top of the page, not at the bottom of a long preamble. AI tools pulling content from a page tend to cite the first substantive sentences, so a service page that opens with "We handle estate planning for families in [city], including living trusts, powers of attorney, and healthcare directives" will outperform one that opens with a paragraph about the firm's history. Second, write in explicit question-and-answer format on each service page — not hidden in an FAQ accordion, but as visible on-page content. A question like "What is included in a basic estate plan?" followed by a specific three-sentence answer gives AI tools a citation-ready passage. Your competitor's page may have this; your current page probably does not.
3. They have more credible third-party citations
Your website alone is not a trust signal for an AI system. It is a first-party claim. Anyone can put anything on their website. What AI tools look for is corroboration — external sources that confirm your business exists, is located where it says, does what it claims, and has served real clients.
Third-party citations include: directory listings (Avvo for lawyers, Healthgrades for medical, Clutch for consultants), review platforms, professional association membership pages, local press mentions, and any publication that names your business and links to it or describes what you do.
Your competitor may have more of these simply because they have been around longer, were more diligent about claiming directory profiles in the early days, or happened to get a local press mention a few years ago.
Google's local ranking documentation is direct on this: prominence is influenced by signals like how many websites link to a business, how many reviews it has, and how well it is referenced across the web. A separate section of Google's business information documentation confirms that brand profiles are built from third-party licensed and publicly available sources alongside the business's own website — which means your website alone is not a sufficient corroboration signal. Whitespark's citation research adds the practical version: citations help search engines verify that a business exists, and when multiple credible sources carry the same accurate business information, that corroboration helps establish legitimacy and improves local visibility.
The fix: Search "[your business name] site:yelp.com" and "[your competitor's name] site:yelp.com" — count the claimed profiles. Do the same for your industry's key directories. Claim and complete any profiles that your competitor has and you do not. Priority goes to: Google Business Profile, Yelp, LinkedIn, any industry association directory, and one or two general directories like the Better Business Bureau.
4. Their structured data is in place and yours is not
Structured data — specifically schema markup — is the machine-readable layer of a website. It tells AI tools directly what your business is, what it does, who it serves, where it is located, and how to contact you. Without it, AI tools have to infer that information from the readable text of your pages. Inference is less reliable and confidence is lower.
Many service business websites have no schema at all, or have basic schema that came from a plugin setup but was never configured to include the specifics that matter — service types, service area, FAQ content, individual practitioners.
Your competitor may have had schema installed by a developer or plugin who actually configured it correctly, while yours was installed but left at default settings that output only a partial picture.
Google explicitly says LocalBusiness structured data helps you tell Google about business details — hours, departments, reviews, and service areas — in machine-readable form. Google also recommends FAQ structured data for service pages and validates it through the Rich Results Test. For AI features specifically, Google notes there is no special AI schema required, but that structured data should match the visible text on the page. What this means in practice: without structured data, you leave more interpretation to search systems, and you miss the richer machine-readable signals and search features that help those systems describe your business with specificity and confidence.
The fix: Paste your homepage URL into schema.org's validator and do the same for your competitor's homepage. Look at what each one outputs. A LocalBusiness block with your services, service area, and contact details is the minimum. FAQPage schema on your service pages and Person schema for named practitioners make the picture significantly clearer for AI tools.
The FAQPage schema point deserves emphasis: it is the bridge between on-page Q&A content and machine-readable structured data. If you have written question-and-answer sections on your service pages (as recommended in Signal 2), wrapping that content in FAQPage schema means AI tools can pull those answers directly when someone asks a question that matches. A page with visible Q&A content plus matching FAQPage schema is significantly stronger than either alone.
5. AI crawlers can reach their site and may not be able to reach yours
This is rarer but more absolute when it occurs. If GPTBot, ClaudeBot, or PerplexityBot are blocked by your robots.txt file — intentionally or as collateral damage from a security plugin or a past maintenance update that was never reversed — AI tools cannot read your site. At all. Your competitor's site, if unblocked, will be visible to AI crawlers while yours is invisible.
This is the best-documented of the five gaps. Google lists ensuring crawling is allowed in robots.txt and by hosting or CDN infrastructure as one of the worthwhile SEO practices for AI features. OpenAI confirms the mechanism directly: webmasters can allow OAI-SearchBot to appear in search results while separately disallowing GPTBot — which means robots.txt controls for AI crawlers directly determine what surfaces in OpenAI's search products. Perplexity is the most unambiguous: PerplexityBot will not index the full or partial text content of any site that disallows it via robots.txt. This is not a confidence penalty — it is an outright exclusion.
The fix: Open your website's robots.txt file (just go to yourdomain.com/robots.txt in a browser). Look for any Disallow: / rules, or any rules that block specific AI bots by name. If you see GPTBot, ClaudeBot, or PerplexityBot disallowed, that needs to be removed. If you cannot read the file yourself, this is a five-minute job for a developer or a competent website manager.
Why this is not about budget or content volume
The instinct when a competitor outperforms you in any channel is to assume they are spending more. In AI search, that assumption is particularly wrong.
ChatGPT does not have a sponsored placement. There is no paid tier for local recommendations. A solo practitioner with a $500 website can outperform a funded firm with a professional agency behind them if the solo practitioner has consistent directory listings, specific service pages, and schema in place — and the larger firm does not.
Content volume is similarly irrelevant in isolation. Publishing fifty blog posts about legal topics will not move your AI visibility if your service pages are generic, your Yelp listing has the wrong address, and your robots.txt is blocking AI crawlers. The AI tools are not rewarding content production. They are rewarding specificity, consistency, and accessibility.
This is mostly good news: the work that closes the gap is finite, mostly a one-time cleanup, and cheaper than ongoing content marketing or paid search.
How to close the gap: where to start this week
Run through this in order. Each item takes between a few minutes and a few hours. The sequence matters — do not start on service page rewrites before you have confirmed AI bots can reach your site and your NAP is clean.
| Priority | Task | Time estimate |
|---|---|---|
| 1 | Check robots.txt for blocked AI crawlers | 5 minutes |
| 2 | Run NAP audit across 5 core platforms | 1–2 hours |
| 3 | Fix any NAP mismatches found | 1–3 hours depending on count |
| 4 | Validate schema on your homepage and top service page | 20 minutes |
| 5 | Add or fix LocalBusiness schema with services and service area | 1–4 hours (or your developer) |
| 6 | Rewrite your highest-value service page: lead with the answer, add Q&A sections, wrap in FAQPage schema | 3–5 hours |
| 7 | Claim and complete any directory profiles your competitor has that you do not | 2–4 hours |
Do not try to do everything at once. Completing rows 1–3 this week will already change your AI confidence score. Rows 4–6 will follow, and by week three you will have done the majority of the structural work that separates businesses that show up in AI answers from those that do not.
What a competitor comparison looks like in practice
Open ChatGPT (or Perplexity) and run two prompts back to back:
- "Describe what [your competitor's business name] does and who they help."
- "Describe what [your business name] does and who they help."
Read both responses carefully. Look at:
- Does it know your city and service area?
- Does it name specific services or use vague categories?
- Does it mention individual practitioners?
- Does the description sound like someone who has read your actual service pages, or like a summary of a generic category?
If your competitor's description is more specific than yours, that specificity gap is your diagnosis. The service pages, schema, and citation work above is how you close it.
If the AI cannot describe your business at all — returns something vague, uncertain, or simply says it does not have information — your starting point is bot access and NAP consistency before anything else.
Where to go from here
The gap between you and your competitor in AI search is almost never permanent. Unlike Google, where a well-established competitor can hold rankings for years through accumulated authority, AI tools are reassessing constantly based on what they can crawl today. A business that cleans up its signals this month starts building AI confidence within weeks.
If you want to know exactly where your gaps are before you start, the LovedByAI GEO audit scans for the specific signals that matter — bot access, schema completeness, NAP consistency, and service page depth — and returns a prioritized list of what to fix first. It takes about two minutes and tells you which of the five gaps above are most likely explaining your competitor's current advantage.

