fb

Google Just Killed num=100 — Here’s What It Means for SEO, AI Search, and Reddit

Google num=100 change what it really means

By Maria Dykstra | Last Updated: October 7, 2025

Maria Dykstra is a GTM strategist and founder specializing in AI-powered visibility systems for B2B tech founders and service providers. After leaving her decade long career with Microsoft advertising, she’s spent the next 13 years building growth frameworks for visionary founders navigating algorithmic chaos.

TL;DR

Google disabled the num=100 parameter in September 2025. Rankings didn’t change—your measurement tools broke. Most rank trackers now see only the top 10–20 results instead of 100. Impressions dropped 40–70% in Google Search Console because bot traffic vanished. Clicks stayed steady. Long-tail keywords still work, but tracking them costs more. Your next move: Build entity-rich content for page one, optimize for AI citations, and measure what matters—conversions, not vanity impressions.

What Google num=100 Change Did (And Why You’re Seeing Different Numbers Now)

For years, SEO tools used &num=100 to pull Google’s top 100 results in a single request. One API call returned everything from position 1 to position 100.

In September 2025, Google disabled it.

Now, most tools default to the top 10–20 results. Fetching deeper rankings requires multiple requests—more cost, more time, more friction.

What this means: Your rank tracker didn’t lose accuracy. It lost visibility into positions 30–100. If you ranked #47 for “best CRM for dentists,” you won’t see it unless your tool pays for deeper crawls.

What didn’t change: Your actual rankings for real users. Google still shows 100+ results per query. The reporting layer broke, not the algorithm.

This is a measurement shock, not a core ranking update.


The Big Effects You’ll See First

1. Impressions Drop 40–70% in Google Search Console

Between September and October 2025, most sites saw impressions crater. The cause: bot traffic disappeared from GSC reporting.

Before num=100’s removal, scrapers, AI crawlers, and rank trackers generated “impressions” at positions 50, 70, 90. Those never represented real users. Google cleaned its data supply.

Your CFO will panic. Tell them this: “The drop is bots gone, not buyers gone.”

2. Average Position “Improves”

When deep-position impressions vanish, your average position rises. You didn’t suddenly rank better—you’re just missing the tail end of your distribution.

If you had 1,000 impressions at position 12 and 500 at position 78, losing the latter pushes your average from 34 to 12. Math, not magic.

3. Clicks Stay Steady (Or Dip Far Less)

Real users rarely scroll past page two. According to Backlinko’s 2024 CTR study, positions 21–30 capture less than 0.5% of clicks. Positions 31–100? Effectively zero.

Your clicks won’t mirror your impression drop because page-six traffic was never clicking.

4. Historical Trend Lines Break

If you’re comparing October 2025 to August 2025, you’re comparing apples to oranges. Create new baselines after September. Annotate dashboards clearly.

Cost increases for deep rank tracking. Tools now fire 5–10 requests to reach position 100. Expect price hikes from SEMrush, Ahrefs, and SE Ranking.


Why Google Did It (And Why They Won’t Reverse It)

Google rarely explains parameter changes, but the logic is clear:

  1. Cut bot and LLM scraping of deep results. OpenAI’s GPTBot, Anthropic’s ClaudeBot, and hundreds of AI training systems scraped num=100 to build datasets. Google wants control over AI access—licensed APIs or nothing.
  2. Reduce server load. Serving 100 results per query is expensive. Most users see 10. Why subsidize scrapers?
  3. Nudge the industry toward APIs. Google Search Console API, Bing Webmaster Tools API, and commercial tools like DataForSEO remain functional. If you need deep data, pay for it.

Bottom line: Google cleaned its data supply. You should too.

Stop tracking 10,000 vanity keywords. Start tracking the 100 that drive revenue.

Strategic Shift: Measuring Entity-Rich Topic Hubs

Since deep-SERP tracking is unreliable, the success of a Topic Hub (or Pillar-Cluster model) must be measured by its internal authority-building signals and its influence on higher-funnel metrics, rather than raw long-tail impressions.

Old Metric (Now Fading)New Success Indicators (Focus Here)Rationale
Impressions (Pos 30-100)Branded Search LiftA strong topic hub validates your authority, leading users to search for your brand directly (a powerful conversion indicator).
Average Position (Inflated)Top-10 Rank CountFocus exclusively on the number of non-branded keywords ranking in positions 1–10. Page one is the only thing that matters now.
Long-Tail Keyword VolumeInternal Link Equity FlowUse tools (like Screaming Frog or Sitebulb) to measure how well the cluster pages (spokes) reinforce the Pillar page (hub), checking for a high volume of authoritative, contextual internal links.
Traffic to deep pagesTime on Site/Scroll Depth (on Hub)High engagement on the main Hub page and clear internal navigations signal to Google that the content offers comprehensive topical depth (an E-E-A-T signal).
Keyword DensityEntity Coverage ScoreUse NLP-focused tools (or Google’s NLP API) to ensure your content semantically covers all relevant entities connected to the topic, not just keywords.

What Happened to Long-Tail Strategy?

Long-tail didn’t die. Tracking it did.

You’ll miss early signals at positions 30–80. If a keyword climbs from #67 to #42, you won’t see it unless you’re paying for deep crawls.

Winners

Sites with topic authority and entity clarity that can break into page one. If you publish “complete guides” that Google trusts, you’ll skip positions 20–50 and land at #8.

Healthcare clinics, legal advisors, and B2B SaaS companies with strong E-E-A-T signals win here.

Losers

Sites living on page 3–10 traffic. If you ranked #35 for 800 keywords and converted 0.2% of that traffic, you’re now invisible—and you can’t prove ROI.

Affiliate sites, thin listicles, and keyword-stuffed directories lose visibility and budget.

The New Play

Shift from “thousands of keywords” to entity clusters and task completion.

Instead of tracking “best CRM,” “top CRM software,” “CRM tools 2025,” build one authoritative hub: “CRM Buyer’s Guide for Dental Practices.”

Cover:

  • Problem (why practices need CRM)
  • Solution (what CRM does)
  • Comparison (5 vendors, pros/cons)
  • How-to (setup walkthroughs)
  • Proof (case studies, ROI data)

Google’s algorithm now rewards topical depth over keyword breadth. So do AI systems like ChatGPT, Perplexity, and Bing’s Copilot.


AI and Generative Search Implications

RAG (Retrieval-Augmented Generation) systems power ChatGPT, Claude, Perplexity, and Google’s AI Overviews. These systems pull context from search results before generating answers.

Before num=100’s removal: RAG systems scraped deep result sets (positions 1–100) to build context.

After: They lean more on page-one sources, licensed feeds (AP, Reuters, Wikipedia), and structured on-page data.

Your Shot at AI Citations Improves When You:

  1. Use explicit structure. TL;DR sections, numbered steps, definition lists, pros/cons tables, and comparison charts parse cleanly.
  2. Mark up pages with schema. FAQPage, HowTo, Product, Organization, Person, and Review schema help AI systems extract facts.
  3. Show first-hand experience. “In our 2024 client cohort, 73% of dental practices using CRM saw 40% faster intake cycles.” Specific beats generic.
  4. Add authorship and credentials. Bylines with LinkedIn links, author bios with years of experience, and Person schema with sameAs links build trust.

Google’s Search Generative Experience (SGE), now rebranded as AI Overviews, prioritizes sources with clear expertise signals. If your content reads like an AI wrote it—vague, hedged, fluff-heavy—you’re invisible.

GEO (Generative Engine Optimization) shift: Optimize to be citable by AI and clear to algorithms, not just rank #17.



What This Means for Reddit (And Similar UGC Platforms)

Reddit still ranks. Google’s August 2024 forum boost and ongoing privileged access keep Reddit visible for queries like “best VPN Reddit” and “is therapy worth it Reddit.”

But many Reddit wins sit past position 20. A thread ranking #38 still gets upvotes and engagement—but you can’t track it in standard tools.

Net Effect

Reddit remains strong for trust, research, and discovery. Users click Reddit results because they want unfiltered opinions. But don’t bank on deep-page wins you can’t measure.

Tactic: Port Your Best Reddit Answers Into Owned FAQs

If you answered “How do I choose a therapist?” on r/mentalhealth and it got 200 upvotes, copy the structure—not the text—into your site’s FAQ page.

Add FAQPage schema. Link back lightly. Earn citations twice: once from Reddit’s authority, once from your owned content.

Google and AI systems now scrape both. You control one; you influence the other.


Old Way vs. New Way (Quick Reframe)

Old Way

  • Track 10,000 keywords
  • Report impressions as “visibility”
  • Scrape 100 results per query
  • Celebrate page-five moves

New Way

  • Build entity-rich topic hubs
  • Target page-one tasks
  • Track clicks, conversions, assisted revenue, AI citations
  • Measure pipeline impact, not vanity metrics

Sticky note: Don’t post more. Post smarter.

The New Focus: Generative Engine Optimization (GEO) / AI Citations

The shift to “AI Citations” (Generative Engine Optimization or GEO) requires content to be structured for machine extraction and verification.

Action ItemHow It Earns the Citation
Use Schema Markup StrategicallyImplement FAQPage, HowTo, Article, and QAPage schema using JSON-LD. This explicitly labels passages of text that AI can quote with high confidence.
Passage-Level OptimizationStructure content into short (under 120-word) paragraphs, each focused on a single fact, definition, or data point. LLMs extract and cite these specific passages.
Reinforce EEATExplicitly state the author’s credentials, link to official reports or proprietary data, and use verification anchors (e.g., “(Source: Q3 2025 Report)”) to signal factual rigor.
Leverage Community ContentAs you noted, answer questions on Reddit or Quora, then port the proven, well-received answers onto your owned FAQs with schema markup. The community validation helps build trust signals that AI models favor.

GEO Playbook: Quarterly Moves to Win Now

Entity Hygiene (Week 1)

Goal: Clarify who you are, what you do, and why Google (and AI) should trust you.

Actions:

  1. Add Organization schema to your homepage with sameAs links to LinkedIn, Crunchbase, and industry directories.
  2. Create or update author bios on every blog post. Include credentials, years of experience, and links to LinkedIn.
  3. Add Person schema with jobTitle, worksFor, and sameAs properties.
  4. Mark up high-intent pages (pricing, services, case studies) with Service or Product schema.
  5. Add FAQ and HowTo schema to support pages.

Why it matters: AI systems extract entities first, then context. If your entities are vague (“we help businesses grow”), you’re invisible. If they’re explicit (“we provide GEO-optimized content for dental practices in Ontario”), you’re citable.

Page-One Content (Weeks 1–3)

Goal: Own the top result for 5 revenue-driving topics.

Actions:

  1. Pick 5 topics where you have real expertise and first-hand experience.
  2. For each topic, create a “Problem → Solution → Comparison → How-to → Proof” content set.
  3. Format for AI extraction:
    • TL;DR at the top (3–4 sentences summarizing the key takeaway).
    • Numbered steps for processes.
    • Pros/cons tables for comparisons.
    • Dates and versions for software or regulatory topics.
    • Case studies with real numbers (client name optional, results mandatory).

Example:
Instead of “10 CRM Tips,” publish “How Dental Practices Reduced No-Shows by 40% Using CRM: A 90-Day Playbook.” Include setup steps, vendor comparisons, cost breakdowns, and ROI proof.


UGC Footprint (Weeks 2–4)

Goal: Answer buyer questions where they’re already asking—then mirror those answers on your site.

Actions:

  1. Identify 10 high-intent questions your ideal clients ask (e.g., “How much does therapy cost in Toronto?”).
  2. Answer them on Reddit, Quora, StackExchange, or LinkedIn.
  3. Lead with helpfulness. Link lightly (max one link per answer, preferably to a resource, not a sales page).
  4. Repurpose each answer into a site FAQ with schema.
  5. Track referral quality from UGC platforms in Google Analytics 4.

Why it matters: UGC platforms rank. AI systems scrape them. If you’re not present, competitors are.

Measurement Reset (Week 4)

Goal: Build new baselines post-September 2025 and focus on KPIs that matter.

Core KPIs:

  • Clicks (Google Search Console)
  • Sessions from organic (GA4)
  • Assisted conversions (GA4, attributed to organic)
  • Branded search lift (Google Trends, GSC brand queries)
  • AI answer presence (manual checks in ChatGPT, Perplexity, Bing Copilot)
  • Referral quality (session duration, pages per session, conversion rate by source)

Support KPIs:

  • Bing Webmaster Tools clicks
  • On-site search logs (what users search internally)
  • Backlinks from high-authority domains (Ahrefs, Majestic)

Narrative for stakeholders: “Lower impressions ≠ lower demand. Bot traffic vanished. Real buyer behavior stayed steady. We’re now measuring what converts, not what inflates dashboards.”


Reporting and Dashboards Post-num=100

Core Dashboard

Google Search Console:

  • Clicks by page (top 20 landing pages)
  • Clicks by query (top 50 queries)
  • Average position for branded vs. non-branded terms

Google Analytics 4:

  • Organic sessions
  • Conversion rate by landing page
  • Assisted conversions (organic touch in funnel)
  • Session quality (engagement rate, pages per session)

Branded Search Trends:

  • Google Trends for your brand name + category
  • GSC queries containing your brand

Page-One Share:

  • Manual or tool-based tracking (e.g., SEMrush Position Tracking) for your top 20 target keywords
  • Focus on positions 1–10 only

Support Dashboard

Bing Webmaster Tools:

  • Clicks and impressions (Bing still shows deeper position data)
  • Top queries (often different audience than Google)

Referral Sources:

  • GA4 → Acquisition → Traffic Acquisition → Source/Medium
  • Filter for Reddit, Quora, YouTube, LinkedIn
  • Track conversion rate and revenue by source

AI Visibility Tracking:

  • Manual checks: Search your target queries in ChatGPT, Perplexity, Bing Copilot, Google AI Overviews
  • Document when your content or brand appears
  • Track citation links (some AI tools provide source URLs)

On-Site Search:

  • GA4 → Events → view_search_results
  • See what users search after landing on your site (signal of content gaps)

Risks and Guardrails

1. Don’t Chase “Ghost” Long-Tail

If you can’t measure it, treat it as research, not ROI. Ranking #47 for “best CRM for orthodontists in Calgary” might drive one visit per month. That’s not a KPI—it’s noise.

Focus on the 20–50 keywords that drive 80% of your revenue.

2. Don’t Spam Forums

Moderators and AI-powered spam filters (like Reddit’s Automoderator and GPT-based detection) will bury you. One self-promotional comment per thread is the ceiling. Ten is a ban.

Lead with helpfulness. Link sparingly. Build reputation first.

3. Avoid Single-Channel Risk

If 90% of your traffic comes from Google, you’re one algorithm update away from a revenue crisis. Diversify:

  • Build email lists (owned audience)
  • Publish on LinkedIn, Medium, Substack (distribution + backlinks)
  • Invest in Bing Webmaster Tools (11% of U.S. search market, higher in enterprise)
  • Create YouTube content (Google owns it, but it’s a separate algorithm)

FAQs: What You’re Actually Asking

Did my rankings drop because num=100 is gone?

No. Your rankings for real users didn’t change. Your rank tracker’s visibility dropped. Google still shows 100+ results per query—you just can’t cheaply monitor positions 30–100 anymore.

Why did my impressions drop 60% overnight?

Bot traffic vanished. Scrapers, AI crawlers, and rank trackers generated impressions at deep positions. Those weren’t real users. Google cleaned its reporting. You’re now seeing actual human visibility.

Is long-tail SEO dead?

No. It’s harder to track, not to earn. Long-tail keywords still convert. But you must prove value with page-one content, not page-seven speculation. Use entity clusters (topical hubs) instead of isolated keywords.

How do I get into AI answers now?

Be clear, structured, experience-led, and entity-rich. Use TL;DR sections, numbered steps, schema markup, and first-hand data. Publish where people ask questions (Reddit, Quora, LinkedIn), then mirror those answers on your site with FAQPage schema.

Is Reddit still worth my time?

Yes—for trust, research, and discovery. Users click Reddit results because they want unfiltered opinions. Pair Reddit answers with owned FAQs to capture citations from both Google’s algorithm and AI systems.

What should I stop doing right now?

Stop tracking vanity keywords. Stop celebrating page-five moves. Stop publishing thin content without E-E-A-T signals. Stop ignoring Bing, YouTube, and email. Stop measuring impressions as success.

What should I start doing right now?

Start building entity-rich topic hubs. Start optimizing for AI citations. Start tracking conversions, not impressions. Start diversifying traffic sources. Start proving ROI with pipeline data, not dashboard screenshots.


Final Takeaway

Google’s num=100 removal didn’t kill your SEO strategy. It killed your excuses.

You can’t hide behind vanity metrics anymore. You can’t report “10,000 keywords tracked” as progress. You can’t celebrate page-six rankings that convert zero buyers.

The new reality: Page one or bust. Entity clarity or invisibility. AI citations or algorithmic obscurity.

Build content that real experts would cite. Structure it so AI systems can extract facts. Measure what drives revenue, not what inflates dashboards.

This isn’t a setback. It’s a filter. The sites that win from here are the ones that deserved to win all along.


About the Author

Maria Dykstra is a GTM strategist and founder of TreDigital, agency specializing in AI-powered visibility systems for healthcare and B2B service providers. She’s also embedded in Exactly AI Solutions, where she drives go-to-market strategy for agentic AI infrastructure. Maria builds plug-and-play AI tools for founders who want to scale smarter, and runs a healthcare-focused growth agency using GEO (Generative Engine Optimization) and AEO (AI Engine Optimization). She’s spent 13 years helping visionary founders navigate algorithmic chaos and build compound visibility engines.

Connect: LinkedIn |

SPREAD THE WORD

Share:

map

Our Mission

Our mission is to help you get the best results on your investments. We use latest marketing strategies to help your acquire and retain your customers. Our approach is on the intersection of art and science.
Search

Popular Posts

Send Us A Message

WINNING CONTENT STRATEGY IN LESS THAN 1 HOUR.

Privacy Preference Center