Google Search Console FAQ

Answers to the most common questions about Google Search Console - from getting started to advanced troubleshooting.

Getting Started

Setting up your Google Search Console account

Google Search Console (formerly Google Webmaster Tools) is a free service offered by Google that helps you monitor, maintain, and troubleshoot your website's presence in Google Search results.

It provides data on:

  • Which search queries bring users to your site
  • How often your pages appear in search results
  • Click-through rates and average ranking positions
  • Crawl errors and technical issues
  • Core Web Vitals and page experience signals

Tip: GSC Wizard connects to your Search Console data to provide deeper analytics, traffic forecasting, and content optimization insights beyond what the standard interface offers.

Yes, Google Search Console is completely free. You only need a Google account and a website to get started. There are no premium tiers or paid features within Search Console itself.

To get even more value from your Search Console data, tools like GSC Wizard offer advanced analytics and insights on top of your free GSC data.

  1. Go to search.google.com/search-console
  2. Sign in with your Google account
  3. Click "Add property"
  4. Choose either Domain (covers all subdomains and protocols) or URL prefix (covers a specific URL path)
  5. Complete the verification process to prove you own the site

Domain Property

  • Covers all subdomains (www, m, blog, etc.)
  • Covers both HTTP and HTTPS
  • Requires DNS verification only
  • Recommended for most sites

URL Prefix Property

  • Covers only URLs under a specific prefix
  • Specific to one protocol (HTTP or HTTPS)
  • Supports multiple verification methods
  • Useful for tracking specific subdomains separately

Go to search.google.com/search-console and sign in with your Google account. A few tips:

  • Make sure you're signed in with the same Google account that was used to verify the property
  • If you have multiple accounts, check the account switcher in the top-right corner of Google
  • If you were granted access by someone else, you'll see their property listed after they add your email under Users & permissions

They are the same product. Google rebranded Google Webmaster Tools to Google Search Console in May 2015. The new name better reflects its broader audience - not just developers and webmasters, but all site owners and SEOs. Since the rebrand, Google has substantially expanded the tool with reports like Core Web Vitals, the URL Inspection tool, and the Page Experience report.

If you see old tutorials referencing "Google Webmaster Tools," the instructions still apply - just look for the equivalent feature in Search Console.

There are three main methods for WordPress:

Method 1: Yoast SEO Plugin

In Search Console, copy your HTML tag verification code. In WordPress, go to Yoast SEO > General > Webmaster Tools, paste only the value from the content="" attribute into the Google field, and save.

Method 2: Google Site Kit Plugin

Install Google's official Site Kit plugin from the WordPress plugin directory. It handles Search Console verification automatically when you connect your Google account.

Method 3: Manual HTML Tag

Use a plugin like Insert Headers and Footers to paste the full meta tag into the site header, or add it directly to your theme's header.php file before the </head> tag.

  1. In Search Console, add your store URL as a URL prefix property
  2. Choose HTML tag as the verification method and copy the meta tag
  3. In Shopify admin, go to Online Store > Themes > Actions > Edit Code
  4. Open theme.liquid and paste the meta tag inside the <head> section
  5. Save the file, then click Verify in Search Console
  6. Submit your sitemap at yourdomain.com/sitemap.xml (Shopify generates this automatically)

Verification

Proving ownership of your website

After adding your property, Google will ask you to verify ownership. The recommended method depends on your property type:

  • Domain property: Add a DNS TXT record through your domain registrar
  • URL prefix property: Choose from HTML file upload, HTML meta tag, Google Analytics, or Google Tag Manager

For most users, the HTML meta tag method (adding a tag to your homepage's <head> section) is the simplest option.

Google Search Console offers five verification methods:

  1. DNS TXT record - Add a record through your domain registrar. Required for Domain properties.
  2. HTML file upload - Upload a verification file to your site's root directory.
  3. HTML meta tag - Add a <meta> tag to your homepage's <head> section.
  4. Google Analytics - Verify using your existing GA tracking code.
  5. Google Tag Manager - Verify using your GTM container snippet.

WordPress users can also use plugins like Yoast SEO, Rank Math, or Google's official Site Kit for simplified verification.

Common reasons for verification failure:

  • DNS changes haven't propagated yet - Wait 24-48 hours and try again
  • HTML file was modified or renamed - Re-download and upload the exact file
  • Meta tag isn't in the <head> - Ensure it's before the closing </head> tag
  • robots.txt is blocking Googlebot - Temporarily allow access
  • Wrong URL variation - Verify HTTP vs HTTPS or www vs non-www matches

Try clearing your website cache, then attempt verification again.

  1. In Search Console, after adding your URL prefix property, choose "HTML tag" as the verification method
  2. Copy the entire <meta name="google-site-verification" content="..."> tag
  3. Paste it into the <head> section of your homepage's HTML, before the closing </head> tag
  4. Publish the change, then click Verify in Search Console

Important: Do not remove the meta tag after verification. If Google can no longer find it, your property will become unverified.

DNS verification is required for Domain properties and works across all subdomains. Steps:

  1. In Search Console, copy the TXT record value provided (it looks like google-site-verification=abc123...)
  2. Log into your domain registrar (Cloudflare, GoDaddy, Namecheap, etc.)
  3. Navigate to DNS settings for your domain
  4. Add a new TXT record with Host/Name set to @ and Value set to the copied code
  5. Save, then return to Search Console and click Verify

DNS changes can take up to 48 hours to propagate, though it usually happens within a few minutes to a few hours.

Submitting & Indexing

Getting your pages discovered and indexed by Google

  1. Add your site to Google Search Console and verify ownership
  2. Submit your sitemap via the Sitemaps section (usually /sitemap.xml)
  3. Request indexing for important individual pages using the URL Inspection tool

Google will discover and crawl your site automatically over time, but submitting a sitemap speeds up the process significantly.

  1. In Google Search Console, click "Sitemaps" in the left sidebar
  2. Enter your sitemap URL (typically /sitemap.xml) in the "Add a new sitemap" field
  3. Click Submit

Google will process the sitemap and report any errors found. You can submit multiple sitemaps if your site is large or has distinct sections.

  1. Open the URL Inspection tool (search bar at the top of Search Console)
  2. Enter the URL you want indexed
  3. Wait for the inspection results
  4. Click "Request Indexing"

Note: There's a daily quota on indexing requests, so prioritize your most important pages. Google typically processes requests within a few days.

Common reasons why Google isn't indexing your pages:

  • noindex meta tag or HTTP header on the pages
  • robots.txt blocking Googlebot from crawling
  • Low-quality or thin content that Google doesn't find valuable
  • Crawl budget issues on large sites
  • Canonical tags pointing to a different URL
  • Pages behind authentication that Googlebot can't access
  • New site that Google hasn't fully discovered yet

Check the Pages report in Search Console for specific indexing status and error details for each URL.

This means Google knows the URL exists (discovered it via a sitemap or link) but hasn't crawled it yet. This typically happens when:

  • Google's crawl budget for your site is limited
  • Google doesn't consider the page high-priority enough to crawl immediately
  • The server was overloaded during previous crawl attempts

Improving internal linking, content quality, and overall site authority can help Google prioritize crawling these pages.

This means Google has crawled the page but decided not to add it to its index. This typically indicates a quality concern - Google may consider the content:

  • Too thin or lacking unique value
  • Duplicate of other content on your site or the web
  • Not valuable enough for search users

To fix this, improve the content by adding unique value, ensure it's not duplicating other pages, and strengthen internal links pointing to it.

Indexing time varies widely - from a few hours to several weeks. Factors that affect speed include:

  • Site authority and crawl frequency
  • Content quality and uniqueness
  • Whether you've submitted a sitemap
  • How new your site is (newer sites take longer)

Using the URL Inspection tool to request indexing can speed things up for individual pages.

A few reasons why the button may be grayed out, disabled, or not working:

  • Daily quota reached - Each property has a limited number of indexing requests per day. Wait 24 hours and try again.
  • Temporary feature limitations - Google has occasionally paused or rate-limited this feature. Check the Search Console Help for any known issues.
  • URL not eligible - The URL may be blocked by robots.txt or have a noindex tag, preventing it from being indexed regardless.

For bulk indexing, submitting an updated sitemap is more reliable than using the Request Indexing button on individual pages.

This error means Google was unable to fetch or parse your sitemap file. Common causes:

  • Sitemap URL returns a 404 or 500 error - Check the URL is accessible in a browser
  • Invalid XML formatting - Validate your XML at a tool like xml-sitemaps.com
  • Blocked by robots.txt - Ensure your sitemap URL isn't disallowed
  • Server timeout - The server took too long to respond when Google tried to fetch it
  • Wrong content type - The server must serve it with an XML content type

After fixing the issue, resubmit the sitemap URL in Search Console.

Indexing Issues

Understanding and resolving indexing status in Search Console

The URL Inspection tool is accessible via the search bar at the top of any Search Console page. Enter any URL from your property to see:

  • Index status - Whether the URL is currently indexed
  • Last crawl date - When Googlebot last visited the page
  • Crawl details - Referring page, crawl allowed/blocked status
  • Rendered page - Click "View Crawled Page" to see how Google sees the page after JavaScript rendering
  • Enhancements - Any structured data or AMP errors detected

Use "Request Indexing" after publishing new content or making important changes to a page.

The Pages report (formerly Coverage report) shows four status categories:

Error - Pages that could not be indexed due to a critical issue (e.g., server errors, redirect errors, blocked by robots.txt when not intended).
Valid with Warning - Indexed but has an issue worth investigating (e.g., indexed despite a noindex on another version).
Valid - Successfully indexed by Google.
Excluded - Not indexed, but usually intentionally (noindex tag, redirects, duplicates, etc.).

Prioritize fixing Error status pages, especially if they are important pages that should be indexed.

Excluded pages are not in Google's index, but this is often intentional. Common excluded statuses include:

  • Crawled – currently not indexed - Google crawled it but chose not to index it (quality issue)
  • Discovered – currently not indexed - Known but not yet crawled
  • Duplicate without canonical tag selected - Google chose a different canonical version
  • Excluded by 'noindex' tag - You've told Google not to index this page
  • Blocked by robots.txt - Googlebot can't access the page
  • Page with redirect - The URL redirects to another URL
  • Not found (404) - Page returns a 404 error
  • Soft 404 - Page returns 200 but appears to have no content

Review each excluded reason to confirm it's intentional. Unexpected exclusions may indicate issues worth investigating.

A soft 404 is a page that returns a 200 OK HTTP status code but contains content that Google considers equivalent to a "page not found" - such as:

  • A nearly empty page with little or no content
  • A "no results found" search results page
  • An out-of-stock product page with no useful information
  • A page that says "Coming soon" with no other content

To fix soft 404s: if the page is intentionally gone, return a proper 404 or 410 HTTP status code. If the page should exist, add meaningful, useful content to it.

This status means the URL Google discovered or that's in your sitemap redirects to a different URL. This is usually fine for intentional 301 permanent redirects, but becomes problematic when:

  • You have redirect chains (A redirects to B which redirects to C)
  • Important pages are being redirected unexpectedly
  • Your sitemap contains redirecting URLs instead of the final destination URLs

Best practice: update your sitemap to list only final destination URLs, and minimize redirect chains to a single hop.

Performance Reports & Data

Understanding your search performance metrics

Click "Search results" in the left sidebar to see your Performance report. It shows:

  • Total clicks, impressions, average CTR, and average position
  • Queries tab - which search terms bring traffic
  • Pages tab - which pages perform best
  • Countries, Devices, Search appearance - breakdowns by dimension

Use the date range and filter options to drill into specific time periods, queries, or pages.

Tip: GSC Wizard extends the Performance report with traffic forecasting, content decay detection, keyword cannibalization analysis, and CTR benchmarking.

Clicks - The number of times users clicked through to your site from Google Search results.
Impressions - How many times your pages appeared in search results, even if nobody clicked.
CTR (Click-Through Rate) - The percentage of impressions that resulted in a click. Calculated as clicks ÷ impressions.
Average Position - The average ranking position in search results. Position 1 is the top result.

Google Search Console stores up to 16 months of Performance report data. If you need to keep data longer, you should:

  • Export data regularly as CSV or Google Sheets
  • Use the Search Console API to store data in your own database
  • Use a tool like GSC Wizard that automatically stores and analyzes your historical data

Yes! The Performance report shows the actual search queries driving impressions and clicks to your site. Powerful keyword research strategies include:

  • Striking distance keywords - Find queries ranking in positions 8-20 that are close to page one
  • High-impression, low-CTR queries - Opportunities to improve titles and descriptions
  • Long-tail discoveries - Sort by impressions to find niche queries competitors miss
  • Unexpected keywords - Queries you rank for that you didn't intentionally target

Unlike third-party tools, this is real data directly from Google, making it highly reliable for SEO decisions.

Tip: GSC Wizard automatically identifies striking distance keywords, content decay, and keyword cannibalization opportunities from your Search Console data.

Search Console and Google Analytics measure fundamentally different things:

Search Console

  • Tracks search impressions and clicks before users reach your site
  • Filters spam clicks
  • Anonymizes some queries

Google Analytics

  • Tracks actual visits and behavior on your site
  • Covers all traffic sources, not just search
  • Different time zone processing

The two tools are complementary, not duplicative. Use both for a complete picture.

CTR varies significantly by ranking position, query type, and industry. General benchmarks by position:

  • Position 1: 20–30% average CTR
  • Position 2–3: 10–15% average CTR
  • Position 4–10: 2–8% average CTR
  • Position 11+: Under 2%

Branded queries typically get higher CTR. High impressions + low CTR is your biggest opportunity - it means you're visible but your title or meta description isn't compelling enough to earn the click. Optimize these pages first.

Tip: GSC Wizard automatically flags pages with below-average CTR for their ranking position, helping you prioritize which titles and descriptions to improve.

Average position is the mean ranking across all queries that triggered impressions. It can be misleading because:

  • A page might rank #1 for some queries and #30 for others, producing an average of around #15
  • High-volume low-ranking queries can drag down the average
  • Position can vary significantly by device, country, and personalization

For more meaningful analysis: filter by a specific query or page to see its true position, or segment by device and country. An improving average position trend is a positive signal even if the absolute number looks high.

Performance data (clicks and impressions) is typically 2–4 days behind the current date. This happens because Google processes and aggregates massive amounts of search data before displaying it. A few things to know:

  • The last 2–3 days of data are often incomplete - avoid drawing conclusions from the very latest data points
  • Index coverage and URL Inspection data updates more frequently
  • Core Web Vitals data (from CrUX) refreshes on a weekly basis
  • Performance data is stored for up to 16 months

In the Performance report, click + New under the filter bar and select Query or Page, then choose "Custom (regex)". GSC uses RE2 regex syntax. Useful examples:

  • ^(how|what|why|when|where|who) - Filter for question-based queries
  • ^(?!.*yourbrand) - Exclude brand queries to see non-brand performance
  • /blog/.* - Show only blog pages in the Page filter
  • best.*2025 - Find queries containing "best" and a year

Regex filters are case-insensitive by default in Search Console.

You have several export options:

  • Export button in the UI - In any Performance report view, click the Export button (top right) to download as CSV or open directly in Google Sheets
  • Looker Studio - Connect via the Search Console data source for live, interactive dashboards
  • Search Console API - For automated, programmatic exports beyond the UI's 1,000-row limit
  • Third-party tools - Tools like GSC Wizard connect via the API and archive data automatically, removing the 16-month storage limit

Note that the UI limits exports to 1,000 rows per export. Use the API for full data access.

  1. Go to lookerstudio.google.com and click Create > Data Source
  2. Search for and select the Search Console connector
  3. Authorize access to your Google account
  4. Choose your site and the data type: Site Impression (aggregated by site) or URL Impression (per page)
  5. Click Connect, then use the data source in a new or existing report

Once connected, you can build dashboards that combine Search Console data with Google Analytics 4, Ads, and other data sources for a complete marketing overview.

If your site receives Google Discover traffic, a dedicated Discover performance report will appear in the left sidebar of Search Console. It shows:

  • Impressions and clicks from Discover, separate from regular Search
  • Top performing pages in Discover
  • Trends over time

The Discover report only appears once your site has accumulated enough Discover impressions to meet Google's reporting threshold. Discover performance tends to be more volatile than search performance since it's driven by content freshness and engagement signals.

Sitemaps

Managing your XML sitemap for better crawling

A sitemap is an XML file that lists all the important pages on your website. It helps search engines discover and crawl your content more efficiently.

While Google can find pages through links, a sitemap is especially important for:

  • New websites with few external links
  • Large sites with hundreds or thousands of pages
  • Pages with few internal links pointing to them
  • Sites with frequently updated content

Most CMS platforms generate sitemaps automatically:

  • WordPress - Plugins like Yoast SEO or Rank Math create and maintain your sitemap
  • Shopify, Wix, Squarespace - Generate sitemaps automatically
  • Custom sites - Use online sitemap generators or create the XML file manually following the sitemap protocol

Your sitemap is typically accessible at yourdomain.com/sitemap.xml.

Common sitemap errors include:

  • Invalid XML format - Malformed XML syntax or encoding issues
  • URLs returning 404 errors - Listed pages that no longer exist
  • URLs blocked by robots.txt - Pages listed but prevented from crawling
  • URLs with noindex tags - Contradictory signals to Google
  • Sitemap too large - Over 50MB or 50,000 URLs (split into multiple sitemaps)
  • Sitemap URL unreachable - Server errors when Google tries to fetch it

Fix the reported issues, then resubmit the sitemap in Search Console.

Core Web Vitals

Page experience and performance metrics

Core Web Vitals are a set of metrics measuring real-world user experience on your website. Since 2021, these are Google ranking factors. The three metrics are:

  • LCP (Largest Contentful Paint) - Loading speed
  • INP (Interaction to Next Paint) - Responsiveness
  • CLS (Cumulative Layout Shift) - Visual stability

The Core Web Vitals report in Search Console shows how your pages perform based on real Chrome user data (CrUX), with separate reports for mobile and desktop.

LCP - Largest Contentful Paint

Measures how fast the main content loads. Aim for under 2.5 seconds.

INP - Interaction to Next Paint

Measures how quickly your page responds to user interactions (clicks, taps, key presses). Aim for under 200 milliseconds.

CLS - Cumulative Layout Shift

Measures how much the page layout shifts unexpectedly while loading. Aim for a score under 0.1.

Fixes depend on which metric needs improvement:

For LCP (loading speed):

  • Optimize and compress images
  • Minimize render-blocking CSS and JavaScript
  • Use a CDN and enable browser caching

For INP (responsiveness):

  • Reduce JavaScript execution time
  • Break up long tasks into smaller chunks
  • Optimize event handlers

For CLS (visual stability):

  • Set explicit width and height on images and ads
  • Avoid inserting content above existing content
  • Use CSS containment for dynamic elements

Use PageSpeed Insights to diagnose specific issues. After fixing, click "Validate Fix" in the Core Web Vitals report. Improvements can take up to 28 days to reflect in Search Console.

Technical SEO

Using Search Console to find and fix technical SEO issues

Go to Experience > Mobile Usability in the left sidebar. The report lists pages with issues such as:

  • Text too small to read - Font size below ~12px on mobile
  • Clickable elements too close together - Links or buttons within 48px of each other
  • Content wider than screen - Horizontal scrolling required
  • Viewport not set - Missing <meta name="viewport"> tag

Click any issue to see the affected URLs. After fixing the CSS or HTML issues, click "Validate Fix" to notify Google.

Go to Experience > Rich results (or search "rich results" in the sidebar search). You'll see individual reports for each structured data type Google has detected on your site - such as FAQ, Product, Article, Recipe, and more.

  • Valid items - Structured data eligible for rich results in search
  • Warnings - Issues that won't prevent rich results but should be fixed
  • Errors - Problems that make the structured data invalid

Use Google's Rich Results Test to validate individual pages before publishing new structured data.

Google retired the dedicated robots.txt tester in Search Console in 2022. Current options for checking your robots.txt:

  • URL Inspection tool - Inspect any URL; it will show "Blocked by robots.txt" if applicable
  • View directly - Access yourdomain.com/robots.txt in a browser to check the current rules
  • Settings in Search Console - Go to Settings > Crawl stats to see if Googlebot is being blocked from sections of your site
  • Third-party testers - Tools like Google's Search Console robots.txt tester or online validators

In the Pages report, look for the "Not found (404)" status under the Error category. Click it to see all affected URLs. For each one:

Should the page exist?

Restore the page, or redirect it to the most relevant existing page using a 301 redirect.

Is the 404 intentional?

You can leave it - Google will eventually drop it from the index. Only prioritize fixing 404s that have backlinks pointing to them or that were previously high-traffic pages.

Not all 404s are bad. Fixing every 404 is not necessary - focus on the ones that matter for SEO and user experience.

The Page Experience report (under Experience in the sidebar) combines multiple user experience signals into one overview:

  • Core Web Vitals - LCP, INP, and CLS performance
  • Mobile usability - Whether pages are mobile-friendly
  • HTTPS - Whether your site is served over a secure connection

The report shows what percentage of your URLs provide a "Good" page experience. These signals are used by Google as part of its page experience ranking system. Improving these metrics can positively influence rankings, especially in competitive niches where content quality is similar across competing pages.

Under Experience > Rich results in the sidebar, you'll find individual reports for each rich result type detected on your site, including:

  • FAQ - FAQ schema that can show expandable Q&As in search results
  • Product - Product prices, availability, and ratings
  • Article - News and blog articles with enhanced display
  • Recipe - Recipe cards with cook time and ratings
  • HowTo - Step-by-step instructions in search results

Fix any errors in these reports to ensure your pages are eligible for rich results in Google Search. Valid structured data doesn't guarantee rich results - Google decides whether to show them based on relevance and quality.

Troubleshooting

Fixing common Google Search Console issues

Use the URL Inspection tool to check if your pages are indexed. Common reasons for not appearing:

  • Your site has a noindex meta tag or HTTP header
  • robots.txt is blocking Googlebot
  • The site is too new and hasn't been crawled yet
  • There's a manual action penalty
  • Your content doesn't match what users are searching for
  • You haven't submitted a sitemap or verified your site

Manual actions are penalties imposed by Google's human reviewers when a site violates Google's spam policies. Common reasons include:

  • Unnatural inbound or outbound links
  • Thin or auto-generated content
  • Cloaking or sneaky redirects
  • Structured data abuse

To fix a manual action:

  1. Check the Manual Actions report in Search Console
  2. Address all violations described
  3. Submit a reconsideration request explaining what you fixed
  4. Wait for Google to review (can take weeks)

There are two approaches depending on whether the removal is temporary or permanent:

Temporary removal (~6 months)

Use the Removals tool in Search Console (left sidebar > Removals). Enter the URL and submit the request.

Permanent removal

Add a noindex meta tag to the page, or return a 404 or 410 status code. Then use the Removals tool to speed up the de-indexing.

The Disavow Tool allows you to tell Google to ignore specific backlinks pointing to your site. It's used when you have spammy or low-quality links that:

  • You can't get removed by contacting the webmaster
  • May be harming your rankings
  • Were part of a link scheme or negative SEO attack

Warning: Use the Disavow Tool cautiously. Disavowing legitimate links can hurt your SEO. It's available at search.google.com/search-console/disavow-links.

Links

Analyzing backlinks and internal links in Search Console

Go to Links in the left sidebar. The External Links section shows:

  • Top linked pages - Your pages with the most external backlinks
  • Top linking sites - Domains that link to you most often
  • Top linking text - Most common anchor text used in links to your site

This is first-party data directly from Google, making it highly accurate for understanding your link profile. Note that GSC shows a representative sample, not necessarily all links - for comprehensive backlink research, supplement with tools like Ahrefs or Semrush.

Tip: GSC Wizard makes it easy to monitor link growth and changes to your backlink profile over time using your Search Console data.

The Links report includes an Internal Links section showing which pages on your site receive the most links from other pages on your site. This is useful for:

  • Identifying orphaned pages with very few or no internal links
  • Understanding which pages your site architecture naturally emphasizes
  • Finding opportunities to improve internal linking to important pages that rank poorly

Pages with few internal links are often harder for Google to discover and may receive less crawl budget. Add contextual internal links from related content to strengthen these pages.

The disavow tool is at search.google.com/search-console/disavow-links. To use it:

  1. Create a .txt file with the domains or URLs to disavow (one per line; domains prefixed with domain:)
  2. Select your property on the disavow page
  3. Upload the file

Warning: Only disavow links if you have a manual action for unnatural links, or if you are certain specific links are harmful and cannot be removed by contacting the webmaster. Incorrectly disavowing good links can significantly hurt your rankings. Google's algorithm is quite good at ignoring spammy links on its own.

User Management

Managing access and permissions in Search Console

  1. In Search Console, click the Settings gear icon (bottom of the left sidebar)
  2. Click Users and permissions
  3. Click Add user
  4. Enter the person's Google account email address
  5. Select their permission level: Owner, Full, or Restricted
  6. Click Add

The new user will receive an email notification and can access the property immediately. You must be an Owner to add other users.

Owner

Full access to all data and features. Can add and remove users, change settings, and verify new properties. There are two types: Verified owners (verified the property themselves) and Delegated owners (granted owner status by a verified owner).

Full user

Can view all data and take most actions (submit sitemaps, request indexing, use the URL Inspection tool), but cannot manage users or delete the property.

Restricted user

Can view most reports but cannot see all data and cannot take actions. Useful for clients or stakeholders who only need to see performance data.

Google Search Console vs Other Tools

How GSC compares to and integrates with other tools

Google Search Console

  • Focuses on search performance
  • How you appear in Google Search
  • Which queries trigger your pages
  • Indexing status and technical SEO
  • Core Web Vitals and page experience

Google Analytics

  • Focuses on user behavior on your site
  • Page views, session duration, bounce rates
  • Conversions and goal tracking
  • All traffic sources (not just search)
  • Audience demographics and interests

Both are free, and most useful when used together. You can link them for combined insights.

To link Search Console with Google Analytics 4:

  1. In Google Analytics 4, go to Admin > Property Settings > Product Links > Search Console Links
  2. Click "Link"
  3. Select your Search Console property
  4. Choose the GA4 web stream to associate
  5. Confirm the link

Once linked, you can see Search Console data in the GA4 Acquisition > Search Console reports, combining search query data with on-site behavior metrics.

Google Search Console

  • Free
  • First-party data from Google
  • Exact clicks & impressions for your own site
  • No data on competitor sites
  • Indexing, technical, and link reports

Semrush

  • Paid (from ~$130/month)
  • Estimated data for any website
  • Competitor keyword gap analysis
  • Site audit and backlink tracking
  • Broader keyword research database

Most SEOs use both: GSC for accurate first-party data on their own site, and Semrush for competitive research and keyword discovery.

GSC provides exact, authoritative data from Google for your own site - every query you rank for, real click and impression counts. Ahrefs uses its own web crawler to estimate rankings and backlinks for any website, but the data is third-party estimates that may not perfectly reflect Google's index.

Key differences:

  • GSC: Free, exact data, your site only, direct from Google
  • Ahrefs: Paid, estimated data, any site, stronger for competitor backlink research and keyword difficulty scores

Use GSC as your source of truth for your own site's performance, and Ahrefs for broader competitive and link building research.

The Search Console API lets you programmatically access your GSC data. Setup steps:

  1. Create a project in Google Cloud Console
  2. Enable the Google Search Console API
  3. Create OAuth 2.0 credentials and authorize for your GSC account
  4. Use the API client libraries (Python, Node.js, etc.) to query data

Common use cases include:

  • Automating weekly performance data exports
  • Storing data beyond the 16-month limit
  • Fetching more than 1,000 rows per query
  • Building custom dashboards and alerts

Common Questions

General questions about using Google Search Console

Different reports update on different schedules:

  • Performance (clicks & impressions) - Updated daily, but with a 2–4 day delay. The most recent 2–3 days may be incomplete.
  • Index coverage / Pages report - Updates frequently, often within 24 hours of Googlebot crawling a page
  • Core Web Vitals - Updated weekly based on the Chrome User Experience Report (CrUX) dataset
  • Links report - Updates periodically, not in real time

There is no real-time data in Google Search Console. For the freshest performance data, look at dates from 3+ days ago.

Common reasons for missing or no data in Search Console:

  • New property - Data takes 3–7 days to appear after verification
  • Very low traffic - Sites with very few searches may not reach reporting thresholds
  • Date range issue - Check that your date range doesn't predate when the property was added
  • Active filter - A query or page filter may be hiding all results; clear all filters
  • Verification issue - Confirm the property is still verified in Settings
  • Looking at wrong property - Make sure you've selected the correct property from the property switcher

Search Console is a diagnostic and monitoring tool - it doesn't directly improve SEO, but it gives you the data to make improvements. By using it, you can:

  • Identify and fix indexing errors that prevent pages from ranking
  • Find keyword opportunities from queries you're already ranking for
  • Spot low-CTR pages where better titles could drive more traffic
  • Detect technical issues like mobile usability errors or Core Web Vitals problems
  • Monitor the impact of SEO changes over time

Think of it as a dashboard that tells you where to focus your SEO efforts. The improvements come from acting on the data.

No. Google Search Console is only for websites, not YouTube channels. YouTube has its own analytics platform - YouTube Studio - which provides data on video views, watch time, traffic sources, subscriber trends, and audience behavior.

If your website embeds YouTube videos or links to them, Search Console will track organic search traffic to those web pages, but it won't show data about the YouTube videos themselves or traffic from within YouTube.

Google Search Console stores Performance data (clicks and impressions) for 16 months. Older data is automatically deleted and cannot be recovered from Search Console. To preserve your data long-term:

  • Export regularly - Download monthly exports as CSV or Google Sheets
  • Use the Search Console API - Automate data collection into your own database or data warehouse
  • Connect to Looker Studio - Build a dashboard that retains historical snapshots
  • Use a third-party tool - Tools like GSC Wizard archive your GSC data automatically so you're never limited by the 16-month window

Get more from your Search Console data

GSC Wizard connects to your Google Search Console account to provide advanced analytics, traffic forecasting, and content optimization insights.