Find out about some behind-the-scene SEO methods for requesting and managing indexing of content to increase search engine visibility opportunities for a site.
Published: by Hannah
"Indexing" is what your local library would have a whole system built for its service. Imagine you have a book full of all your favourite stories. Now, say you have hundreds of favourites and you want to find a specific story quickly. You would need some help, right? That's where a 'table of contents' comes in handy.
Now you can begin to imagine how sophisticated and complex search engines are in comparison. While a search engine, like Google, may seem self-sufficient, it actually takes external input into account. You can tell Google to prioritise scheduling a "crawler"/"robot"/"spider" as soon as you publish a new page, make edits or remove the page altogether. Read on for technical tips on this 'search engine optimisation' (SEO) concept.
If you're a brand new site owner, start by claiming your website domain via
(whichever applies to you).
There are usually multiple ways to verify your ownership. Read each provider's documentation for step-by-step guidance or quick connect e.g. log into your Cloudflare account for DNS verification.
Once you've verified your website "property", you can manually submit your XML sitemap in the Search Console/Webmasters 'Sitemaps' settings. (see example instructions of dynamic XML sitemaps for NextJS/NodeJS here).
Search engines will regularly check your robots.txt and XML sitemap files to keep up-to-date indexing. However, you can always revisit the "Sitemaps" console tool details to check the last time it was 'Crawled' and how many URLs were discovered. This is useful for catching any issues and resubmitting your XML sitemap whenever there's an update you want 'crawled' sooner.
NB resubmitting your XML Sitemap doesn't guarentee prioritisation of your pages on SERPs, so don't try resubmitting multiple times in a day.
Directly from the Search/Webmaster Console, you can check the last time the search engine crawled and indexed your page by URL.
In Search Console, you can drop the URL in the top 'Inspect' bar or from the side navigation panel click 'URL Inspection'. Similarly, in Bing Webmasters tools your see the 'URL Inspection' in the side navigation panel.
Maybe you saw a private admin pages or an outdated PR page on SERPs that you need to hide from the public asap. Or you thought you had deleted that page recently but it's still showing up in Search. Search engines tend to cache pages so it could be days or weeks before 'the crawlers' decide to remove it from SERPs/indexing. An upside of long waiting periods for 'crawling' and 'indexing' is when a publisher has a hosting outage or accidentally deletes a valuable page and intends to reinstate it.
Now, if you need a specific URL removed from SERPs, go in Search Console, to 'Removals' and follow the steps to remove a URL or URL prefix from SERP caching and indexing. In Bing Webmasters it's in Configuration > Block URLs.
Finally, a game changer with a single 'fetching' API that can make requests to multiple search engines in real time.
You can boost your organic search visibility via IndexNow on the following search engines:
First, check whether your domain provider/CDN already has the IndexNow integration or third-party plugins. Common ones that do support it include Cloudflare, WP, Shopify and Drupal. For example, you can switch on the IndexNow integration from your Cloudflare account by going to Websites > (your-domain) > Caching > Configuration > Crawler Hints.
In the case that you don't, you can make a custom setup:
Start by generating a unique API key for your (sub)domain. Click the download button and then copy the .txt file to the root project folder of your site.
Next, set up a 'transformer' function to dynamically call all key landing URLs of your site and list them in a "urlList" to submit to the API. Follow the documentation for alternative ways to verify ownership of your domain and details on methods to submit your indexing request.
Tip: Another quick way to submit a URL, is to do so in your bowser. You'll still need an API key to verify ownership of the domain and append to the URL request. The template is:
https://api.indexnow.org/indexnow?url=url-changed&key=your-key
For example: https://api.indexnow.org/indexnow?url=https://www.hgdrconsulting.co.uk/blog/tips-to-getting-your-site-pages-indexed-quickly&key=a5772ec89d504d5a8fba0b2204736f67
You now might be wondering if there is a Google provided API. And there is, however the API may only be used for certain pages containing Job Postings or Livestream (Broadcast) videos.
You can find the "Web Search Indexing API" in the Google Cloud API Library.
You'll need to have a Google Cloud account and to set up a project.
From inside your project navigate to the "APIs & Services" dash. Then click the "+ ENABLE APIS AND SERVICES". Next, you'll be able to use the search bar to find the "Web Search Indexing API" and enable it. Wait a little while for it to complete.
Next, in the navigation panel, go to IAM & Admin > Service Accounts
and follow instructions to 'Create Key' (JSON format is recommended by Google).
Tip: Remember to download the public/private key pair that's generated to your desktop in a safe folder because it's the only copy you'll get.
Then, open your unique private key file to locate the "client_email" value and copy it to your clipboard. It will look: "my-service-account@project-name.google.com.iam.gserviceaccount.com".
Now, go into your Google Search Console property Settings. From there, click into 'Users and permissions' to 'Add User'. Paste in the unique "client_email" and grant it permissions.
Back in your Google Cloud Console, navigate to the APIs & Services > OAuth consent screen. Follow the instructions from there to create an 'Internal' User Type (or External if you have other circumstances, check the documentation) and download the JSON file it generates.
With that JSON file, you can now generate a OAuth access token. Check the example templates on how you can request an access token.
Finally, you'll be able to use the API to tell Google whenever specific URLs are updated or removed from your site or send batch indexing requests.
There are other technical SEO methods for layering static directives to increase the chances search engine crawlers respect which pages you prefer to be "indexed" and/or "followed". Furthermore, it's possible to "disallow" AI bots from crawling your content. These can be implemented during a tech SEO site audit.