How To Use The Google Search Console URL Removal Tool


Google Search Console URL Removal Tool

Although it is not a good idea to remove content from your site, there are occasions where you cannot help it. For example, you may have published something illegal or something that should be kept a secret. 

Today, I will show you how to use the Google Search Console removal tool so that the page does not get indexed anymore. What that means is that it will no longer appear in search results. 

Pre-Requisites for Removal 

All in all, Google wants webmasters to take accountability for page removals. As such, the removal request tool only applies if you meet specific requirements. You can only request removal if you have already modified or removed the page. 

For one, you must set up your site with Google Search Console (GSC). The tool will not work unless your site is integrated with GSC.  

1. Robots.TXT

First, check if your URL is disallowing search engines from crawling it. Google has a robots.txt tester that is free to use. It allows you to check whether or not your URL s blocking Google’s web crawlers from reading your pages. 

To block robots from crawling your page, you have to go to your website’s control panel and look for the robots.txt file.

Open that file, and then add a command. Here is an example: 

User-agent: Googlebot

Disallow: /nogooglebot/

It is a command telling your server that Googlebot cannot crawl the page whose URL has /nogooglebot. You must do this per page, or you can do it for your entire site. 

If you are using hosted sites like Wix, you may not do this. You can contact the support team and tell them what you want to happen. They will fix the issue for you. 

If you want, you can also create a robots.txt file. You only need a notepad or any text editor. Do not use a word processor. 

Name that file as robots.txt. Keep in mind that your entire site must only have one robots.txt file. Type the command you want in the file, then save it. 

Next, upload it to the root folder of your website. Do not put in a sub-directory. If you do not know how to do this, you must contact your site admin. 

2. NoIndex Meta Tag

The other thing you must do is use Fetch as a Googlebot tool to check your meta tag. You want to check it to ensure that that URL is a noindex page. 

You can prevent a page from appearing on Google Search if there is a noindex meta tag in it. When the Googlebot crawls that page, it will see the noindex meta tag and drop that page from the search results, even if pages link to it. You can implement noindex as a meta tag or an HTTP header. 

For meta tags, you must access the page’s HTML code, then look for the <head> section. In that section, add the command <meta name=”robots” content=”noindex“>. This should tell the Google Search Engine not to index this page. 

The other option is through an HTTP response header, which is much more complicated to use. However, you can use it for non-HTMK content such as PDFs, images, and video files. To do this, you need to return an X-Robots-Tag header. The value should be noindex.  

Consult with a webmaster or ask for help from your web hosting service provider. Once you do all these, you could use the robots.txt tester tool to validate if it worked. 

3. 404 or 410 Status

Check if your page is returning a 404 or 410 status code. Error 404 means that the page is no longer active or accessible. You can use tools like Fetch as Googlebot or Live HTTP Headers.

The issue is that the user will see an Error 404, which is terrible for user experience and SEO. 404 error happens if you delete the page. Error 410, however, means that the page has already been deleted and that it is permanent. 

404 and 410 have similarities. If you do this, make sure you redirect the user to another page that tells them that the page is gone. 

Steps to Use the Google Search Console URL Removal Tool

If the site URL meets the conditions above, you can now proceed and request for removal. To do this, go to http://www.google.com/webmasters/tools/removals. It looks like this:

The next step is to click on New Request, and then you need to type the URL. You also need to choose if the URL is an image or a page. It looks like this:

An important note here is that you must type that URL of the page, not the URL on Google search. If you type the wrong URL, Google will not help you.

Now, you also have the option to remove an entire site or directory. The process is similar to what we discussed earlier. You must disallow search engines from crawling it. Do it in the robots.txt file. Once this is complete, you can now request removal. 

Reincluding Content

What if you requested in error? Or have you changed your mind? Then, you can cancel the request at any time. You can also cancel requests that other people make. 

Of course, you must be a verified owner of the site. Log in to Google Search Engine to request cancellation, then go to Site Configuration. 

From here, choose Crawler Access, then Remove URL. You will see a list called Removed URLs. Click Cancel to any request. 

Summary: How To Use The Google Search Console URL Removal Tool

If possible, you should only publish content that has value. It saves you from the hassle of going through all these steps. There is no evidence that page removal requests affect SEO, but it does impact your efficiency. Removal requests can take six months. The best course of action is to replace the page’s content or use a redirect.  

John Kilmerstone

I'm an Aussie living in Japan who enjoys traveling, photography, and blogging. Please visit this website and explore the wonderful world of blogging. Discover how to turn your passions and pastimes into an online business.

Recent Posts