In simple terms, a backlink is a link or a citation that a website receives from another site. Backlinks – also known as incoming or inbound links; play a crucial role in determining how well your webpages rank on various search engines, and where it will show up when users make relevant search queries.
Google, Bing, DuckDuckGo, etc. assign a higher rank to websites that have a greater number of valuable backlinks.
Here we’re going to troubleshoot the reasons your backlinks are not showing up and solve the problem. But first, here’s a quick rundown on the possible reasons your backlinks are not showing up…
Backlinks are not showing up? There are many reasons why your website’s backlinks are not showing up. For example, your website could be new, you are unknowingly blocking search engines from indexing your website, or you could be violating the search engine’s privacy and plagiarism rules. Let’s investigate the reasons…
It’s not impossible to fix this problem, but first, we need to identify what the reason is as to why your particular backlinks are not showing up. When we say backlinks, we effectively mean a page is not showing up. So a backlink to that page (or post) has simply not been recorded or is not working.
Here are the main reasons and their possible solutions to why your backlinks (links to those pages or posts) are not appearing.
Your Website Is Possibly New
Google takes time to explore and discover new sites and web pages. If you launched your website just a couple of hours or even a week or more ago, then this is the most obvious explanation as to why search engines just haven’t indexed it yet.
So the easiest way to check this, and to probably do first, is to check whether Google even knows that your website exists, you can run a search for the web site, try typing into Google site:yourwebsite.com. You should see something like the screen below…
If you find even one result, then it’s at least some assurance that Google knows about your site. But if there are no results to be found, then Google doesn’t know about your site.
However even if they do know about your website, it’s very likely that they would not know about certain pages that you’re trying to get ranked. To check whether they know about those pages you can search for your site – not as a url, but as a Google search…
For example, if I want to check our content audit guide in on Google, I would search for “https://bloggingkarma.com/content-audit-guide/” If it appears in Google search, then Google knows about it…
If you see no results for any of the above two searches, then the next thing to do is to create a sitemap and submit it through Google Search Console. Or check your site has one auto-generated.
Designing and submitting a sitemap is beneficial for your website anyway. A sitemap informs Google about the pages on your website that are important and where to find them – after all, it’s a search engine friendly map!
Unknowingly Blocking Search Engines from Indexing Your Site
If you instruct Google not to show specific pages in its search results, then it simply won’t.
You can do so by using a “noindex” Meta tag, an extract of HTML code which looks like this:
<meta name=”robots” content=”noindex”/>
Pages with this code wouldn’t be indexed, even if you’ve created and submitted a sitemap in Google Search Console.
Web developers often make use of this Meta tag to prevent Google from indexing webpages while they’re still in the development phase and often forget to remove them before launching the site.
If search engines have already gathered data from the webpages in your sitemap, then it will inform you about the non-indexed ones through the report in Google Search Console.
If you’ve only recently submitted your sitemap to Google and they haven’t yet crawled them, then run a crawl in Ahrefs Site Audit.
This will check each and every page on your site for a large number of potential SEO issues, which also includes the presence of non-indexed pages. Simply remove the non-index tag from the pages that you want to get indexed.
Blocking Search Engines From Gathering Page Data
Most websites usually have a robots.txt file. This file instructs Google about where it can and cannot go on your website. You can stop unwanted searches on your site with robot.txt file, see our guide on that here.
Search engines won’t be able to collect data from URLs that are blocked in your robots.txt file. If you have already submitted your sitemap via Google Search Console, it should inform you about such issues.
To fix this, open the coverage report in your Google Search Console, and search for “Submitted URL blocked by robots.txt”
Keep in mind that it would only work if bots from Google have already tried to crawl the URLs in your submitted sitemap. If you have only recently submitted it, then it is extremely likely that your URLs have not yet been crawled.
If you don’t want to wait for the coverage report, you can also check it manually. Simply go to yourdomain.com/robots.txt. If you receive a 404 error, it proves that you don’t have a robots.txt file.
It’s important to remember though that robots.txt files can be pretty complicated, and it’s easy to mess them up. So, if you feel that your robots.txt file might be the reason your pages aren’t showing up on search engines, and you don’t have sufficient skill in manipulating this file, then it’s best to hire an expert to resolve this issue.
Your Website Has Duplicate Content Problems
If your website has content that is already present on a different URL, Google won’t index it. This is done to ensure that the same content doesn’t end up taking extra space in their index. Thus, it would normally only index the version that you have set as canonical.
If you haven’t set up any of the versions as canonical, then Google itself would select the ideal version of that page to index.
In order to identify duplicate content problems on your website, it’s best to conduct a crawl using Ahrefs Site Audit, and then going for the “Content quality” report. Search for clumps of duplicate and almost-duplicate webpages without canonicals. You can then fix these issues by either redirecting or canonicalizing the duplicate pages.
You May Have Suffered a Google Penalty
This is one of the least probable reasons for your website not showing up on search engines; however, it is a possibility. There are two basic types of google penalties, namely manual and algorithmic.
Fortunately, manual penalties are very rare. You’re very unlikely to get one of these unless you have done something seriously wrong. Google tends to inform you about them through the “Manual penalties” section in Search Console so you would probably know if this were the case.
On the other hand, identifying algorithmic penalties can be a pretty challenging task. If you’re in doubt of an algorithmic penalty because of a sudden drop in organic traffic, then the first step you should take is to monitor whether that drop concurred with any of the Google algorithm updates.
Also keep in mind, that traffic drops can simply be because your top-performing pages, i.e. those bringing in the most traffic, were simply beaten in the SERPs by a better piece of content created by another site. Basically, your big-hitter article may have been relegated!
You can find a number of useful tools for this purpose online, which will highlight known algorithm amendment dates over your Google Analytics traffic so that it becomes easier for you to identify the problems and rectify them.
We hope this has been useful in tracking down why your Backlinks are not showing up. Let me know in the comments below if these fixed it or if you encountered something else that I need to add to help others.
And, if you’d like some help starting your blog or affiliate marketing business, then check out my resource page. I have some great help to offer you and to get you started!
https://youtu.be/IDHqn_LO8tM If you are operating a business where you offer SEO services, you should definitely look for clients who need local SEO services. These establishments would...
https://youtu.be/t290An0tmGE You are more likely to increase conversions if your site visitor stays there for a while. Why? Because the site visitor gets to know...