If Google’s crawler, Googlebot, encounters an issue when it tries to crawl your site and doesn’t understand a page on your website, it’s going to give up and move on.
This means your page will not be indexed and will not be visible to searchers, which greatly affects your search performance.
These two issues are Google systems’ fault, so you don’t have to worry about these, just start validation and wait for Google to index
- Crawled – currently not indexed
- Discovered – currently not indexed
1. Server Error (5xx)
A server error (5xx) occurs when the server encounters an internal error and is unable to process the request. This can result in Googlebot being unable to crawl and index the affected pages. To fix this issue, website owners should first check their server logs for any errors or issues. They should also ensure that their server is properly configured and optimized for their website’s needs.
2. Page with redirect
A page with a redirect occurs when a page is redirected to another URL. This can result in Googlebot being unable to crawl and index the original page. To fix this issue, website owners should ensure that any redirects are properly implemented and that they are not blocking Googlebot from crawling the original page.
3. Blocked by robots.txt
A page blocked by robots.txt occurs when a page is blocked by the website’s robots.txt file. This can result in Googlebot being unable to crawl and index the affected pages. To fix this issue, website owners should ensure that their robots.txt file is properly configured and that it is not blocking Googlebot from crawling and indexing their website.
4. Excluded by ‘noindex’ tag
A page excluded by the ‘noindex’ tag occurs when the page has been intentionally excluded from Google’s search results. This can result in Googlebot being unable to crawl and index the affected pages. To fix this issue, website owners should remove the ‘noindex’ tag from the affected pages or ensure that it is not being incorrectly applied.
5. Soft 404
A soft 404 occurs when a page returns a 200 status code but the content of the page is similar to a 404 error page. This can result in Googlebot being unable to properly index the affected pages. To fix this issue, website owners should ensure that their website is returning the appropriate HTTP status codes and that their website is properly configured to handle 404 errors.
6. Unauthorized request (401)
An unauthorized request (401) occurs when the user attempting to access the page does not have the necessary authentication credentials. This can result in Googlebot being unable to properly crawl and index the affected pages. To fix this issue, website owners should ensure that their website is properly configured to require authentication credentials and that Googlebot has the necessary credentials to access the affected pages.
7. Blocked due to access forbidden (403)
A page blocked due to access forbidden (403) occurs when the user attempting to access the page is forbidden from accessing it. This can result in Googlebot being unable to properly crawl and index the affected pages. To fix this issue, website owners should ensure that their website is properly configured to allow access to the affected pages and that Googlebot has the necessary permissions to access the affected pages.
8. Not Found (404)
A page not found (404) occurs when the requested page is not found on the server. This can result in Googlebot being unable to properly crawl and index the affected pages. To fix this issue, website owners should ensure that their website is properly configured to return the appropriate HTTP status codes and that any broken links or outdated URLs are redirected to the correct pages.
9. Crawl Issue
A crawl issue occurs when Googlebot is unable to crawl and index a website’s pages due to technical issues. This can result in poor visibility in search results and lower website traffic. To fix this issue, website owners should check for any crawl errors in Google Search Console and take appropriate actions to resolve them. This may include fixing broken links, optimizing website speed, and ensuring proper website structure and organization.
10. Alternate page with proper canonical tag
An alternate page with a proper canonical tag occurs when a website has multiple pages with similar content, causing duplicate content issues. To fix this issue, website owners should use a canonical tag to indicate the preferred page to be indexed by Google. This will ensure that the preferred page is properly indexed and that any duplicate content issues are resolved.
11. Blocked due to other 4xx issue
A page blocked due to other 4xx issues occurs when the page encounters a 4xx error other than those listed above. To fix this issue, website owners should identify the specific error and take appropriate actions to resolve it. This may include checking server logs, fixing broken links, and ensuring proper website configuration.
Bottom Line:
Indexing issues can impact a website’s visibility in Google search results and lower website traffic. However, by identifying and fixing these issues, website owners can improve their website’s performance and search engine rankings. By following the steps outlined above, website owners can identify and fix common indexing issues in Google Search Console.
FAQs
What is Google Search Console?
Google Search Console is a free tool provided by Google that helps website owners monitor and maintain their website’s presence in Google search results. It provides important information such as website traffic, search analytics, and indexing status, and helps website owners identify and fix issues that may affect their website’s performance in search results.
How can indexing issues impact my website’s performance?
Indexing issues can prevent search engines like Google from properly crawling and indexing your website’s pages, which can result in poor visibility in search results and lower website traffic. By identifying and fixing indexing issues, you can improve your website’s performance and search engine rankings.
What are some common indexing issues in Google Search Console?
Some common indexing issues in Google Search Console include server error (5xx), page with redirect, blocked by robots.txt, excluded by ‘noindex’ tag, soft 404, unauthorized request (401), not found (404), crawl issue, alternate page with proper canonical tag, and blocked due to other 4xx issue.
How can I fix server error (5xx) issues in Google Search Console?
To fix server error (5xx) issues in Google Search Console, website owners should first identify the specific error and its cause. This may include checking server logs, fixing broken links or outdated URLs, and ensuring proper website configuration. Website owners should also monitor their website’s performance and traffic after fixing the issue to ensure that it has been properly resolved.
How can I prevent crawl issues in Google Search Console?
To prevent crawl issues in Google Search Console, website owners should regularly monitor their website’s performance and search engine rankings, and fix any issues that may arise. This may include fixing broken links, optimizing website speed and performance, and ensuring proper website structure and organization. Website owners should also submit sitemaps and use a robots.txt file to help Googlebot crawl and index their website’s pages properly.