Last Updated:
How to Fix Crawled - Currently Not Indexed in Google Search Console

How to Fix Crawled - Currently Not Indexed in Google Search Console

Gene Reiser Webmaster Tips

Table of Contents

Are you struggling to get your content indexed by Google? Are you frustrated that the pages or posts you thought should be showing up in the search results are not appearing? If so, then this blog post is for you! We'll look at how to fix "Crawled - currently not indexed" errors in Google Search Console, and how to optimize your content for better visibility.

Identify Reasons for Not Indexing

Identifying the reasons why Google is not indexing your pages is essential for fixing the “Crawled - currently not indexed” issue. There are several factors to consider when trying to identify the causes for not indexing. First, make sure that your page is not blocked by robots.txt, as this will prevent Google from indexing it. Also, check if your page contains any duplicate content as Google may not index pages with duplicate content. Additionally, ensure that your page’s load time is not too slow, as this can adversely affect the indexing process. Finally, verify that your site’s server status is healthy, as any downtime can also prevent Google from indexing a given page.

Check if Pages are Blocked by Robots.txt

One of the first things to check when trying to fix the “Crawled - currently not indexed” issue in Google Search Console is whether or not your pages are blocked by robots.txt. This is a file that webmasters use to control which pages are visible to search engines. If your pages are blocked, it can prevent them from being indexed. To check if this is the case, log into your Search Console account and select “Crawl” from the left-hand menu. Click on “robots.txt tester” and enter the URL of the page that is not being indexed. If it is blocked, you will need to unblock it in order for it to be indexed.

Check for Duplicate Content

Duplicate content can be a major problem when it comes to getting your website indexed. Duplicate content could be anything from two URLs pointing to the same page, or having content from another website copied and pasted onto your own. Google will penalize you for this and may not index your website until the issue is resolved. To check for duplicate content, use a tool such as Copyscape or Siteliner to scan your website for any plagiarized or copied text. Once you've identified any issues, make sure to remove or rewrite the offending content in order to get your website indexed by Google.

Siteliner

Check Your Page’s Load Time

One of the most common reasons for Google not indexing your page is due to slow loading times. A slow loading page can impact user experience and thus can lead to Google not indexing your page. To check your page’s loading time, you can use tools such as Google PageSpeed Insights or GTmetrix. We have a list of free website speed test tools you can also use to find a tool.

To increase your page’s loading time, try optimizing your images to reduce the file size and compressing your HTML, CSS, and JavaScript codes. If you’re using WordPress, you can install a caching plugin like W3 Total Cache or WP Rocket. You may also want to consider using a content delivery network (CDN) to serve your content more quickly.

Check our How to Speed Up Your WordPress Website post for additional help!

Check Your Site’s Server Status

The server status of your site is an important factor to consider when trying to fix Crawled - currently not indexed in Google Search Console. If your server is down, Google will not be able to access your site and index its pages.

To check your server status, you can use a variety of online tools such as Down For Everyone Or Just Me and Pingdom. These tools will give you information about whether your server is up or down and how fast it is responding to requests. If the server is down, contact your web hosting provider for help in getting it back up and running.

Fix Crawl Errors in Google Search Console

In order to fix crawl errors in Google Search Console, you should first identify any potential problems. This can be done by using the 'Crawl Errors' report in the Coverage section. This report will list any URLs that Googlebot has been unable to access as well as any other issues that have been detected.

Once you have identified the issue, you can start to take action to resolve it. Common causes of crawl errors include server errors, DNS problems, robots.txt blocks, and soft 404 errors. If the issue is related to a server error, you will need to contact your web hosting provider and ask them to investigate. If it is a DNS problem, you may need to reset your DNS settings or contact your domain registrar for help.

If your crawl error is due to a robots.txt block, you will need to edit your robots.txt file in order to allow Googlebot access to the page in question. If the issue is related to a soft 404 error, you will need to review your page's content and make sure that it is relevant and useful for users.

For additional help we've outlined steps on How to Check for Broken Links.

Once you have fixed the underlying issue causing the crawl error, you can request indexing for that page in the Crawl Errors report by clicking on the 'Request Indexing' button. This will tell Googlebot to visit the page again and re-index it if there are no other issues preventing it from being indexed.

Fix Soft 404 Errors

Fixing soft 404 errors is essential for fixing the “Crawled - currently not indexed” issue. A soft 404 error means that a page exists, but it does not have any content. This can be caused by a number of things, including incorrect redirects, empty pages, and pages with no content.

To fix soft 404 errors, you should first check your redirects to make sure they are working properly. You can do this by using an online tool such as Redirect Checker or Redirect Mapper.

Next, you should check your pages for any empty pages or pages with no content. If you find any of these, either add some content or remove the page entirely.

Once you’ve fixed the soft 404 errors on your site, you should resubmit the page to Google Search Console to get it indexed.

Optimize Your Meta Tags and Content

Optimizing your meta tags and content is an important step in making sure your page is indexed by Google. Meta tags are snippets of code that provide search engine crawlers with information about the page. They include the page title, description, and keywords. It is important to make sure your meta tags accurately reflect the content on the page and are not too long or too short.

website meta tags

You should also optimize your content for SEO. To do this, you should include keywords in the text and titles, and you should use good formatting such as headings, bullet points, and images. You should also make sure that the content is relevant to the topic of the page and is interesting and engaging for readers.

By optimizing your meta tags and content, you can help ensure that your page is indexed by Google and can help it appear in search results.

Add New Content Regularly

One of the most important factors for getting your content indexed by Google is to ensure that you are regularly adding new content. Adding new content will help to keep Google’s crawlers busy and interested, and can help your website stand out from the competition.

You should aim to add new content at least once a week to ensure that your pages stay fresh and relevant. You can also use this opportunity to add additional keywords, or to update existing posts with new information or insights. Doing so can help your content stay competitive in search engine rankings.

When adding new content, ensure that you are including relevant keywords and phrases throughout your post. This will help Google understand what your page is about and can help it rank higher in search engine results. Additionally, make sure that each post has an engaging title and meta description, as this can also help draw more viewers to your website.

Submit a Sitemap to Google Search Console

Submitting a sitemap to Google Search Console is a great way to get your pages indexed quickly. A sitemap is an XML file that lists all the pages in your website and tells search engine crawlers which pages are important. This makes it easier for Google to find and index your webpages.

To submit a sitemap, first you will need to create one. You can do this manually or use a plugin such as Yoast SEO or SEOPress if you're using Wordpress to generate it for you. Once you have created the sitemap, you need to upload it to the root of your domain, which is usually in the same directory as your homepage.

Once you have uploaded your sitemap, you can submit it in Google Search Console. To do this, go to the Crawl section and click on Sitemaps. Then, click Add/Test Sitemap and enter the URL of your sitemap.

Google will then begin crawling and indexing your pages according to the information in your sitemap. This should help to fix any ‘Crawled - currently not indexed’ errors you may have been experiencing.