Fix Excluded by noindex tag – Google Search Console

Fix Excluded by noindex tag – Google Search Console

Do you really want to index the cart, checkout, my account, login, logout, search filter pages, and long URL pages? Yes, these are the usual pages reported under this head in the Google search console. I will explain how you can fix excluded by noindex tag.

What is “Excluded by noindex tag”

When Google examined the page, it detected a ‘noindex’ directive in the robots meta tag, preventing it from being indexed. To ensure the page is indexed, you’ll need to adjust or remove the robots meta tag. It’s essential to determine if the page holds significance for the website or if it’s redundant or unnecessary content.

Furthermore, being excluded by the ‘noindex’ tag means that the reported pages are intentionally blocked from being crawled and indexed by Google (or any other search engine). This blocking is typically intentional and can be implemented through the robots.txt file, the <meta name="robots" content...> tag in the <head> section, or the robots header

Before we start let’s look at example cases in which you should fix the issue or leave it as it is

Fix it if is an unnatural link created to achieve functionality by the web developer. Such links get reported in the Google search console and normally get a reference from within the website. You can set up no-follow because thse links are normally used to achieve some functionality.

Leave it as it is if it is a search page because a search page can accept any input and quickly exhaust the links, also spammers can enter any word into it and use it to spam your website:


Once you decide how these issues are reported and if you want to make them avaialble to crawl and index you can remove the blocking code line or update <meta name=”robots” content. This usaually have urls that are not important.

Leave a Comment

Your email address will not be published. Required fields are marked *