The First Step on Your Site Audit: Have it Indexed

You created a website for your business to make it visible to your customer online, yet it is not available on Google or Bing when you type your company name or even pasting the URL on the search bar.

The short of the matter is, yes, your website has a problem.

It’s possible that your website is not being indexed by Bots of the search engines. If you haven’t had it indexed yet, there’s no magic that can improve your website visibility even if you’re doing SEO for a long time.

But don’t panic, there are still ways.

Is your website being indexed?

First, you can check if your website is being indexed using available online tools.

A quick way to check is to enter your domain name as shown on the image below. The results will show the pages indexed by a search engine; in this case, by Google. You can also enter the specific pages within a website.

No Results Found?

If you don’t see a result after trying the method above, the Meta robots or robots.txt file may be the culprit. This called robots is a file or an instruction that tells Google or Bing which pages are exempted from being crawled on your website.


Robots.txt is a file that can be found in the root directory. If you are unsure if your website has a robots.txt, check it by adding “/robots.txt” on your domain name.

Here is an example:

And if you see this instruction in your website

It means your site is disallowed from being read. You can create a new robots.txt file with these instructions.

It will allow all Bots to read through all of your pages. You can also test your robots.txt at Google Search Console  to identify errors on the file and can test if a file is currently blocked.

Robots Meta tag

The Meta robots is located on the <head> section of the page. Once pages have been disallowed on robots.txt, there is no need to add a robot meta tag.

The Google Developers can explain robots meta tag in detail better.

Don’t forget to add an XML sitemap

XML sitemap is used to provide search engine a list of pages URL of a website.

To check if your website has an XML sitemap, add “/sitemap.xml” at the end of the domain name.

Your website doesn’t have one yet? Don’t worry! There are easy ways to create an XML sitemap.

If your website is on a CMS, you can add a plugin that will automatically generate a sitemap.xml for you like Yoast SEO, and XML Sitemap & Google News feeds. Or use this web based free tool from to generate a file to be uploaded to your website’s root directory.

Don’t forget to submit your sitemaps to Google Search Console and Bing Webmaster Tools to help their Bots find where the sitemap location is.

No update?

Search engines, specifically their bots, are busy always busy crawling and indexing millions of websites. Sit, wait, and don’t feel bad if you haven’t seen updates reflect about your new changes yet.

First try to see what pages are being cached by putting cache:domain_name or cache:domain_name/pages on the address bar. If this message appears, your pages haven’t been crawled yet.

What you could aternatively do is try to fetch your pages at Google Search Console.

Again, especially if you are just starting out with optimizing your website, very little of it matters for as long as your pages are not yet indexed by search engine. Try these methods to get indexed and ride that SEO saddle towards getting crawled!