How to edit robots.txt file on Google Cloud Hosting Server ? SOLVED

Why is my word-press website hosted on Google cloud not index on Search Engines ?

This robots.txt file controls the permission that allow a search engine to index your site. The robots.txt file is located in the root folder of your hosting server.

By default, the Google Cloud hosting server defines the robots.txt to DISALLOW search engines. If you have a word-press website hosted on google cloud you need to manually edit the file to ALLOW indexing.

You can check your website’s robots.txt file in the following location:
http://yoursite.com/robots.txt


robots.txt default file extract on Google Cloud hosting:
#
# Added by Google
# Modify or delete lines below to allow robots to crawl your site.
# Block all robots
User-agent: *
Disallow: /

If your default robots.txt file set as above, you site will not be indexed.
It is very important to edit the robots.txt file to allow permission for search engines to index your site.


These instructions will guide you through the process of editing your robots.txt file on Google Cloud:

1) First, install the “SSH for Google Cloud Platform” plugin on your google chrome web browser.

SSH for Google Cloud

 

2) Login to https://cloud.google.com/ and navigate to https://console.cloud.google.com/ and Open Resources TAB.

Google Cloud Console

 

3) Write down the INSTANCE NAME and TIME ZONE of the hosting server you are running your word-press blog. . (Please write it down and not just look, you will need it later to save time)

Locate Google Cloud instance name and Time Zone

4) Activate “CLOUD SHELL” from the top right corner on your Cloud Console dashboard.

Locate google Cloud Shell

5) After the cloud shell opens up type in: gcloud compute ssh [INSTANCE_NAME]

Use the instance name you have noted on Step 3,  ignore the brackets.

If you are using the SSH command for the first time, you need to set up your SSH keys. Just type in the above command. The Cloud Shell will guide you through with prompts on the command window.

If asked for any username or passphrase, enter something simple like your website name. There is a second prompt which asks you to define a password, This is optional so leave it blank and press ENTER.

At some stage you will need to enter the corresponding numeric value of your time zone, identify the number correctly on screen from the details of the timezone you have written down earlier.

6) After the above procedure is completed, on the same CLOUD SHELL window you can start to edit your robots.txt file after the next step.

7) On the Shell command of your Cloud shell type in: ~$ sudo nano /var/www/html/robots.txt 

(without ~$, the ~$ will be already displayed on you command screen )

After you enter the above sequence of commands, the robots.txt file is now open on the shell for editing.

You need to select each character to delete or replace . The arrow keys on the shell command does not work like a normal text editor.

The objective is to change “Disallow” to “Allow” on the LAST LINE while retaining the same formatting.


This is the new format of my robots.txt file that ALLOWS indexing :
#
# Added by Stocksonfire
# Allow all robots
User-agent: *
Allow: /


After you have modified the file, PRESS CTRL+O (O as in alphabet O, this is where the plugin you installed in the first step becomes useful).
Now click ENTER key to confirm the changes,

This will save the changes you made to the robots.txt file.

Use the INCOGNITO window on your chrome browser and navigate to http://yoursite.com/robots.txt to confirm the new changes.
It is important to check this from the INCOGNITO to avoid chrome from fetching the old cache file on a normal chrome window.

If the robots.txt file reflects the new changes, you can close all the windows and ignore the prompts that come up on screen.

If you liked this article, please take a moment to share this article on your Twitter or Facebook.