← Back to Videos

Check Under the Hood of your Website

by Katie Ayres

There are many files on your website that are seen only by search engines. Learn how to check on these files to ensure they are working properly to help your search engine rankings. (2 minutes)

Download Checklist

Script

Today we are going to give your website a little tune up and look at hidden files that direct the search engines when they scan your website. These files are easy to see once you know how to find them.

My name is Katie Ayres, and I’m a web developer and owner of 1 Happy Place, and it is time to check under the hood of your website.

There are two views of your website, the one the human visitor sees and the one the search engine sees. Today, we are going to talk about how to check on the files that are hidden from the human visitor, but visible to the search engine.

First, let’s talk about bots.

Bots are software that automatically scans the internet. I run my own servers and I wrote my own bot, 1 Happy Bot. It runs every few hours, scanning my client’s website to ensure nothing has changed unexpectedly. This all happens over the Internet behind the scenes.

Search Engine Bots, such as GoogleBot are legitimate bots that scan the Internet to index all the websites on the Internet. When you search for something on google.com, it bases the search results on all that information it collected while scanning websites.

There are also nefarious bots that try to cause trouble. It is likely that if your contact form is getting spammed, it is a bot.

Part of Search Engine Optimization (SEO) is to ensure that the search engines do not hit any roadblocks while scanning your website.

There are three files that give directives to the search engines and we are going to check up on them:

Robots.txt

This file lets the legitimate bots know if there are areas of your website that are best not indexed and served up as search results. The main thing you want to check is that anything that is not allowed makes sense for your website.

Sitemap.xml

Sitemaps direct the search engine bots to all the pages and resources on the website. You want to ensure that list is comprehensive.

404 page

When a visitor is on a website, if they accidently request a page that doesn’t exist, a 404 page appears. That page might be a default page, but we want to see a custom page that gently leads the visitor back to the website.

Now that you understand these files, you will see below this video a downloadable sheet that will give you the instructions on how to look at these files as well as understand what you see.

So have fun checking under the hood of your website!