We produced an article recently telling you 3 unusual reasons that that might stop your website from ranking on Google, and really wanted to give you some things we’ve seen that you’d probably never think of.

But the reality is there are all kinds of things that can cause your website problems and harm your SEO/where you appear in the results, so here are 3 more “standard” type problems that you might encounter.

These are common issues that affect many websites.

  • robots.txt

Robots.txt is a file you can see if you go to yourwebsite.com/robots.txt, and is a file that instructs search engines like Google how to crawl your site, which bits to include in the results, etc.

It can be set to give different search engines different instructions, like you might tell Yahoo not to crawl your site, but Google to crawl it all, and maybe Duck Duck Go to only crawl certain sections (I don’t know why you’d do any of that, but in theory you can).

Of course, all search engines give you the disclaimer that while they might endeavour to follow your instructions, they might also completely ignore them.

In a cruel twist of fate, you can be sure that if you didn’t mean to tell Google not to crawl your site but inadvertently did – it will always follow your request.

That’s life.

Your robots file needs to “allow” the bots you want, and “disallow” the ones you don’t want, if you want every bot to crawl and index your site (and why wouldn’t you?) then it needs to allow them all.

If you have it set to disallow then your pages will probably not show in Google at all, so make sure you check this ad get it right, you’d be amazed how often people mess this up and it causes them no end of trouble.

  • sitemap.xml / Sitemap.html

Your sitemap is a literal roadmap of the layout of your site, telling Google etc where the pages are, what their address is (their URL), etc, and you can even add in things like how frequently you want them crawled, etc.

I really wouldn’t worry about trying to tell Google how important your content is or how often it should look at it, just let them do their thing – that’s what they’ll do anyway after all.

Your sitemap might be generated automatically if you use WordPress etc, or you might need to generate it manually if you have a HTML (static) website.

Ordinarily if you’ve created your sitemap and everything has been great, you’d never think it might cause a problem, but we have seen a number of issues where people have eventually switched their website to HTTPS from HTTP, and not updated their sitemap, and so there are just dead pages and they lose all of their rankings.

If you are using WordPress you should be able to solve this by using a plugin or adapting your htaccess file, if not you can use hard coded 301 redirects to force the old HTTP to the new HTTPS URL.

A 301 redirect basically tells Google “This old page has moved from HTTP://whatever to HTTPS://whatever


Changing your .htaccess file is not something you want to try doing if you don’t understand what it is.

To try and explain it in layman’s terms, it’s like a cross between code telling the server what to do, that then passes code telling the website what to do, which impacts what search engines and browsers do.

I know that’s as clear as mud, but honestly there are so many different things, codes, options, etc that you can do with the htaccess file that you could spend the next 12 months learning it for 12 hours per day and still not know everything.

It can control the forcing of HTTP to HTTPS (or vice versa), the forcing of particular URLS to other pages, control strings of text, where files will be stored – a whole host of things, in fact pretty much anything you can imagine you might want to control.

Can almost all be done through a .htaccess file.

It is so complex it is not something you will even be able to try and change, and so if you need someone to look at it, please get someone who knows what they are doing.

This is a typical example of code – if you put one colon, semi colon, full-stop in the wrong place – it will break your entire website.

Never mind the complexity of the code itself, just getting the punctuation wrong will break the site.

So, imagine how the nuances of the code itself could affect things.

If you need help then get in touch, we’re always happy to help you and we’re web geeks, not salesmen!

SEO Experts are based on the West Yorkshire and North Yorkshire border we work with businesses in Bradford, Leeds, Keighley, Skipton, Harrogate, Ilkley, Halifax, and Huddersfield, as well as nationwide.