How to ensure that your website is in full compliance with Google standards?
This Friday we are about to delve deeper into Google’s quality guidelines in order to be on the safe side when improving the quality of our website or blog. The idea for today’s Q&A post was sent to us by Lauren Witte the Associate Director of Marketing & Client Services at the Mesa, AZ law firm, JacksonWhite. We thank him for the interesting suggestion and we welcome all of our readers to take part in this open Q&A series and send us their questions with a subject You Ask, We Answer. The most interesting questions will be selected, featured and answered on this blog, so feel free to join our Friday discussion.
And now let’s see the original question that Lauren sent us:
Q: How can you evaluate your website to ensure that you are in full compliance of Google standards?
A: The most certain way to check whether your website is in serious violation with Google’s guidelines is to verify whether your website has been penalized manually (check your Webmaster Tools account for any messages by Google) or algorithmically (check your website traffic and rankings – look for sudden unfounded drops). We have closely discussed this issue in our last week’s Q&A post – you might wish to skim over for further tips.
Currently among the common troubles most webmasters face is the bad inbound link profile. Getting any kind of spammy, unnatural links (including paid links) that aim to manipulate search engine rankings is what could make Google blacklist your website for good. Maintaining a natural backlink profile requires a lot of persistence, good marketing plan and honest approach.
In order to be sure that your website is not violating any Google regulations you should make it a habit to check your backlinks. You do not have to indulge in manipulative practices yourself to be linked by crappy websites. However you should stay alert for such a threat in order to take the necessary preventive measures in a timely manner. Use OpensiteExplorer, MajesticSeO, Ahrefs and most of all check the backlinks that show in your Webmaster Tools. Sometimes competitive businesses become an object of negative PR or negative SEO practices – you do not wish to learn about this after it has become too late, right.
Another good practice is to assess the accessability of your website. Make it easy for Google to crawl your website:
- Have a proper design – simple and logical URL structure – avoid using numbers and substitute them where possible with words; use hyphens (-) instead of underscores (_); every page should be referred (linked) by at least one static text link on your website.
- Create robots.txt file to appoint certain restrictions for pages that you want to hide from the crawling bots.
- Create and submit xml sitemap and ensure that each important healthy (not rendering 404 error) page is crawled by the search engine and respectfully – indexed.
- Go over your site on a page level and be careful with rel=”noindex”, rel=”nofollow” tags usage as well as the various combination of the two. Noindex pages that do not offer important user experience like “Registration completed”, or “Thank You!” pages for instance.
Having covered this up you’d better check your website indexability and verify whether all of your pages are well indexed in Google. Use the command “site:www.mywebsite.com” and check all of the listed pages. Repeat the search with the omitted results as to have a better picture of how Google treats your website. This search will show you if your website has a lot of duplicate pages. If you find such pages, you should definitely handle them as quickly as possible by either permanently redirecting them to their canonical (preferred) versions, or by banning them from being indexed in your robots.txt file, or by using the meta tag “noindex” in the given page head section.
Finally talking about website quality we should definitely discuss the on-site content. Having a unique site copy is a somewhat cliché advice. Nevertheless there are still people who ignore it and try to fill their site with “partially rewritten” site copy. Even if you are not doing this, you might order site content to random online freelancers who on the other hand prefer to spin content in order to save time and efforts. If you have not written the copy or the blog post that you are to post on your website, pass it through Copyscape for a quick check-up. The duplicate pieces will show immediately and you would know whether the text is worthy of sharing with your readers, or if it will be just a waste of time and resources to accept.
Being up-to-date with Google standards might be a challenge, for Google is constantly changing its algorithms and thus its overall approach towards the online content it crawls. One tactic for sure works though – having a website that is human-oriented instead of search engine optimized almost always ensures that you are in no trouble of deindexing or any kind of penalty for trying to play the system.