SEO audits. What you really need to know.
Non-technical individuals are bound to become scared, when creating website audits. Or scarred – it roughly depends on how you look at the matter of software development.
The major problem of most sites out there is given by the individuals who put them together. The grand majority of these have nothing in common with Search Engine Optimization, and some couldn’t care less about your Google ranking. In other words, the first step in creating an search engine-friendly website is to learn a bit of code yourself. I’m not referring to complicated languages that blow you off your feet. But knowing some basic things about HTML and PHP always comes in handy. And if you can’t use any other resource, just resort to the magnificent world wide web.
Unfortunately for website owners in general, and for those who’ve written a lot of content in particular, creating audits isn’t that simple. The more articles and/or pages you’ve created, the more time it’s going to take.
Auditing a website has to be performed thoroughly and extensively. Each page must be analyzed from various viewpoints.
-
Usability and accessibility
We’ve approached this topic in one of our first articles about search engine optimization. For more information on it, read it here. Checking if your site is accessible in search engines is performed with the help of a simple search. Use the expression site:yourdomain when using Google and compare the number of pages that are indexed to the number of pages you thought were indexed.
Perform a search of your brand name, combined with the name of the products or services you sell to see if you’re getting indexed. If you’re not, you may have suffered a penalty without knowing it.
Validate your site ownership if that’s possible. It’s among the simplest ways a search engine can understand that a site is healthy. Combining the feeling of a real-life person with the virtuality of a website makes your opinions look veridical.
-
Keyword health
It’s always best to perform keyword research before jumping in to writing content. But if you skipped this part, try to be as pragmatic as possible. Otherwise, if you’ve been living with the feeling it might be enough to target a single keyword or the same keyword for all pages, you’re in for a big surprise. And boy, will it be unpleasant.
Keyword cannibalization takes place when multiple pages have been ‘optimized’ for the same term. Something tragic will happen in this case: the most popular page will come up in search engine queries, whereas the others, which have unfortunately not been gifted with the same amount of awareness, will be left out miserably.
-
Duplicate content
It’s pretty safe to think that people don’t usually go around creating copies of pages by themselves. The fact of the matter is they can and they do, and this practice is mostly used for A/B testing. But some aren’t doing it on purpose… so what happens then?
How do Google robots distinguish content? The first and most important of their criteria is the URL. And if you haven’t decided whether to give up on www, you should do it now. It’s bad for your indexing status to fail to choose between “http://” and “www”.
So, how do you fix your duplicate content? If the only issue you have on your hands is the www dilemma, that one can be solved in a jiffy, using Google Webmaster Tools. If, on the other hand, you are the owner of some duplicate pieces of text, all you need to do is rewrite them as best as possible, so they stands out as being unique.
Using 301 to direct from the page you don’t want indexed to the one you do is yet another viable option. It’s a code which lets robots know when a page has been permanently moved. The only problem with using 301 is you cannot implement it without the help of code. Confused? Webconfs explains how anyone can create 301 redirects, depending on various programming languages.
-
Content. Title tags and meta tags. Images and alt tags.
How much content is enough content?
There’s a single answer to this question and it’s tough: no content is too much content. As long as it’s original and relatable and has an awe-inspiring potential of being shared online… you should never quit writing. Not good with words? The world doesn’t end today — use a content marketing team. And even in writing simple posts, there are organization features to always keep in mind, such as heading tags. Headings are extremely useful for robots, because it’s with their help that they better understand text better. Since it’s also said that robots aren’t exactly perfect at deciphering video content, try to use headings frequently and appropriately. If you’re still having trouble with understanding which heading’s which, try this Media College set of explanations.
In the case of titles, in SEO site audits it’s quite important to know that keywords should always come first. But not the brand name. Using the name of your company excessively can give the impression you’re creating spammy content. URLs of pages also have to have something in common with the titles of the articles and with their main keyword, and it’s recommended that you keep it short and simple. As a general rule, title tags should be no more than 70 characters.
To prevent meta robots from ending up on one of your pages, use tags such as NoIndex or NoFollow. Meta descriptions also have to be checked for each page, as absolutely all pages must own one. In fact, search engines use these as descriptions of that particular page, which means they affect click-through rate.
-
Internal linking and subdomains
Internal links are used for navigation and for increasing the ranking of a website. The basics of internal linking are quite simple: as long as you don’t refer to any other page you’re the owner of by the sole use of a keyword, you’re covered. It’s more important to focus on anchor texts of internal links than on which words you’re hyperlinking.
People who’ve built their websites using Flash, Java or other plugins will fail to get their inner links indexed, because robots can’t read through programming languages as the ones mentioned above.
Yet another thing to consider when constructing an SEO website audit is the main pages of a website should not contain any keywords. If you’re trying to instruct your readers to contact you, name that page “Contact”. It’s simplicity that Google likes the most.
Lastly, the number of subdomains should be restricted. If you ask me, I wouldn’t create any of them, with the current regulations that rule the search engine world. But, if you ultimately have to use subdomains, try to stick with creating up to 10.
-
External linking
While internal linking can be controlled and accurately performed by almost any website owner, it’s the external linking business that’s a tricky one. Some years ago, to spread awareness about their business, entrepreneurs used to pay bloggers to write reviews of their products and/or mention their brand in an ‘apparently’ random post. Nowadays, Google is set on putting these practices to a miserable end, so paying for popularity won’t get you anywhere. Or maybe it will get you somewhere: outside of the Google playground.
But what if you want to know how many backlinks are recommending your website? Then, try these simple SEO audit tools: Authority Hacker, Ahrefs.com and Monitor Backlinks.
The concept of external linking is so popular nowadays that companies make up different names for it. For instance, Hubspot uses the inbound marketing terminology. Call it whatever you like — the safest way to be quoted by someone else is to create outstanding content.
And is developing backlinks from comments an easy way of letting people know you exist? Or is it a sure way of getting your site unindexed? Some say to stop wasting your time.
Why would people moderate and publish your comment anyway, if it simply looks like spam? What good is it for them? Try offering something in return for their attention. An e-book, a noteworthy and honest opinion, or at least a piece of advice they can relate to.
Since we’re talking about links, let’s see what Neil Patel has to say about broken ones: When Google is crawling a site and hits a broken link, the crawler immediately leaves the site. In other words, auditing your site might come in handy for optimizing your links, not only your content.
If you are thinking of starting your SEO audit soon, we really recommend checking out UpCity’s recent article: The Ultimate Guide to Developing an Actionable SEO Audit. It’s quite insightful and it will help you a lot.
So… SEO auditing is a long and boring road to walk on, but it’s the only one to choose. Keep in touch with your own content more than with the one that others are writing. It’s the safest way to healthy site visibility.
*All images used in this article are courtesy of OSU Commons.