Common Onsite SEO Errors
Those without any knowledge of SEO will often overlook these errors on their website. Meta tags, h1 tags, URL structure, 301 redirects just to name a few. However, with a bit of understanding these issues can be easily fixed, especially if your website is built on a popular CRM such as WordPress, Wix, Weebly, Squarespace, Shopify etc.
Meta tags are the most common issue for websites looking to establish strong search visibility. The reason is because meta tags contain the exact keywords that users will be typing into Google. The two main types of meta tags is the Meta Title and Meta Description. Knowing how to change these two tags will greatly enhance your search visibility.
Best practices for meta titles:
Meta title should be unique and less than 60 characters
Meta tile should contain one or two keyword phrases
Brand name should be appended at the end
Best practices for meta descriptions
Meta description should be unique and less than 160 characters
Meta description should contain one or several keywords.
Meta description should be readable and entice users to read more.
Proper h1-h6 HTML Tags
Every page should contain only one pair hot <h1></h1> HTML tags, you can determine if a page has more than 1 set of h1 tags by right clicking and pressing inspect page source.
We recommend using the free SEO tool Screaming Frog to help your SEO team see duplicate, missing, over character limit meta tags, h1 tags and more.
Having a proper, well-written URL structure is crucial for ranking well on Google. Here are the following guidelines you should follow when creating your URL permalinks.
Keep URLs short, simple and limited to one keyword phrase
Example of Good URL:
Example of Bad URL:
No-Index Pages With Thin Content
Often times, search engines will index the entire website including hidden inner pages that offer no valuable content to search engines. To prevent this from happening, there needs to be a “nofollow tag” added which tells search engine crawl bots to skip the current page.
Common page types that should be blocked from crawl bots
Unused user account pages
Media/Image attachment pages
Empty product or blog pages
Backend admin dashboard pages
Sometimes “SEO professionals” including many top digital marketing agencies will outsource backlinks to a third party vendor who artificially generate links. Many times these backlinks come from low quality websites that offer little to no value beside bringing down the authority of the website. It’s very important to know the quality of domains referring you links.
When a website has hundreds of thousands of spammy backlinks, they are very likely to become a flagged site and the entire site might end up in the “Google sandbox” where there is practically zero search engine visibility.
Signs of spam/low quality backlinks
The referring domain is horribly designed and is unreadable to users.
The authority score of the backlink source domain is less than 40.
The backlinks lead back to a website with no content.
Setting up Google Search Console
Google Search Console is a free tool provided by Google to allow web administrators the ability to track clicks, traffic, site information and blacklist backlinks from URLs or domains.
Create and Upload a Disavow File
You can create a disavow file by simply using notepad and following this format.
# Two pages to disavow
# One domain to disavow
Save and name the file as disavow.txt and upload the file here.
Overtime Google will begin blacklisting the URLs and domains listed which should result in getting taken out of the sandbox and reversing any drop in traffic caused by the spam links.
Image optimization is often the most overlooked part of any SEO strategy mainly because people underestimate how effective ranking images are on search engines.
Often times a properly optimized image with right title and alt tags can help bring in as many qualified leads as keywords due to many people using the image feature of search engines.
When uploading an image or any media object to your website, it’s important to rename the file to something readable so search engine crawl bots can easily identify the keywords related to the media file.
Good image filename:
Bad image filename:
Using the “alt tag” is a very good practice not only because it helps search engines index keywords but also to help users with disabilities access information from your website. For example, images with the alt tag is often translated through a voice bot for blind readers.
Checking and reducing pagespeed
Using Google’s Pagespeed Insights tool we can see how long it takes the average user to load your website on mobile and desktop.
Having a website that takes longer than 3 seconds to load is likely to incur huge penalties from search engines who prefer to index fast, responsive websites that load under 1 second.
The most effective and surest way to increase your pagespeed is to upgrade to more powerful and faster servers done via contacting your website’s hosting provider.
Here are other simple methods for optimizing your site’s load time:
Disabling unnecessary plugins
Keeping all images to under 100kb
Serve a plain static website using caching