So far in the DigiF9 blog series we have covered a few key areas on the benefit a website can bring to your organisation, a quick setup guide, security guide and threw a user guide in there for good measure! This week we are going to talk about Technical SEO, the first in our three-part series on the topic of SEO (as they say, good things come in threes).
Maybe you’ve never heard of SEO before, maybe you’ve heard the buzzword but never really understood what it means. In this blog we will define exactly what it is and discuss various ways to improve your website’s ‘SEO Score’.
At DigiF9 we want to help all our customers take their website to the next level, by offering a fully transparent, highly competitively priced, and fully optimised product to all of our customers, to help them truly transform their operations. Interested? Get in touch at firstname.lastname@example.org today for a personalised quote, fully bespoke to your requirements.
Let’s get technical…
Technical SEO does it exactly what it says on the tin – it is concerned with the technical requirements of modern search engines with respect to your website. It is all about ensuring that your website is optimised for the search engine algorithms, which can massively improve how well your site ranks for search results – after all you want your site to appear higher in the search results, and more frequently for relevant keywords.
There are a wide variety of ways to improve Technical SEO, so instead we will focus on some of the key areas to get things started, as well as how to measure your SEO score over time and continuously improve. Rome wasn’t built in a day after all. You don’t need a perfect SEO score to rank in the search results, and this can be something you work on to improve over time, as it is unlikely that you will transform your SEO score overnight. But by building it up carefully, you can continue to grow your presence and ranking on search engines.
How Users Interact With Websites
Loading, loading, loading, loading, failed…
The most common set of requirements for a website are geared around this section – making sure visitors to your website have the smoothest user experience possible, so that they interact with your website, use your services, or purchase your products. Whilst this may seem like a separate topic to Technical SEO but it is actually quite significant – after all search engines exist to provide a service to users searching for content, and so will reward websites offering the best possible user experience to these searchers.
Here are a few key ways to ensure a smooth and enriched user experience, that will in turn help boost your SEO score;
You can use the Google Search Console’s Mobile Usability Report to identify pages on your site that are not optimised for mobile users, as well as identifying the exact issue with the page, allowing you to quickly address the issue.
If you absolutely need to have a resource heavy site, consider using features like Lazy Loading or a Content Delivery Network to cache your content and boost loading times.
Using a tool like Google Page Speed Insights, you can identify the loading speed of your website, and identify if work is needed here to improve it.
Spiders are more afraid of you than you are of them…
Technical SEO is as much about your website structure as it is about speed, we will now discuss how you can ensure your website structure is optimised for SEO. In a world where there are over 1 billion recorded  websites, search engines face a mammoth task in classifying, categorising and understanding the content on these sites, in order to return the most relevant search results to their users. Google uses crawlers to go through existing websites and new websites where the owner has provided a sitemap (more on this later).Architecture
KISS – Keep It Simple Stupid
Your website structure represents how your website is organised and whilst we want to encourage creativity here, simplicity is key from both a technical and user perspective. A flat structure, where each page is only a few links / clicks away from another, allows both the visitor and the search engine crawler to easily navigate across your entire site. Being organised allows you to keep track of the content on your site, ensuring that content is kept up to date and all sections are linked correctly.
You can use tools like the Site Audit feature from Ahrefs, to provide a detailed overview of your websites structure, listed by directories for each of the different sections of your site. If you prefer a more interactive visual view of your site, you can use a network visualiser such as Visual Site Mapper to see connections between different segments of your site.
With search engine crawlers, well-structured directories allow the search engines to understand and categorise the content on your website, and these are visible in the search results returned. Here you will see sitelinks which alongside the main search result identify the directory and category of that on the website. Don’t overdo the hierarchy – it needs to be logical and appropriate, so if you have a small site with limited subdomains, keep it simple.
This topic tends to fall more under on-page SEO, however, it is worth mentioning here because internal links (hyperlinks that point to other pages on your own website), help Google navigate and index all the pages on your website. The bigger your site, the more important internal links become.
A sitemap helps search engines to find and understand your website, the pages, and their content. Sitemaps are not crucial because a search engine like Google will be able to find and index most if not all your pages if they are well linked. However, it is considered best practice and makes sure that a search engine can find all pages. This is especially the case if you have a large website, a new website, or a site full of media content.
Once you have created a sitemap, you should upload it to the Google Search Console which can help you identify any pages with broken links or indexing issues, and much more.
Schema represents a method of organising content, so that the search engine can understand what certain elements on your web pages are. Schema markup allows you to spoon-feed search engines where the key classifications for particular content on your page are. JSON-LD is the preferred schema markup for both Google and Bing, and you can use the following guide on how to implement structured data. After implementing you can test your markup with the Google Structured Data Testing Tool here.
In addition to helping bots understand content on your site, schema markup can also enable additional features to accompany your pages, referred to as ‘rich snippets’. Some examples of these include;
These sections allow you to add further credibility to your website, which in turn allows you to boost your SEO score. Google provide a really useful set of Guidelines related to Quality on a website, which should be followed to comply with search engine crawlers. In particular, marked-up reviews should not be written by the business and should be genuine, unpaid reviews from actual customers for the greatest success.Canoncialisation
When search engines crawl the same content on different web pages within a domain, it raises issues when indexing the site in search results, as is not sure what the primary source of that content is. That is where the rel=”canonical” tag comes into play, as it identifies the preferred version of content – aka the master version is located. That way the source page is indexed instead, meaning that the primary source of your content will be displayed in search results.
The canonical tag should be used on every page on your site, as recommended by Google, to prevent search engines indexing multiple versions of a single page. If a canonical tag does not exist, the search engine will choose the version it deems more likely to be the original, so to avoid issues here define it yourself and tell the search engines how to index your site properly.Duplicate Content
Wait a second I’m sure I’ve seen that somewhere else…
As your website grows to include more pages, blogs, and features you may run into a slight challenge – repeating yourself. Whilst you can argue that you may want to drill the message home by repeating it continuously, search engine crawlers don’t share that view.
There are a number of tools that can be used here to assist you with identifying issues with your website content. Both Raven Tools Site Auditor and Ahref’s Site Audit Tool allow you to identify pages within your site that contain duplicate content, or are ‘thin’ containing a low word count, which tend to perform worse in search results.
Both tools will provide you with a clear view of duplicate content on your website, but there is also the other scenario – duplicate content on other websites. Using a tool like Copyspace’s Batch Search you can upload a list of domains and identify content that appears around the web related to that domain. You can then search for a snippet of text returned in these search results in a search engine, such as Google with the text in quotes “text goes here“. If your page is returned first in the results, then the search engine considers you to be the author of that content. You only need to worry about content on your site that is deemed to be copied from other sites, as other people copying your content will be their SEO problem.The NOINDEX Tag
If you identify pages on your site with duplicated content, but you don’t want to alter them for any reason, you can make use of the “noindex” tag – instructing search engines not to index those pages in the search results. That way, human users can still interact and view that content, but search engine crawlers will stay away, meaning that those pages won’t impact your SEO score.
Bonus SEO Tips Dead Links
Sorry this is a dead end, please turn around…
Whilst less of a challenge on smaller domains, as your website presence grows to contain more links and subdomains, you may encounter a scenario where one (or more) of your links don’t actually go anywhere. This is frustrating for both site visitors and search engine crawlers, who will run into challenges navigating your site. You can identify broken links with most SEO tools, for example through SEMrush which will aid you in your dead link discovery process. You can also use the Map Broker XML Sitemap Validator to identify all of the live pages on your website and identify any pages that are broken (Error Code: 404) or redirecting (Error Code: 301).
It is highly recommended to complete regular audits for your website’s SEO score, and the following Guide is a fantastic place to start, where you can follow through a checklist of steps to boost and maintain your site’s SEO score over time.Conclusion
Technical SEO is a tricky subject, certainly to master, given that the changes that you make will take some time to come to fruition. Tools or service providers offering you instant success in terms of SEO are over promising – this is a long-term project that should be viewed over a period of 6-12 months rather than days or weeks. That being said there are a number of key areas to get started with outlined in this article, both from an immediate actions’ perspective and for ongoing maintenance and review.
At DigiF9 we offer a portfolio of services in website and application development, as well as digital presence and branding. We provide completely bespoke, tailored solutions to all our customers, where we immerse ourselves within your environment to be part of your team. We are completely transparent with our customers, providing you with clear and realistic timelines, and keeping you updated every step of the way. Interested? Contact email@example.com for a personalised quote and let us help you transform your digital presence!
We are a growing team of developers and designers who help companies take the next step and transform their digital online presence.
Post articles and opinions on Manchester Professionals
to attract new clients and referrals. Feature in newsletters.
Join for free today and upload your articles for new contacts to read and enquire further.