Search Engine Optimisation

Technical SEO

We deliver expert technical SEO services that help your website get higher rankings.

Technical SEO guide

Technical SEO is often overlooked when building a blog, eCommerce platform, or professional website, it’s an important piece of the puzzle that shouldn’t be ignored.

A site may look fantastic and boast incredible content, but unless the foundation of said website is strong, it can very quickly all come crashing down.

From the fundamentals of technical SEO to Crawling, rendering and indexing, let’s take a look at a slice of the package that makes the internals of a website tick:

  • Javascript
  • 404 pages
  • Canonical tags
  • 301 redirects
  • URL structure
  • Structured data
  • Duplicate content
  • Thin content
  • XML sitemaps
  • Hreflang
  • Site architecture

The fundamentals of technical SEO

If you’re not quite up to date with the dictionary definition, Technical SEO can be described as the process of ensuring that a website will meet the technical requirements of the modern search engine. The goal is simple, to improve organic rankings with the focus on including website architecture, crawling, indexing, and rendering.

Regardless of how good the content and the design of your website is, if the technical SEO is all over the shop, you’re not going to be ranking anywhere near where you should be. Simply put, search engines, at their most basic, must find, crawl, render, and finally index all pages on a website.

However, that’s in its most basic sense, there’s much more that goes into technical SEO before a site is deemed complete. Security, mobile-first, responsiveness, and countless other details should be considered for a top-notch site – that’s when Google starts taking note. So what should you look into to get things started?

Site structure and navigation

The structure of any website is the foundation that sets a blog, e-commerce or business up for success or for failure – it’s the most important part of any technical SEO journey.

That’s a bold claim, but it’s one that can easily be backed up. Although crawling and indexing is often pedestaled as the holy grail, any issues that arise from the pair come from a poor site structure. Get the basics right and the rest falls into place – so to speak.

So what’s the best way to structure a website for the best technical SEO results? Nothing groundbreaking, it’s simple – to use a flat and organized site structure. A flat structure, if you’re not aware, simply means that all of the pages at the site should only be a few links away from each other – making it super simple for search engines to crawl the whole site.

Site structure & technical SEO

A flat structure is vital, but this structure must be organised, too. Without structure, sites have a tendency to create “orphan pages”, that’s pages without any internal links pointing towards them. There’s a couple of ways to see how the pages are linked together, but few are quite as strong as Visual Site Mapper, a free tool offering a visual diagram showing how a site is related.

In addition to flat, organised structure, a consistent URL structure and breadcrumb navigation is a great way to solidify the architecture of a site. Breadcrumb navigation automatically adds internal links to both category and subpages on a site and, with Google now changing URLs to this breadcrumb-style of navigation now in the SERPs, there’s never been a better time.

But remember, it’s not only about your internal link and site structure, you have to consider your SEO link building strategy too.

Crawling, rendering and indexing

Crawling, rendering and indexing is how search engines find, see and index a website – that makes it pretty important in the grand scheme of things.

Identifying indexing issues

So, if you’ve got some issues covering indexing issues, now is the time to find them, address them and fix them to keep on top of your technical SEO game. There are three main ways to spot indexing issues, starting with the Coverage Report found in Google Search Console. The report is fantastic for keeping site owners up to date with any indexing or rendering issues and should always be the place to start.

Screaming Frog, despite the peculiar name, is one of the best and most popular crawlers used globally. Once any issues picked up from Search Console have been solved, running it through the software is a great idea. Finally, Ahrefs offers a great SEO site audit tool, giving great visual representations on the overall SEO health of the site including page loading, HTML issues and a percentage health score.

Problems with deep pages

It’s rare that sites struggle with homepage indexing, instead, the real problems come with those deep pages further away from the homepage.

Ensuring that an organised, flat structure is employed is the best way to prevent any issues occurring, but if there’s a specific deep page that would be of benefit to be indexed, there’s a simple fix. Nothing is quite as powerful as an internal link to that page, something in the footer will work just perfectly and, if your homepage has good authority, will ensure the deep pages will be indexed quickly.

Using an XML sitemap

Google, well a Google rep, was recently quoted stating that XML sitemaps are the “second most important source” for finding URLs. That makes them a pretty big deal.

If you’re wanting to check your sitemap is in good standing, it’s a good idea to head over to the “Sitemaps” section in the Search Console. If things aren’t looking too good, it’s time to act and address any problems.

Thin and duplicate content

Providing that you’re proactive with your approach to written content and ensure that there are bags of fresh, unique content on every page that you’re publishing, duplicate content isn’t something you should be worried about.

But that doesn’t mean that duplicate content won’t be a problem, after all, it can occur on any site, especially when your CMS has created several versions of the same page across numerous URLs. But this isn’t an issue unique to duplicate content, the exact same story can be told about thin content. It may not be an issue for most sites, but it can drag a site down positions in search engines. If there is an issue, it’s worth fixing, so how do we go about it?

There are a couple of fantastic web tools that work to find both thin and duplicate content in the Raven Tools Site Auditor and the Ahrefs Content Quality section of their website. The former scans a site, looking for the culprits and informs the user of which pages should be updated for the best results. The latter is great for finding whether a site has very similar, or the same content across several pages and will give a colour coded visual ranging from unique content to bad duplicates.

Pages that don’t quite have fully unique content will feature on the majority of sites, but that isn’t necessarily a bad thing. Issues do start to arise, however, when these pages are indexed – but this makes the solution nice and simple, add a “noindex” tag. This tag tells search engines that this page should not be indexed, preventing it from showing up in search and causing problems.

Although the majority of pages featuring duplicate content should be given the noindex treatment, there’s an alternative process in Canonical URLs that can be used for pages with very similar content featured on them. This is particularly useful with eCommerce sites that sell a number of similar products, in different colours, for example.

If the site is set up with different URLs for the colours, this will be a minefield for duplicate content – unless the canonical tag is used. This lets Google know that the “Red T-shirt” is the main version and the “Blue, Green, Brown etc.” are different variations.

PageSpeed

PageSpeed isn’t always the easiest metric to tackle, but it’s one of the very few technical SEO options that can make a direct impact on the site rankings.

There are a few ways that PageSpeed can reduce and we’re going to start with the web page size to get the ball rolling. The bottom line is easy to comprehend, images can be compressed and you can cache the living daylights out of a site, but the time it takes for the page to load will remain slow if the page is massive. If you’re finding giant PageSpeed loading times, it might be time to rethink the page structure.

Another great way to improve PageSpeed is by removing any unnecessary third-party scripts. Now, this is a great general rule, especially with the average third party script adding a total of 34ms to the load time, but some third party scripts are worth keeping. Google Analytics, for example, is one that probably won’t be worth scrapping, but others that you may have collected over the years might – it’s always worth checking.

The final tip is to test the load times of a page both with and without CDNs. It can be surprising to find that sites without a CDN can load faster than those with. If you’re looking to up the PageSpeed, it’s worth testing!

Get A Technical SEO Audit

There’s a lot more to technical SEO than originally meets the eye and, if you’re looking to build a successful website, the above tips are a great place to start.

But it’s not the full package, there’s so much more to technical SEO to fine-tune any website to operate at its optimum level. Get in touch and book your technical SEO audit today.

Get Started Today.

Simply fill out your details below and we'll get back to you shortly!