Guide to Technical SEO

Technical SEO is a crucial part of your overall search engine optimization efforts. When you work with technical SEO, the goal is to optimize your website according to search engines’ requirements and guidelines. Technical SEO is a complex field with many factors to keep track of. We’ll walk you through what you need to know to get started! But what exactly is important when it comes to technical SEO? That’s what we’ll hopefully give you a thorough explanation of in this guide!

Audun Braastad

Audun Braastad

Subject Matter Expert · · 10 min lesetid

Guide to Technical SEO

What is technical SEO?

In short, technical SEO is the work done to ensure that your website is read and categorized (“indexed” in professional terms) by search engines in a correct and efficient way. Technical SEO involves optimizations of the website’s technical aspects that make it easier for search engines to access the important information you present and better understand what your content is about. This way, they will also index the content efficiently and present it in the search results when it is relevant.

In short, you use the technical aspects of your site to show Google and other search engines what your page is about and that it meets a solid technical standard. Both of these factors help search engines recognize your website as a high-quality site and rank it higher in the search results for the right queries. By making well-considered decisions and technical optimizations, you also ensure that your site is user-friendly for visitors. Strong technical SEO is a win-win!

Why work with technical SEO?

As already mentioned, technical SEO is crucial for streamlining how search engines read and index the content on your website. But why is it so important? Because it has major ripple effects that influence the traffic to your site and, ultimately, your revenue.

For a website to achieve optimal organic visibility and attract the ideal target audience, it first needs to be recognized by the search engine. If your content is not registered in Google’s index (and with other search engines), you will never reach the top of the search results. To become visible, you must first ensure that the search engines register that your content exists. Then they need to be able to understand what it is about and categorize it accordingly. When they identify aspects of the website that they consider valuable, they will rank it higher. Working with technical SEO is therefore an essential first step in the pursuit of strong rankings. Higher rankings mean that more people are exposed to your content, and it is proven that the top search results receive the most traffic.

Working with technical SEO is also about making your website user-friendly for visitors. By, for example, ensuring fast loading of design and images, you improve the user experience. This also has a major impact on whether people want to use your site and, consequently, how many sales you generate.

With increased traffic and a high-performing website (this is where site structure and also on-page SEO come into play) you will be able to appeal to more customers and increase your revenue!

How we work with technical SEO

Everyone working with websites should monitor how their site is performing from a technical standpoint. Still, there are certain situations where it makes sense to run a thorough check. This is often referred to as a technical SEO audit. We carry out such a check every time we onboard a new SEO client.

Imagine that Dag, the digital manager, is about to launch new websites for his online store. When he is launching the new site, it is crucial that he reviews it thoroughly. If the technical aspects are not properly in place, he will lose the visibility he is carrying over from the old online store. How can Dag avoid this? That’s exactly what we’ll walk through here!

Indexing

One thing that is important for Dag to keep track of is how Google indexes his website. Google’s index can be seen as Google’s own library, where they collect all the web pages they crawl and are aware of. In the index, these pages are categorized based on how Google interprets the content. As mentioned earlier, it is essential that the pages you want to achieve strong visibility with are included in Google’s library. If Google is not aware of Dag’s optimized pages, they will never be able to reach top positions in the search results!

Incorrect indexing – noindex tag

When Dag launches his new site, it is important to monitor Google’s index to ensure that the key pages are being crawled and indexed. Equally important is keeping an eye on pages that should not be indexed. This is best done in Google Search Console. Sometimes, when creating new subpages or categorizing pages on a website, auto-generated pages can appear that Dag definitely does not want Google to see. If, for example, empty auto-generated pages appear, it is important that he deletes or de-indexes them.

Incorrectly indexed pages can be removed from Google’s index by using a so-called noindex tag. By hiding the irrelevant and incorrect pages from Google, he prevents the search engine from spending a lot of time crawling pages that will undermine the pages he has actually worked on. Want to learn more about how indexing and the noindex tag work? Check out our explanation here.

It’s not only at the time of launch that Dag needs to keep an eye on this; it will be an ongoing task he has to stay on top of. Google scans websites regularly, so by monitoring over time, Dag will be in control if any issues suddenly appear.

Duplicate content

When Dag checks the index in Google Search Console, it is also useful to review the index for duplicate pages. It is important to avoid having multiple pages on the same website that cover exactly the same topic. It is especially important that Dag does not have identical pages on his site. How is Google supposed to know which of the two pages to rank in the search results? The answer is often that both are indexed, and then the two identical pages end up competing against each other for visibility. Instead of one of Dag’s pages achieving a top ranking, he ends up becoming his own competitor. By deleting duplicate content when he sees it in the index, he avoids that mistake.

Canonical tag

One way to avoid duplicate content and internal competition now and in the future is to implement canonical tags. This can be used for all types of pages, but is especially useful for ecommerce sites. If Dag has filtered views in his online store, for example the option to sort products alphabetically, then canonical tags are essential. Otherwise, you often end up with two URLs showing the same products, just sorted differently. In that case, Google doesn’t understand that these are two versions of the same page. With a canonical tag in the HTML code, you show search engines which version should be prioritized. Everything you need to know about using canonical tags can be found here.

Sitemap

Another thing Dag should do in Google Search Console is upload a sitemap for his new online store. A sitemap is an encoded file that contains information about which URLs and other important files are on the website. It also provides an overview of how these pages are connected. By giving Google this kind of map of the site structure, the search engine will be able to index the website much more efficiently.Read more about what Sitemap entails here!

Robots.txt file

Robots.txt files are another tool Dag can use to ensure that Google indexes the new website correctly. By adding a robots.txt file in the code, you can provide guidance to Google in the indexing process. This allows you to automatically exclude the auto-generated categories by adding a directive stating that URLs which, for example, contain /author/ (these often appear when publishing blog posts with different authors) should not be added to Google’s index. Read more about how robots.txt files work here!

“Crawl budget”

After going into more detail on indexing, sitemaps and the robots.txt file, it has become clear that how Google reads your website is crucial to work on. This is because Google only spends a certain amount of time scanning (or “crawling”, as it’s called) and indexing pages per website. How much time is allocated per site is one of Google’s many secrets. But be aware that it is limited, and that the search engine stops scanning when the time is up.

If Dag’s online store is large, with countless products and categories, not everything will be registered by Google at once. Without setting these parameters, he will have zero control over whether the pages he has actually optimized will be indexed. Instead, a lot of pages that are not relevant for organic visibility may end up in Google’s library. In that case, he can wait a long time before seeing high rankings for his key landing pages.

Page speed

It’s not just what the search engines find when they crawl your website that affects how your site is ranked. The speed of your website also influences how Google evaluates it. Pages that are fast, well-structured, and easy to navigate will be valued more by search engines than comparable pages that are slow and confusing. When Dag works on his new website, he will be able to take various actions to ensure good performance.

Page loading

Under “innlastingshastighet” there are also factors that look at how much different elements move around during loading, and how long it takes before the user can interact with the website. These aspects are part of what is called “Core Web Vitals” – which you can read more about here. When working to improve these aspects, you need to make sure, among other things, that unnecessary scripts and other code are not being loaded. Here, different CMS-ene (publiseringsløsningene) have their own strengths and weaknesses.

Image sizeOne of the simplest improvements you can make to increase your website speed is to ensure that you use compressed images that are not larger than necessary. This applies both to the number of pixels and the number of kilobytes. Many CMS solutions have built-in options for image sizes and compression. There is no definitive answer to how large images should be, but as a rule of thumb, smaller images should be under 100 kb and large images should be under 3–400 kb.

Mobile friendliness

More and more people primarily use their mobile phones when searching for products or services. That’s why it is important to have a site that is optimized for mobile. It is crucial that images, text, and design adapt to smaller screens without compromising the user experience. To attract customers to his new online store, Dag must ensure that it is mobile-friendly.

But here the focus is on technical SEO, which fortunately goes hand in hand with user-friendliness in most areas. Google’s algorithms are designed so that sites that are not mobile-friendly will be penalized with lower rankings. Since Google wants to present relevant search results, they will not recognize sites that are not mobile-friendly as relevant for mobile users. That’s why it’s essential to have this in place!

To test whether the search engine recognizes the domain as mobile-friendly, Dag uses Google’s own “mobile-friendly test.” You can find the test tool here.

SSL certificate

Who would want to shop on Dag’s website if it appears unsafe? By showing users that the site is encrypted with an SSL certificate, you build trust with visitors. It is also a factor that search engines use when evaluating websites. Google has increased its focus on ensuring that the pages in its index are secure. With an SSL certificate, you ensure that communication on the website is encrypted.Read more about what an SSL certificate entails here.

Meta title and meta description on all pages

Another element included in our technical SEO work is reviewing the metadata associated with the site. We use the Screaming Frog tool to check the status of meta titles and descriptions on the various landing pages of a website.

Meta title

The meta title is added in the CMS on the website and is the title that appears in Google’s search results. The meta title should preferably be short and concise. The limit is around 60 characters. It is important that the meta title describes what the specific landing page is about. Search engines use the meta title to understand what the page is about, which is why it is an element that must be in place to ensure good technical SEO. Read about the requirements for the meta title here.

Meta description

The meta description appears below the title in the search results and is the text that should make users interested in your page. It should expand on the description of the landing page and invite users in. The length of a meta description is a maximum of around 160 characters. Here you can learn more about what a meta description is.

Different technical SEO tools

Google Search Console

As mentioned earlier, we use Google Search Console extensively in our technical SEO work to get an overview of which URLs on a website are included in Google’s index and what types of indexing errors have been registered. But it can also be used for many other purposes.

Search Console is a tool launched by Google that gives anyone working with a website an overview of site traffic and the website’s organic performance. You also receive notifications from Google about issues and can monitor how Google indexes the site.Read more about this versatile tool here!

Screaming Frog

Another tool that is often used to map a website is Screaming Frog. Despite the strange name, it is very useful. Screaming Frog crawls websites in much the same way as Google’s own bots. Dag can simply crawl his website once it has been launched, and Screaming Frog will give him an overview of his landing pages. The free version allows him to scan 500 random pages on the site, which provides a solid overview. Among the results, Dag can check image file sizes, which pages are indexable, and the status of headings and metadata.

Learn more about the opportunities you have with Screaming Frog!

Webpagetest

Previously, we described the importance of having a fast website. To check the status of your site, you can use the Webpagetest tool. In Webpagetest, Dag can search for his domain and adjust the location he wants to test from. It is useful to see how the site performs on desktop and mobile, and how speed varies when you change location. You can test the loading time of different landing pages and identify what is affecting performance. Read more about the test here.

Audun Braastad

Audun Braastad

Subject Matter Expert

Kamilla Krane

Kamilla Krane

Commercial Manager

La oss ta en prat

Fortell oss om prosjektet ditt, så tar vi en uforpliktende prat om hvordan vi kan hjelpe.