The Beginner's Guide to Technical Search Engine Optimization

Since this is a beginner's guide, let's start with the basics.

What's technical? SEO?

Technically SEO is the process of optimizing your website to help search engines like Google find, crawl, understand, and index your pages. The aim is to find and improve rankings.

How complicated is technical SEO?

It depends on whether. The basics aren't really difficult to master, but technical SEO can be complex and difficult to understand. I'll keep things as simple as possible with this guide.

This chapter explains how you can ensure that search engines can crawl your content efficiently.

How crawling works

Crawlers access the content of pages and use the links on these pages to find other pages. This way they can find content on the web. There are a few systems in this process that we will talk about.

Url swell

A crawler has to start somewhere. In general, they compile a list of all the URLs that can be found through links on pages. A secondary system for finding additional URLs is sitemaps, created by users or various systems with page lists.

Crawl queue

Any URLs that need to be crawled or re-crawled are prioritized and added to the crawl queue. This is basically an ordered list of URLs that Google wants to crawl.

Crawler

The system that records the content of the pages.

Processing systems

These are different systems that handle the canonicalization that we'll talk about in a moment, send pages to the renderer, which loads the page like a browser, and process the pages to get more URLs to crawl.

Renderer

The renderer loads a page like a browser with JavaScript and CSS Files too. This is done so that Google can see what most of the users will see.

index

These are the saved pages that Google shows users.

Crawl controls

There are several ways that you can control what is crawled on your website. Here are some options.

Robots.txt

A robots.txt file tells search engines where to go on your website and where not.

Just a quick note. Google may index pages that cannot be crawled if links point to those pages. This can be confusing. However, if you want to prevent pages from being indexed, read this guide and flowchart to walk you through the process.

Crawl rate

There is a crawl delay directive that you can use in robots.txt that is supported by many crawlers. This option allows you to set how often you can crawl pages. Unfortunately, Google doesn't respect this. For Google, you need to change the crawl rate in the Google Search Console as described here.

Access restrictions

If you want the page to be accessible to some users but not search engines, you probably want one of three options:

  • A kind of registration system;
  • HTTP Authentication (password required to access);
  • IP Whitelisting (which only allows certain IP Addresses for accessing the pages)

This type of setup is best for internal networks, member-only content, or for staging, testing, or development sites. It allows a group of users to access the page, but search engines cannot access it and do not index the pages.

This is how you can see the crawling activity

For Google, the Google Search Console crawler statistics report is the easiest way to see what they are crawling. There you can find more information about how to crawl your website.

If you want to see all the crawling activity on your website, you need to access your server logs and possibly use a tool to better analyze the data. This can be quite advanced, but if your hosting has a control panel like cPanel, you should have access to raw logs and some aggregators like Awstats and Webalizer.

Crawl adjustments

Each website has a different crawl budget. This is a combination of the number of times Google wants to crawl a website and the number of times your website is allowed to crawl. More popular pages and pages that change frequently are crawled more often, and pages that don't seem popular or well linked are crawled less often.

Typically, if crawlers begin to see signs of stress while crawling your site, they will slow down or even stop crawling until conditions improve.

After the pages are crawled, they are rendered and sent to the index. The index is the main list of pages that can be returned for search queries. Let's talk about the index.

Advanced learning

In this chapter, you will learn how to make sure your pages are indexed and how to index them.

Robot instructions

A robot meta tag is a HTML Snippet that tells search engines how to crawl or index a particular page. It will be in the Section of a webpage and looks like this:

Canonization

If there are multiple versions of the same page, Google will choose one to put in their index. This process is called canonicalization and Url What Google shows in the search results is selected as canonical. There are many different signals that they use to select the canonical Url including:

The easiest way to see how Google indexed a page is to use Url Inspection tool in the Google Search Console. It will show you the canonical chosen by google Url.

Advanced learning

One of the hardest things for SEOs is prioritization. There are many best practices out there, but some changes will affect your rankings and traffic more than others. Here are some of the projects I would recommend prioritizing.

Check the indexing

Make sure pages you want people to find can be indexed in Google. The previous two chapters were all about crawling and indexing, and that wasn't a coincidence.

You can review the indexability report in Site Audit to find pages that cannot be indexed and the reasons for them. It's free in the Ahrefs Webmaster Tools.

Reclaim lost links

Websites tend to change their URLs over the years. In many cases, these legacy URLs contain links from other websites. If they are not redirected to the current pages, those links will be lost and will no longer count towards your pages. It's not too late to do these redirects, and you can quickly reclaim lost values. Think of this as the fastest link building you'll ever do.

Site Explorer -> yourdomain.com -> Pages -> Best by Links -> Adding a "404 not found" HTTP Response filter. I usually sort this by referring domains.

This is what it looks like for 1800flowers.com.

Look at the first Url On archive.org I see that this was the mother's day page before. Redirecting that one page to the latest version would get you back 225 links from 59 different websites and there are many more options.

You want to redirect old URLs to their current locations to reclaim that lost value.

Add internal links

Internal links are links from one page on your website to another page on your website. They help find your pages and rank the pages better. We have a tool called Link Opportunities within Site Audit that you can use to find these opportunities quickly.

Add schema markup

Schema markup is code that helps search engines understand your content better and supports many features that can help your website stand out from the rest in search results. Google has a search gallery that shows the various search capabilities and schema required to make your website eligible.

Advanced learning

The projects we talk about in this chapter are all good things to focus on. However, they may be more work and less useful than the quick-win projects from the previous chapter. That doesn't mean you shouldn't do them. This is just to help you get an idea of ​​how to prioritize different projects.

Side experience signals

These are less important factors, but still things that you want to look at for the benefit of your users. They cover aspects of the website that affect the user experience (UX).

Core web vitals

Core Web Vitals are the speed metrics that are part of Google's Page Experience Signals that are used to measure user experience. The metrics measure the visual load with Largest Contentful Paint (LCP), visual stability with cumulative layout shift (CLS) and interactivity with the first entry delay (FID).

HTTPS

HTTPS protects the communication between your browser and server from being intercepted and manipulated by attackers. This provides confidentiality, integrity, and authentication to the vast majority of today WWW traffic. They want your pages to load HTTPS and not HTTP.

Any website that has a lock icon in the address bar will be used HTTPS.

Mobile friendliness

In simple terms, it verifies that web pages are displaying properly and are safe for people on mobile devices to use.

How do you know how mobile friendly your website is? Check the Mobile Usability report in the Google Search Console.

This report will tell you if any of your pages are experiencing any cell phone usability issues.

Safe browsing

These are checks to make sure that pages are not fooled, contain malware, or have malicious downloads.

Interstitials

Interstitials prevent content from being displayed. These are pop-ups that cover the main content that users may need to interact with before they go away.

Hreflang – for multiple languages

Hreflang is a HTML Attribute for specifying the language and the geographic orientation of a website. If you have multiple versions of the same page in different languages, you can use the hreflang tag to let search engines like Google know about these variations. This helps them deliver the correct version to their users.

General maintenance / website health

These tasks are unlikely to have a huge impact on your rankings, but are generally a good solution for the user experience.

Broken links

Broken links are links on your website that point to resources that do not exist. These can either be internal (i.e. to other pages of your domain) or external (i.e. to pages of other domains).

Site Audit allows you to quickly find broken links on your website in the link report. It's free in the Ahrefs Webmaster Tools.

Redirect chains

Redirect chains are a series of redirects that take place between the first letter Url and the goal Url.

With Site Audit in the Redirects Report, you can quickly find redirect chains on your website. It's free in the Ahrefs Webmaster Tools.

Advanced learning

You can use these tools to improve the technical aspects of your website.

Google Search Console

Google Search Console (formerly Google Webmaster Tools) is a free service from Google that you can use to monitor the appearance of your website in search results and troubleshoot errors.

Use this option to find and fix technical errors, submit sitemaps, view structured data problems, and more.

Bing and Yandex have their own versions, as does Ahrefs. Ahrefs Webmaster Tools is a free tool that you can use to improve your website SEO Performance. It enables you to:

  • Monitor your website SEO health
  • Check for 100+ SEO Problems
  • Show all of your backlinks
  • Show all keywords that you are ranked for
  • Find out how much traffic your pages are receiving
  • Find internal linkages
  • This is our answer to the Google Search Console limitations.

Google's mobile-friendly test

Google's mobile-friendly test checks how easy it is for a visitor to use your page on a mobile device. It also identifies certain issues with the usability of mobile devices, such as: B. Too small text to read, the use of incompatible plugins, etc.

The mobile-friendly test shows what Google sees when crawling the page. You can also use the Rich Results Test to view the content that Google sees for desktop or mobile devices.

Chrome DevTools

Chrome DevTools is the webpage debugging tool built into Chrome. Use this option to troubleshoot page speed issues, improve web page rendering performance, and more.

From a technical one SEO Standpoint, it has endless uses.

Ahrefs toolbar

Ahrefs SEO The toolbar is a free extension for Chrome and Firefox that offers useful functions SEO Information about the pages and websites you have visited.

The free features are:

  • On page SEO report
  • Redirect tracer with HTTP headlines
  • Broken Link Checker
  • Link highlighter
  • SERP Positions

As an Ahrefs user, you also get:

  • SEO Metrics for every website and page you visit, as well as Google search results
  • Keyword metrics like search volume and keyword difficulty right into SERP
  • SERP Export results

PageSpeed ​​Insights

PageSpeed ​​Insights analyzes the loading speed of your websites. In addition to the performance factor, actionable recommendations are also shown to help speed up page loading.

Let's sum that up

All of this just scratches the surface of the technology SEO. This should help you with the basics, and many of the sections have additional links to help you dive further. There are many other topics that were not covered in this guide. So I've put together a list if you want to learn more.

Specific focus

Infrastructure related

Website related

Processes

miscellaneous

Have fun exploring and learning. Message me on Twitter if you have any questions.

Comments are closed.