3 min read

Technical SEO

Technical SEO

There are a few technical SEO components that impact organic search for your site, and are not related to words (or on-page SEO factors).

These include:

  • Website architecture
  • Crawling
  • Rendering
  • Indexing

Let’s go over the basics.

(Note, the team at Hire a Writer is not staffed by technical SEO experts - for that, we refer everyone to Narrative SEO, a baller agency with mad technical SEO skills.)

Website architecture

Website architecture is just what it sounds like: is your website designed and built the right way?

Structured data

How well is your site organized? Structured data makes it easy for search engines to figure out what your site is all about. It’s obvious. It’s a fixed format that search engines understand. If you go too wacky and wild either with design or site mapping, you risk losing this technical SEO point.


Schema is not technically just related to technical SEO: it’s an idea of organizing associated groups, with associated properties, into types. These are then put into a hierarchy. Schema markup uses semantic vocabulary and tagging to coordinate information on your site, possibly improving search engine rankings. It really underlines the “technical” part of technical SEO, and has to be added into HTML by a coder. Moz has a good article on schema structured data you can check out.

XML Sitemap

Your XML sitemap is a roadmap that all search engines can follow on your site. It uses neat categories, packaging related content nicely and keeping track of images, modifications, etc. If all of your internal links are in order and you happen to be Type A, you may not need this. But you might.


Is your website hella confusing? Well, that’s a problem. Not just for user experience, but for technical SEO. That’s right: it’s all connected. You have to make sure your navigation 1) makes sense, 2) is easy to use, 3) makes sense. Yes. Twice.


You probably know that search engines crawl a website, hopping from link to link and interpreting what’s on your site. Crawlability is a vital metric to technical SEO. There are a few components to it, most of which need to be handled by a web developer.

Robots.txt File

This is a tool that has some risks (so be careful). You can look at the Yoast article on robots.txt if you want to learn more.

Kind of related, but not the same thing, is the meta robots tag, which lets the search engine crawl but makes it a noindex or nofollow signal. Again, ask a developer.

Dead Links

Dead links are a housekeeping issue for maintaining a website. They slow sites down and throw up 404 errors and similar. Search engines don’t like these errors, because it’s a sign of bad health. Even hidden dead links are bad. If you have an old site, or inherit a big ole’ site from a client to work on, this is a high priority to take care of.


Fetch and render - in case you’re like, “jigga what?” Rendering is the process of Googlebot getting your pages, or retrieving them. It runs all of your site pages code, analyzing the content to make sense of your site.

The process goes:

Original HTML → Rendered HTML

Initial HTML is the server response, linking to your CSS, Java, images, and everything else used to build the page. 

Why does it matter? Mostly because if there are issues on either side, or changes happen in the code, your entire webpage or even site is at risk. Google can’t index your site if it can’t render your site.

Search Engine Journal covers this nicely if you want to read more.


Indexing is a pretty common, used all of the time kind of term in the world of SEO. It just means that the pages on your website are added to Google search. Maybe (hopefully) obviously, you have to be using an “index” meta tag.

For the technical component of SEO, it’s about making sure your site *IS* indexable. If you login to Google Search Console (GSC), you’ll immediately see if there are indexing issues on your site (look at the “Coverage Report”). You can also check Screaming Frog.

These should obviously be fixed. Your site may not be indexing at all, certain pages may not be indexing, you may not be getting posts or pages indexed fast enough. A myriad of issues could impact this, and it’s 100% something you have to get to the bottom of if you want your site to be firing on all cylinders. 

Duplicate Content

We get asked this all of the time, maybe because we’re copywriters: “can I post this blog on a few different sites?” No, you can’t. It’s called duplicate content and it will be a problem for you. Google will probably rank the site with the highest street cred (domain authority) and deprioritize the others. It looks like you’re gaming the system.

A tactic in the world of black hat SEO is to use “spun content.” Basically, you “spin” a piece of content by rewriting a hefty portion of it, then reposting it in a bunch of places. Also a no go. Well, that’s actually more of a moral issue or integrity issue (because it’s the wrong thing to do), but if there is enough similarity, it looks suspicious to Google, and could risk you a rank ping.

Technical SEO for Everyone

Want an article that does a way better, more thorough job than this of explaining all things technical SEO? Backlinko is the way to go, IMO. Open a second Google tab to search all the terms, but y

Content, Credibility & SEO: Building Trust with Readers and Robots

Content, Credibility & SEO: Building Trust with Readers and Robots

As of March of 2022, the internet is home to just under 2 billion websites. And probably less than 1% of them are worth visiting. How do you make...

Read More
Enhancing Your Product Page SEO: 5 Key Strategies

Enhancing Your Product Page SEO: 5 Key Strategies

Having well-optimized product pages is crucial for boosting your sales.

Read More