Chapter 3

Technical Audits for SaaS Companies

With the abstract keyword research process out of the way, it’s time to look at and address your SaaS website’s technical structure. You can also download the SaaS SEO Guide in PDF form here.
Technical Audits for SaaS

With the abstract keyword research process out of the way, it’s time to look at and address your SaaS website’s technical structure. Technical issues are typically less prominent for our SaaS clients at SimpleTiger because building good software is part of their business. That being said, no site is completely free of technical issues. We’ve worked with SaaS sites that didn’t have a ton of critical technical issues, but we always find something, and in most cases, there are some simple wins that spell immediate bumps in rankings and traffic if addressed properly.

We’ll want to get started by crawling the site and gathering data, so we know what’s not working properly and what needs to be done. This is a process that will happen over and over again during the course of building your site’s organic search traffic, so get ready to create a new habit.

Crawling Tools

Part of what makes the Technical Structure process of SEO so easy is the tools you use for the job. There are a variety of different tools you can use to crawl your SaaS website, but there are a few we’ve been using for years and love. Just like with my suggested Keyword Research Tools, I’ll show you my favorite crawling tools, explain why we love them and share which ones you should use when factoring in the size of your site and how intense your technical audits really need to be.

First, you should certainly have Google Search Console setup and configured properly on your site. Make sure you get the right www or non-www version of your site verified, and if it’s HTTPS (as it should be), you’ll want to make sure you verify that part properly too.

Next, pick one of the following tools (or even a few) and start a crawl of your site:

  • Screaming Frog - While not my go-to tool, Screaming Frog is easy to use and trustworthy. It’s great for smaller sites (
  • DeepCrawl - This is our de facto tool for crawling most of our SaaS client sites at SimpleTiger. DeepCrawl has a list of features and deep-diving capabilities, including functionality that enables us to manage multiple different client sites and crawl budgets and utilize advanced settings like password protected pages and ignoring robots.txt. These are all things that make it easier for an SEO agency like us. DeepCrawl may be overkill for you, but we can personally vouch for it as a crawling tool that takes a ton of the guesswork out of identifying actual problems for someone without as much experience as a seasoned SEO professional. For SimpleTiger, we have to be prepared to handle any type of client site from 200-page sites to 2.5MM-page sites all with various technical issues. DeepCrawl does that every time.
  • Ahrefs - There are so many other crawling tools we could have suggested here—if technical audits were all I’m talking about—but we know you’re likely looking for a single software solution that solves for more than just one piece of the SEO puzzle. Because we’ve already suggested Ahrefs so hard in the Keyword Research Process (spoiler: We’ll suggest it again in the Offsite Strategy Process later in this guide), you should just get used to using it now and save yourself some money and time learning different tools. While Ahrefs’ crawler isn’t as deep or thorough as DeepCrawl, I’d say it’s a fantastic next option if extreme depth and power-user benefits aren’t as much of a concern for your technical audits.

High Priority Technical Issues

Regardless of the tool you use, you’ll need to know which issues are real concerns and which issues can be addressed down the road. There’s always a balance between cost and benefit when making technical implementations for SEO purposes, but some issues can be fixed easily and have a massive impact if they’re really blocking performance. I’m going to break down the top technical issues we see in our technical audits of SaaS websites. Here’s our top list of things to look out for in order of priority:

  • 404s - This is probably the most deadly technical issue your site could have—the Page Not Found error. If you get a nasty 404 page, you need to fix that, but in some cases, you may not even know that your server is generating a 404 error because the page loads fine, and you can see everything just like you’d expect. Your crawling tool will list out 404s, so you should analyze those closely to make sure they aren’t pages you expect to be loading properly. Google doesn’t want to see 404s on your site because disappearing pages mean a poor user experience. The most common recommendation is either to repair the page if it’s broken or create a 301 redirect to the most relevant location.
  • Load Time - Pages with a slow load time (more than 2 seconds) will not rank in Google, period. This is a direct statement from Google. This doesn’t mean your site has to load in less than two seconds, but Google thinks it would be really nice, so we recommend you try your best. This doesn’t just affect your rankings; it affects your conversions because users are more likely to convert on sites that load quickly and are easy to navigate. Use a tool like Google PageSpeed Insights to analyze what’s making your site load slowly (no matter how fast it is), so you can optimize it.
  • Responsive - This sounds like load time, but it means your site loads appropriately on the user’s device. For example, if a user is on a desktop and then a phone, does your site load flexibly so that the user has a great experience regardless of the device? It should. Does the URL change depending on the device? It shouldn’t. Google stated years ago that if your site doesn’t render well on mobile devices, then it won’t outrank other sites that do. You don’t want to have a different version of your site either (one for mobile and one for desktop). You need one site that uses relative references in HTML and CSS to make the site load on every device and scale to the needs of the device. You can check if your site is mobile friendly with this tool provided directly from Google.
  • Duplicate Content - Google is learning how to handle this issue a little better, but it’s still something to take seriously. Duplicate content is when Google finds two or more URLs on a site with the same or similar content. This makes it difficult for Google to decide which page ought to target whatever keyword they’re both targeting, and sometimes Google will just prefer neither instead of picking one for you. Both outcomes are undesirable, so Google honors a tag in the <head> section of your pages called a rel=canonical link tag. This tag points from different versions of a page to the original or preferred version of that page, and it instructs Google to give all of the credit and authority to the original page. Duplicate content can apply to title tags, on page content and the HTML code used on your site which DeepCrawl will break out as necessary. Mainly though, you should look out for pages that generally duplicate content in the page and title.</head>
  • Thin Content - Pages that have very little content on them are at risk of not ranking well if they’re even indexed at all. Google defines thin content as content that has little to no value to a user and is typically represented by pages with hardly any content on them. In many cases, these pages need to be considered for either fleshing out further or combining with other pages to complete a subject matter with more content.
  • JavaScript blocking navigation - This can be hard to detect with one of the crawling tools suggested above, but if you disable JavaScript using the web developer toolbar, you’ll be able to navigate the site using the main menu and any other menus and see if JavaScript is in the way. If you’re unable to navigate to parts of your site with JavaScript disabled, you need to find a way to either render these links as HTML or some other navigable way with JavaScript turned off.
  • Missing Sitemap.xml File - Your site should have an XML sitemap file. It acts as a feed to Google of the URLs on the site that you want to be crawled, the priority of those pages and how often they’re updated; so Google knows when to come back. This file can greatly help get your site fully indexed and frequently recrawled to ensure any updates are reflected in Google fast. Without this file, you’re completely trusting Google to figure everything out on its own which isn’t ideal. In fact, Google wants users to provide them with sitemap.xml files to make their job easier, so they provide an option to do so in your Google Search Console account.
  • Missing Robots.txt File - The robots.txt file is the default for sites as the first place Google visits to see what sort of pages you do not want Google to crawl or index. This helps Google because there may be a lot of pages you don’t want to be crawled for whatever reason (think duplicate pages, or session IDs, or pages that don’t serve users but are generated by your server), and Google can then reserve their crawl budget for parts of your site you do want crawled. Within this file, you can link over to your XML sitemap file to show Google the URLs you do want crawled, and that’s a great way to get the ball rolling on a good crawl. Be careful how you use robots.txt though as it’s a very powerful tool, and with a single, misplaced “/”, you could tell Google to deindex your entire site. Before tinkering with it, you should learn more about how to use robots.txt first.
technical-audits-2.png

Moving Content & Redesigning Sites

It’s time for a major caveat when it comes to moving content around or redesigning your site even if the content and platform are staying the same.

Google develops relationships with TLDs (or Top Level Domains such as simpletiger.com) on a per-domain basis, meaning each domain is seen as an entirely different entity. If you ever try to move from one domain to another by renaming or rebranding your company, you’ll be forced to start over completely with building the new domain. This can take a very long time and isn’t ideal if you can avoid it.

As for staying on the same domain but moving content around on that domain, you’ll still create some problems that you should know about before making the move. This applies to redesigns too. On many CMSs when you move a piece of content to a different menu item or dropdown, or you move it to a different folder or subdirectory on the site, you end up changing the URL to a new URL. For example, moving the “...domain.com/about-us” content to “...domain.com/about” is a URL change. The old URL “/about-us” is now a 404 and needs to be redirected to the new location “/about” which isn’t ideal. Google goes back and forth on whether a redirect causes a loss in link authority or not, but over time, your site’s main pages could easily acquire several hops of redirects which can turn into a mess and make future redesigns or platform moves a nightmare.

The most ideal course of action is to build your site using a well-optimized URL structure that includes your target keywords in the right location. Then, you should never change them. This can be difficult to achieve, but if you can maintain this sort of structure, then SEO will be much easier for you over time.

In the previous section, we suggested moving content around based on the menu and navigational structure as well as for keyword targeting purposes. This still applies and should be done with redirects pointing from old URLs to the new ones, but when possible, plan ahead for the long term and don’t move content.

Recrawling & Maintaining Proper Technical Structure

After you’ve implemented some technical best practices according to this guide and your findings in your SEO tools, it’s time to recrawl your site. This way, you can see if your changes are live and if any new issues popped up. You’ll also want to set up the crawling tool you’ve chosen to recrawl the site based on a schedule of your choosing. If you’re moving a lot of things around and changing a lot on the site, you’ll probably want a more frequent crawl cycle. If you rarely change the site or add content to it, then a more infrequent crawl cycle should also work. Over time, issues will inevitably pop up, and your goal isn’t to have zero issues on your site but to stay on top of the most critical ones and keep them in check, so they don’t limit your performance as you’re adding content and links later.

There’s so much more you can do to optimize your site than just address the issues we've listed here, but it’s easy to get stuck in the weeds and never make it into the world of high-impact SEO efforts, including content production and link building.

Let’s shift gears, and I’ll show you what you can build on a site with a strong technical foundation.

Simply effective marketing.