Technical SEO is the infrastructure layer that determines whether search engines can find, render, understand, and rank your content. Without it, even the best content and strongest backlink profile will underperform. In 250+ projects across Australia and New Zealand, I have seen the same pattern repeatedly: businesses invest heavily in content and link building while sitting on a broken technical foundation. The content never reaches its potential because Google cannot efficiently crawl and index it.
This is not another generic checklist of tick boxes. Drawing on 15 years as a VETASSESS accredited Marketing Specialist and Australian Marketing Institute member, I have structured this guide around the technical SEO areas that actually move rankings, based on real audit findings from my consulting practice. Each section explains what to check, why it matters, and how to prioritise fixes when your development resources are limited.
What Is Technical SEO
Technical SEO refers to the process of optimising your website's infrastructure so that search engines can crawl, render, index, and rank your pages effectively. It sits alongside on-page SEO and off-page SEO as one of the three core disciplines, but it is foundational - the other two depend on it.
Think of it this way: on-page SEO is about making your content relevant. Off-page SEO is about building authority. Technical SEO is about removing the barriers that prevent Google from seeing and understanding your work in the first place.
Technical website optimisation covers everything from how fast your server responds to how your JavaScript renders, from whether Googlebot can access your pages to whether your structured data validates correctly.
In my experience, roughly 60 to 70 per cent of the Australian websites I audit have at least one critical technical issue that is actively suppressing their organic performance. The most common offenders are crawlability problems, indexation gaps, and poor Core Web Vitals - exactly the areas this guide prioritises.
When Technical SEO Should Come First
I recommend a technical audit before any content or link building investment. Here is why: if Google cannot crawl your site efficiently or your pages are not in the index, no amount of keyword research or digital PR will compensate. Across my client portfolio, businesses that address technical issues before scaling content see results 40 to 60 per cent faster than those who try to do everything simultaneously.
Crawlability: Helping Search Engines Find Your Content
Crawlability is the first gate. If Googlebot cannot access your pages, nothing else matters. This section covers the infrastructure that controls how search engines discover and navigate your site.
Robots.txt Configuration
Your robots.txt file is the first thing Googlebot reads when it visits your domain. It tells crawlers which areas of your site they can and cannot access.
What to check:
- Verify robots.txt exists at yourdomain.com/robots.txt
- Confirm no critical pages or directories are accidentally blocked
- Ensure your XML sitemap URL is referenced in the file
- Check that CSS and JavaScript files are not blocked (Google needs these to render pages)
- Validate syntax - a misplaced wildcard can block your entire site
Practitioner note: One of the most common issues I encounter in Australian site audits is leftover staging environment rules in production robots.txt files. After a site launch or migration, development teams often forget to update the file, leaving Disallow: / directives that block the entire site from crawling. I check this on every single audit, and I find it at least once a quarter.
XML Sitemaps
Your XML sitemap is a roadmap for search engines, listing the pages you want crawled and indexed.
What to check:
- Sitemap exists and is properly formatted (valid XML)
- Submitted to Google Search Console
- Includes only indexable, canonical URLs (no noindex pages, no redirects, no 404s)
- Updated automatically when content changes
- Stays under 50,000 URLs and 50MB per file (use sitemap index files for larger sites)
- Last modified dates are accurate (do not fake these - Google uses them to prioritise crawling)
Crawl Budget Management
Crawl budget is the number of pages Googlebot will crawl on your site within a given timeframe. For most small to mid-sized Australian websites (under 10,000 pages), crawl budget is rarely a bottleneck. But for larger sites - eCommerce stores, publishers, directories - it becomes critical.
What to check:
- Use Google Search Console's crawl stats report to monitor crawl rate and response times
- Identify and eliminate soft 404 pages (pages that return 200 status but show error content)
- Reduce redirect chains (each hop wastes crawl budget)
- Block low-value pages from crawling (faceted navigation, internal search results, tag archives)
- Monitor crawl errors and fix them systematically
When crawl budget actually matters: If your site has more than 10,000 pages, if you are adding hundreds of new pages frequently, or if your server response time regularly exceeds 500 milliseconds, crawl budget management should be a priority. For a 50-page brochure site, it is not something to lose sleep over.
URL Structure
Clean, logical URLs help both users and search engines understand your content hierarchy.
What to check:
- URLs are descriptive and human-readable (e.g., /blog/technical-seo-checklist not /page?id=4827)
- Hyphens used as word separators (not underscores)
- All lowercase
- No unnecessary parameters or session IDs
- URL depth reflects site hierarchy (keep important pages within three levels of the homepage)
- Consistent structure across the site
Indexation: Getting Your Pages Into Google's Index
Crawlability gets Google to your door. Indexation determines whether your pages actually enter Google's search index and become eligible to rank. These are two distinct processes, and issues in either one can suppress your visibility.
A healthy benchmark: more than 90 per cent of your published, intended-to-rank pages should be indexed. If you are below that threshold, something needs investigation.
Canonical Tags
Canonical tags tell Google which version of a page is the "primary" one when duplicate or near-duplicate content exists.
What to check:
- Every indexable page has a self-referencing canonical tag
- Canonical tags point to the correct preferred URL (check http vs https, www vs non-www, trailing slashes)
- Paginated pages canonicalise correctly (do not point all pages to page 1 unless you intend to)
- No conflicting signals (canonical says one thing, hreflang says another)
- Canonical tags are in the
<head>section, not the<body>(a surprisingly common error in JavaScript-rendered sites)
Meta Robots and Noindex Directives
Meta robots tags and X-Robots-Tag HTTP headers give you granular control over what gets indexed.
What to check:
- Noindex tags are applied only to pages you genuinely do not want in the index
- No accidental noindex on critical pages (check after deployments - I have seen this break entire sections)
- Nofollow is used appropriately (not on your own internal links)
- X-Robots-Tag headers align with meta robots directives (conflicting signals confuse crawlers)
- Check that noindex pages are not included in your XML sitemap (contradictory signals)
Index Coverage in Google Search Console
Google Search Console's indexing report is your primary diagnostic tool for understanding how Google sees your site.
What to check:
- Review "Not indexed" pages and categorise by reason
- Prioritise "Discovered - currently not indexed" (Google knows about these pages but has chosen not to index them - often a quality or crawl budget signal)
- Address "Crawled - currently not indexed" (Google fetched the page but deemed it unworthy of indexing - typically a content quality issue)
- Monitor "Page with redirect" and "Duplicate without user-selected canonical" errors
- Set up alerts for sudden drops in indexed page count
Practitioner insight: In 15 years of auditing websites, the "Discovered - currently not indexed" status is the one that most commonly correlates with real ranking issues. It often indicates that Google is struggling with crawl budget allocation or that it perceives the page content as low value. For Australian eCommerce sites with thousands of product pages, I typically find 20 to 40 per cent of products sitting in this status. The fix is usually a combination of improving internal linking to those pages and strengthening the content on them.
Duplicate Content Resolution
Duplicate content confuses search engines about which page to rank and dilutes your ranking signals.
What to check:
- Run a crawl with Screaming Frog or Sitebulb to identify duplicate title tags and content
- Check for www/non-www and http/https duplicates (consolidate with redirects)
- Identify thin or near-duplicate pages (particularly common in eCommerce with colour/size variants)
- Ensure URL parameters do not create duplicate versions of pages
- For Australian and New Zealand businesses operating .com.au and .co.nz domains, verify that cross-domain duplication is handled with hreflang or canonical tags
Core Web Vitals and Page Performance
Core Web Vitals are Google's standardised metrics for measuring real-world user experience. Since March 2024, the three metrics are LCP, INP, and CLS. These are confirmed ranking signals, and poor scores measurably impact both rankings and conversion rates.
I have written a comprehensive Core Web Vitals guide that covers each metric in depth. Here I will focus on what to check during a technical audit and the thresholds that matter.
Largest Contentful Paint (LCP)
LCP measures how long it takes for the largest visible content element to load. Target: under 2.5 seconds.
What to check:
- Identify the LCP element on key pages (usually a hero image or heading text)
- Check server response time (Time to First Byte should be under 200 milliseconds)
- Ensure images are optimised: use WebP or AVIF formats, serve responsive sizes, implement lazy loading for below-the-fold images only
- Verify that render-blocking CSS and JavaScript are minimised
- Check CDN configuration for Australian users (hosting location matters - a server in the US adds 150 to 300 milliseconds of latency for Australian visitors)
Interaction to Next Paint (INP)
INP replaced First Input Delay (FID) in March 2024. Unlike FID, which only measured the delay of the first interaction, INP evaluates the responsiveness of all user interactions throughout the entire session. Target: under 200 milliseconds.
What to check:
- Test across key pages, not just the homepage
- Identify heavy JavaScript execution that blocks the main thread
- Audit third-party scripts (analytics, chat widgets, ad platforms) - these are the most common INP offenders
- Check event handler performance for interactive elements (dropdowns, modals, filters)
- Use Chrome DevTools Performance panel or the Web Vitals extension for field and lab data
Practitioner note: Since INP replaced FID, I have seen a noticeable shift in which Australian sites pass Core Web Vitals. Sites that rely heavily on third-party scripts - particularly those with live chat widgets, tag managers loading 15+ scripts, and client-side filtering - frequently fail INP even when they previously passed FID. The fix is almost always script auditing and deferral, not a complete rebuild.
Cumulative Layout Shift (CLS)
CLS measures visual stability - how much the page layout shifts during loading. Target: under 0.1.
What to check:
- Set explicit dimensions (width and height) on all images and videos
- Reserve space for ad slots and dynamically loaded content
- Avoid inserting content above existing content after the page loads
- Use font-display: swap or font-display: optional to prevent layout shifts from web font loading
- Test on slow connections (3G simulation) where CLS issues are most visible
Server Response and TTFB
Time to First Byte (TTFB) measures how quickly your server responds to requests. While not a Core Web Vital itself, poor TTFB directly impacts LCP.
What to check:
- Target TTFB under 200 milliseconds for Australian users
- Verify hosting location (Australian hosting or a CDN with Australian edge nodes)
- Check caching configuration (browser caching, server-side caching, CDN caching)
- Ensure Gzip or Brotli compression is enabled
- Monitor database query times if using a CMS like WordPress
Schema Markup and Structured Data
Schema markup helps search engines understand the meaning and context of your content. It does not directly boost rankings, but it enables rich results (star ratings, FAQ dropdowns, breadcrumbs, how-to steps) that improve click-through rates.
I cover structured data implementation in detail in my schema markup guide. For this audit checklist, here are the key items.
Essential Schema Types
What to check:
- Organisation or LocalBusiness schema on the homepage
- Article or BlogPosting schema on blog content
- BreadcrumbList schema for navigation paths
- Product schema on eCommerce product pages (with price, availability, and review data)
- FAQ schema on pages with question-and-answer content (note the 2025 deprecation caveat below)
- Person schema for author pages (important for E-E-A-T signals)
Testing and Validation
What to check:
- Validate all schema using Google's Rich Results Test
- Check for errors and warnings in Google Search Console's Enhancements reports
- Ensure schema data matches visible on-page content (Google penalises mismatches)
- Test that JSON-LD is in the
<head>or<body>and loads correctly
What Google Deprecated in 2025
This is critical context for 2026 audits. In mid-2025, Google deprecated several structured data types:
- HowTo schema no longer generates rich results in Google Search
- FAQPage schema was restricted to authoritative government and health sites for most queries
- Several older review-based markup types were retired
- Support is being removed from Search Console and its API from January 2026
What this means for your audit: If you previously relied on FAQ or HowTo rich results for click-through rate improvements, you need to adjust your strategy. The schema itself is not harmful to leave in place, but it will no longer generate visual enhancements in search results for most sites. Focus schema efforts on Article, Product, Organisation, and BreadcrumbList types where Google continues to support rich results.
JavaScript SEO
JavaScript powers modern web experiences, but it creates unique challenges for search engine crawling and indexing. Google can render JavaScript, but the process is resource-intensive and introduces delays.
Rendering and Googlebot
Googlebot uses a two-phase process: first it crawls the raw HTML, then it queues the page for rendering (executing JavaScript to see the final content). This rendering queue can introduce delays of hours to days.
What to check:
- View your pages with JavaScript disabled - is the critical content visible in the raw HTML?
- Compare the page source (View Source) with the rendered output (Inspect Element) to identify JavaScript-dependent content
- Check that Google can render your pages correctly using the URL Inspection tool in Search Console
- Ensure critical content (headings, body text, internal links) is not solely dependent on JavaScript execution
Key update for 2026: In December 2025, Google updated its JavaScript SEO documentation with three significant clarifications. Pages returning non-200 HTTP status codes (404, 5xx) may be excluded from the rendering pipeline entirely. This means JavaScript-generated error handling must be paired with correct server-side status codes. Additionally, Google clarified that dynamic rendering is now officially a deprecated workaround.
Server-Side Rendering vs Client-Side
What to check:
- If using a JavaScript framework (React, Vue, Angular, Next.js, Nuxt), verify your rendering strategy
- Server-side rendering (SSR) or static site generation (SSG) is strongly preferred for SEO-critical content
- For eCommerce Product structured data, Google recommends placing markup in the initial HTML rather than generating it via JavaScript
- If using client-side rendering, verify that all critical content reaches the rendered DOM within a reasonable timeframe
Dynamic Rendering Deprecation
Dynamic rendering - serving pre-rendered HTML to search engines while serving JavaScript to users - was a common workaround. Google has officially deprecated this approach.
What to check:
- If you are currently using dynamic rendering (Rendertron, Prerender.io), plan a migration to SSR or SSG
- This is not an emergency (existing implementations still work), but it is a technical debt item that should be on your roadmap
- New projects should use SSR or SSG from the start
Mobile Optimisation and Mobile-First Indexing
Google completed the rollout of mobile-first indexing in July 2024. This means Google now uses the mobile version of your site for both mobile and desktop indexing and ranking. If your mobile experience is degraded, your desktop rankings suffer too.
In Australia, mobile accounts for over 60 per cent of all web traffic and an even higher percentage of search queries. For local businesses, that figure often exceeds 75 per cent.
What to check:
- Responsive design implemented and functioning across major breakpoints (320px, 375px, 768px, 1024px)
- Content parity between mobile and desktop (same text, images, structured data, and internal links)
- Tap targets are at least 48 by 48 CSS pixels with adequate spacing
- Font sizes are readable without zooming (minimum 16px for body text)
- No horizontal scrolling required at any viewport width
- Mobile page speed is optimised (test on real 4G connections, not just lab conditions)
- Forms are usable on mobile (appropriate input types, visible labels)
Site Architecture and Internal Linking
Site architecture determines how link equity flows through your site and how efficiently search engines can discover your content. A well-structured site makes every page easier to find, crawl, and rank.
Flat Architecture
What to check:
- Important pages are reachable within three clicks from the homepage
- Logical hierarchy: Homepage → Category → Subcategory → Individual pages
- No critical pages buried more than four levels deep
- Navigation menus surface your most important pages
Breadcrumbs and Navigation
What to check:
- Breadcrumb navigation implemented on all pages (except the homepage)
- Breadcrumbs reflect the actual site hierarchy
- BreadcrumbList schema markup matches the visible breadcrumbs
- Primary navigation is crawlable (not hidden behind JavaScript-only interactions)
Orphan Pages
Orphan pages are pages with no internal links pointing to them. Search engines struggle to discover and prioritise orphan pages.
What to check:
- Run a crawl comparison: compare your sitemap URLs against internally linked URLs
- Any page in your sitemap that has zero internal links is effectively orphaned
- Fix by adding relevant contextual internal links from related pages
- For large sites, cross-reference crawl data with Google Search Console to find pages that are indexed but receive no internal link equity
HTTPS and Security
HTTPS has been a confirmed Google ranking signal since 2014. In 2026, it is table stakes - not a competitive advantage, but a baseline requirement.
What to check:
- HTTPS implemented site-wide with a valid SSL/TLS certificate
- HTTP URLs redirect to HTTPS via 301 redirects
- No mixed content (HTTP resources loading on HTTPS pages)
- HSTS (HTTP Strict Transport Security) header implemented
- Certificate expiry is monitored and auto-renewal is configured
- Security headers configured: Content-Security-Policy, X-Content-Type-Options, X-Frame-Options
How to Prioritise Your Technical SEO Fixes
After running through this checklist, you will likely have a list of issues ranging from critical to minor. Here is the prioritisation framework I use across my consulting engagements:
Priority 1 - Fix immediately (blocking rankings):
- Accidental noindex on important pages
- Robots.txt blocking critical content
- HTTPS/SSL errors
- Severe indexation gaps (less than 70 per cent of pages indexed)
Priority 2 - Fix within 2 weeks (suppressing performance):
- Core Web Vitals failures (LCP, INP, or CLS above thresholds)
- Canonical tag errors
- Crawl errors on important pages
- Missing or broken XML sitemaps
Priority 3 - Fix within 1 to 2 months (optimisation):
- Schema markup implementation or fixes
- Site architecture improvements
- Orphan page resolution
- JavaScript rendering optimisation
- Internal linking enhancements
Priority 4 - Ongoing maintenance:
- Monthly Core Web Vitals monitoring
- Regular crawl error checks (fortnightly)
- Quarterly full technical audits
- Post-deployment checks after every site update
Practitioner insight: I tell every client the same thing - do not try to fix everything at once. A focused approach where you resolve Priority 1 items first, verify the fixes are working in Google Search Console, and then move to the next tier produces faster results than scattering development resources across 50 issues simultaneously. In my experience, resolving Priority 1 and 2 items alone typically accounts for 70 to 80 per cent of the ranking improvement from a technical SEO audit.
Tools for Technical SEO Audits
You do not need every tool on the market. Here is what I use across my practice:
- Google Search Console - free, essential. Index coverage, Core Web Vitals field data, crawl stats
- Screaming Frog SEO Spider - desktop crawler. The free version handles up to 500 URLs, which covers most small business sites
- PageSpeed Insights - free. Core Web Vitals lab and field data for individual URLs
- Ahrefs or SEMrush - paid. Site audit features, backlink analysis, and technical issue detection
- Chrome DevTools - free. Performance profiling, JavaScript debugging, network analysis
For a deeper look at the tools I use and recommend, see my technical SEO service page.
Frequently Asked Questions
How often should I run a technical SEO audit?
I recommend a comprehensive technical audit quarterly, with monthly spot checks on key metrics. At minimum, check Google Search Console weekly for new crawl errors, indexation drops, or Core Web Vitals regressions - this takes 15 minutes and catches problems before they compound. After any major site update, deployment, or CMS migration, run a full crawl immediately.
What is the most important technical SEO factor in 2026?
Indexation. If your pages are not in Google's index, they cannot rank for anything. I consistently see indexation issues as the highest-impact finding in my audits. After indexation, Core Web Vitals and crawlability round out the top three. Schema markup and JavaScript rendering are important but secondary to these fundamentals.
Can I handle technical SEO without developer access?
Partially. You can diagnose most issues with Google Search Console and a crawler like Screaming Frog. You can fix some items through CMS settings, plugins, or tag manager. But resolving server-side issues, implementing SSR, fixing Core Web Vitals at the code level, and deploying schema markup properly almost always requires developer involvement. If your development team treats SEO fixes as low priority, that is an organisational problem worth addressing directly with stakeholders.
How long does it take to fix technical SEO issues?
It depends on the severity and your development capacity. Priority 1 issues (accidental noindex, robots.txt blocks) can often be fixed in hours. Core Web Vitals improvements typically take 2 to 6 weeks of development work. Site architecture overhauls can take 2 to 3 months. The key variable is not the fix itself - it is getting development time allocated. Across my Australian client portfolio, the average time from audit delivery to full Priority 1 and 2 resolution is 4 to 8 weeks.
Does technical SEO directly affect search rankings?
Yes. Core Web Vitals are a confirmed ranking signal. HTTPS is a confirmed ranking signal. Mobile-friendliness is a confirmed ranking signal. Beyond confirmed signals, crawlability and indexation are prerequisites for ranking at all. A page that Google cannot crawl or has not indexed has zero ranking potential regardless of its content quality or backlink profile.
What is crawl budget and does my site need to worry about it?
Crawl budget is the number of URLs Googlebot will crawl on your site within a given period. For most Australian business websites with fewer than 10,000 pages, crawl budget is not a practical concern. It becomes important for large eCommerce stores, marketplaces, publishers, or any site with tens of thousands of pages. If your site is smaller, focus your energy on the other items in this checklist first.
Should I still use FAQ and HowTo schema markup in 2026?
Google deprecated FAQ and HowTo rich results for most websites in 2025. The schema markup itself is not harmful, and it still helps search engines understand your content structure. However, it will no longer generate the visual enhancements (expandable FAQ dropdowns, step-by-step how-to cards) in search results for most sites. I recommend keeping existing markup in place but not investing significant effort in adding it to new pages unless you are a government or authoritative health site.
What is the difference between technical SEO and on-page SEO?
Technical SEO focuses on website infrastructure - crawlability, indexation, page speed, rendering, and server configuration. On-page SEO focuses on content-level optimisation - title tags, headings, keyword usage, internal linking within content, and content quality. Both are essential, but technical SEO should be addressed first because it determines whether your on-page work can even be seen by search engines.
