SEO Validation in Monorepos: Turborepo and Nx Patterns
How to run SEO checks across monorepo apps with Turborepo or Nx. Pipeline configuration, per-app thresholds, and caching strategies.
You pushed a refactor of the @repo/ui package on Friday. Monday morning, organic traffic to the marketing site dropped 40%, and the documentation site fell out of the top 10 for three core keywords. The culprit: someone conditionally rendered the <Head> component in the shared layout, stripping the <title> and <meta name="description"> tags from 44 pages across two distinct Next.js apps. Unit tests passed. End-to-end tests verified the buttons clicked. But no one validated the <head> payload. In a monorepo, shared code means shared risk. A single bad commit in a UI library propagates instantly, nuking the SEO of five different applications simultaneously. You need a pipeline step that fails the build when SEO regressions occur, isolated per app but orchestrated globally.
Why do monorepos require isolated SEO validation?
Monorepos share UI components across multiple applications, meaning a single metadata regression in a shared package propagates instantly to all consuming apps.
When you share a Layout component or a utility function that generates Open Graph tags, static analysis tools like ESLint lose context. ESLint cannot execute the Next.js App Router's deep metadata merging logic across workspace boundaries. It cannot know that apps/marketing requires a strict 60-character title limit, while apps/internal-dashboard explicitly requires <meta name="robots" content="noindex">.
You must validate the compiled HTML output of each application independently, applying different rule sets to different apps.
# Running Indxel against a specific monorepo app catches compiled regressions
npx indxel check apps/marketing/.next/server/app
apps/marketing/app/pricing/page.tsx
1:1 error Missing <meta name="description"> description-presence
1:1 error Title exceeds 60 characters (72) title-length
1:1 error og:image returns HTTP 404 og-image-status
✖ 3 critical errors found. Build failed.The CLI outputs warnings in the exact same format as ESLint — one line per issue, with the file path and rule ID. This maps directly to GitHub Actions annotations.
How do you configure Turborepo for SEO checks?
Define an seo-check task in your turbo.json pipeline that depends on the application build step to ensure you validate the final HTML output.
Validating SEO against a running development server in CI is slow and flaky. Validating static source code misses dynamically generated metadata. The correct approach is to run Indxel against the build artifacts. In a Next.js App Router project, this means parsing the generated .html files in .next/server/app.
Add the validation step to turbo.json:
{
"$schema": "https://turbo.build/schema.json",
"pipeline": {
"build": {
"dependsOn": ["^build"],
"outputs": [".next/**", "!.next/cache/**"]
},
"seo-check": {
"dependsOn": ["build"],
"inputs": [".next/server/app/**/*.html", "seo.config.ts"],
"outputs": []
}
}
}Notice the inputs array. By targeting the compiled HTML and the local config file, Turborepo will cache the SEO validation result. If the underlying HTML hasn't changed, Turborepo skips the seo-check task entirely, dropping pipeline time to 0 seconds.
To run the checks across your workspace:
# Run validation only for apps that changed
npx turbo run seo-check --filter=...[origin/main]How to enforce per-app SEO thresholds?
Use app-specific seo.config.ts files to set different validation rules, allowing strict 95/100 thresholds for marketing sites while disabling checks entirely for internal tools.
A monorepo holds applications with completely different business requirements. A B2B marketing site requires perfect JSON-LD schema and optimized Open Graph images. An authenticated user dashboard just needs a title tag and a noindex directive.
Drop an seo.config.ts at the root of each application directory. Indxel automatically resolves the closest configuration file.
// apps/marketing/seo.config.ts
import { defineConfig } from '@indxel/cli';
export default defineConfig({
thresholds: {
score: 95,
critical: 0
},
rules: {
'title-length': ['error', { min: 40, max: 60 }],
'description-presence': 'error',
'og-image-status': 'error',
'canonical-resolution': 'error',
'json-ld-validity': 'error',
'noindex-directive': 'off'
}
});For the internal dashboard, invert the rules:
// apps/dashboard/seo.config.ts
import { defineConfig } from '@indxel/cli';
export default defineConfig({
thresholds: {
score: 0, // We don't care about the overall score
critical: 0
},
rules: {
// Fail the build if this app ever accidentally gets indexed
'noindex-directive': ['error', { require: true }],
'title-presence': 'warning',
'description-presence': 'off',
'og-image-status': 'off'
}
});Indxel enforces 15 specific rules covering everything from canonical URL resolution to og:image HTTP 200 checks. You configure them per app, ensuring developers aren't blocked by irrelevant SEO warnings on internal tools.
What is the Nx configuration pattern for SEO validation?
Add a validate-seo target to your nx.json targetDefaults to automatically distribute and parallelize SEO checks across all affected apps in your workspace.
Nx wins over Turborepo for monorepos exceeding 20 applications because its task graph analysis strictly isolates validation to apps that actually consume the changed UI package. If you modify a Button component used only by the admin app, Nx will not run the marketing app build or SEO validation.
Update your nx.json:
{
"targetDefaults": {
"validate-seo": {
"dependsOn": ["build"],
"inputs": ["default", "^default", "{workspaceRoot}/indxel.config.ts"],
"cache": true
}
}
}Then, in each app's project.json:
{
"name": "marketing",
"targets": {
"build": {
"executor": "@nx/next:build",
"outputs": ["{options.outputPath}"]
},
"validate-seo": {
"executor": "nx:run-commands",
"options": {
"command": "indxel check dist/apps/marketing/.next/server/app"
}
}
}
}Run the affected command in CI:
npx nx affected -t validate-seoNx executes the checks in parallel across all affected apps. Because Indxel is a compiled Go binary under the hood, it does not suffer from Node.js single-thread bottlenecks. Running 5 apps in parallel will saturate your CI runner's CPU cores, completing the validation in seconds.
If you use Next.js App Router, beware of metadata object merging. If apps/marketing/app/layout.tsx defines an openGraph object, and a child route defines a new openGraph.title, Next.js shallow merges by default. The child route will lose the inherited openGraph.images array unless explicitly redefined. Indxel catches this compilation artifact; static analysis tools do not.
What is the real-world impact on CI build times?
Running Indxel locally against static Next.js build artifacts validates 500 pages across 4 apps in under 12 seconds, adding negligible overhead to your CI pipeline.
Developers skip SEO validation because traditional tools require spinning up headless browsers (Puppeteer/Playwright) or waiting for staging deployments. A standard Cypress test suite checking metadata across 50 pages takes 4 minutes.
Indxel parses the static HTML AST directly.
| Validation Method | Execution Time (500 pages) | Scope | Fails CI Build? |
|---|---|---|---|
Indxel CLI (--diff) | < 1s | PR-affected routes only | Yes |
| Indxel CLI (Full) | 12s | Entire application | Yes |
| Playwright / Cypress | ~8-12 mins | Sampled URLs only | Yes |
| External Crawlers | Post-deployment | Entire application | No |
How do you integrate Indxel into GitHub Actions?
Run npx indxel check --ci --diff inside your workflow after the build step to annotate pull requests with specific line-level metadata errors.
The --diff flag is critical for monorepos. When a developer adds a new blog post to the docs app, they shouldn't wait for Indxel to re-validate 400 existing pages. The --diff flag compares the current branch against origin/main, identifies the specific routes modified, and restricts validation to those URLs. This drops execution time from 12 seconds to 800 milliseconds.
name: CI
on:
pull_request:
branches: ["main"]
jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0 # Required for --diff to work
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: 20
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Build applications
run: npx turbo run build --filter=...[origin/main]
- name: Validate SEO
run: npx turbo run seo-check --filter=...[origin/main]
env:
INDXEL_TOKEN: ${{ secrets.INDXEL_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Enables inline PR annotationsWhen a failure occurs, Indxel uses the GITHUB_TOKEN to post the error directly on the offending line in the pull request's "Files changed" tab. The developer sees "Missing canonical URL" exactly where they modified the metadata export.
Frequently Asked Questions
How do I handle shared canonical URL logic in a monorepo?
Configure your base URL via environment variables injected at build time, and write an Indxel rule to verify the domain matches the deployment target. Hardcoding https://example.com in a shared @repo/ui package breaks staging environments. Pass NEXT_PUBLIC_SITE_URL to your apps, use it in the metadata base, and configure Indxel's canonical-resolution rule to strictly expect that environment variable in the output.
Should I run SEO checks against preview URLs or build artifacts?
Check build artifacts (.next/server/app) for speed and deterministic caching. Validating against preview URLs introduces network latency and requires waiting for Vercel or Netlify to finish deploying. Artifact validation happens locally in the CI runner immediately after the build completes, allowing you to fail the pipeline before deployment even starts.
Why do my shared UI components fail validation in one app but pass in another?
Different apps provide different props to shared layouts. If @repo/ui/Layout expects a description prop, apps/marketing might pass it correctly while apps/docs forgets it. Enforce strict TypeScript interfaces for metadata props in your shared packages, but always rely on Indxel to validate the final compiled output at the app level to catch runtime omissions.
npm install -g @indxel/cli
npx indxel init