Engineering

Automating i18n in Your CI/CD Pipeline: From Git Push to Live Translations

Eray Gündoğmuş
Eray Gündoğmuş
·9 min read
Share
Automating i18n in Your CI/CD Pipeline: From Git Push to Live Translations

Automating i18n in Your CI/CD Pipeline: From Git Push to Live Translations

"We support 12 languages." Great. But when a developer ships a new feature, how long until users in those 12 languages actually see translated content?

For most teams, the answer involves a lot of waiting. A developer adds a string, mentions it in Slack, a PM opens a spreadsheet, an email goes to a translation vendor, the vendor delivers files a week later, a developer copies JSON, opens a PR, it gets reviewed, merged, deployed. By then, the feature shipped to English users two weeks ago.

This post is about eliminating that wait entirely. We'll build an end-to-end automated i18n pipeline where the only manual step is a developer writing code — and optionally, a human translator reviewing AI output. Everything else happens automatically.

The Manual i18n Workflow (And Why It Breaks)

Here's the typical flow:

  1. Developer adds t('checkout.shipping.title', 'Shipping Address') to source code
  2. Developer tells PM "hey, there's a new string"
  3. PM exports translation keys to a spreadsheet
  4. PM emails the spreadsheet to the translation agency
  5. Agency translates, returns the spreadsheet
  6. Developer (or PM) manually copies translations into JSON locale files
  7. Developer opens a PR with the locale files
  8. PR gets reviewed, merged
  9. App redeploys with new translations

Each handoff introduces delay. Worse, each handoff is a potential failure point: keys get missed, formats get corrupted when copy-pasting from spreadsheets, JSON syntax errors slip through, and the feedback loop for catching mistakes is measured in days not minutes.

The automation goal: zero human steps between "developer writes t('key')" and "user sees translated content."

The Automated Pipeline

Here's the architecture we're building:

git push
  └─> CI: extract new/changed/deleted keys
        └─> if changes detected: trigger translation API
              └─> AI first-pass translation
                    └─> (optional) human review queue
                          └─> quality gates: glossary, placeholders, length
                                └─> publish to CDN
                                      └─> live for all users (no redeploy needed)

No file copying. No PRs for locale files. No redeployments for translation updates. Let's build each step.

Step 1: Automated Key Extraction

The first CI job needs to figure out which translation keys changed between commits.

Using AST-based extraction

Tools like i18next-parser and @formatjs/cli do AST-based extraction — they parse your source code and pull out every call to your translation function, even across dynamic imports and complex component trees.

# Install
npm install --save-dev i18next-parser

# Extract all keys from source
npx i18next-parser --config i18next-parser.config.js

A basic i18next-parser.config.js:

module.exports = {
  input: ['src/**/*.{ts,tsx,js,jsx,vue,svelte}'],
  output: '.i18n-extracted/$NAMESPACE.json',
  locales: ['en'],
  defaultNamespace: 'common',
  keySeparator: '.',
  namespaceSeparator: ':',
};

Detecting changes in CI

The key insight: you don't want to re-translate everything on every push. You want to detect new, changed, and deleted keys since the last extraction.

Here's a shell script that diffs extracted keys between the current commit and the last known extraction snapshot:

#!/bin/bash
# scripts/detect-i18n-changes.sh

set -euo pipefail

SNAPSHOT_FILE=".i18n-snapshot/en.json"
EXTRACTED_FILE=".i18n-extracted/common.json"

if [ ! -f "$SNAPSHOT_FILE" ]; then
  echo "No snapshot found — treating all keys as new"
  echo "new_keys=$(jq -r 'keys | length' "$EXTRACTED_FILE")"
  cp "$EXTRACTED_FILE" "$SNAPSHOT_FILE"
  exit 0
fi

# Find keys in extracted but not in snapshot (new keys)
NEW_KEYS=$(jq -n \
  --slurpfile current "$EXTRACTED_FILE" \
  --slurpfile snapshot "$SNAPSHOT_FILE" \
  '[$current[0] | keys[]] - [$snapshot[0] | keys[]]'
)

# Find keys in snapshot but not in extracted (deleted keys)
DELETED_KEYS=$(jq -n \
  --slurpfile current "$EXTRACTED_FILE" \
  --slurpfile snapshot "$SNAPSHOT_FILE" \
  '[$snapshot[0] | keys[]] - [$current[0] | keys[]]'
)

NEW_COUNT=$(echo "$NEW_KEYS" | jq 'length')
DELETED_COUNT=$(echo "$DELETED_KEYS" | jq 'length')

echo "New keys: $NEW_COUNT"
echo "Deleted keys: $DELETED_COUNT"

# Export for GitHub Actions
echo "new_keys=$NEW_COUNT" >> "$GITHUB_OUTPUT"
echo "deleted_keys=$DELETED_COUNT" >> "$GITHUB_OUTPUT"
echo "has_changes=$([ "$NEW_COUNT" -gt 0 ] || [ "$DELETED_COUNT" -gt 0 ] && echo true || echo false)" >> "$GITHUB_OUTPUT"

If has_changes is true, we proceed to trigger translation.

Step 2: Translation Triggering

When new keys are detected, we need to notify a Translation Management System (TMS) or a translation API. This is a webhook call.

GitHub Action for translation triggering

# .github/workflows/i18n-sync.yml (partial)
- name: Trigger translation for new keys
  if: steps.detect-changes.outputs.has_changes == 'true'
  run: |
    curl -X POST "${{ secrets.TMS_WEBHOOK_URL }}" \
      -H "Authorization: Bearer ${{ secrets.TMS_API_KEY }}" \
      -H "Content-Type: application/json" \
      -d '{
        "project_id": "${{ vars.I18N_PROJECT_ID }}",
        "source_locale": "en",
        "target_locales": ["de", "fr", "es", "ja", "pt-BR"],
        "keys_file_url": "${{ steps.upload-keys.outputs.url }}"
      }'

For teams using Better i18n, the platform handles this step natively — a push to your connected repository automatically detects key changes and queues them for translation without a separate webhook setup.

Step 3: AI Translation + Human Review

AI translation has become good enough that it's a viable first pass for most content. The workflow:

  1. AI translates immediately (zero wait)
  2. Translations are available to users right away with AI quality
  3. Human translators review asynchronously and approve/edit
  4. Approved translations replace AI versions via CDN update

Quality gates on AI output

Before AI translations are accepted, run automated checks:

// scripts/validate-translations.ts
import { readFileSync } from 'fs';

interface TranslationFile {
  [key: string]: string;
}

interface ValidationResult {
  key: string;
  locale: string;
  error: string;
}

function extractPlaceholders(str: string): string[] {
  return [...str.matchAll(/\{(\w+)\}/g)].map(m => m[1]);
}

function validateTranslations(
  source: TranslationFile,
  target: TranslationFile,
  locale: string
): ValidationResult[] {
  const errors: ValidationResult[] = [];

  for (const [key, sourceValue] of Object.entries(source)) {
    const targetValue = target[key];

    // Missing key
    if (!targetValue) {
      errors.push({ key, locale, error: 'Missing translation' });
      continue;
    }

    // Placeholder mismatch
    const sourcePlaceholders = extractPlaceholders(sourceValue);
    const targetPlaceholders = extractPlaceholders(targetValue);
    const missingPlaceholders = sourcePlaceholders.filter(
      p => !targetPlaceholders.includes(p)
    );

    if (missingPlaceholders.length > 0) {
      errors.push({
        key,
        locale,
        error: `Missing placeholders: ${missingPlaceholders.join(', ')}`,
      });
    }

    // String length check (warn if translation is 3x longer than source)
    if (targetValue.length > sourceValue.length * 3) {
      errors.push({
        key,
        locale,
        error: `Suspiciously long translation (${targetValue.length} vs ${sourceValue.length} chars)`,
      });
    }
  }

  return errors;
}

// Run validation
const locales = ['de', 'fr', 'es', 'ja', 'pt-BR'];
const source: TranslationFile = JSON.parse(
  readFileSync('.i18n-extracted/common.json', 'utf-8')
);

let hasErrors = false;

for (const locale of locales) {
  const target: TranslationFile = JSON.parse(
    readFileSync(`.i18n-translations/${locale}/common.json`, 'utf-8')
  );

  const errors = validateTranslations(source, target, locale);

  if (errors.length > 0) {
    console.error(`\nValidation errors for ${locale}:`);
    errors.forEach(e => console.error(`  [${e.key}] ${e.error}`));
    hasErrors = true;
  }
}

if (hasErrors) {
  process.exit(1);
}

console.log('All translations validated successfully');

Step 4: CDN Publishing

This is where modern i18n platforms change the game. Traditional approaches put locale JSON files in your repository. When translations are updated, you need to merge a PR and redeploy. That's slow and couples your translation workflow to your deployment pipeline.

CDN-first delivery decouples them entirely:

  • Your app fetches translations from a CDN URL at runtime
  • When a translation is approved, it's pushed to the CDN
  • Users see updated translations on their next page load — no deploy needed

Compare the two approaches:

File-basedCDN-based
Update translationsMerge PR + redeployPush to CDN
Time to liveMinutes to hoursSeconds
RollbackGit revert + redeployCDN cache invalidation
Separate translation PRsYesNo
Works with feature flagsComplicatedNative

With Better i18n's CDN delivery, translations are versioned and served from edge nodes globally. Your SDK fetches the right version for your app's current deployment, so you can even pin translations to specific app versions.

Step 5: Quality Gates in CI

Beyond validating AI translation output, you need CI gates that catch i18n issues before they reach production.

Missing key detection

Fail the build if any key exists in the source locale but not in target locales:

#!/bin/bash
# scripts/check-missing-keys.sh

set -euo pipefail

SOURCE_FILE=".i18n-extracted/common.json"
LOCALES=("de" "fr" "es" "ja" "pt-BR")
HAS_MISSING=false

SOURCE_KEYS=$(jq -r 'keys[]' "$SOURCE_FILE" | sort)

for locale in "${LOCALES[@]}"; do
  TARGET_FILE=".i18n-translations/${locale}/common.json"

  if [ ! -f "$TARGET_FILE" ]; then
    echo "ERROR: Missing translation file for ${locale}"
    HAS_MISSING=true
    continue
  fi

  TARGET_KEYS=$(jq -r 'keys[]' "$TARGET_FILE" | sort)

  MISSING=$(comm -23 <(echo "$SOURCE_KEYS") <(echo "$TARGET_KEYS"))

  if [ -n "$MISSING" ]; then
    echo "ERROR: Missing keys in ${locale}:"
    echo "$MISSING" | while read -r key; do
      echo "  - $key"
    done
    HAS_MISSING=true
  fi
done

if [ "$HAS_MISSING" = true ]; then
  echo ""
  echo "Build failed: missing translations detected"
  exit 1
fi

echo "All translation keys present in all locales"

Placeholder verification in CI

Add a dedicated step to verify {placeholder} variables survive translation:

- name: Verify placeholder integrity
  run: npx ts-node scripts/validate-translations.ts

Step 6: Monitoring in Production

CI gates catch problems before deploy. But you also need runtime monitoring to catch issues that slip through — keys added without translation coverage, locales falling behind as the app evolves.

Track translation coverage per locale

// api/i18n-coverage.ts
export async function getTranslationCoverage(): Promise<Record<string, number>> {
  const sourceKeys = await fetchSourceKeys(); // from your TMS or CDN
  const coverage: Record<string, number> = {};

  for (const locale of SUPPORTED_LOCALES) {
    const translatedKeys = await fetchTranslatedKeys(locale);
    const translatedSet = new Set(translatedKeys);
    const covered = sourceKeys.filter(k => translatedSet.has(k)).length;
    coverage[locale] = Math.round((covered / sourceKeys.length) * 100);
  }

  return coverage;
}

Alert on missing keys reaching users

Wrap your translation function to log missing keys:

// lib/t.ts
import { track } from './analytics';

export function t(key: string, fallback: string): string {
  const translation = getTranslation(key);

  if (!translation) {
    // Log missing key for monitoring
    track('i18n.missing_key', {
      key,
      locale: getCurrentLocale(),
      url: window.location.pathname,
    });
    return fallback;
  }

  return translation;
}

Send these events to your observability platform (Datadog, Grafana, etc.) and set up alerts when missing key rates spike.

Complete GitHub Actions Workflow

Here's the full workflow file that ties everything together:

# .github/workflows/i18n-pipeline.yml
name: i18n Automation Pipeline

on:
  push:
    branches: [main]
    paths:
      - 'src/**/*.{ts,tsx,js,jsx,vue,svelte}'

env:
  LOCALES: de,fr,es,ja,pt-BR
  SOURCE_LOCALE: en

jobs:
  extract-and-sync:
    name: Extract Keys and Sync Translations
    runs-on: ubuntu-latest

    steps:
      - name: Checkout code
        uses: actions/checkout@v4
        with:
          fetch-depth: 2 # Need previous commit for diff

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '20'
          cache: 'npm'

      - name: Install dependencies
        run: npm ci

      # Step 1: Extract translation keys from source code
      - name: Extract i18n keys
        run: npx i18next-parser --config i18next-parser.config.js

      # Step 2: Detect changes vs last snapshot
      - name: Detect key changes
        id: detect-changes
        run: bash scripts/detect-i18n-changes.sh

      # Step 3: Upload new keys to translation API
      - name: Upload new keys for translation
        if: steps.detect-changes.outputs.has_changes == 'true'
        run: |
          curl -X POST "${{ secrets.BETTER_I18N_API_URL }}/api/sync" \
            -H "Authorization: Bearer ${{ secrets.BETTER_I18N_API_KEY }}" \
            -H "Content-Type: application/json" \
            -d @.i18n-extracted/common.json
        env:
          PROJECT_ID: ${{ vars.I18N_PROJECT_ID }}

      # Step 4: Wait briefly for AI translation (or poll until ready)
      - name: Wait for AI translations
        if: steps.detect-changes.outputs.has_changes == 'true'
        run: |
          # Poll translation status up to 5 minutes
          for i in $(seq 1 30); do
            STATUS=$(curl -s \
              -H "Authorization: Bearer ${{ secrets.BETTER_I18N_API_KEY }}" \
              "${{ secrets.BETTER_I18N_API_URL }}/api/status/${{ vars.I18N_PROJECT_ID }}" \
              | jq -r '.status'
            )

            if [ "$STATUS" = "ready" ]; then
              echo "Translations ready"
              break
            fi

            echo "Waiting for translations... ($i/30)"
            sleep 10
          done

      # Step 5: Download translated files
      - name: Download translations
        run: |
          IFS=',' read -ra LOCALE_ARRAY <<< "$LOCALES"
          for locale in "${LOCALE_ARRAY[@]}"; do
            mkdir -p ".i18n-translations/${locale}"
            curl -s \
              -H "Authorization: Bearer ${{ secrets.BETTER_I18N_API_KEY }}" \
              "${{ secrets.BETTER_I18N_API_URL }}/api/export/${locale}" \
              -o ".i18n-translations/${locale}/common.json"
          done

      # Step 6: Validate translations
      - name: Validate placeholder integrity
        run: npx ts-node scripts/validate-translations.ts

      # Step 7: Check for missing keys
      - name: Check for missing keys
        run: bash scripts/check-missing-keys.sh

      # Step 8: Update snapshot for next run
      - name: Update i18n snapshot
        if: steps.detect-changes.outputs.has_changes == 'true'
        run: |
          cp .i18n-extracted/common.json .i18n-snapshot/en.json
          git config user.email "ci@yourorg.com"
          git config user.name "CI Bot"
          git add .i18n-snapshot/
          git diff --staged --quiet || git commit -m "chore: update i18n snapshot [skip ci]"
          git push

  quality-gate:
    name: i18n Quality Gate
    runs-on: ubuntu-latest
    needs: extract-and-sync

    steps:
      - uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '20'
          cache: 'npm'

      - name: Install dependencies
        run: npm ci

      # Final check: no missing translations in any locale
      - name: Verify translation coverage
        run: |
          node -e "
            const { getTranslationCoverage } = require('./dist/api/i18n-coverage');
            getTranslationCoverage().then(coverage => {
              const failed = Object.entries(coverage)
                .filter(([, pct]) => pct < 95);
              if (failed.length > 0) {
                console.error('Coverage below 95%:', failed);
                process.exit(1);
              }
              console.log('Coverage OK:', coverage);
            });
          "

Conclusion

The goal is zero human steps between "developer writes t('key')" and "user sees translated content." With AST-based extraction detecting key changes automatically, AI translation providing an instant first pass, CDN delivery pushing translations live without redeployments, and quality gates catching placeholder mismatches and missing keys in CI — that goal is achievable today.

The teams that get this right treat i18n the same way they treat any other infrastructure: automated, monitored, and decoupled from individual deploys. Translations stop being a bottleneck and become just another part of the delivery pipeline.

Start with the key extraction script and the missing-key CI check — those two steps alone eliminate most of the manual pain. Then layer in the translation API integration and CDN delivery to complete the pipeline.

Better i18n is a developer-first localization platform built for modern frontend teams. Type-safe SDKs, Git-based workflows, CDN delivery, and AI translation with glossary enforcement — without locale files in your repo.