Fix: Duplicate Content Entry Slug in Astro

Error message:
Duplicate content entry slug.
Content Collections 2025-01-25

What Causes This Error?

This error occurs when two or more content entries in the same collection have the same slug. Each entry must have a unique slug to generate unique URLs.

The Problem

src/content/blog/
├── post-one.md      # slug: "my-post" (from frontmatter)
└── my-post.md       # slug: "my-post" (from filename)
                     # ❌ Duplicate slug!
---
# File 1: src/content/blog/article-a.md
slug: "featured-post"
---

---
# File 2: src/content/blog/article-b.md
slug: "featured-post"  # ❌ Duplicate!
---

The Fix

Use Unique Slugs

---
# File 1: src/content/blog/article-a.md
slug: "featured-post-2023"
---

---
# File 2: src/content/blog/article-b.md
slug: "featured-post-2024"
---

Or Use Unique Filenames

src/content/blog/
├── featured-post-january.md   # slug: "featured-post-january"
└── featured-post-february.md  # slug: "featured-post-february"

Common Scenarios

Filename vs Frontmatter Conflict

# File: src/content/blog/hello-world.md
# Default slug from filename: "hello-world"

---
# If another file has this in frontmatter:
slug: "hello-world"  # Conflicts with the filename-based slug
---

Solution: Choose One Method

---
# Option 1: Use meaningful filenames, no slug in frontmatter
# File: hello-world.md (slug is "hello-world")
title: "Hello World"
---

---
# Option 2: Use IDs as filenames, slug in frontmatter
# File: 001.md
title: "Hello World"
slug: "hello-world"
---

Nested Directories

src/content/blog/
├── 2023/
│   └── review.md      # slug: "2023/review"
└── 2024/
    └── review.md      # slug: "2024/review" ✅ Different slugs

Finding Duplicates

# List all markdown files and their slugs
grep -r "^slug:" src/content/blog/ | sort

# Or check filenames
ls -la src/content/blog/

In Build Output

Error: Duplicate content entry slug "my-post" found in collection "blog".
Entry: src/content/blog/post-a.md
Entry: src/content/blog/post-b.md

Programmatic Slug Generation

// If generating content programmatically, ensure unique slugs
const posts = [
  { title: "Post 1", slug: "post-1" },
  { title: "Post 2", slug: "post-2" },
  // Ensure no duplicates before creating files
];

const slugs = new Set();
posts.forEach(post => {
  if (slugs.has(post.slug)) {
    throw new Error(`Duplicate slug: ${post.slug}`);
  }
  slugs.add(post.slug);
});

Using Dates for Uniqueness

---
# Include date in slug for uniqueness
title: "Weekly Update"
slug: "2024-01-15-weekly-update"
---

---
title: "Weekly Update"
slug: "2024-01-22-weekly-update"
---

Category/Type Prefixes

---
# Use category prefix
title: "Getting Started"
slug: "tutorial/getting-started"
---

---
title: "Getting Started"
slug: "guide/getting-started"  # Different category = unique
---

Migration: Handling Existing Duplicates

// Script to find and fix duplicates
import { glob } from 'glob';
import matter from 'gray-matter';
import fs from 'fs';

const files = await glob('src/content/blog/**/*.md');
const slugMap = new Map();

for (const file of files) {
  const content = fs.readFileSync(file, 'utf-8');
  const { data } = matter(content);
  const slug = data.slug || file.split('/').pop().replace('.md', '');

  if (slugMap.has(slug)) {
    console.log(`Duplicate: ${slug}`);
    console.log(`  - ${slugMap.get(slug)}`);
    console.log(`  - ${file}`);
  } else {
    slugMap.set(slug, file);
  }
}

Quick Checklist

  • Each entry needs a unique slug
  • Check both filenames and frontmatter slugs
  • Use dates or categories for uniqueness
  • Nested folders create different slug paths
  • Run build to catch duplicates early