Whether you’re working with dynamic content, optimizing metadata, or ensuring your pages are properly indexed, this guide will walk you through the key steps to enhance your Next.js app's visibility in search results. Let’s dive into the best practices to make your app search engine-friendly!

1. Check Your robots.txt File

A robots.txt file tells search engines which parts of your site they can or cannot crawl.

Option 1: Static robots.txt file

Place the file in the public directory of your Next.js project. This is the simplest approach:

  • Files in the public directory are served at the root path of your website
  • The file would be accessible at https://mediagear.ca/robots.txt

Dynamic robots.txt file (Next.js 13+ App Router)

Create a robots.js or robots.ts file in your src/app directory to generate the robots file dynamically.

What to Include in Your robots.txt File

For most websites that want good SEO, you should use a robots.txt file that allows search engines to crawl your entire site. Here's a recommended configuration:

# Allow all crawlers
User-agent: *
Allow: /

# XML Sitemap
Sitemap: https://mediagear.ca/sitemap.xml

Option 2: Dynamic file with Next.js App Router

Alternatively, you can create a dynamic robots.txt file using Next.js App Router:

import { MetadataRoute } from 'next'

export default function robots(): MetadataRoute.Robots {
  return {
    rules: {
      userAgent: '*',
      allow: '/',
    },
    sitemap: 'https://mediagear.ca/sitemap.xml',
  }
}

Explanation of robots.txt Contents

This robots.txt configuration:

  1. Allows all search engines to crawl your site:
  • User-agent: * targets all search engine bots
  • Allow: / gives permission to crawl the entire site
  1. References your sitemap:
  • Sitemap: https://mediagear.ca/sitemap.xml tells search engines where to find your XML sitemap
  • Note: You'll need to create a sitemap.xml file as well (I can help with that if needed)

2. Creating a Valid Sitemap for Your Next.js Site


Option 1: Static robots.txt file

Let's create a basic static sitemap.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://mediagear.ca/</loc>
    <lastmod>2023-03-12</lastmod>
    <changefreq>weekly</changefreq>
    <priority>1.0</priority>
  </url>
  <url>
    <loc>https://mediagear.ca/about</loc>
    <lastmod>2023-03-12</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
  <url>
    <loc>https://mediagear.ca/ja2</loc>
    <lastmod>2023-03-12</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.9</priority>
  </url>
  <url>
    <loc>https://mediagear.ca/zanpoyo</loc>
    <lastmod>2023-03-12</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.9</priority>
  </url>
  <url>
    <loc>https://mediagear.ca/melhssn</loc>
    <lastmod>2023-03-12</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.9</priority>
  </url>
  <url>
    <loc>https://mediagear.ca/changelog</loc>
    <lastmod>2023-03-12</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.6</priority>
  </url>
</urlset>

Option 2: Dynamic sitemap using Next.js App Router

Alternatively, you can use Next.js App Router's dynamic sitemap generation for more maintainable and automatically updated sitemaps:

import { MetadataRoute } from 'next'

export default function sitemap(): MetadataRoute.Sitemap {
  // Base URL for the site
  const baseUrl = 'https://mediagear.ca'
  
  // Current date for lastModified
  const currentDate = new Date()
  
  // Static pages
  const staticPages = [
    {
      url: baseUrl,
      lastModified: currentDate,
      changeFrequency: 'weekly',
      priority: 1.0,
    },
    {
      url: `${baseUrl}/about`,
      lastModified: currentDate,
      changeFrequency: 'monthly',
      priority: 0.8,
    },
    {
      url: `${baseUrl}/ja2`,
      lastModified: currentDate,
      changeFrequency: 'weekly',
      priority: 0.9,
    },
    {
      url: `${baseUrl}/zanpoyo`,
      lastModified: currentDate,
      changeFrequency: 'weekly',
      priority: 0.9,
    },
    {
      url: `${baseUrl}/melhssn`,
      lastModified: currentDate,
      changeFrequency: 'weekly',
      priority: 0.9,
    },
    {
      url: `${baseUrl}/changelog`,
      lastModified: currentDate,
      changeFrequency: 'monthly',
      priority: 0.6,
    },
  ]

  return staticPages
}

Ensuring Your Sitemap Is Valid

Now that you have created a sitemap, here's how to ensure it's valid and working properly:

1. Choose One Approach

You should choose either:

  • The static file (public/sitemap.xml) - Simple, but requires manual updates
  • The dynamic file (src/app/sitemap.ts) - Automatically generated with each build

I created both to show you the options, but you should use only one in production.

2. Validate Your Sitemap

After deploying your site, validate your sitemap using these methods:

  1. Visual Inspection: Visit https://mediagear.ca/sitemap.xml in your browser to ensure it's accessible and properly formatted.
  2. Use Online Validators:
  • Google Search Console XML Sitemap Validator (Add and validate your sitemap)
  • XML-Sitemaps Validator

3. Submit Your Sitemap to Search Engines

To ensure search engines know about your sitemap:

  1. Google Search Console:
  • Register your site with Google Search Console
  • Navigate to "Sitemaps" in the left menu
  • Enter sitemap.xml in the field and click "Submit"
  1. Bing Webmaster Tools:
  • Register with Bing Webmaster Tools
  • Go to "Sitemaps" and submit your sitemap URL

4. Keep Your Sitemap Updated

If using the static approach, update your sitemap.xml when:

  • You add new pages
  • You substantially update existing pages
  • The structure of your site changes

If using the dynamic approach, your sitemap will automatically update with each build.

Share this post