File System

File System

The File System utilities offer a way to organize and query file-system data in renoun. It is a powerful tool that allows you to define a schema for file exports and query those exports using a simple API.

To get started with the File System API, instantiate the Directory class to target a set of files and directories from the file system. You can then use the getEntry / getDirectory / getFile methods to query a specific descendant file or directory:

import { Directory } from 'renoun'

const posts = new Directory({
  path: 'posts',
  loader: {
    mdx: (path) => import(`./posts/${path}.mdx`),
  },
})

Here we are creating a new Directory instance that targets the posts directory relative to the working directory. We are also specifying a loader for the mdx file extension that will be used to load the file contents using the bundler.

Referencing protocol paths

The Directory and File constructors accept protocol-prefixed paths that resolve using internal resolvers before the adapters interact with the file system. The only supported protocol currently is workspace:, which resolves paths relative to the workspace root instead of the current working directory.

This is helpful when using renoun from a nested package (for example apps/site) but you need to reference files located in another workspace folder (like examples):

import { Directory } from 'renoun'

const examples = new Directory({
  path: 'workspace:examples',
})

If the current working directory is apps/site, the example above resolves to ../../examples internally while remaining ./examples when run from the workspace root.

Querying file system entries

import { Directory } from 'renoun'

const posts = new Directory({
  path: 'posts',
  loader: {
    mdx: (path) => import(`./posts/${path}.mdx`),
  },
})

export default async function Page({ slug }: { slug: string }) {
  const post = await posts.getFile(slug, 'mdx')
  const Content = await post.getExportValue('default')

  return <Content />
}

You can query the entries within the directory to help with generating navigations and index pages. For example, we can filter to only mdx file extensions to generate an index page of links to all posts using the getEntries method:

import { Directory, withSchema } from 'renoun'

interface PostType {
  frontmatter: {
    title: string
    date: Date
  }
}

const posts = new Directory({
  path: 'posts',
  filter: '*.mdx',
  loader: {
    mdx: withSchema<PostType>((path) => import(`./posts/${path}.mdx`)),
  },
})

export default async function Page() {
  const allPosts = await posts.getEntries()

  return (
    <>
      <h1>Blog</h1>
      <ul>
        {allPosts.map(async (post) => {
          const pathname = post.getPathname()
          const frontmatter = await post.getExportValue('frontmatter')

          return (
            <li key={pathname}>
              <a href={pathname}>{frontmatter.title}</a>
            </li>
          )
        })}
      </ul>
    </>
  )
}

File selection criteria

When querying files using getFile or getEntry, the file system follows a specific priority order to resolve ambiguous paths:

  1. Sibling files over directories : When both a file and a directory exist with the same base name (e.g., integrations.mdx and integrations/ ), the sibling file is preferred over the directory. This ensures that getEntry('integrations') returns integrations.mdx rather than the integrations/ directory.
  2. Base files over files with modifiers : When multiple files share the same base name but have different modifiers (e.g., Reference.tsx and Reference.examples.tsx ), the base file without a modifier is preferred.
  3. Extension matching : When an extension is specified in the query (e.g., getFile('button', 'tsx') ), only files matching that extension are considered.
  4. Directory representatives : If a directory is selected, the file system looks for a representative file within that directory in this order:
    • A file with the same name as the directory (e.g., Button/Button.tsx )
    • An index file (e.g., Button/index.tsx )
    • A readme file (e.g., Button/readme.md )
import { Directory, MemoryFileSystem } from 'renoun'

const directory = new Directory({ path: 'posts' })

// Example: When both integrations.mdx and integrations/ exist
const entry = await directory.getEntry('integrations')
// Returns: integrations.mdx (sibling file, not the directory)

// Example: When both Reference.tsx and Reference.examples.tsx exist
const file = await directory.getFile('Reference')
// Returns: Reference.tsx (base file without modifier)

Type checking file exports

To improve type safety, you can utilize the withSchema helper to specify the schema for the file’s exports:

import { Directory, withSchema } from 'renoun'

interface PostType {
  frontmatter: {
    title: string
    date: Date
  }
}

const posts = new Directory({
  path: 'posts',
  loader: {
    mdx: withSchema<PostType>((path) => import(`./posts/${path}.mdx`)),
  },
})

Now when we call JavaScript#getExportValue and JavaScriptExport#getRuntimeValue we will have stronger type checking and autocomplete.

Working with globbed module maps

When using bundler utilities like import.meta.glob, you can return a loader map from the loader option to reuse the globbed modules:

/// <reference types="vite/client" />
import { Directory, withSchema } from 'renoun'

interface PostType {
  frontmatter: {
    title: string
    date: Date
  }
}

const posts = new Directory({
  path: 'posts',
  loader: () => {
    const mdxModules = import.meta.glob('./posts/**/*.mdx')
    return {
      mdx: withSchema<PostType>((path) => mdxModules[`./posts/${path}.mdx`]),
    }
  },
})

The loader factory executes once and returns the extension-to-loader map, so the globbed modules are reused across all files in the directory. If the glob returns loader functions themselves, they will automatically be called and awaited for you.

Schema Validation

You can also apply schema validation using libraries that follow the Standard Schema Spec like Zod, Valibot, or Arktype to ensure file exports conform to a specific schema:

import { Directory, withSchema } from 'renoun'
import { z } from 'zod'

const posts = new Directory({
  path: 'posts',
  loader: {
    mdx: withSchema(
      {
        frontmatter: z.object({
          title: z.string(),
          date: z.date(),
        }),
      },
      (path) => import(`./posts/${path}.mdx`)
    ),
  },
})

Alternatively, you can define a schema yourself using both TypeScript types and custom validation functions:

import { Directory, withSchema } from 'renoun'

interface PostType {
  frontmatter: {
    title: string
    date: Date
  }
}

const posts = new Directory({
  path: 'posts',
  loader: {
    mdx: withSchema<PostType>(
      {
        frontmatter: (value) => {
          if (typeof value.title !== 'string') {
            throw new Error('Title is required')
          }

          if (!(value.date instanceof Date)) {
            throw new Error('Date is required')
          }

          return value
        },
      },
      (path) => import(`./posts/${path}.mdx`)
    ),
  },
})

The file system utilities are not limited to MDX files and can be used with any file type. By organizing content and source code into structured collections, you can easily generate static pages and manage complex routing and navigations.

Using GitHostFileSystem

The GitHostFileSystem adapter lets you mirror files from a remote Git provider into memory so they can be queried just like local entries. It accepts the repository coordinates and optional filters and then streams the tarball for the requested ref.

import { Directory, GitHostFileSystem } from 'renoun'

const repoFs = new GitHostFileSystem({
  repository: 'souporserious/renoun',
  ref: 'main',
})

const docs = new Directory({
  path: '.',
  fileSystem: repoFs,
})

Avoiding rate limits

Git providers apply very small anonymous rate limits (for example, GitHub only allows 60 unauthenticated requests per hour). Passing an access token to GitHostFileSystem raises those limits dramatically and unlocks the more efficient metadata paths that renoun uses internally.

Provide the token via the token option, ideally by reading from an environment variable so it is not committed to source control:

import { GitHostFileSystem } from 'renoun'

const repoFs = new GitHostFileSystem({
  repository: 'souporserious/renoun',
  token: process.env.GITHUB_TOKEN,
})

When a token is supplied, renoun batches GraphQL blame queries and reuses cached metadata to keep requests to a minimum. Without a token the file system falls back to higher-volume REST sampling, so authenticated requests are the easiest way to prevent rate limiting during development and CI.

API Reference