File System

The File System utilities offer a way to organize and query file-system data in renoun. It is a powerful tool that allows you to define a schema for file exports and query those exports using a simple API.

To get started with the File System API, instantiate the Directory class to target a set of files and directories from the file system. You can then use the getEntry / getDirectory / getFile methods to query a specific descendant file or directory:

import { Directory } from 'renoun'

const posts = new Directory({
  path: 'posts',
  filter: '*.mdx',
  loader: (path) => import(`./posts/${path}.mdx`),
})

Here we are creating a new Directory instance that targets the posts directory relative to the working directory. We are also specifying a loader for the mdx file extension that will be used to load the file contents using the bundler.

Repository-first workflows

Use Repository as the entry point for git-backed workflows. It selects the correct file-system adapter automatically (local/clone via GitFileSystem, remote API via GitVirtualFileSystem) and exposes factory helpers for Directory and File. Clone mode is lazy, so the repository is not cloned until you actually read from a directory or file.

Local repositories automatically use a worktree overlay when no explicit ref is provided. This means GitFileSystem prefers on-disk content (including uncommitted or untracked files) and falls back to git objects when a file is not present in the working tree. When a ref is explicitly set, or when a repository is remote/virtual, reads are git-only for deterministic results.

import { Repository } from 'renoun'

// Remote with clone (default)
const clonedRepo = new Repository({
  path: 'https://github.com/mrdoob/three.js',
  depth: 1,
})
const clonedDirectory = clonedRepo.getDirectory('src')

// Remote API (virtual)
const virtualRepo = new Repository({
  path: 'https://github.com/mrdoob/three.js',
  clone: false,
})
const virtualDirectory = virtualRepo.getDirectory('src')

// Local
const localRepo = new Repository()
const localDocs = localRepo.getDirectory('docs')

Directories created without a repository will auto-detect git roots and attach an implicit Repository when possible:

import { Directory } from 'renoun'

// Auto-detects the closest git repo and enables git metadata.
const directory = new Directory({ path: 'src' })

Referencing path schemes

The Directory and File constructors accept path schemes that resolve using internal resolvers before the adapters interact with the file system. The only supported scheme currently is workspace:, which resolves paths relative to the workspace root instead of the current working directory.

This is helpful when using renoun from a nested package (for example apps/site) but you need to reference files located in another workspace directory (like examples):

import { Directory } from 'renoun'

const examples = new Directory({
  path: 'workspace:examples',
})

If the current working directory is apps/site, the example above resolves to ../../examples internally while remaining ./examples when run from the workspace root.

Querying file system entries

import { Directory } from 'renoun'

const posts = new Directory({
  path: 'posts',
  loader: {
    mdx: (path) => import(`./posts/${path}.mdx`),
  },
})

export default async function Page({ slug }: { slug: string }) {
  const post = await posts.getFile(slug, 'mdx')
  const Content = await post.getContent()

  return <Content />
}

You can query the entries within the directory to help with generating navigations and index pages. For example, we can filter to only mdx file extensions to generate an index page of links to all posts using the getEntries method:

import { Directory } from 'renoun'
import { z } from 'zod'

const posts = new Directory({
  path: 'posts',
  filter: '*.mdx',
  schema: {
    mdx: {
      frontmatter: z.object({
        title: z.string(),
        date: z.coerce.date(),
      }),
    },
  },
  loader: {
    mdx: (path) => import(`./posts/${path}.mdx`),
  },
})

export default async function Page() {
  const allPosts = await posts.getEntries()

  return (
    <>
      <h1>Blog</h1>
      <ul>
        {allPosts.map(async (post) => {
          const pathname = post.getPathname()
          const frontmatter = await post.getExportValue('frontmatter')

          return (
            <li key={pathname}>
              <a href={pathname}>{frontmatter.title}</a>
            </li>
          )
        })}
      </ul>
    </>
  )
}

File selection criteria

When querying files using getFile or getEntry, the file system follows a specific priority order to resolve ambiguous paths:

  1. Sibling files over directories : When both a file and a directory exist with the same base name (e.g., integrations.mdx and integrations/ ), the sibling file is preferred over the directory. This ensures that getEntry('integrations') returns integrations.mdx rather than the integrations/ directory.
  2. Base files over files with modifiers : When multiple files share the same base name but have different modifiers (e.g., Reference.tsx and Reference.examples.tsx ), the base file without a modifier is preferred.
  3. Extension matching : When an extension is specified in the query (e.g., getFile('button', 'tsx') ), only files matching that extension are considered.
  4. Directory representatives : If a directory is selected, the file system looks for a representative file within that directory in this order:
    • A file with the same name as the directory (e.g., Button/Button.tsx )
    • An index file (e.g., Button/index.tsx )
    • A readme file (e.g., Button/readme.md )
import { Directory } from 'renoun'

const directory = new Directory({ path: 'posts' })

// When both integrations.mdx and integrations/ exist
// returns integrations.mdx (sibling file, not the directory)
const entry = await directory.getEntry('integrations')

// When both Reference.tsx and Reference.examples.tsx exist
// returns Reference.tsx (base file without modifier)
const file = await directory.getFile('Reference')

Type checking file exports

To improve type safety, you can define a schema for exports (runtime validation) and optionally type the loader (compile-time):

import { Directory } from 'renoun'
import { z } from 'zod'

const posts = new Directory({
  path: 'posts',
  schema: {
    mdx: {
      frontmatter: z.object({
        title: z.string(),
        date: z.coerce.date(),
      }),
    },
  },
  loader: {
    mdx: (path) => import(`./posts/${path}.mdx`),
  },
})

Working with globbed module maps

When using bundler utilities like import.meta.glob, you can return a loader map from the loader option to reuse the globbed modules:

/// <reference types="vite/client" />
import { Directory } from 'renoun'

interface PostType {
  frontmatter: {
    title: string
    date: Date
  }
}

const posts = new Directory({
  path: 'posts',
  loader: import.meta.glob<PostType>('./posts/*.mdx'),
})

The loader factory executes once and returns the extension-to-loader map, so the globbed modules are reused across all files in the directory. If the glob returns loader functions themselves, they will automatically be called and awaited for you.

Schema Validation

You can also apply schema validation using libraries that follow the Standard Schema Spec like Zod, Valibot, or Arktype to ensure file exports conform to a specific schema:

import { Directory } from 'renoun'
import { z } from 'zod'

const posts = new Directory({
  path: 'posts',
  schema: {
    mdx: {
      frontmatter: z.object({
        title: z.string(),
        date: z.date(),
      }),
    },
  },
  loader: {
    mdx: (path) => import(`./posts/${path}.mdx`),
  },
})

Alternatively, you can define a schema yourself using both TypeScript types and custom validation functions:

import { Directory } from 'renoun'

interface PostType {
  frontmatter: {
    title: string
    date: Date
  }
}

const posts = new Directory({
  path: 'posts',
  schema: {
    mdx: {
      frontmatter: (value: any) => {
        if (typeof value.title !== 'string') {
          throw new Error('Title is required')
        }

        if (!(value.date instanceof Date)) {
          throw new Error('Date is required')
        }

        return value
      },
    },
  },
  loader: (path) => import(`./posts/${path}`),
})

The file system utilities are not limited to MDX files and can be used with any file type. By organizing content and source code into structured collections, you can easily generate static pages and manage complex routing and navigations.

Streaming

File instances expose web streamable methods directly, slice, stream, arrayBuffer, and text—without buffering file contents. Content type and size are inferred from the file metadata (via the type / size accessors), so you can treat a File as a streaming Blob immediately.

Because these helpers mirror the standard Web implementations, they can be passed directly to APIs that expect a Blob or File while still streaming their contents in byte ranges. Below are common ways to use them:

If the underlying file system adapter cannot report a byte length, stream() falls back to the raw readable stream instead of throwing. In that case the size accessor and slice() remain unavailable because they rely on a known length, but you can still forward the stream directly.

Surface streaming views from renoun File instances

import { Directory, File, InMemoryFileSystem } from 'renoun/file-system'

const directory = new Directory({
  fileSystem: new InMemoryFileSystem({
    'video.mp4': new Uint8Array([1, 2, 3]),
  }),
})
const file = new File({ directory, path: 'video.mp4' })

// Work with slices directly on the File
const previewChunk = await file.slice(0, 1024 * 1024).arrayBuffer()
const type = file.type

Upload streaming files with fetch

Streaming files work with fetch (including in Node runtimes that support Web streams). Nothing is buffered before the request is sent:

import { Directory, File, InMemoryFileSystem } from 'renoun/file-system'

const directory = new Directory({
  fileSystem: new InMemoryFileSystem({
    'video.mp4': new Uint8Array([1, 2, 3]),
  }),
})
const file = await directory.getFile('video.mp4')

await fetch('https://example.com/upload', {
  method: 'PUT',
  body: file.slice(), // streamed as the body
  headers: { 'content-type': file.type },
})

Serve partial responses efficiently

You can return slices without reading the whole asset:

import { Directory, File } from 'renoun/file-system'

const directory = new Directory({
  path: 'workspace:public',
})

export async function GET(request: Request) {
  const rangeHeader = request.headers.get('range')
  const file = new File({ directory, path: 'video.mp4' })
  const { start = 0, end = file.size } = parseRange(rangeHeader)
  const slice = file.slice(start, end)

  return new Response(slice.stream(), {
    status: 206,
    headers: {
      'content-type': file.type,
      'content-range': `bytes ${start}-${end - 1}/${file.size}`,
    },
  })
}

function parseRange(header: string | null) {
  if (!header?.startsWith('bytes=')) return {}
  const [start, end] = header.replace('bytes=', '').split('-').map(Number)
  return { start, end: Number.isFinite(end) ? end + 1 : undefined }
}

Using GitVirtualFileSystem

The GitVirtualFileSystem adapter lets you mirror files from a remote Git provider into memory so they can be queried just like local entries. It accepts the repository coordinates and optional filters and then streams the tarball for the requested ref.

import { Directory, GitVirtualFileSystem } from 'renoun'

const fileSystem = new GitVirtualFileSystem({
  repository: 'souporserious/renoun',
  ref: 'main',
})

const directory = new Directory({
  fileSystem,
})

Avoiding rate limits

Git providers apply very small anonymous rate limits (for example, GitHub only allows 60 unauthenticated requests per hour). Passing an access token to GitVirtualFileSystem raises those limits dramatically and unlocks the more efficient metadata paths that renoun uses internally.

Provide the token via the token option, ideally by reading from an environment variable so it is not committed to source control:

import { GitVirtualFileSystem } from 'renoun'

const fileSystem = new GitVirtualFileSystem({
  repository: 'souporserious/renoun',
  token: process.env.GITHUB_TOKEN,
})

When a token is supplied, renoun batches GraphQL blame queries and reuses cached metadata to keep requests to a minimum. Without a token the file system falls back to higher-volume REST sampling, so authenticated requests are the easiest way to prevent rate limiting during development and CI.

Using GitFileSystem

The GitFileSystem adapter provides high-performance file access for local git repositories. It can clone remote repositories automatically and read files at any ref by streaming directly from git’s object store.

Basic usage with a local repo

import { Directory, GitFileSystem } from 'renoun'

const fileSystem = new GitFileSystem({
  repository: process.cwd(),
})
const directory = new Directory({
  fileSystem,
})

Clone a remote repository

import { Directory, GitFileSystem } from 'renoun'

const fileSystem = new GitFileSystem({
  repository: 'souporserious/renoun',
  ref: 'main',
})

By default, remote repositories are cloned into the cache directory.

Sparse checkout for monorepos

To limit the scope of the repository to a specific directory, you can use the sparse option to specify the directories to checkout:

import { GitFileSystem } from 'renoun'

const fileSystem = new GitFileSystem({
  repository: 'https://github.com/vercel/next.js',
  sparse: ['packages/next'],
  ref: 'canary',
})

Performance benefits

  • Blob SHA caching (same content = cache hit)
  • Persistent git processes (no spawn overhead)
  • Read files at any git ref (branches, tags, commits)
  • Works in sparse checkouts

Comparison with GitVirtualFileSystem

FeatureGitFileSystemGitVirtualFileSystem
File accessgit CLIHTTP APIs
Rate limitsNoneAPI limits apply
AuthSSH keys / git configOAuth tokens
History depthFull (if not shallow)Limited by API

Worktree overlay (local repositories)

GitFileSystem reads file contents from git objects by default. For local repositories with no explicit ref, it uses a worktree overlay that prefers files on disk (including new/uncommitted files) and falls back to git objects when a file does not exist locally.

This keeps local development fast and intuitive while preserving accurate git history:

  • Content comes from disk when available.
  • Git metadata (commit dates/authors/history) still comes from the repository.
  • Untracked files are readable but have no git history.
  • If you set ref explicitly, GitFileSystem switches to git-only reads.

Migration guide

import { Directory, Repository } from 'renoun'

// Remote with clone (default)
const clonedRepo = new Repository({
  path: 'https://github.com/mrdoob/three.js',
  depth: 1,
})
const clonedDirectory = clonedRepo.getDirectory('src')

// Remote API
const virtualRepo = new Repository({
  path: 'https://github.com/mrdoob/three.js',
  clone: false,
})
const virtualDirectory = virtualRepo.getDirectory('src')

// Local (auto-detects)
const autoDirectory = new Directory({ path: 'src' })

// Explicit local
const localRepo = new Repository()
const localDirectory = localRepo.getDirectory('src')

Inspecting packages with Package

The file system utilities also include a Package helper that can discover workspace packages, resolve the closest node_modules installation, or fall back to repositories fetched through GitVirtualFileSystem. Once discovered, the package manifest is analyzed so you can introspect exports and imports without manually walking the file tree.

import { Package } from 'renoun'

const renounMdx = new Package({
  name: '@renoun/mdx',
  loader: {
    'remark/add-sections': () => import('@renoun/mdx/remark/add-sections'),
  },
})
const remarkAddSections = await renounMdx.getExport('remark/add-sections')
const defaultExport = await remarkAddSections.getExport('default')

await defaultExport.getType()
await defaultExport.getValue()
import { Package } from 'renoun'

const renounMdx = new Package({
  name: '@renoun/mdx',
  loader: {
    'remark/add-sections': () => import('@renoun/mdx/remark/add-sections'),
  },
})
const remarkAddSections = await renounMdx.getExport('remark/add-sections')
const defaultExport = await remarkAddSections.getExport('default')
const type = await defaultExport.getType()
const value = await defaultExport.getValue()

Each export directory tracks the manifest conditions, target file path, and helper methods for loading runtime values. The same API works for getImports() when you need to trace how a package consumes other modules, making it straightforward to build documentation or automated analysis around any published or workspace package.

API Reference