nyxcore-systems
6 min read

Bringing Docs to Life: Integrating Dynamic, Interactive Documentation into Our Project Pages

We just shipped a major improvement to our project detail pages, integrating dynamic documentation directly from GitHub, complete with Mermaid diagrams and LaTeX math rendering. Dive into the technical journey, challenges overcome, and the architecture behind a truly 'living' documentation system.

frontendbackenddocumentationmarkdownMermaidLaTeXdeveloper-experiencetrpcNext.jsGitHub-API

Documentation is the lifeblood of any complex software project. Yet, it often lives in disparate places, becomes outdated, or lacks the interactivity needed to truly convey complex ideas. We've been feeling this pain, with critical intelligence scattered across various temporary files. Our latest mission? To centralize, modernize, and bring our project documentation to life directly within our application.

Today, I'm excited to share the journey of integrating a dynamic "Docs" tab into our project detail pages. This isn't just about rendering static Markdown; it's about creating a living, interactive documentation experience, complete with rich features like Mermaid diagrams and LaTeX math, all pulled directly from our linked GitHub repositories.

The Vision: Interactive Docs, Always Up-to-Date

Our goal was clear: provide a seamless experience for accessing project documentation. This meant:

  1. A Dedicated "Docs" Tab: A new section on each project's detail page to browse its documentation.
  2. GitHub Integration: Fetching documentation directly from a /docs/ folder within the project's linked GitHub repository.
  3. Rich Markdown Support: Going beyond basic Markdown to include:
    • Mermaid Diagrams: For flowcharts, sequence diagrams, and more, rendered as interactive SVGs.
    • LaTeX Math: For inline ($inline$) and block ($$block$$) mathematical expressions.
  4. Structured Content: Moving all our internal intelligence documents into a structured /docs/ folder within the repository, with individual files for each section.

This wasn't just a UI task; it was a full-stack endeavor touching our backend API, frontend rendering pipeline, and even our internal document generation processes.

Under the Hood: Building the Interactive Docs Engine

Let's break down the key technical pieces that made this possible.

Backend Brilliance: Bridging to GitHub with tRPC

At the heart of fetching our docs is our src/server/trpc/routers/projects.ts file. We extended our projects router with a new docs sub-router:

  • docs.list: This procedure leverages a new fetchRepoTree utility to scan the /docs/ folder of a linked GitHub repository. It provides a list of all available Markdown files, forming the basis for our navigation.
  • docs.get: Given a file path, this procedure uses fetchFileContent to retrieve the raw Markdown content. Crucially, it also extracts the title and a short summary from the Markdown, allowing us to display rich previews.

This setup ensures that our application is always pulling the latest documentation directly from the source of truth – GitHub.

Frontend Flourish: The Markdown Renderer and UI

The magic truly happens on the frontend, where we transform raw Markdown into a rich, interactive experience.

src/components/markdown-renderer.tsx: The Powerhouse

This component is where we brought our Markdown to life:

  • Math Rendering: We integrated remarkMath and rehypeKatex to parse and render LaTeX expressions. Now, $E=mc^2$ and $$\int_0^\infty e^{-x^2} dx = \frac{\sqrt{\pi}}{2}$$ render beautifully.
  • Mermaid Diagrams: A dynamically imported MermaidDiagram component handles code blocks marked with mermaid. It renders them as scalable vector graphics (SVGs) and respects our dark theme, ensuring visual consistency. Dynamic import keeps our bundle size lean.
  • Internal Link Rewriting (RepoLink): A subtle but powerful feature is our RepoLink component. It intelligently converts relative file-path links (e.g., src/server/some-file.ts) within the Markdown to their corresponding GitHub blob URLs when githubOwner and githubRepo props are provided. This allows documentation to link directly to relevant source code on GitHub.

To support these features, we added mermaid, remark-math, rehype-katex, and katex to our package.json dependencies.

src/app/(dashboard)/dashboard/projects/[id]/page.tsx: The User Experience

Integrating this into our UI was straightforward thanks to our existing tabbed interface:

  • We added a BookOpen icon and a new DocsTab component.
  • A <TabsTrigger value="docs"> now appears alongside our other project tabs.
  • The DocsTab itself offers a three-state experience:
    1. A grid view listing all available documentation sections.
    2. A summary card view for a selected document (showing title and excerpt).
    3. The full, beautifully rendered document using our markdown-renderer.

This hierarchical view makes navigating large documentation sets intuitive and efficient.

Docs Reorganization: A Structured Approach

To support this new system, we also undertook a significant internal effort to reorganize our intelligence documents. What were once large, monolithic temporary files (/tmp/nyxcore-doc-part{1,2,3}.md) have been meticulously split into individual, focused Markdown files within the /docs/ folder (e.g., 01-executive-summary.md through 11-real-world-examples.md).

Furthermore, we're leveraging background agents to automate the generation of new documentation sections (like 12-auto-fix.md, 13-refactor.md, 17-github-connector.md, etc.), ensuring our docs stay current with ongoing development. A few more sections are still being generated by these agents as we speak!

Lessons Learned: Navigating the Treacherous Waters

No development session is without its bumps. Here are a few critical challenges we encountered and how we navigated them:

1. The Pesky npm Cache Permissions

The Problem: During dependency installation, we repeatedly hit EACCES errors due to root-owned files in the default ~/.npm/_cacache/ directory. npm cache clean --force failed to resolve it.

The Solution: Instead of fighting with permissions in the user's home directory, we opted for a simple, reliable workaround: specify a temporary cache directory. Using npm install --cache /tmp/npm-cache-nyxcore allowed us to proceed without root privileges, creating a cache that's safe to delete after the fact. This taught us the value of quick, temporary environment fixes when deeper permission issues are at play.

2. TypeScript and Modern Regex Flags

The Problem: We wanted to use the /s (dotAll) flag in a regular expression within TypeScript (/^(.+?)(?:\n\n|\n#|$)/s) to efficiently capture multi-line content. However, TypeScript threw TS1501: This regular expression flag is only available when targeting 'es2018' or later. Our current target was older.

The Solution: Rather than altering the project's tsconfig.json just for one regex, we adopted the more verbose but universally compatible [\s\S] character class. Replacing . with [\s\S] achieves the same dotAll behavior without requiring a specific ES version target: /^([\s\S]+?)(?:\n\n|\n#|$)/. A good reminder that sometimes the "older" solution is the more portable one.

3. Large Language Model (LLM) Prompt Limits

The Problem: When attempting to merge several large source files into a single documentation section using a background agent, we frequently hit "Prompt is too long" errors. LLMs have token limits, and feeding them too much context at once overwhelms them.

The Solution: We refined our agent strategy. Instead of a single agent trying to synthesize everything, we split the task into multiple parallel agents with narrower scopes (e.g., one for sections 12-16, another for 17-20). We even spun up a dedicated Bash agent to handle the initial file-splitting, ensuring each LLM agent received manageable chunks. This highlights the importance of modularity and careful prompt engineering when working with LLMs for content generation.

What's Next?

While the core functionality is shipped, a few final steps remain:

  1. Waiting for our background agents to finish generating and writing the last few documentation sections (docs/14-code-analysis.md, docs/15-action-points.md, docs/16-discussion-service.md, and docs/20-sidebar-active-processes.md).
  2. Committing and pushing these final documentation files.
  3. Thorough end-to-end verification of the Docs tab on a project with a linked GitHub repo, ensuring Mermaid diagrams render correctly in dark mode and all links function as expected.
  4. Considering an optional docs/00-index.md file to serve as a README or table of contents for the documentation folder.

Conclusion

This integration marks a significant leap forward in how we manage and consume project documentation. By pulling content directly from GitHub, supporting advanced Markdown features, and creating an intuitive user interface, we're transforming our static documentation into a dynamic, living resource. This not only enhances the developer experience but also ensures our critical project intelligence is always accessible, up-to-date, and presented in the most effective way possible.