nyxcore-systems
6 min read

Unlocking Deeper Insights: Building AutoFix and Structured Reports for Our Dev Platform

Dive into how we integrated AutoFix capabilities and revamped our reporting system to deliver structured, actionable insights directly within our developer platform.

TypeScriptReacttRPCPrismaCode AnalysisDeveloper ToolsUI/UXAutomated Refactoring

As developers, we're constantly striving for smarter ways to manage code health and gain actionable insights into our projects. Our internal developer platform is designed to be that central hub, but it needs to evolve with our needs. This past session, my focus was squarely on leveling up our platform's ability to surface automated code improvements and provide crystal-clear reports.

The goal was ambitious: introduce a dedicated AutoFix tab to give developers a rapid overview of automated code corrections, significantly enhance our existing Reports tab to cover all pipeline types, and, crucially, implement a structured finding format for more digestible and actionable reports.

After a focused session, I'm happy to report that all these changes are committed and pushed. Let's break down what went into it.

The AutoFix Tab: Your Proactive Code Health Dashboard

One of the biggest additions was the new AutoFixTab component. We're running automated code fixes, but without a dedicated view, their impact and pending actions weren't immediately visible. This tab changes that completely.

The AutoFixTab is designed to be a developer's first stop for understanding the health of their automated fixes. It provides:

  • Summary Badges: At a glance, you can see the total number of issues identified, a severity breakdown (critical, high, medium, low), and how many Pull Requests (PRs) have been created by the AutoFix pipeline. This offers an instant snapshot of the current state.
  • PR Action Items Card: This is perhaps the most critical part. It lists PRs that require review or merging, complete with direct links to GitHub. No more digging through notifications – the platform tells you exactly what needs your attention.
  • Expandable Run Cards: For a deeper dive, individual AutoFix runs are displayed in expandable cards. Each card shows the run date, the affected repository, its status (completed, failed), and a detailed list of issues with their severity, fix status, and a link to the relevant PR if applicable.

To power this, I implemented a new tRPC query: autoFix.byProject. This query efficiently fetches up to 10 recent AutoFix runs that have identified issues, filtering them by repository.projectId. It returns crucial data points like severity, category, fix status, PR URLs, PR numbers, and file paths – everything the UI needs to render those actionable insights.

typescript
// src/server/trpc/routers/auto-fix.ts
// A conceptual representation of the autoFix.byProject query's return type
interface AutoFixRunSummary {
  id: string;
  createdAt: Date;
  repository: {
    name: string;
    // ... other repo details
  };
  status: 'completed' | 'failed' | 'running';
  findings: Array<{
    severity: 'critical' | 'high' | 'medium' | 'low';
    category: string;
    fixStatus: 'fixed' | 'pending' | 'ignored';
    prUrl?: string; // Link to the created PR
    prNumber?: number;
    filePath: string;
  }>;
}

// The tRPC query would fetch and transform this data...
// trpc.autoFix.byProject.useQuery({ projectId: 'yourProjectId' });

The UI also leverages a severity color map and distinct labels/variants for fix statuses, ensuring a consistent and intuitive experience.

Elevating Reports: Beyond Just Data Dumps

Our ReportsTab was already functional, but it was siloed. It could generate reports for specific pipeline types, but it lacked a unified view. The goal here was to make it a central hub for all our automated insights.

I enhanced the ReportsTab to query across all three of our primary pipeline types: autoFix.byProject, refactor.byProject, and workflows.byProject. Now, a single tab displays completed runs from AutoFix, Refactor, and Workflow pipelines, each with a "Generate Report" button that triggers our existing ReportGeneratorModal. This provides a much more cohesive user experience, allowing developers to generate comprehensive reports without switching contexts.

The Power of Structure: Introducing FINDING_FORMAT

This was a critical step in making our reports truly actionable. Raw text descriptions of findings can be vague or inconsistent. By introducing a FINDING_FORMAT constant, we enforce a structured template for every finding generated by our AI-powered analysis.

typescript
// src/server/services/report-generator.ts
const FINDING_FORMAT = `
### Finding: [A concise, descriptive title for the issue]
- **Category:** [e.g., Security, Performance, Maintainability, Bug, Code Style]
- **Severity:** [e.g., Critical, High, Medium, Low]
- **Description:** A clear and concise explanation of the issue, its potential impact, and why it's a problem in the context of best practices or project standards.
- **Solution:** Clear, actionable steps to resolve the finding. This should be specific enough for a developer to follow.
- **Reasoning:** Explain the underlying principles, security implications, performance bottlenecks, or best practices that inform the solution. Why is this fix important?
- **Code Example (Optional):**
\`\`\`
// [Relevant code snippet demonstrating the issue or the proposed solution, if applicable]
\`\`\`
`;

This format is conditionally injected into the system prompt of our report generation service whenever the featureLabel is "AutoFix" or "Refactor." This ensures that the AI adheres to a consistent structure, providing:

  • Clarity: Each section (Category, Severity, Description, Solution, Reasoning) has a defined purpose.
  • Consistency: All findings look the same, making them easier to parse.
  • Actionability: Solutions are explicitly called out, guiding developers directly to the fix.

This builds on previous work where we added MERMAID_GUIDANCE (commit b2b1920) to ensure our reports also include visual diagrams, further enhancing their value.

Lessons Learned: A Quirky Database Reminder

Thankfully, this session was remarkably smooth with no major roadblocks. All changes were additive, and I even got a clean TypeScript compile on the first try – always a good feeling!

However, a recurring note from past experiences, and something to always keep in mind when working with custom database types: db:push can sometimes be a bit aggressive. Specifically, it has a tendency to drop the embedding vector(1536) column on our workflow_insights table if you're not careful. This isn't strictly a "pain" from this session as I didn't touch the schema, but it's a critical reminder: always restore or verify your schema after a db:push if you're working with such specialized column types. Better yet, consider using explicit migrations for these changes to prevent accidental data loss or schema drift.

What's Next? Refining the Experience

With the core features in place, the immediate next steps involve thorough QA:

  • Verifying the AutoFixTab correctly displays runs and PR action items for projects with linked repositories.
  • Ensuring report generation from the ReportsTab works for all pipeline types, and that mermaid charts render alongside the new structured finding format.
  • Confirming that PR action item links correctly navigate to GitHub.

Beyond QA, I'm already thinking about future enhancements, such as an "Impact Report" that provides a cross-pipeline summary, and refining the refactor report structured format to specifically address opportunities, descriptions, improvements, and code examples.

This session was a significant step towards making our developer platform an even more indispensable tool for maintaining code quality and fostering a proactive approach to development. The new AutoFix tab and enhanced, structured reports mean less time sifting through data and more time acting on insights.

json
{
  "thingsDone": [
    "Added AutoFix tab to project details page",
    "Implemented autoFix.byProject tRPC query",
    "Built AutoFixTab React component with summary, PR action items, and run cards",
    "Enhanced ReportsTab to query and display runs from AutoFix, Refactor, and Workflow pipelines",
    "Introduced FINDING_FORMAT constant for structured report findings",
    "Conditionally injected FINDING_FORMAT into AI system prompt for AutoFix/Refactor reports",
    "Ensured TypeScript clean code and pushed all changes"
  ],
  "pains": [
    "Recurring issue: db:push can drop 'embedding vector(1536)' column on workflow_insights, requiring manual restore or careful migration planning."
  ],
  "successes": [
    "All changes were additive and integrated cleanly",
    "First-try clean TypeScript typecheck",
    "Achieved all session goals for AutoFix tab and enhanced reporting"
  ],
  "techStack": [
    "TypeScript",
    "React",
    "Next.js",
    "tRPC",
    "Prisma",
    "PostgreSQL",
    "GitHub"
  ]
}