nyxcore-systems
5 min read

Crafting AI Identity: Building a Dynamic Avatar Generator for Personas

We've replaced static AI persona portraits with a dynamic, deterministic, and skill-driven avatar generator, built with zero external dependencies, bringing a new level of visual identity to our AI agents.

AIavatarsfrontendbackendTypeScriptNode.jsPNGdeveloper-experiencetRPC

In the world of AI, personas are more than just a collection of prompts and capabilities; they're the digital embodiment of an intelligent agent. For a long time, our AI personas relied on a static pool of portraits – functional, but ultimately generic. This week, we embarked on a mission to inject a vital dose of personality and determinism into our AI's visual identity, culminating in a brand-new, dynamic avatar generator.

The goal was ambitious: replace those static images with unique, 24x24 block-matrix avatars for each AI persona. These weren't just random images; they needed to be 300x300 transparent PNGs, deterministically generated, and visually driven by the persona's core skills and chosen theme. The result? A system that brings our AI agents to life with custom, reflective imagery.

Under the Hood: The Avatar Generator (src/server/services/avatar-generator.ts)

At the heart of this transformation lies our new avatar-generator.ts service. What makes it special? It's a completely self-contained, zero-dependency solution, a testament to what you can achieve with Node.js built-ins.

Here’s a peek at its core features:

  • Symmetric Silhouette: Each avatar starts as a 24x24 grid, forming a pleasingly symmetric humanoid silhouette – a head, neck, shoulders, torso, and a stable base. This provides a consistent, recognizable shape.
  • Depth-Based Coloring: We implemented a clever coloring system that applies different hues based on the "depth" of a pixel within the silhouette. This creates a nuanced, layered look, moving from edge colors to fill colors and finally to a core color.
  • Skill-Driven Palette: This is where our AI personas truly shine. We mapped over 20 different skills to specific RGB color values. When an avatar is generated, its primary colors are derived directly from the persona's skill set, making its visual identity a direct reflection of its capabilities.
  • Thematic Variations: To add another layer of customization, we introduced two distinct themes:
    • nyxcore: A more intricate design featuring a subtle glow ring, a gold sigil overlay (a diamond+cross pattern), and additional accent points.
    • minimal: A cleaner, more subdued aesthetic with fewer accents, focusing on the core silhouette and colors.
  • Personality Touches: Small details make a big difference. White, symmetric eyes are strategically placed at row 4, giving each avatar a hint of personality and focus.
  • Deterministic Output: Consistency is key. Using a seeded PRNG (mulberry32 algorithm), we ensure that an avatar generated with the same inputs (skills, name, theme, variant) will always produce the exact same image. This means no surprises, and a stable visual identity for each persona.
  • Zero-Dependency PNG Encoding: This was a significant win! Instead of pulling in heavy native libraries like sharp, we leveraged Node.js's built-in crypto and zlib modules. We manually encoded PNG chunks, resulting in lean, transparent PNGs (typically 1.5-2KB) that are written directly to public/images/personas/avatar-{hash}.png. This keeps our dependency tree light and our performance snappy.

Bringing It to Life: Integration Points

A powerful generator is only as good as its integration. We wired our new avatar system into three critical areas:

  1. generateAvatar tRPC Mutation: A new mutation was added to src/server/trpc/routers/personas.ts. It accepts skills, name, systemPrompt, theme, and a variant number. Crucially, if no explicit skills are provided, the system intelligently extracts specializations directly from the persona's systemPrompt, ensuring every persona gets a skill-colored avatar. The persona's name serves as a fallback seed for determinism.

  2. New Persona Creation UI (src/app/(dashboard)/dashboard/personas/new/page.tsx): The most visible change is here. The static portrait picker has been replaced with the dynamic generator. Users can now:

    • Toggle between nyxcore and minimal themes.
    • Click a "Regenerate" button, which increments a variant counter to produce a new deterministic but slightly varied avatar.
    • See a skeleton loading state during generation, with graceful fallbacks if anything goes awry.
    • When suggesting personas, handleSelect now automatically triggers avatar generation based on the suggestion's systemPrompt.
  3. Executive Intelligence Dashboard (Overview): We confirmed that the Overview tab's implementation was already complete from a previous session, ensuring that new personas with generated avatars will be displayed correctly.

Challenges & Lessons Learned

No development session is without its hurdles, and this one offered a couple of valuable lessons:

  • TypeScript and Set Iteration: We hit a TS2802 error when trying to iterate directly over a Set<string> using for (const key of sigil). This is a known project-wide constraint related to --downlevelIteration settings (documented in our CLAUDE.md Prisma Gotchas). The fix was straightforward but important to remember: convert the Set to an array (Array.from(getSigilCells())) and then use an index-based for loop. It's a good reminder to be aware of your project's specific TypeScript configuration and its implications.
  • The Dependency Dilemma: sharp vs. Built-in: Initially, we considered using sharp for PNG generation. While sharp is incredibly powerful, it's a native dependency, which adds complexity to installation, deployment, and overall project weight. We made a conscious decision to avoid it. Instead, we invested time in implementing a custom PNG encoder using Node.js's zlib.deflateSync and manual PNG chunking. This proved to be a fantastic decision, resulting in a perfectly functional, zero-dependency solution that keeps our bundle size minimal and our deployment frictionless. It reinforced the value of building lean and leveraging built-in capabilities when possible.

What's Next?

With the core generator and its primary integrations complete, our immediate next steps include:

  1. Committing all changes and pushing to the main branch.
  2. Considering regenerating avatars for our existing built-in personas in prisma/seed.ts to give them a fresh, dynamic look.
  3. Extending the avatar regeneration functionality to the persona detail and edit pages (/dashboard/personas/[id]), allowing users to update avatars for existing personas.
  4. Exploring the possibility of integrating the generator into the persona overview panel in our analytics, perhaps auto-generating avatars for any personas that might be missing an imageUrl.

This session has been incredibly rewarding, breathing new life into our AI personas. By giving them dynamic, skill-driven visual identities, we're not just improving aesthetics; we're enhancing the overall experience and connection users have with our intelligent agents.