Unlocking Visual Stories: Multi-Image Uploads and AI Workflow Mastery
A deep dive into a recent development sprint, covering the successful migration to multi-image uploads, the verification of an AI-powered implementation pipeline, and critical lessons learned from production mishaps.
Shipping new features and validating powerful AI pipelines are the moments that truly energize a development team. This past session was one of those times, culminating in two significant milestones: a long-awaited upgrade to multi-image uploads and the robust verification of our AI-driven implementation pipeline. We also collected some invaluable (and slightly painful) lessons along the way.
Embracing Richer Narratives: The Multi-Image Migration
For too long, our notes were limited to a single image – a snapshot when what we often needed was a full visual story. This session, we finally broke that barrier, migrating our image upload system to support multiple images per note.
The Technical Deep Dive:
- Schema Evolution: The core change began in our
prisma/schema.prisma. We introduced a newNoteImagemodel, creating a dedicatednote_imagestable. This table now elegantly manages image metadata, includingnoteId,memoryId,tenantId,imageKey(the S3 key), adescription, and a crucialsortOrderfor presentation.prismamodel NoteImage { id String @id @default(uuid()) noteId String memoryId String? // Optional, for context if needed tenantId String imageKey String description String? sortOrder Int @default(0) note ProjectNote @relation(fields: [noteId], references: [id], onDelete: Cascade) memory MemoryEntry? @relation(fields: [memoryId], references: [id], onDelete: Cascade) @@index([noteId]) @@index([tenantId]) } - Database Cleanup: With the new
NoteImagemodel, we could responsibly remove the redundantimageKeyandimageDescriptioncolumns from bothproject_notesandmemory_entriestables, streamlining our data model. - API & UI Overhaul:
- New tRPC mutations (
addImage,removeImage) were added to ourprojects.notesrouter, providing a clean API for managing image collections. - The frontend saw a significant upgrade, with a multi-file drag-and-drop UI implemented in
src/app/(dashboard)/dashboard/projects/[id]/page.tsx, making the user experience intuitive and powerful. - Data loading was updated in
load-notes-content.tsto query images via their new relation.
- New tRPC mutations (
- Strategic Streamlining: The Memory Hub page, which previously had image upload capabilities, was cleaned up. Images are now exclusively managed through Project Notes, simplifying the overall user flow.
This entire multi-image system, committed with 075d9f6, is now deployed to production with robust Row Level Security (RLS) policies in place, ensuring data integrity and security.
Precision Engineering: Verifying Our AI Implementation Pipeline
Beyond feature development, a critical part of this session was validating our AI-powered pipeline designed to convert high-level action points into detailed, actionable implementation plans.
The Pipeline in Action:
- Workflow Execution: We ran workflow
8ae18aa5-a026-448e-8b92-de706296d039, titled "feat: project-onboarding (13 actions)". This workflow is designed to break down a project onboarding feature into 13 distinct development actions. - AI Engine: Our engine of choice for this task was
google/gemini-2.5-pro. It processed all 16 steps of the workflow flawlessly, completing the entire sequence in approximately 18 minutes at a cost of just $0.82. - Validation: The crucial part: we verified that all 13 action points were matched with corresponding, comprehensive implementation plans. The output was impressive – 145,000 characters of detailed specifications, including file paths, code snippets, test instructions, and necessary commands. This output is ready for a developer to pick up and run with.
The PoC Demo Document:
To solidify this achievement and provide a clear blueprint, we created a comprehensive Proof-of-Concept (PoC) demo document: docs/reports/2026-03-15-image-to-implementation-pipeline-poc.md. This document includes:
- A full pipeline architecture diagram.
- A detailed traceability chain, linking back to database tables.
- An actual execution record with costs and durations.
- Mapping to ISO 27001 and DSGVO compliance standards.
- Clear reproduction steps for anyone to follow.
This PoC demonstrates the power of our AI-driven approach to streamline the journey from idea to implementation.
Navigating the Trenches: Lessons Learned
Not everything was smooth sailing. Production environments have a way of revealing the sharp edges of our development practices.
The Perilous db push --accept-data-loss on Production
Challenge: In a moment of oversight, I again ran npx prisma db push --accept-data-loss directly on our production environment. This command, while incredibly useful in development for rapid schema iteration, is a destructive force on production. It led to the accidental dropping of our pgvector embedding vector(1536) column on the workflow_insights table – for the second time!
Workaround & Fix: We quickly restored the column and its associated index using ALTER TABLE ... ADD COLUMN IF NOT EXISTS and CREATE INDEX IF NOT EXISTS.
Lesson Learned (CRITICAL): NEVER run db push on production. Always, always use the dedicated, safe migration scripts (e.g., ./scripts/db-migrate-safe.sh). This incident has highlighted the urgent need for a robust safety guard. We're looking into adding a pre-push Git hook or a shell alias that explicitly blocks this command on production environments.
Anthropic API Credit Crunch
Challenge: During the "Synthesis review" step of one of our workflows, our Anthropic API requests started failing with a 400 error due to insufficient credit balance.
Workaround: The system gracefully fell back to OpenAI o3, but unfortunately, this produced an empty output. The subsequent "Final Implementation Prompt" step was robust enough to compensate independently.
Lesson Learned: API credit monitoring and robust provider failover mechanisms are essential. We need better alerting for low credit balances and a more resilient retry/fallback strategy that ensures valid output, even if it means trying multiple providers.
Looking Ahead
With multi-image uploads deployed and our AI pipeline proven, we're set for an exciting phase of implementation. Our immediate focus will be:
- Topping up Anthropic credits to ensure our review steps run smoothly.
- Implementing a safety guard against accidental
db push --accept-data-losson production. - Diving into the 13 implementation plans generated by our AI, prioritizing critical items like "Fix Hanging Analysis," "Onboarding Wizard," and "Repository Scanning."
- Thoroughly testing multi-image upload on production with fresh content.
- Wiring up the Synthesis review step with improved retry logic and provider failover to guarantee output.
This session was a powerful reminder of the dynamism of software development – the thrill of shipping, the precision of automation, and the humbling lessons learned in the heat of battle. Onwards to the next challenge!