Kaiber Labs partnered with Yaeji and visual artist Weirdcore to produce a one-hour generative visual set for her third Boiler Room performance at the Brooklyn Mirage. The brief was to build the entire show around Woofa, Yaeji's signature yellow dog character, using AI-generated animation and post-processing. The core challenge was that the techniques we needed barely existed: character consistency, keyframe-driven animation, and aesthetic control across long-form generative video were all unsolved problems at the time.
We built custom pipelines from scratch, evaluating multiple video model approaches before landing on a keyframe-to-motion workflow using Midjourney, SDXL, and Luma DreamMachine. Each model had different tradeoffs around aesthetic quality, output length, and motion consistency, and the final pipeline reflected weeks of testing and iteration. The output totaled 355 generated clips, roughly 26 minutes of AI-generated video composited into the full hour-long set. A secondary deliverable, Woofamoji, used a separate generative pipeline to produce a custom iOS emoji set of Woofa at a fraction of the time traditional illustration would require.
This project was formative: the production problems we solved here directly informed how Superstudio, Kaiber’s generative canvas product, was designed, and made it clear that the gap between what AI models could do and what creators actually needed was a product problem.
Designing this show directly informed both the company’s direction with our generative canvas product and my own trajectory from creative technologist to product manager.
Working in FigJam, we quickly noticed that our creative process sped up as we introduced more context to their canvas. We used it to whiteboard concepts, workflows, and organize outputs. When reviewing this eventually sprawling document, we noticed that it was easy to fluidly trace both our production pipelines and our creative thought processes from concept through execution.
This approach to a creative production environment that is usually hindered by the UX of a huge number of 3rd party tools gave us clarity as we turned our applied research learnings into product insight. By combining the production space with the presentation space, a creative user or team could move seamlessly between ideation and making, quickly testing ideas on the fly and branching off into one-off experiments without breaking the link to the original idea.
While shaping Kaiber’s product based on our production experience, I saw an opportunity for deeper integration between the applied research and product teams. Despite our canvas product still existing in its infancy, it seemed like translating our production processes to curated, custom canvases and making them available to users would be a perfect flywheel between partnerships, research, product, and marketing.

Kaiber’s collaboration with Yaeji allowed us to test the very first version of this structure, which would heavily influence how the company developed over time. Using the currently rudimentary canvas tool, we were able to simplify and recreate our core workflows as presets on a curated canvas and pitch the integration directly to Yaeji and Weirdcore.

My insights translating creative production workflows to actionable product feedback here played a significant role in my transition from creative technologist to product manager at Kaiber. This project illustrates a few of my core strengths on creative tooling product teams: the ability to embody the user from direct experience, translate creative workflows into distributable product systems, and coordinate discoveries across teams.


