Building the Canvas Above AI Models
Ethan Proia is the Founding Head of Design at FLORA, a creative AI platform that’s unifying 50+ generative models across text, image, video, and audio into one coherent workspace built for flow, control, and creative agency. Before FLORA, Ethan spent years exploring spatial computing, mixed reality, and human-computer interaction, building immersive experiences and investigating how humans interact with increasingly intelligent systems. His background spans installation art, interactive technology, and interface design for emerging paradigms. FLORA’s founding manifesto makes a strong claim: current AI creative tools are made by non-creatives for other non-creatives to feel creative. Ethan’s designing something different, a tool that honors the history of creative software while building scaffolding on top of AI models, not just aggregating them. We brought him in to talk about FLORA’s design philosophy, the specific interface decisions that make 50+ models feel coherent instead of chaotic, and where creative tools are heading as generative AI becomes the dominant material designers work with.
At a glance
FLORA’s core abstraction is modality (text, image, video, audio) rather than individual models, because modalities never change but models constantly evolve.
The node-based canvas makes creative workflows visual and repeatable, turning the process itself into the deliverable, not just the output.
Every new primitive on the canvas has to fight for its life because complexity kills the beauty of node-based thinking for newcomers.
Context and intent are the twin engines of UX in the AI era, every design decision comes down to answering those two questions.
Ethan uses ChatGPT to dump months of Flora context and Cursor with Figma MCP to prototype directly in code, moving away from Figma as source of truth.
FLORA isn’t just creative software, it’s positioning itself as a creative operating system with no allegiance to any form factor or surface area.
The role of designers is changing fundamentally, expect more work directly in code, higher fidelity prototypes that don’t take forever, and a shared language with engineers.
When hiring designers, Ethan looks for proficiency across multiple creative tools, experience with current AI creative tools, and strong opinions about what works and what doesn’t.
Topics
Creative tools must respect established workflows while innovating where it matters
Ethan says a tool made by creatives for creatives needs to fundamentally acknowledge the history it’s coming from, and it needs to know when to abide by those rules and when to break them. That philosophy is core to how he thinks through FLORA and how to expand it, always asking how they can pay homage to and develop what’s already been done, improve it where it needs to be improved, augment it where it needs to be augmented, specifically with new technologies. It needs to actually work, be scalable, be enjoyable, and be something you can build a personal relationship with. What strikes Ethan when talking to creatives is how personal people’s relationships are with their tools, which is interesting because tools are made to be adopted by lots of different people, yet there’s such an individualistic experience when using them. The way one person uses Figma is different than how another uses it, and that extends all the way down. So FLORA needs to be universally approachable and understandable and adoptable, but also something you can build a relationship with.
Models are building blocks, not the intelligence itself, and the abstraction should reflect that
Ethan thinks we shouldn’t be thinking about models as intelligent in their own right because they’re not, they’re basically input output machines that are very good and novel in the way they’re doing that. He believes we should be using the models themselves as the tools we’re building a foundation on top of, where it’s less about the individual model and more about the structures and scaffolding on top of them. Right now everyone is model focused, new models come out all the time and people post about which one can do what with better prompt adherence, but Ethan thinks the models are more fundamental than that. We should actually be building structures on top of the models, and that’s really core to the philosophy of FLORA and this next generation of creative tools and how they’re integrating and building on these new abilities. Professional users care about models only insofar as it lets them get to the creative output they want, so if FLORA could guarantee the same output without mentioning a model at all, Ethan bets the vast majority of creative professionals wouldn’t care. The reason model names and hype around new model drops is still important is because it’s much more explicit about the kind of control that enables, but always the point is the output, what am I trying to do, what will it look like, what will it feel like, how do I get the result I want.
Modality became FLORA’s abstraction because it’s the only thing that never changes
When FLORA was figuring out the common language and substrate they’re building on top of, Ethan explains they realized text to image models were easy to categorize because they all take text input and output an image, and you could add complexity like what aspect ratio or resolution you get, but it’s still manageable. But then multimodal models threw the whole paradigm out the window because a model that can support both text and image but needs to be both or some permutation of the inputs makes that abstraction a lot messier. The conclusion they came to, and what Ethan thinks has contributed to FLORA being successful so far, is you need to pick an abstraction that has nothing to do with the models. The abstraction they settled on was the modality, and Ethan is obviously biased but thinks that’s the correct abstraction we should be building on top of because that’s never going to change. We will always have text, image, video, audio, 3D models, and whatever those atomic units are, the building blocks, that’s why they call them blocks on the canvas. Then you can build on top of that. They’re always stress testing that foundation to make sure it’s compatible with new models that come out, and so far it’s held up, and people really respond well to it because it’s more intuitive than using nonsense model names for people just coming into this.
Node-based canvases encourage divergence and convergence while making the creative process visible
Ethan loves node-based canvases because they’re inherently spatial, and we are spatial creatures who think in dimensionality and relativity, which is why infinite canvases have become so popular in design tools because you can place things and organize them. A node-based canvas has all those benefits but then the fact that you’re actually connecting your train of thought together makes it very easy to follow and encourages the theme of the double diamond, divergence and convergence, which is very visual when you’re looking at a node-based canvas because you’re quite literally connecting your thoughts together. What FLORA enables specifically and what they’re excited about is that FLORA is allowing you to codify and visualize the process, so the creative process then becomes the material you’re working with, the deliverable, not just the output. You could generate a poster of whatever, but what if FLORA could give you the creative process and make it repeatable and scalable, that’s the new deliverable, the process. Ethan was convinced this paradigm has persisted, it’s been kind of niche but there’s got to be a reason for that, and it’s all those things he explained, but the reality still stands that it is confusing for people who are used to traditional interfaces. By simplifying and reducing and abstracting away the other complexity, he thinks they can let the beauty of that way of making shine through.
Every new primitive on the canvas has to fight for its life to prevent overwhelming complexity
Ethan is always trying to make sure every new primitive they consider introducing to the canvas has to fight for its life. He hates what he’s seen in past node-based tools, and he says this from a place of love because he’s done a lot of work in Touch Designer, Max MSP, and Pure Data, but if you need a place to put something it’s just make another node, put another node on the canvas. That’s cool for people who understand and are in the universe of the software, but for a new person coming in that’s chaos, you’re telling them there are 400 nodes that all do different things and they have to know what they do and how to connect them and in what combinations. There’s been really beautiful emergent community from that in tools like Blender’s geometry nodes, Unity’s shader graph, and Unreal Engine’s blueprints, with people not knowing what’s going on necessitating coming together and making and sharing and knowledge sharing, which is lovely. But Ethan thinks that can exist without needing to have so many primitives on the canvas, and when he says primitives he means the basic atomic units you’re stitching together to build something bigger, in their case a creative process or workflow.
Nodes create causal relationships and “noodles” are where the context-sharing actually happens
The nodes inherently have a causal relationship and there’s a chronology to them, Ethan explains, which is different from laying things out in Figma where you might mentally paste horizontally for one iteration and vertically for something else. The nodes implicitly imply this thing and then this thing, you start with text and then get an image and then the image turns into a video and that video turns into something else. The question becomes what is the thing that’s actually connecting them together, what they call noodles internally, and what’s actually happening in the noodle. Philosophically what’s happening is the transformation of data, the actual transmission is happening in that noodle, so it’s a lot more than just a conceptual connection. Ethan is trying to ground it in physicality and tactility, they’re really connecting these things together, taking this thought and connecting it to this thought and then branching it out to whatever else. That is actually really intuitive for people, and it’s not getting there that’s hard, but once they get it, people have an aha moment. Ethan has sat down with hundreds of people and shown them how FLORA works, mostly people who have never used any node-based tool before, and when you show someone for the first time that you can take this and connect it and there’s a causal element, they understand that the thing from this is going to affect this thing, which fits with how creatives think generally because you don’t necessarily know where you’re going.
Context and intent are the twin engines of UX in the AI era
There are two main pillars Ethan thinks about when designing new tools or aspects of FLORA, context and intent, and really if you can answer both of those questions then you can figure out what to design. The noodles represent context in a sense, sometimes literally but also figuratively, and there’s causality to it, so the deeper question is what kind of context, is it context for the user or for the model or for the AI or for a collaborator. There are lots of different types and they actually joked about banning the word context at FLORA because they would lean on it too much when whiteboarding things out, and one day Weber asked what is context, what does that actually mean, so they’ve had to break it down into smaller pieces. When thinking about that, the primary way is by connecting and representing it with a thumbnail in the subsequent block, so you have your source block connected to a subsequent block and you’ll see the previous block as a thumbnail. But context changes depending on the model you’re using, an image can be input to another model either as a general reference or as a style reference or a character reference, so something they’re still working on is how to visualize not only context but the subtypes of context. Ethan wants to bring the at mention paradigm from productivity software into creative software, and while he would have been hesitant if it hadn’t been so proliferated by productivity tools, it’s everywhere now and that’s context, you mention someone or something and that’s a direct and powerful way using language and a symbol everyone understands.
[Demo] Prototyping an audio block in a vibe-coded Flora twin environment
Ethan showed a version of FLORA that’s not in production, a kind of FLORA twin environment he’s working in that he hopes to eventually get his team working in as well. He’s on a longer-term project of basically moving away from Figma as the source of truth for design and aligning that, making the code the source of truth. Designers should be in there with the front end team and understand the codebase, not necessarily in its entirety but in how the design system is actually being translated into code, because too often there’s such a disconnect between the design team and the front end team. This is all made in Cursor using Figma MCP, and what he demoed is a new audio block, the next modality they’re working on after text, image, and video. He showed how the audio block works with text to audio generation, how it’s fully scrubbable and interactive with media controls, and how it expands dynamically. Part of the challenge of having as few primitives as possible is they have to really think through in detail how everything is connecting, because if you have so few primitives but still want to unlock all the complexity of something that has 400 primitives, where does that complexity go, and a lot of that comes down to context and how you’re representing different kinds of context.
[Demo] Using ChatGPT and Cursor to build with months of accumulated Flora context
Ethan’s workflow for starting a project involves using the ChatGPT Mac app mapped to option space, where he locks himself in a room and records himself talking for 10 minutes about everything they’ve been thinking about, conversations he had with the team, why he thinks certain approaches are good or bad, as if he were just talking to somebody about it. ChatGPT transcribes that and then he asks for a PRD, a product requirement document. Since using GPT for many months now it knows a lot about FLORA and talks about FLORA the way the team does, so it feels like talking to another coworker, like meeting a friend again where they don’t have to start from scratch every time, and that is crucial because if they did this whole approach wouldn’t work. The fact that it’s able to read between the lines and interpolate and understand when he says block and noodle, it knows what he means not just from an interface perspective but conceptually because he’s written up and fed in the philosophical context. He takes the PRD and throws it into Cursor, and Cursor has context from Figma through Figma MCP, so GPT is handling the conceptual meta context about FLORA and Cursor has the visual context, the components from their design system, the rest of the codebase where other features exist, markdown files and readmes with additional context. He also asks ChatGPT to help write an overview for how the Cursor agents should perform, being extremely granular because particularly with design you need to be extremely granular.
FLORA is thinking of itself as a creative operating system, not just software
When asked about the ideal form of codifying creative workflow, Ethan says right now yeah it’s a node-based canvas, but in the future he doesn’t know because they have no allegiance to any form factor or surface area. If it turns out that mixed reality is the best place to explore these things, then let’s do it, though he has more of a proclivity toward hybrid experiences. What he’s imagining is a multimodal experience, a cross-platform experience, all feeding back to a shared network, a shared creative system, but you can think about other kinds of hardware as satellites to this system. If he has smart glasses and is out in the world and can snap a photo of something or record a note, that’s all context he can feed back into this creative system he’s building. So it’s something that is omnipotent, everywhere and anywhere, and he can access it in any way he wants, pulling down a creative system from the cloud but with infinite access points both grounded in the digital in software like an actual desktop interface or through a wearable, everything just needs to feed into this creative system. They’re a team of mostly creative technologists so they love prototyping and playing and exploring, and if FLORA needs to make its way into projection mapping it will, but they have no allegiance to anything. They need to be as malleable and adaptable as creatives are and as creativity is. Ethan is clear: FLORA is not the last tool they’ll make, FLORA is a much larger idea than just node-based creative software, it’s this overarching idea of building a fundamental creative system that can power anything.
Designers need to take more ownership of code and bring higher fidelity prototypes to design reviews
Ethan thinks the role of designers is going to change a lot and it’s not going to change at all. Change a lot in the sense that the tools and methods they’re using will change, designers working a lot more with code, and he thinks designers should not be afraid of code but can start to think about it as an abstraction. There’s an acknowledgement that ultimately that’s where your design lives, in the code, so you have to get more comfortable with it and interface with it and spend more time with it, and it’s not scary, it’s actually the thing that’s going to get you closer to the exact right animation curve. Designers need to take more control and ownership over that part of the process. There’s going to be an expectation, there’s already an expectation, that designers come to meetings and design reviews with much higher fidelity prototypes that don’t take forever to make. It’s a pain to make a Figma prototype, having to wire everything up and it’s still wrong and auto-animate doesn’t always do what you want it to do, so why inject another abstraction, let’s just go direct to code. The engineers are going to write it better of course, but if you can give them a foundation like this is what it should feel like, you can speak a common language, and that’s something that’s going to become a lot more prevalent and prominent.
When hiring designers, Ethan looks for breadth across creative tools and strong opinions on AI tools
One of the things FLORA looks for when hiring is proficiency in multiple creative tools, not just one, and that philosophy has been fundamental to them because understanding how to serve the diversity of ICPs they service, filmmakers, photographers, fashion designers, graphic designers, architects, those tools are very different. Ethan is always looking for designers, and really anyone that joins the team, who have a breadth of understanding of different kinds of software. Ideally it’s not three different kinds of design software, it’s have you played with Touch Designer and Unreal Engine, have you done any work in CAD, not that you have to have mastery over these things but are you familiar with the patterns that are shared and unique across different kinds of creative software because there’s so much richness there. Having that additional experience is always useful, so they’re looking for folks who aren’t afraid to play and try stuff, dig into new things as they come out, experiment, and you don’t have to be a master because every tool has a range of different people and ways of using it. In the same vein, specifically in the past 6 to 12 months, they really look for folks who have experience with the current landscape of AI creative tools specifically and who have an opinion. They don’t have to agree, that’s the whole point, Ethan wants to hire people he doesn’t agree with because that’s where the good ideas come from. But he’d like to see that you have a stance, if you’ve used everything that’s available right now, what do you like, what do you not like, and why, and what from that could they apply to FLORA because they’re always learning and trying to expand on that.
Thanks for reading. Stay in the loop on new episodes and upcoming events by subscribing.














