Is prompt-based AI design missing direct manipulation?
HCI Today summarized the key points
- •It addresses the problem that today’s AI design tools rely only on text prompts, losing the advantages of direct editing.
- •The author says they want a UI tool where AI handles the underlying system while preserving direct-manipulation editing like in Figma.
- •In reality, the translation cost between design and implementation is still high, and there is often a need to revisit and fix the generated code or mockups.
- •In the comments, many people say the most useful approach is to create a draft with prompts and then fine-tune it through direct manipulation.
- •Some tools are already trying to move in this direction, but the seamless handoff between the screen and development is still not fully mature.
This summary was generated by an AI editor based on HCI expert perspectives.
Why Read This from an HCI Perspective
This article is worth reading because it revisits why AI design tools still tend to stay ‘prompt-first,’ and it brings renewed attention to the value of direct manipulation—an idea long discussed in HCI. From a UX practitioner’s perspective, it helps examine friction in the handoff process from generation to editing to development. From a researcher’s perspective, it offers clues for studying multimodal interaction, human–AI collaboration, and the cognitive costs of switching across tools. In particular, it raises practical questions about how far AI should intervene during the UI editing stage.
CIT's Commentary
From a CIT perspective, this issue is not simply about building a ‘more convenient AI editor’; it’s about what the interaction model for design work should be. Prompts are strong for early divergence, but during fine-tuning they often fail to adequately support users’ spatial and visual thinking. The key, then, is how to combine direct manipulation with AI automation. We view this shift as reducing ‘translation costs between tools,’ and the ideal direction is not a structure where natural language, canvas manipulation, and code compete, but one where they flow smoothly depending on the situation. However, in practice, it must also ensure design-system consistency, version tracking, and implementability—so it requires a mindset that redesigns the entire workflow, beyond simple WYSIWYG.
Questions to Consider While Reading
- Q.In AI-based UI editing, in which stages of work are direct manipulation and prompt input each more appropriate?
- Q.What interaction design is needed to reliably synchronize changes on a design canvas into a development-ready state?
- Q.What constraints and feedback structures are most effective for generative AI to provide editing flexibility without breaking design-system consistency?
This commentary was generated by an AI editor based on HCI expert perspectives.
Please refer to the original for accurate details.
Subscribe to Newsletter
Get the weekly HCI highlights delivered to your inbox every Friday.