Foodies was my 2018 interaction design capstone. Seven years later, I used it as my first vibecoding experiment, feeding an existing design into Claude to see how quickly a working prototype could come out the other side.
INTERACTIVE PROTOTYPE
Click through the prototype above, it's fully interactive.
01 · Why I Used Vibecoding for This Project
Foodies was my interaction design capstone project in 2018, a group dining app that helps people find a restaurant that works for everyone's dietary needs, budget, and location. The full design process is documented in the original Medium article.
In 2025, I wanted to understand vibecoding properly, not in theory, but in practice. Rather than starting from scratch, I chose a project I already knew inside out. Foodies was the perfect test case: I had the user research, the flows, the screen designs. The question was simple: if I feed all of that into Claude, how quickly does a working prototype come out?
Using a project I'd already designed meant I could evaluate the output critically. I knew what the interactions should feel like, so I could judge where Claude got it right and where it needed redirecting.
Feed the design context into Claude, the problem, the user journeys, the key screens, and see what comes back. How much does it capture? What gets lost? What requires the most correction?
02 · What Vibecoding Means in My Design Practice
Vibecoding, to me, is using AI as a rapid prototyping partner,describing what a product should do, how it should feel, and what it should show, then letting the AI generate a first version to react to. It's not about replacing design thinking. It's about compressing the gap between "I have a design" and "I have something I can click through."
The designer's existing knowledge is what makes it work. You need to know what good looks like to direct the output, correct the gaps, and decide when to push back.
Vibecoding is fast. Describing screens and interactions in conversation, then iterating on the output, is significantly quicker than building prototypes manually,especially for multi-screen flows.
The AI generates the scaffolding. The designer makes it right. Every interaction in this prototype was steered, corrected, or adjusted based on knowing what the original design intended.
The better you describe the design,the user problem, the flow logic, the edge cases,the better the output. Vibecoding rewards designers who think clearly about their own work.
03 · How I Used Claude in the Foodies Project
I started by giving Claude the context it needed: the core user problem (a group trying to agree on a restaurant that works for everyone's dietary needs), the key user journeys from the original design, and the screen structure,splash, home, event creation flow, results, restaurant detail, dietary preferences, profile.
From there, the process was iterative. Claude generated the prototype code screen by screen. I reacted to each output, corrected where the interactions didn't match the original design intent, and pushed it further. The 4-step event creation flow,friends, budget, location, review,required the most direction to get right.
The overall structure, visual hierarchy, and component logic translated quickly. The dietary filtering concept and the match percentage system,core to the original design,came through with minimal correction.
Interaction details required the most iteration: the step-by-step flow logic, the group dietary aggregation mechanic, and the transitions between states. These are the parts where knowing the original design deeply made the difference.
The prototype above is the result. Click through to see the full 4-step event creation flow.
04 · What I Learned
Using a project I already knew deeply was the right call for a first experiment. It meant I could tell immediately when Claude was on track and when it wasn't,and direct it accordingly. Vibecoding without design knowledge to draw on would produce something much harder to evaluate.
Getting from "here's the design" to "here's a clickable prototype" was significantly quicker than building it manually. The speed is real, especially for multi-screen flows that would take hours to wire up in a traditional prototyping tool.
The quality of the output tracked directly to the quality of the input. The screens and interactions I described most precisely came out best. Vibecoding rewards having thought deeply about the design beforehand,it doesn't replace that thinking.
Vibecoding earned a place in my process,not as a replacement for UX craft, but as a rapid prototyping method for designs that already exist on paper. The next step is testing it on a 0 to 1 project, where the design is still being discovered.