
Vibe Modeling: Turning Prompts into Parametric Prints
Author(s): Nicolas CHOURROUT
Originally published on Towards AI.
TL;DR I built a βvibeβmodelingβ workflow β ask an AI for a part, tweak it if needed, and download a readyβtoβprint 3D file. Below I show the pipeline, dive into real prints examples, and talk about why this is thrilling for beginners yet still a secondβstring tool for seasoned CAD users.

What is βVibe Modelingβ?
Think of vibe modeling as the CAD twin of vibe coding: you suggest the idea (βsleek bowlβ, βiPhone standβ) and let the LLM handle the math. From there you can keep chatting with the bot to iterate or nudge the generated sliders to fineβtune dimensions β staying in the creative flow instead of spending hours in Fusion 360 β until the design feels right.
The Chat To STL App
I built a basic Streamlit application to demonstrate the concept.
Type your description in the chat and o4βmini does what LLMs do best β write code. In this case the code is plain OpenSCAD (a free, scriptβbased CAD language), so every line the model emits translates directly into geometry. Peek at that script in a collapsible panel or watch it materialise instantly in an interactive 3D viewport beside the chat. Refine the design by sending a followβup prompt or by nudging the autoβgenerated sliders (wall thickness, diameters, hole countβ¦). When it looks right, grab a readyβtoβprint STL or 3MF with one click.
Full repository (bring your own OpenAI key): github.com/nchourrout/chat-to-stl
GitHub – nchourrout/Chat-To-STL: 3D Designer Agent using OpenAI and OpenSCAD
3D Designer Agent using OpenAI and OpenSCAD. Contribute to nchourrout/Chat-To-STL development by creating an account onβ¦
github.com
Edit: now you can also try it online on Huggingface 🤗
Example 1 β Catchβall bowl 🥣
Prompt: βA catchβall bowl with βhave a good dayβ engraved on the bottom.β

The model nailed it on the very first run β no slider tweaks, no reβprompting. I sliced the STL, sent it to my Bambu Lab A1 3D printer, and 45 minutes later pulled the dish off the plate.
Example 2 β Swissβcheese door stop 🧀
Initial prompt: βdoor stop, wedge shaped like a swiss cheese with holesβ

The first render nailed the wedge shape but the βholesβ were all clumped together. I reβprompted the bot to subtract randomβsized spheres scattered through the body; one pass later the stop sported true Emmental style holes and still wedged the door solid.
Example 3 βPushing the limits with a backpack hook for an IKEA desk 🎒
Initial prompt: βS-shaped backpack hook I can mount on a table topβ


Getting this right took roughly ten iterations 😅. I had to spell out the Sβshaped profile, tell the model to extrude that outline, and then ask for precise fillets where I wanted the curves. Once the render finally looked good, the AI surfaced sliders for dimensions and a few nudges later the clamp fit my IKEA Linnmon tabletop like a glove. Printed in PLA (though PETG would be even better), it now holds a fullyβloaded backpack. The final hook feels plenty sturdy, yet the iterative backβandβforth highlighted that knocking out a quick sketch in Fusion 360 would have been faster.
Bonus Example β A Star shaped box βοΈ
Arenβt there already AI models for 3D?
Projects like OpenAIβs ShapβE, Googleβs DreamFusion, and NVIDIAβs GET3D can turn text into 3D meshes, but those outputs are built for screens, not manufacturing. They produce voxel (3D pixel) clouds that cannot be easily customized. Vibe modeling takes a different route: the AI writes parametric OpenSCAD solids that are easily editable and can be sliced cleanly for realβworld, functional prints.
Where it falls short
- Simple shapes only. Right now the AI shines with basic objects. Even just asking it for a mug can lead to a completely misplaced handle.
- Needs CADβstyle language. Prompts work best if you speak in sketchβandβextrude terms (βdraw a 50 Γ 20 mm rectangle, extrude 5 mm, fillet 3 mmβ).
- No selfβcritique. Currently, the model canβt see what the rendered 3D objects look like. However, Iβm working on a version that would feed the render preview image back so the AI can spot errors on its own.
Who should use this?
- Novices & weekend makers β skip the CAD learning cliff; vibeβmodel a quick fix in minutes.
- Educators β teach parametric thinking without a complex UI.
- Not for Power users β for intricate sketches or assemblies, Fusion 360/SolidWorks is still the way to goβ¦ but for how long?
If you use the app, tag me β canβt wait to see what you design with it!
Need custom AI tooling? We build tailored solutions at flowful.ai.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI