project / / experimenting
Recipe App
A Streamlit app that takes pantry ingredients and cooking preferences, then calls a local Ollama LLM to generate three tailored recipe ideas with steps and pairings.
Recipe App (Pantry Pairing Recipe App) is a single-file Streamlit application that generates personalized recipe suggestions using a locally running LLM via Ollama. Users describe what’s in their pantry, their time budget, meal type, effort level, dietary needs, and flavor mood — and the app prompts the model to return three structured recipe cards complete with ingredient breakdowns, step-by-step instructions, and pairing suggestions. All inference runs locally; no cloud API keys are required.
Purpose
Test whether a minimal local-LLM stack (Streamlit + Ollama) can deliver a genuinely useful everyday cooking tool, while keeping the prototype simple enough to run anywhere with a single command.
Highlights
- Configurable Ollama model selection in the UI — swap
gemma4,llama4, or any pulled model without touching code - Structured Markdown output per recipe: time estimate, pantry ingredient match, extra items needed, numbered steps, fun drink/side/vibe pairings
- Optional favorites saved to a local JSON file
- Graceful error message and instructions when Ollama is not running
Technical notes
| Component | Detail |
|---|---|
| Frontend | Streamlit |
| LLM runtime | Ollama (local HTTP localhost:11434) |
| Data validation | Pydantic |
| HTTP client | requests |
| Package manager | uv |
| Entry point | uv run streamlit run app.py |