<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://kwoolford.github.io/feed.xml" rel="self" type="application/atom+xml" /><link href="https://kwoolford.github.io/" rel="alternate" type="text/html" /><updated>2026-04-29T21:20:54+00:00</updated><id>https://kwoolford.github.io/feed.xml</id><title type="html">Kyle Woolford</title><subtitle>Personal site for fun</subtitle><author><name>Kyle Woolford</name></author><entry><title type="html">Creature Creator</title><link href="https://kwoolford.github.io/project/creature-creator/" rel="alternate" type="text/html" title="Creature Creator" /><published>2026-04-25T00:00:00+00:00</published><updated>2026-04-25T00:00:00+00:00</updated><id>https://kwoolford.github.io/project/creature-creator</id><content type="html" xml:base="https://kwoolford.github.io/project/creature-creator/"><![CDATA[<p>Creature Creator (working title: Morphlings Arena) is a small Python 3D game prototype built with the Ursina engine. The project drops a set of randomly generated creatures — each assembled from primitive geometric shapes — into an enclosed arena where they wander, chase rivals, fight, and periodically morph their appearance and stats. The aim is a self-contained playable demo, intentionally limited in scope to keep iteration fast.</p>

<h2 id="purpose">Purpose</h2>

<p>Explore procedural creature design and lightweight 3D combat mechanics using pure Python, without external 3D assets or a heavy engine. Ursina sits on top of Panda3D and keeps the setup to a single <code class="language-plaintext highlighter-rouge">uv add</code> command.</p>

<h2 id="highlights">Highlights</h2>

<ul>
  <li>Creatures built entirely from Ursina primitives — spheres, cubes, cones — with randomized body parts (eyes, horns, tails, wings)</li>
  <li>Per-creature randomized stats: color, size, speed, health, attack damage, aggression range</li>
  <li>Periodic morphing every few seconds that visibly changes a creature’s scale, color, and stats mid-match</li>
  <li>In-scene health bars; creatures are removed when health reaches zero</li>
  <li>Player controls: <code class="language-plaintext highlighter-rouge">Space</code> spawn, <code class="language-plaintext highlighter-rouge">R</code> reset arena, <code class="language-plaintext highlighter-rouge">P</code> pause, <code class="language-plaintext highlighter-rouge">Esc</code> quit</li>
</ul>

<h2 id="technical-notes">Technical notes</h2>

<table>
  <thead>
    <tr>
      <th>Item</th>
      <th>Detail</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>Language</td>
      <td>Python 3.12+</td>
    </tr>
    <tr>
      <td>Engine</td>
      <td>Ursina 8.3+ (Panda3D)</td>
    </tr>
    <tr>
      <td>Package manager</td>
      <td>uv</td>
    </tr>
    <tr>
      <td>Entry point</td>
      <td><code class="language-plaintext highlighter-rouge">uv run python main.py</code></td>
    </tr>
  </tbody>
</table>

<p><em>Needs more detail in the README.</em></p>]]></content><author><name>Kyle Woolford</name></author><category term="project" /><category term="python" /><category term="game" /><category term="ursina" /><category term="3d" /><summary type="html"><![CDATA[Creature Creator (working title: Morphlings Arena) is a small Python 3D game prototype built with the Ursina engine. The project drops a set of randomly generated creatures — each assembled from primitive geometric shapes — into an enclosed arena where they wander, chase rivals, fight, and periodically morph their appearance and stats. The aim is a self-contained playable demo, intentionally limited in scope to keep iteration fast.]]></summary></entry><entry><title type="html">Recipe App</title><link href="https://kwoolford.github.io/project/recipe-app/" rel="alternate" type="text/html" title="Recipe App" /><published>2026-04-25T00:00:00+00:00</published><updated>2026-04-25T00:00:00+00:00</updated><id>https://kwoolford.github.io/project/recipe-app</id><content type="html" xml:base="https://kwoolford.github.io/project/recipe-app/"><![CDATA[<p>Recipe App (Pantry Pairing Recipe App) is a single-file Streamlit application that generates personalized recipe suggestions using a locally running LLM via Ollama. Users describe what’s in their pantry, their time budget, meal type, effort level, dietary needs, and flavor mood — and the app prompts the model to return three structured recipe cards complete with ingredient breakdowns, step-by-step instructions, and pairing suggestions. All inference runs locally; no cloud API keys are required.</p>

<h2 id="purpose">Purpose</h2>

<p>Test whether a minimal local-LLM stack (Streamlit + Ollama) can deliver a genuinely useful everyday cooking tool, while keeping the prototype simple enough to run anywhere with a single command.</p>

<h2 id="highlights">Highlights</h2>

<ul>
  <li>Configurable Ollama model selection in the UI — swap <code class="language-plaintext highlighter-rouge">gemma4</code>, <code class="language-plaintext highlighter-rouge">llama4</code>, or any pulled model without touching code</li>
  <li>Structured Markdown output per recipe: time estimate, pantry ingredient match, extra items needed, numbered steps, fun drink/side/vibe pairings</li>
  <li>Optional favorites saved to a local JSON file</li>
  <li>Graceful error message and instructions when Ollama is not running</li>
</ul>

<h2 id="technical-notes">Technical notes</h2>

<table>
  <thead>
    <tr>
      <th>Component</th>
      <th>Detail</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>Frontend</td>
      <td>Streamlit</td>
    </tr>
    <tr>
      <td>LLM runtime</td>
      <td>Ollama (local HTTP <code class="language-plaintext highlighter-rouge">localhost:11434</code>)</td>
    </tr>
    <tr>
      <td>Data validation</td>
      <td>Pydantic</td>
    </tr>
    <tr>
      <td>HTTP client</td>
      <td>requests</td>
    </tr>
    <tr>
      <td>Package manager</td>
      <td>uv</td>
    </tr>
    <tr>
      <td>Entry point</td>
      <td><code class="language-plaintext highlighter-rouge">uv run streamlit run app.py</code></td>
    </tr>
  </tbody>
</table>]]></content><author><name>Kyle Woolford</name></author><category term="project" /><category term="python" /><category term="streamlit" /><category term="llm" /><category term="ollama" /><summary type="html"><![CDATA[Recipe App (Pantry Pairing Recipe App) is a single-file Streamlit application that generates personalized recipe suggestions using a locally running LLM via Ollama. Users describe what’s in their pantry, their time budget, meal type, effort level, dietary needs, and flavor mood — and the app prompts the model to return three structured recipe cards complete with ingredient breakdowns, step-by-step instructions, and pairing suggestions. All inference runs locally; no cloud API keys are required.]]></summary></entry><entry><title type="html">Beat Saber Automapper</title><link href="https://kwoolford.github.io/project/beatsaber-automapper/" rel="alternate" type="text/html" title="Beat Saber Automapper" /><published>2026-04-18T00:00:00+00:00</published><updated>2026-04-18T00:00:00+00:00</updated><id>https://kwoolford.github.io/project/beatsaber-automapper</id><content type="html" xml:base="https://kwoolford.github.io/project/beatsaber-automapper/"><![CDATA[<p>An open-source AI system that takes an audio file and produces a playable Beat Saber level — notes, arcs, chains, bombs, obstacles, and a synchronized light show — packaged as a v3-format <code class="language-plaintext highlighter-rouge">.zip</code>. The pipeline is a three-stage transformer: a shared audio encoder feeds an onset detector, a note-sequence decoder, and a lighting decoder, each conditioned on difficulty, genre, and per-frame song-structure features.</p>

<h2 id="purpose">Purpose</h2>

<p>The goal is to replicate what good human mappers do — density planning, swing-direction flow, lighting that tracks song energy — from audio alone, with a model small enough to train overnight on a single RTX 5090. Existing automappers either target the obsolete v1 format, skip lighting entirely, or produce maps that feel mechanical. This project targets v3, includes lighting, and is trained on a curated high-rating slice of BeatSaver.</p>

<h2 id="highlights">Highlights</h2>

<ul>
  <li><strong>Style-cohort training (V5).</strong> Replaced a single averaged model with per-mapper style cohorts — 18 mappers in 9 style buckets — each trained independently.</li>
  <li><strong>Auto-researcher harness.</strong> A YAML queue of training specs runs overnight; each result is scored on a playability + style-closeness composite and written to a leaderboard.</li>
  <li><strong>Rich conditioning.</strong> Every stage receives difficulty + genre + song-structure embeddings (RMS, onset strength, band energies, section id, section progress) so soft sections slow down and drops get dense.</li>
  <li><strong>Token-level note generation.</strong> Stage 2 uses an autoregressive transformer over a 183-token vocabulary, with beam search or nucleus sampling and an ergonomics loss that penalizes parity violations.</li>
</ul>

<h2 id="technical-notes">Technical notes</h2>

<table>
  <thead>
    <tr>
      <th>Stage</th>
      <th>Task</th>
      <th>Arch</th>
      <th>Loss</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>1</td>
      <td>onset detection</td>
      <td>6-block TCN + 2L transformer</td>
      <td>BCE on fuzzy labels</td>
    </tr>
    <tr>
      <td>2</td>
      <td>note sequence</td>
      <td>8L transformer decoder</td>
      <td>CE + flow + ergo</td>
    </tr>
    <tr>
      <td>3</td>
      <td>lighting events</td>
      <td>4L transformer decoder</td>
      <td>CE + label smoothing</td>
    </tr>
  </tbody>
</table>

<p>All stages share one audio encoder: 4-layer CNN frontend → sinusoidal positional encoding → 6-layer transformer (d_model=512). Trained end-to-end on Beat Saber map data — no pretrained speech weights, because what matters here is low-level rhythmic structure, not semantics.</p>

<blockquote>
  <p><strong>Note.</strong> Trained with a RTX 5090 (sm_120) requires PyTorch nightly with <code class="language-plaintext highlighter-rouge">cu128</code>; stable wheels don’t yet compile for Blackwell.</p>
</blockquote>

<h2 id="links">Links</h2>

<ul>
  <li>Repo: <a href="https://github.com/Kwoolford/beatsaber_automapper">github.com/Kwoolford/beatsaber_automapper</a></li>
  <li>Previewer: <a href="https://allpoland.github.io/ArcViewer/">ArcViewer</a> — drag in a generated <code class="language-plaintext highlighter-rouge">.zip</code></li>
  <li>Map format spec: <a href="https://bsmg.wiki/mapping/map-format.html">BSMG Wiki</a></li>
</ul>]]></content><author><name>Kyle Woolford</name></author><category term="project" /><category term="python" /><category term="ml" /><category term="pytorch" /><category term="audio" /><summary type="html"><![CDATA[An open-source AI system that takes an audio file and produces a playable Beat Saber level — notes, arcs, chains, bombs, obstacles, and a synchronized light show — packaged as a v3-format .zip. The pipeline is a three-stage transformer: a shared audio encoder feeds an onset detector, a note-sequence decoder, and a lighting decoder, each conditioned on difficulty, genre, and per-frame song-structure features.]]></summary></entry><entry><title type="html">Welcome</title><link href="https://kwoolford.github.io/note/welcome/" rel="alternate" type="text/html" title="Welcome" /><published>2026-04-18T00:00:00+00:00</published><updated>2026-04-18T00:00:00+00:00</updated><id>https://kwoolford.github.io/note/welcome</id><content type="html" xml:base="https://kwoolford.github.io/note/welcome/"><![CDATA[<p>This site is a running log of the projects I’m working on and miscellaneous things I want to keep track of.</p>

<p>Most entries will be short digests auto-generated from project READMEs; some will be notes I drop in by hand. The homepage surfaces featured projects at the top, with the full archive filterable below. Each project page is meant to read like a notebook entry — a single-sentence summary, a stack row, a status, then sections for purpose, highlights, and technical notes.</p>

<p>If something is missing or broken, the source is on <a href="https://github.com/Kwoolford/Kwoolford.github.io">GitHub</a>.</p>]]></content><author><name>Kyle Woolford</name></author><category term="note" /><category term="meta" /><summary type="html"><![CDATA[This site is a running log of the projects I’m working on and miscellaneous things I want to keep track of.]]></summary></entry></feed>