Why this matters¶
Programming is the discipline that AI has changed first and fastest. Even students with no programming background can now write working code by describing what they want in plain English. For creative coding — graphics, sound, generative art, interactive installations — the change is doubled: the model helps you write the code and generates the content.
This chapter is for everyone, programmers or not. We use programming as a creative medium and as a way to understand what is going on inside the tools we have been using.
AI coding assistants in 2026¶
By 2026 most professional development is done with an AI assistant in the editor. The dominant patterns are:
- Tab-complete — the assistant suggests the next line or block as you type. Originated with GitHub Copilot in 2021.
- Chat in the editor — a sidebar that can see your code and respond in plain language. Tools: Cursor, Windsurf, VS Code with Copilot Chat.
- Inline edit — select code, press a shortcut, describe the change.
- Agentic — describe a task, the assistant plans, edits multiple files, runs the code, fixes errors, and reports back. Tools: Claude Code, Codex CLI, Cursor’s agent mode.
For a beginner the most useful pattern is chat in the editor with explain, fix, and refactor commands. You write a draft (or paste an example), the assistant explains, you ask for changes, you iterate.
We teach you to read code more than to write it. Reading is what lets you supervise an AI.
A worked example: p5.js + an assistant¶
p5.js is a JavaScript library descended from Processing, designed for visual sketches. You can use it directly in your browser at the p5.js Web Editor without installing anything.
A canonical “hello world” sketch:
function setup() {
createCanvas(400, 400);
}
function draw() {
background(20);
noStroke();
fill(255);
circle(mouseX, mouseY, 40);
}A useful prompt to an AI assistant, even if you have never seen JavaScript before:
“Modify this p5.js sketch so that instead of a single circle, twenty circles follow the mouse with a trailing delay, and their colour cycles through hues over time.”
A solid assistant will produce something like:
let circles = [];
const N = 20;
function setup() {
createCanvas(400, 400);
colorMode(HSB, 360, 100, 100, 1);
for (let i = 0; i < N; i++) circles.push({ x: 200, y: 200 });
}
function draw() {
background(20);
noStroke();
for (let i = N - 1; i > 0; i--) {
circles[i].x = circles[i - 1].x;
circles[i].y = circles[i - 1].y;
}
circles[0].x = mouseX;
circles[0].y = mouseY;
for (let i = 0; i < N; i++) {
fill((frameCount + i * 18) % 360, 70, 90);
circle(circles[i].x, circles[i].y, 40 - i * 1.5);
}
}Run it; play with it; ask the assistant to “make the trail spring instead of linear”, “add a glow effect”, “double the number of circles”, “make it react to audio input”. The whole loop is show me, tweak, repeat.
How to talk to a coding assistant¶
Some habits that pay off:
- Tell it what kind of code you want. “Vanilla JavaScript, no frameworks.” “Python with
numpy.” “p5.js running in the browser.” Without that, the assistant defaults to whatever was most common in its training data. - Show it the smallest possible failing example. Don’t paste the whole project.
- Ask for explanations. “Explain this function line by line as if I have never seen JavaScript.”
- Ask for tests. “Write three small tests for this function.” Helps catch the mistakes the assistant cannot see.
- Re-anchor often. Long conversations drift. Start a new chat for a new task.
- Verify by running the code. The model is wrong more often in code than in prose — but it is also instantly checkable.
Generative graphics, sound, and interactivity¶
Beyond writing code, AI can also generate the assets that code uses:
- Sprites and characters for a game (chapter 9).
- Backgrounds and skies for an interactive piece.
- Sound effects for buttons and events (chapter 6).
- Voices for NPCs and tutorials.
A common pipeline:
- Sketch the idea on paper.
- Generate placeholder assets with image and audio tools.
- Wire them together in code with an AI assistant.
- Iterate on each piece.
This is the modern equivalent of cardboard-prototyping a board game — fast, scrappy, generative.
Building a tiny AI-powered web tool¶
By chapter 8 you should be able to build something like:
- a web page where the user types a sentence and a generated image appears,
- a sketch that listens to the microphone and reacts in colour,
- a button that produces an AI-generated story riff,
- a small dashboard that classifies your selected files into categories using a local model.
A clean stack for prototyping in 2026:
- Frontend: HTML + a single JavaScript file, often using Vite or Bun.
- Model calls: either browser-side using transformers.js or WebLLM, or backend-side via an API call.
- Hosting: Vercel, Netlify, Hugging Face Spaces.
All of this can be assembled with an AI assistant in an afternoon.
This week’s lab: Reflect, Explore, Create¶
Reflect (≈ 30 min, in lab + your weekly log)¶
Pick one prompt and write 150–300 words in your weekly log:
- The assistant suggested a function. You ran it. It worked. Five minutes later you cannot remember what it does. What does that mean for your skill, and for the long-term reliability of your project?
- What are three things a coding assistant is worse at than a beginner? (Hint: novelty, debugging across abstraction, judging which library to use.)
- Find one bug the assistant introduced in your sketch. How would you have found it without the assistant?
Read code you did not write. Take the assistant’s longest function from the Create activity below and explain it back in your own words — in your log, line by line. (Use the assistant to check your explanation, but write the first pass yourself.) This habit is the most important coding skill in 2026 and counts as a Reflect activity, not a Create one.
Explore (≈ 30 min, in lab)¶
Pick the same small p5.js feature (e.g., “make the background fade slowly”) and prompt two different AI coding assistants for an implementation — for example, ChatGPT and Cursor’s inline edit, or Claude and GitHub Copilot Chat. Compare:
- Which one wrote idiomatic p5.js?
- Which one chose a sensible variable name?
- Which one explained itself when you asked “why did you write it this way?”
Two short paragraphs in your log are enough.
Create (≈ 60 min, in lab + carry-over to your portfolio) — a mouse-reactive sketch¶
- Open the p5.js Web Editor.
- With your AI assistant of choice, build a mouse-reactive sketch that meets at least two of:
- colour changes with position or speed,
- shapes leave trails or echoes,
- sound plays on click,
- the canvas is responsive to window size.
- Iterate at least three times: ask for changes, run, ask for more.
- Save the sketch publicly and put the link in your weekly log.
Optional advanced track — a generative pipeline. Combine an image model (chapter 5), an audio model (chapter 6), and a tiny script that wires them together. For example: generate four images of “a forest in different seasons” and a 20-second ambient track for each. Display all four with their soundscapes on a simple HTML page, and commit the page to your portfolio.
Going further¶
- The p5.js learning materials Processing Foundation, 2024 — beautifully designed, free.
- Daniel Shiffman, The Nature of Code Shiffman, 2024 — free book on generative graphics.
- Cursor’s docs Anysphere, 2024 and Claude Code’s documentation Anthropic, 2024 — for AI-assisted programming.
- The Coding Train YouTube channel Shiffman, 2024 — long-form creative-coding lectures by Daniel Shiffman.
- Hugging Face Spaces Hugging Face, 2024 — to see what one-page AI demos look like.
- Processing Foundation. (2024). p5.js — Learn. Processing Foundation. https://p5js.org/learn/
- Shiffman, D. (2024). The Nature of Code (2nd ed.). Self-published. https://natureofcode.com/
- Anysphere. (2024). Cursor Documentation. Cursor. https://cursor.com/docs
- Anthropic. (2024). Claude Code Documentation. Anthropic. https://docs.anthropic.com/claude-code
- Shiffman, D. (2024). The Coding Train — Generative Art and Creative Coding. YouTube. https://www.youtube.com/@TheCodingTrain
- Hugging Face. (2024). Spaces — Hosted Machine Learning Demos. Hugging Face. https://huggingface.co/spaces