netbusinesslabs
Back to journal
AI·Mar 28, 2026·7 min read

288 SKU configurations in 11 days: anatomy of an AI-augmented catalogue

How we delivered Kindred Studio's full seating collection — 12 SKUs × 6 fabrics × 4 woods — in less than two weeks.

Kindred came to us with a problem most furniture brands now face: their range had quietly grown to where physical photography couldn't keep up. 12 SKUs × 6 fabrics × 4 woods = 288 valid combinations. They needed all of them, photoreal, on the website. Here's the pipeline we ran.

Day 1–4 — Photoreal masters

Twelve SKUs, modelled from CAD, lit identically in a studio scene that mirrored Kindred's existing brand photography. One render per SKU, 4K, fully ray-traced. This is the part that's still entirely human craft — a senior 3D artist at the wheel for every shot.

Day 5–6 — Brand fine-tune

We trained a Stable Diffusion XL model on Kindred's existing catalogue plus the 12 masters we just rendered. The model learned what 'a Kindred chair' looks like — the studio aesthetic, the lighting bias, the material vocabulary.

Day 7–10 — Variation generation

For each master, we ran 24 variation passes (6 fabrics × 4 woods) through the fine-tuned model. The model preserved geometry, lighting and brand feel; it varied only material and finish. 288 outputs in 4 days of compute.

Day 11 — Human QA

Every single one of the 288 frames was reviewed by a senior artist. About 4% needed regeneration. Zero AI artefacts shipped. The catalogue went live the next morning.

Why this isn't 'AI slop'

The model never had to invent a chair. It started from a fully ray-traced master. Its job was small and specific: change the fabric, change the wood. That's the fundamental difference between AI-augmented production and AI-only production. One scales craft. The other replaces it — badly.