Margin
← back to simulations
Phase 2 simulation

Polynomial Playground.

Drop a few dots. Slide the polynomial degree from 1 to 15. The curve morphs in real time — and somewhere around degree 10, you watch overfitting happen with your own eyes.

loading the simulation…

What you’re seeing

A polynomial of degree d has d+1 coefficients. The more coefficients, the more “wiggles” the curve can have. Ordinary least squares finds the coefficients that minimize the squared distance from the curve to your dots.

When the degree is low, the curve is too rigid to capture real patterns. When the degree is too high, the curve wiggles to pass through every dot perfectly — including all the noise. Training error goes to zero, but the curve has memorized the data instead of learning the underlying shape.

The sweet spot — the degree that captures the pattern without chasing the noise — is what most of machine learning is about. This same principle shows up in every model you’ll meet: linear regression, neural nets, transformers. Same lesson, different scale.