HyperAIHyperAI

Command Palette

Search for a command to run...

Neural Network Learns the Mandelbrot Set Using Fourier Features to Overcome Spectral Bias

Teaching a neural network to learn the Mandelbrot set reveals deep insights into how neural networks represent complex functions. The Mandelbrot set, a fractal defined by the iterative formula ( z_{n+1} = z_n^2 + c ) starting from ( z_0 = 0 ), produces an infinitely detailed boundary that challenges even the most advanced models. While the set is mathematically deterministic, its extreme complexity makes it a powerful testbed for understanding neural network limitations and capabilities. The key challenge lies in the fact that the boundary of the Mandelbrot set is highly discontinuous and exhibits infinite detail at every scale. A naive binary classification approach—labeling points as inside or outside—leads to unstable training due to the sensitivity of the outcome to tiny changes in ( c ). To overcome this, the problem is reformulated as a regression task using a smooth escape-time function. This function maps each point in the complex plane to a continuous value between 0 and 1, derived from the iteration count at which the sequence escapes, with logarithmic scaling to improve distribution and prevent abrupt jumps. Data collection is optimized by biasing the sampling toward the boundary, where the most complex structure resides. A mix of uniform and boundary-focused samples ensures the model learns both the global shape and fine details. The dataset is then used to train a deep residual MLP, which takes raw ( (x, y) ) coordinates as input and predicts the smooth escape value. The baseline model, despite its depth and capacity, fails to capture the fine details near the boundary. The output appears blurred, with thin filaments and intricate patterns missing. This is not due to insufficient data, model size, or training time, but rather a fundamental issue known as spectral bias. Neural networks tend to learn low-frequency components of a function first and struggle with high-frequency variations—precisely the kind of structure found in fractals. The solution lies in input encoding. By using multi-scale Gaussian Fourier features, the input coordinates are transformed into a higher-dimensional space using sinusoidal projections across multiple frequency bands. This approach, inspired by the 2020 paper by Tancik et al., allows the network to more easily represent high-frequency patterns. The Fourier features act as a built-in basis for capturing fine-scale structure, effectively shifting the burden of learning complex oscillations from the network to the input representation. When the same deep residual architecture is used with multi-scale Fourier features, the results are striking. The model no longer plateaus after learning the global shape. Instead, it gradually refines the image, first capturing large-scale features and then progressively adding finer details. The final output shows sharp, well-defined filaments and the full complexity of the Mandelbrot boundary—something the vanilla model could not achieve. This experiment demonstrates that the bottleneck is not in the model’s capacity or the data, but in the input representation. By encoding coordinates in frequency space, the network gains the ability to approximate functions that are otherwise intractable. This principle extends far beyond fractals, with applications in computer graphics, physics-informed neural networks, and signal processing, where high-resolution, high-frequency data is common. All visualizations in this work were generated directly from the model’s outputs, with no use of external renderers. The full code, including training, rendering, and animation generation, is available in the associated GitHub repository. The results underscore a key takeaway: the right input representation can transform a model’s performance, turning a blurry approximation into a detailed, accurate representation of one of the most complex mathematical objects in existence.

Related Links

Neural Network Learns the Mandelbrot Set Using Fourier Features to Overcome Spectral Bias | Trending Stories | HyperAI