An image briefly flashed on the retina introduces a cascade of processes in the brain that unfold in time. Some processing streams focus on texture, others on spatial locations or large scale forms. Temporal evolution along different pathways may occur at different rates. Even within a single processing stream, distinct locations in space can lead to neural responses at slightly different times - a static image never exists in the brain. In the images presented here, snapshots of temporally evolving internal forms are made visible.
The first stage of the algorithm resembles filtering that occurs in the retina, thalamus and primary layers of cortex. The next step derives from the observation that different neurons respond to distinct parts of an image at slightly different times. If these temporal variations are translated into spatial dislocations, then uniform fields are distorted, planes acquire curvature, and fractures arise. (A physical analogy for this process can be found in the underwater light caustics created by waves on the surface of a swimming pool.) The process is related to new algorithms for time-frequency analysis.