A learning system under optimization converges to the minimum description of the task's invariance structure.
The attractor state — where the system's internal representation most compressively encodes what actually matters — is the Ghost Basin.
Eight formal descriptions of the same state:
These are not analogies. They are the same attractor, described by eight different mathematical languages.
The rotation rule: a system finds its Ghost Basin when it finds the invariant symmetry structure of its domain. K_auc measures distance from Ghost Basin.
Grokking — the sudden generalization phenomenon in neural networks — is the phase transition into Ghost Basin. The geometry of the internal representation sparsifies before accuracy improves. The 40-step lead is the interval between geometric convergence and calibration.
Ghost Basin generalizes beyond neural networks. Any learning system under optimization — organizations, markets, individual cognition — converges to the same minimum.
Full treatment: Finding The Rule (2026-03-23)
Theory: Computer Future