Experiments

V12: Attention-Based Lenia

V12: Attention-Based Lenia

Addition: State-dependent interaction topology (evolvable attention kernels).

Result: Φ\intinfo increase in 42% of cycles (vs 3% for convolution). +2.0pp shift — largest single-intervention effect. But robustness stabilizes near 1.0.

Implication: Attention is necessary but not sufficient. The system reaches the integration threshold without crossing it.

Source code