**Learning adaptive behavior requires dynamic environments**
CPGs / networks of sine driven actuation are primarily ignoring sensory input. They can be learned in isolation and don’t need to adapt
Dynamic environments are necessary for reactive, adaptive behavior
Phases of learned adaptive behavior
- responding to environmental differences (e.g. obstacles)
- undulating landscape
- responding to sensory data generated by other actors
- predicting or modeling the behavior of others as a monolith
- predicting or modeling the behavior of other as distinct groups
- predicting or modeling the behavior of others as individuals
Interactive
- Watch creatures evolve in an environment
- change parameters of the environment
- energy density, clumpiness, randomness
- interaction with others, transfer energy (fight)
- islands connected by tunnels vs open plains
- gravity
- fluid effects
- see effect on population statistics
- save checkpoints, restart on different timelines
- see list of creatures, select to focus on one
- Take over a creature to train it
- random mapping between finger joints/endpoints and creature joints/endpoints
- start training, DAGGER behavioral cloning
- learned weights mapped back to genome
- individual reinserted into population
- Generative modeling of creatures (sketchy ideas)
- self avoiding curves
- aggregation of particles
- reaction-diffusion
# RealityKit tests
VisionOS has RealityKit, which has a physics simulator.
- Issues so far:
- No motors on joints, but can apply force torque directly. Kinda fun to have to be a bit lower level
- Little/poor documentation, though there is some and developer support seems responsive
- Benefits
- Direct tie to renderer and events, easy to handle interactions
- New and shiny-ish
## Abstractions needed
- World.
- Find a place for newly created creatures
- Populate the terrain (kinematic collision surfaces)
- Manage populations of plants, animals and the creatures
- **Handle collisions** for eating / mating / fighting interactions, e.g., add energy to creature that "eats" a plant
- Create new plants/animals as background resources
- Creature - run time instance
- Interact with World to get sensory input
- Activates the neural net to decide on actions
- Activate muscles to interact with World
- Accept World notifications that it has been injured or changed by the environment / other creatures
- Creator
- Creates a Creature from a Genotype
- Assembles the physical body and joints
- Connects the neural net for control
- Genotype
- Holds the definition of a particular creatures morphology and controller
- Mutates self based on a mutation rate
- Combines with other genotypes to create a third kind
- Compares similarity between self and other genotypes
- Directed Graph of Nodes and Connections
- Node
- Describes a rigid part
- dimensions - size of part - vec3
- recursive limit
- joint type (rigid, revolute, twist, universal, bend-twist, twist-bend, spherical)
- joint limits (two floats per DoF)
- array of Effectors
- array of Sensors (contact, photo)
- array of hidden neurons
- internal connections between:
- constant values (out)
- periodic generators (out)
- sensors (out)
- neurons (k in, j out, depending on type)
- effectors (in)
- in connection slots to parent, child, and brain
- out connection slots to parent, child and brain
- Attachment
- placement of child part relative to parent
- position (of child on parent)
- orientation (of child)
- scale factor of child size
- reflection of placement (negative scaling)
- terminal-only flag (connection used only with recursive limit of parent is reached)