Skip to content
Gallery
Active Inference Ontology Website
Share
Explore
Active Inference Ontology

icon picker
Correct & Incorrect Examples

Return to the home page.
See the &
Active Inference Ontology ~ Correct & Incorrect Examples
List
Term
Correct Examples
Incorrect Examples
Core
65
Accuracy
A lower-resolution camera or fMRI has lower
@Accuracy
and thus lesser capacity to map fine scale features of stimuli.
The true distribution was 10 +/- 1 and the person guessed 10 +/- 5, thus they had higher
@Accuracy
. I was sure the green car was really blue; this inaccuracy was caused by the dim light.
Action
Ants are continually involved in
@Action
in real life (
@Realism
) and/or in a
@Model
(
@Instrumentalism
)
I love this band’s first song, can’t wait till the
@Action
really begins!
Action Planning
The robot assessed its current and target location, then engaged in
@Action Planning
to decide under time pressure how to navigate.
The
@Agent
used
@Action Planning
algorithms for
@Variational Free Energy
@Inference
on
@Sensory input
Active Inference
It is fun and rewarding to learn and apply
@Active Inference
as a framework for
@Perception
,
@Cognition
, and
@Action
— it is a
@Process Theory
compatible with the
@Free Energy Principle
.
@Active Inference
is a style of reinforcement learning
@Artificial
@intelligence
that uses
@frequentist
statistics only.
Active States
The
@Active States
of the computer program are the statistical outputs that it presents, while the
@Sensory input
@Sense State
are the incoming statistical dependencies.
Canada and Kenya have many people involved in dance, really you could say these two are
@Active States
Affordance
“In this T-maze model, there are 3
@Affordance
for movement at the junction (Left, Right, Down).”
“He didn’t have the
@Affordance
to do that, but he did it anyway”.
Agency
An
@Agent
uses their
@Agency
to sculpt their
@environment
which makes it more predictable thereby minimizing the agent’s
@Variational Free Energy
and maintaining its
@Markov Blanket
in the face of random fluctuations.
I got a new picture taken at the passport
@Agency
.
Agent
“The ant nestmate is the
@Agent
in the Active InferAnts model”
“Calcium carbonate is the anti-caking
@Agent
in this cookie”
Ambiguity
The noisy readings from the thermometer resulted in high
@Ambiguity
given the
@Sensory input
to the
@Agent
.
I am curious about what is inside the box, there is
@Ambiguity
about what it might be.
Attention
The
@Model
instantly updated to the new
@Sensory Data
because it was paying maximal
@Attention
to the
@Stimulus
.
The words were impactful even though the child was not paying
@Attention
at all.
Bayesian Inference
Many of the key ideas of
@Bayesian Inference
existed before Rev. Bayes, and in some cases reflect recent contributions from computational research.
The frequentist t-test is a classical example of
@Bayesian Inference
.
Behavior
I expect you to be on your best
@Behavior
!
The
@Behavior
of this hammer is being determined entirely by my
@Internal State
s without being intermediated by
@Blanket State
s.
Belief
The
@Prior
or
@Empirical prior
in
@Bayesian Inference
can be a
@Belief
on
@Hidden State
.
A
@Belief
in
@Bayesian Inference
requires
@Subjective feeling states
or conscious awareness.
Belief updating
The incoming
@Sensory Data
resulted in
@Belief updating
I promise that if you vote for me, even if I change my mind due to new
@Information
coming in, I will never undergo
@Belief updating
— so you know exactly what you get with me!
Blanket State
I don’t care whether it is an
@Action
or
@Sense State
, as long as it is a
@Blanket State
!
My child created a fortress from the bedding; it belongs to the
@Blanket State
Cognition
BacillisAxelis345 engaged in
@Cognition
to decide whether to eat or to escape.
Because the
@Agent
is exhibiting intelligent
@Behavior
, this means
Complexity
@Model
@Complexity
can refer to the number of predictor or independent variables or features that a model needs to take into account in order to make accurate prediction.
@Complexity
is the opposite of concavity.
Data
These three readings from the thermometer constitute
@Data
!
An anecdote is not a piece of
@Data
.
Ensemble
The ant colony from a
@Behavior
al, or
@Collective behavior
perspective, is an
@Ensemble
of nestmates.
After the band played their final song, the crowd called out “
@Ensemble
!”, prompting them to play another song.
Epistemic value
When the environment contains
@Uncertainty
I would like to undertake the action of
@Foraging
for the sake of it to see if I can gather more relevant
@Observation
s to explain what is going on around me. Such relevant
@Observation
s have
@Epistemic value
for me as an
@Agent
.
In the case of temperature homeostasis, taking
@Action
s that move the body temperature in an adaptive direction are of purely
@Epistemic value
.
Evidence
Every photon is like a piece of
@Evidence
on the retina.
Just because you have litigated the case with
@Data
doesn’t mean that you provided any
@Evidence
Expectation
At timestep 1, the
@Agent
made a prediction about
@Expected Free Energy
through time, this was an
@Expectation
about the future.
Although the scientist made a
@Prediction
about the future using a
@Model
, it was not an
@Expectation
since they were not waiting for it to be realized.
Expected Free Energy
The deep affective
@Inference
@Agent
used
@Expected Free Energy
calculation as a basis of
@Policy selection
.
The amount of calories that the
@Living system
has at
@Non-Equilibrium Steady State
is the
@Expected Free Energy
.
External State
In my
@Generative Model
of temperature, the true temperature in the room is being modeled as an
@External State
(
@Hidden State
), so we will never know it.
That country is outside of our borders, it is an
@External State
.
Free Energy
@Free Energy
can refer to various different formulations and decompositions, that share some important key features.
You can get the electricity for no cost at all, it is
@Free Energy
.
Free Energy Principle
As a principle, the
@Free Energy Principle
cannot be falsified.
There is not enough empirical evidence to support the
@Free Energy Principle
.
Friston Blanket
A
@Friston Blanket
is a type of
@Markov Blanket
where
@Sense State
s are associated with incoming
@Sensory Data
and
@Action
states are associated with outgoing
@Behavior
of the
@Agent
.
Karl walked into the room and cast a chilly
@Friston Blanket
over the audience.
Generalized Free Energy
wrote “Crucially, this means the
@Generalized Free Energy
reduces to the
@Variational Free Energy
for outcomes that had been observed in the past.....Outcomes in the
@Generalized Free Energy
formulation are represented explicitly as
@Belief
s. This means that the
@Prior
over
@Sensory outcome
is incorporated explicitly in the generative model.”
The speaker was talking about
@Variational Free Energy
on
@Generalized coordinates
in a very broad and vague fashion, this is identical to
@Generalized Free Energy
.
Generative Model
@Partially Observed Markov Decision Process
es are commonly used for computational modeling of
@Generative Model
s.
The subject of the photography session made art as well, they are a
@Generative Model
— but if they wouldn’t have made the art, they would have only been a
@Model
, not a
@Generative Model
.
Generative Process
The
@Generative Process
generates
@Sensory Data
.
The
@Agent
uses its
@Generative Process
to determine future
@Hidden State
of the
@environment
.
Hierarchical Model
A
@Nested
@Model
is a
@Hierarchical Model
, for example the
@Hierarchically Mechanistic Mind
.
The
@Hierarchical Model
stated that water boils at a higher temperature at higher altitudes.
Inference
The researcher made
@Model
of
@Active Vision
where the
@Agent
was doing
@Inference
on
@Action
(
@Action Planning
,
@Action Prediction
) as well as
@Perception
(
@perceptual inference
).
The ball rolled downhill using
@Inference
on
@Policy selection
, sometimes veering more to the left and other times more to the right.
Information
There is more maximum
@Information
in 1 terabyte than in 1 gigabyte.
I listened to that podcast and the file checksum was OK but the
@Information
was modified relative to the reference version.
Internal State
When the
@Generative Process
describes a forest outside, and the
@Blanket State
s describe the bark of the tree, the
@Internal State
s describe the core of the tree.
The
@State space
of the internet is equivalent to the total number of computers connected to it at any given time.
Learning
The software agent engaged in
@Belief updating
on internal
@parameter
s, this is technically
@Learning
.
Every day we change, but from a
@Bayesian Inference
perspective it is only
@Learning
if the
@Belief updating
is adaptive.
Living system
A body is a
@Living system
.
A
@Living system
is a mathematical model used to predict weather patterns.
Markov Blanket
@Markov Blanket
s are features of Maps (e.g.
@Bayesian
@Graphical
@Model
s), not of Territories (e.g. ant colonies or brains).
In textile industry, a
@Markov Blanket
is a blanket made using a specific knitting technique."
Markov Decision Process
Whether it is fully observable or not, one common type of model used in control theory is a
@Markov Decision Process
We will let Captain Markov make the call here, since it is a
@Markov Decision Process
Multi-scale system
The
@Generative Model
of counties within states within countries, was a
@Multi-scale system
In culinary arts, a
@Multi-scale system
is a method for measuring ingredients in both metric and imperial units.
Niche
Every ant lives in their ecological
@Niche
.
That band’s kind of music honestly is just too
@Niche
for me.
Non-Equilibrium Steady State
Blood sugar levels are dynamic and fluctuating, however they revisit characteristic states repeatedly, this can be described by a
@Non-Equilibrium Steady State
.
The ball just lay still on the floor, we can describe this as exhibiting a
@Non-Equilibrium Steady State
.
Observation
One
@Observation
can make all the difference.
@Observation
s are not required in order to do empirical parametrization of a statistical model.
Particle
@Internal State
and
@Blanket State
s together constitute the
@Particle
.
The house was made of
@Particle
board.
Perception
Visual
@Perception
gives us many demonstrations of the characteristics of our
@Generative Model
— for example saccades, the blind spot, and blink supression.
@Active Inference
models
@Action
and
@Perception
as literally the same thing for each
@Agent
.
Policy
The sequence of
@Action
s that an
@Agent
plans to take is a
@Policy
.
What insurance
@Policy
do you recommend for my house?
Policy selection
When the
@Agent
decided to go one way instead of the other, it was due to an internal process of
@Policy selection
, specifically
@Action and Planning as Divergence Minimization
.
We went shopping for insurance and the package was already determined for me, this is a case of
@Stochastic
@Policy selection
.
Posterior
The
@Posterior
distribution reflects our degrees of
@Belief
about
@Latent cause
s after we see
@Sensory Data
.
The
@Posterior
distribution can always be trivially obtained by solving Bayes’ theorem.
Pragmatic Value
The
@Agent
had
@Preference
s for there to be more beans in the jar, so adding more beans was of
@Pragmatic Value
for them.
If a
@Policy
reduces the divergence between your
@Preference
s and
@Observation
s, it is defined as having negative
@Pragmatic Value
.
Prediction
After successfully
@Learning
the structure of the
@Generative Process
the
@Agent
can make a
@Prediction
about the future state of this
@Generative Process
and the associated
@Sensory Data
it will generate.
A magic 8-ball can make a
@Prediction
that will reliably be true.
Preference
The
@Generative Model
of the bacteria underwent
@parameter
fitting (
@Belief updating
/
@Learning
) on
@Action
, guided by a
@Preference
for medium but not high/low sugar concentration.
All
@Action
results in
@Agent
s that realize their
@Preference
in terms of
@Sensory outcome
.
Prior
@Active Vision
uses
@Prior
on
@Sensory input
.
We arrested someone with no
@Prior
s.
Recognition Model
After constructing a
@Generative Model
, an
@Agent
can invert this model to obtain the
@Recognition Model
which allows for the prediction of the
@Hidden State
(causes) that generated some new
@Sensory Data
.
That billboard has a picture of one of the most
@Recognition Model
in the world.
Representation
The sterotypical neural pattern induced by a
@Stimulus
is considered a
@Representation
, at least by those who subscribe to
@Representationalism
.
No Taxation without
@Representation
!
Salience
@Salience
is related to how relevant a given
@Stimulus
appears to be.
I can smell dinner cooking, and already my mouth is
@Salience
.
Sense State
In this model, the
@Observation
s coming in from the thermometers are considered as
@Sense State
s.
Most models don’t show it, but actually all
@Active Inference
@Generative Model
s include a “sixth
@Sense State
“.
State
@Blanket State
is a type of
@State
that partition
@Internal State
from
@External State
California is the
@State
with the best honey on the West Coast.
State space
The
@State space
of a
@Generative Model
can be a
@Continuous state space
or
@Discrete state space
.
In politics, the
@State space
refers to the total area governed by a particular state.
Stationarity
A common assumption of many time series algorithms is that the data exhibits
@Stationarity
.
I am exhibiting
@Stationarity
when I stop walking.
Surprise
Surprisal cannot be minimized directly because it is involves calculating the
@Evidence
term in Bayes’ Rule which generally involves an intractable integral over all possible states an organism can be in.
When they walk into the room, yell “
@Surprise
“!
System
George Mobus argues that
@System
s have some fundamental properties such as structure and function.
A
@System
is the scientific term for a large group of wolves.
Temporal Depth
In deep temporal models,
@Temporal Depth
or occurs because of the number of
@Counterfactual
possibilities one must account for increases as more future states are modeled (see: )
The longer the queue, the greater the
@Temporal Depth
.
Uncertainty
A measure of unpredictability or expected
@Surprise
(cf,
@Entropy
). The
@Uncertainty
about a
@Random variable
is often quantified with its
@Variance
(inverse
@Precision
).
I felt a lot of
@Uncertainty
after that job interview.
Variational Free Energy
@Variational Free Energy
is a tractable way to compute an upper bound on
@Surprisal
of a
@Generative Model
given
@Data
.
Our electric bill fluctuates so much each month though on average it is zero; this is known as
@Variational Free Energy
.
Outcome
The
@Generative Process
produces outcomes when we sample from it.
I was worried about the
@Outcome
of my decision.
Habit
Supplement
301
Abstract Action
Abstract action prediction
Abstract Bayesian Inference
Abstract epistemic value
Abstract External State
Abstract Generative Model
Abstract Hidden State
Abstract Internal State
Abstract Sensory State
Abstract System
Abstract Accuracy
abstractCounterpart
Action and Planning as Divergence Minimization
Action at a distance
Action Integral
Action Prediction
The
@Generative Model
over the next few timesteps with respect to
@Active States
, is the
@Action Prediction
.
The
@Agent
inferred what
@Affordance
s it had, this process is known as
@Action Planning
or
@Action Prediction
.
Active Vision
@Active Vision
Agency based model
Agency free model
Algorithm
Alignment
analogy
Appraisal theories of emotion
Attenuation of response
Augmented reality
Autopoiesis
In the right niche, cells can be considered to exhibit
@Autopoiesis
at the
@System
level.
The pile of sand quickly dissipated in the wind, however I still think it is my favorite example of long-range
@Autopoiesis
.
Bayes-optimal control
Bayesian
Bayesian Brain
Bayesian surprise
Bethe approximation
Bottom-up attentional control
chaos
Circular causality
Cognitive Science
Cognitive System
Cognitivism
Collective behavior
Conceptual metaphor
Conditional Probability
Confidence
Congruence
Connectionism
Control (states)
Control theory
Counterfactual
Cue
Culture
Cybernetics
Decision-making
Density
Deontic Action
Development
Dissipation
Distribution
divergence
Kullback-Leibler Divergence
Domain
Domain-generality
Domain-specificity
Dynamic Causal Modelling
Dynamic expectation maximization
Dynamicism
Ecology
EcoEvoDevo
Embedded Embodied Encultured Enactive Inference
Embodied Cybernetic Complexity
Embodied Belief
Emotion
Empirical prior
Enactivism
Entropy
Ergodicity
Estimator
Event-related potential
Evolution
Expectation maximization
Expected Utility Theory
Experience of body ownership
Explaining Away
Explanation
Extended Cognition
Factor graph
Falsification
Far-from-equilibrium
Fokker-Planck Equation
Foraging
Forney
Friction
Friston's Law
functional magnetic resonance imaging (FMRi)
Gaussian distribution
Generalized coordinates
Generalized Synchrony
Generative density
Generative modelling
Gestalt
Goal-driven selection
Gradient Descent
Graphical
Group Renormalization Theory
Guidance signal
Habit learning/formation
Hamilton's Principle of Least Action
Helmholtz machine
Hidden State
Hierarchically Mechanistic Mind
Homeostasis
Homeostatic system
Hyperprior
Hypothesis
Information bottleneck
Information Geometry
Instrumentalism
Interface
Interoception
Interoceptive sensitivity
Interpretation
Inverse problem
Latent cause
Lateral geniculate nucleus
Likelihood
Marginal approximation
Markovian Monism
Marr's Levels of Description
Material science
Mean
Mean field approximation
Memory
Message Passing
Mismatch negativity
Mode
Model
Model accuracy
Model Inversion
Morphogenesis
Multisensory integration
Narrative
I wrote a
@Narrative
about my time in graduate school.
Network
Neuronal Ensemble
Niche construction
Noisy signal
Non-linear dynamical systems
Novelty
Candy and ice cream are kinds of
@Novelty
foods.
Optimal control
overfitting
Partition
Policy posterior
Policy prior
Precision
Prediction error
Prediction error minimization
Predictive Coding
Predictive Processing
Principle
Probability distribution
Process Theory
Friston “The distinction is between a
@State
[theory] and
@Process Theory
; i.e., the difference between a normative principle that things may or may not conform to, and a
@Process Theory
or hypothesis about how that principle is realized”
Random variable
Realism
Receptive field
Recognition density
Regime of Attention
Representationalism
Reservoir Computing
Reward
Risk
Sample space
Selection bias
Selection history
Self-organization
Selfhood
Semi-Markovian
Sense of agency
Sensorimotor
Sensory attenuation
Sensory Data
Sensory input
Sensory outcome
Sentience
Shared Generative Model
Signal
Simulation
solenoidal
Sophisticated Inference
spike-timing dependent plasticity
Statistical manifold
Stigmergy
Stochastic
Subjective feeling states
Surprisal
Swarm
Symbol
Synergetics
Teams
Theory
Thermodynamic system
Thermostatistics
Thing
Thinking Through Other Minds
Top-down attentional control
Umwelt
Unidirectionality
Update
Variance
Variational
Variational Niche Construction
Von Economo neurons
Weak mixing
Working memory
World States
Active learning
Attracting set
Bayesian belief updating
Bayesian mechanics
Coarse graining
Continuous state space
Discrete state space
Epistemic foraging
Hermeneutics
Least action
link
Partially Observed Markov Decision Process
Renormalization
Variational message passing
Variational principle
Lagrangian
Path integral
Sensory observation
changing mind (cognition)
changing world (action)
High road
The
@High road
to the
@Free Energy Principle
starts by talking about random
@Non-linear dynamical systems
in general without a specific focus on biological organisms with brains.
You take the
@High road
and I’ll take the low road.
Low road
The
@Low road
to the
@Free Energy Principle
starts by looking at how biological organisms perceive their
@environment
and take actions within it to develop a notion about how they can successfully predict the next state they will be in (
@Perception
as
@Hypothesis
testing).
Be careful, the
@Low road
can be dangerous.
normative
active processes
changing the mind
changing the world
model evidence
predictive machine
Bayesian Model Selection
Blanket index
categorical
constraint
fitness
flow
Gauge theory
Helmholtz Free Energy
maximum caliber
path
phenotype
population
Quantum
Quantum-like
Qubit
Sensory State
Sensory states
sufficient statistic
time
Active Blockference
Analytical Philosophy
Category Theory
Continental Philosophy
DeSci
Exteroception
Filter
Helmholtz Decomposition
Proprioception
Quantum mechanics
Statistical Parametric Mapping
T-Maze
Effective
Effectivity
Nested
Path of Least Action
Precariousness
affect
Deflationary
Inflationary
Co-category
Functor
Lens
Monad
anticipation
matching
tracking
category
colimit
morphism
Allostasis
Artificial
frequentist
intelligence
Natural
sparsity
tensor network
Bayes Theorem
Credibility
degrees of freedom
interaction
tensor
[7] is a single-dimensional
@tensor
in 1 dimension.
unit of adaptive behavior
Gain
Gain is related to
@Precision
and
@Variance
Entailed
74

Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.