Formulas#
Deferred computation for easier model building.
The key insight here is that we build up a stored comptuation graph composed of Layers and Arrays, and then we pass data to the stored computation graph and it knows what to do.
This means we have to define the set of operations we want to support for Layers and Arrays in advance.
Abstractly, at the end of the day we want an Array, so everything needs to be resolvable to arrays.
You can think of this in two parts, instance creation creates the deferred computation graph, and the call method accepts actual data and resolves the deferred computation to a real JAX Array.
So let’s focus on a specific thing
a(f.x1 + f.x2) * a(f.x1 | f.x2)
What’s going to happen here is we go from right to left so
``` Prod(
- AdaptiveLayer(
- Sum(
f.x1, f.x2
)
), AdaptiveLayer(
- Concat(
f.x1, f.x2
)
)
deferred.__call__ –> now
- class blayers.experimental.syntax.DeferredBinaryOp(left_deferred, right_deferred, op, symbol)[source]#
Bases:
Deferred
Defers and then calls op(left_now, right_now)
- class blayers.experimental.syntax.Sum(left, right)[source]#
Bases:
DeferredBinaryOp
- class blayers.experimental.syntax.Prod(left, right)[source]#
Bases:
DeferredBinaryOp
- class blayers.experimental.syntax.Concat(left, right)[source]#
Bases:
DeferredBinaryOp
- class blayers.experimental.syntax.DeferredManyOp(op, symbol, *args)[source]#
Bases:
Deferred
Defers and then calls op(left_now, right_now)
- class blayers.experimental.syntax.ConcatMany(*args)[source]#
Bases:
DeferredManyOp
- class blayers.experimental.syntax.DeferredLayer(layer_instance, *args, **kwargs)[source]#
Bases:
object
- blayers.experimental.syntax.bl(formula, data, num_steps=20000)[source]#
- Parameters:
formula (Formula)
data (dict[str, Array])
- blayers.experimental.syntax.cat#
alias of
ConcatMany