Skip to content

Layers¤

Low-level model building blocks.

Note

Key notes:

  • Linear supports Random Weight Factorization (RWF) and optional complex parameters.

phydrax.nn.Linear ¤

Affine layer with optional activation.

Computes

\[ y=\phi(Wx+b), \]

where \(\phi\) is activation (or the identity). If Random Weight Factorization (RWF) is enabled, parameters are represented as an unscaled weight matrix \(V\) and per-output log-scales \(s\), and the layer applies

\[ y=\phi\!\left(\operatorname{diag}(e^s)\,Vx + b\right). \]

If enforce_positive_weights=True, weights are constrained via \(W=\operatorname{softplus}(W_\text{raw})\).

__init__(*, in_size: typing.Union[int, collections.abc.Sequence[int], typing.Literal['scalar']], out_size: typing.Union[int, collections.abc.Sequence[int], typing.Literal['scalar']], activation: collections.abc.Callable | None = None, initializer: str = 'glorot_normal', rwf: bool | tuple[float, float] = True, use_random_weight_factorization: bool | None = None, use_bias: bool = True, bias_init_lim: float = 1.0, enforce_positive_weights: bool = False, key: Key[Array, ''] = jr.key(0)) ¤
__call__(x: Array, /, *, key: Key[Array, ''] = jr.key(0)) -> Array ¤