Releases: google-deepmind/dm-haiku
Releases Β· google-deepmind/dm-haiku
Haiku 0.0.16
What's Changed
- Fix inspect.stack() error caused by no_flax shim module. 58b4130.
- Various changes to remain compatible with the latest JAX release.
Full Changelog: v0.0.15...v0.0.16
Haiku 0.0.15
- Allow using Haiku without Flax in Python >=3.13 (#843).
- Various changes to remain compatible with the latest JAX release.
Full Changelog: v0.0.14...v0.0.15
Haiku 0.0.14
Haiku 0.0.13
Haiku 0.0.12
Haiku 0.0.11
hk.layer_stacknow allows transparent application (no prefix on module names).hk.MultiHeadAttentionallows bias initializer to be configured or biases to be removed.hk.DepthwiseConvNDnow supportsdilation.hk.dropoutsupportsbroadcast_dims.hk.BatchApplyavoids an unnecessary h2d copy during tracing.hk.experimental.profiler_name_scopeshas been removed, these are on by default.- Added
hk.mapmirroringjax.lax.map.
Haiku 0.0.10
- Added
hk.mixed_precision.push_policy. - Added
hk.experimental.{get_params,get_initial_state,get_current_state}. - Added
hk.experimental.{maybe_get_rng_sequence_state,maybe_replace_rng_sequence_state}. hk.switchnow supports multiple operands.hk.get_parameternow supportsinit=None.hk.MethodContextnow includesorig_class.hk.GetterContextnow includeslifted_prefix_name.hk.layer_stacknow allows parameter reuse.- Haiku is now compatible with
jax.enable_custom_prng. TruncatedNormalnow exports lower and upper bounds.- Haiku init/apply functions now return
dictrather thanMapping. hk.dropoutnow supportsbroadcast_dims.
Haiku 0.0.9
What's Changed
- Support vmap where in_axes is a list rather than a tuple in 307cf7d
- Pass pmap axis specs optionally to make_model_info in d0ba451
- Remove use of jax_experimental_name_stack flag in dbc0b1f
- Add param_axis argument to RMSNorm to allow setting scale param shape in a4998a0
- Add documentation and error messages for w_init and w_init_scale to avoid confusion in #541
- Fix hk.while_loop carrying state when reserving variable sizes of rng keys. by @copybara-service in #551
- Add ensemble example to hk.lift documentation. by @copybara-service in #556
Full Changelog: v0.0.8...v0.0.9
Haiku 0.0.8
- Added
experimental.force_name. - Added ability to simulate a method name in
experimental.name_scope. - Added a config option for PRNG key block size.
- Added
unrollparameter todynamic_unroll. - Remove use of deprecated
jax.tree_*functions. - Many improvements to our examples.
- Improve error messages in
vmap. - Support
jax_experimental_name_stackin jaxpr_info. transform_and_runnow supports a map on PRNG keys.rematnow uses the new JAX remat implementation.- Scale parameter is now optional in
RMSNorm.
Haiku 0.0.7
- Bug fix: modules with leading zeros (e.g.
linear_007) are now correctly handled. 7632aff - Breaking change:
hk.vmap(..)now requiressplit_rngto be passed. - Breaking change:
hk.jitwas removed from the public API. - Feature: we always add profiler name scopes to Haiku modules with the latest version of JAX.
- Added a tutorial on parameter sharing.
- Added
hk.ModuleProtocolandhk.SupportsCall. - Added
cross_replica_axistoVectorQuantiser. - Added
allow_reuseargument tohk.lift. - Added
fan_in_axestoVarianceScalinginitialiser. - Added
hk.custom_setter(..)to intercepthk.set_state(..). - Added
hk.Deferred. - Added
hk.experimental.transparent_lift(..)andhk.experimental.transparent_lift_with_state(..). - Added
hk.experimental.fast_eval_shape(..). - Added
hk.experimental.current_name(). - Added
hk.experimental.DO_NOT_STORE. 2a6c034 - Added config APIs.
- Added support for new
jax.named_callimplementation. - The
HAIKU_FLATMAPPINGenv var is no longer used. hk.dropout(..)now supports dynamicrate.hk.without_apply_rng(..)now supports multi transformed functions.