Release history¶
0.3.1 (November 16, 2020)¶
Changed
Raise a validation error if
hidden_to_memoryorinput_to_hiddenare True whenhidden_cell=None. (#26)
Fixed
0.3.0 (November 6, 2020)¶
Changed
Renamed module from
lmutokeras_lmu(so it will now be imported viaimport keras_lmu), renamed package fromlmutokeras-lmu(so it will now be installed viapip install keras-lmu), and changed any references to “NengoLMU” to “KerasLMU” (since this implementation is based in the Keras framework rather than Nengo). In the future thelmunamespace will be used as a meta-package to encapsulate LMU implementations in different frameworks. (#24)
0.2.0 (November 2, 2020)¶
Added
Added documentation for package description, installation, usage, API, examples, and project information. (#20)
Added LMU FFT cell variant and auto-switching LMU class. (#21)
LMUs can now be used with any Keras RNN cell (e.g. LSTMs or GRUs) through the
hidden_cellparameter. This can take an RNN cell (liketf.keras.layers.SimpleRNNCellortf.keras.layers.LSTMCell) or a feedforward layer (liketf.keras.layers.Dense) orNone(to create a memory-only LMU). The output of the LMU memory component will be fed to thehidden_cell. (#22)Added
hidden_to_memory,memory_to_memory, andinput_to_hiddenparameters toLMUCell, which can be used to enable/disable connections between components of the LMU. They default to disabled. (#22)LMUs can now be used with multi-dimensional memory components. This is controlled through a new
memory_dparameter ofLMUCell. (#22)Added
dropoutparameter toLMUCell(which applies dropout to the input) andrecurrent_dropout(which applies dropout to thememory_to_memoryconnection, if it is enabled). Note that dropout can be added in the hidden component through thehidden_cellobject. (#22)
Changed
Renamed
lmu.lmumodule tolmu.layers. (#22)Combined the
*_encoders_initializer``parameters of ``LMUCellinto a singlekernel_initializerparameter. (#22)Combined the
*_kernel_initializerparameters ofLMUCellinto a singlerecurrent_kernel_initializerparameter. (#22)
Removed
Removed
Legendre,InputScaled,LMUCellODE, andLMUCellGatingclasses. (#22)Removed the
method,realizer, andfactoryarguments fromLMUCell(they will take on the same default values as before, they just cannot be changed). (#22)Removed the
trainable_*arguments fromLMUCell. This functionality is largely redundant with the new functionality added for enabling/disabling internal LMU connections. These were primarily used previously for e.g. setting a connection to zero and then disabling learning, which can now be done more efficiently by disabling the connection entirely. (#22)Removed the
unitsandhidden_activationparameters ofLMUCell(these are now specified directly in thehidden_cell. (#22)Removed the dependency on
nengolib. (#22)Dropped support for Python 3.5, which reached its end of life in September 2020. (#22)
0.1.0 (June 22, 2020)¶
Initial release of KerasLMU 0.1.0! Supports Python 3.5+.
The API is considered unstable; parts are likely to change in the future.
Thanks to all of the contributors for making this possible!