Keraflow
Deep Learning for Python.
|
  
Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research.
As its official description, Keras serves as an excellent front-end for graph based deep learning framework. However, from the point of view of soft-engineering, its API design and its complicated internal mechanism makes it hard to understand and cumbersome to extend. Therefore, I reimplement the core of Keras to provide:
For the full API reference, read the online documentation. It is strongly recommended to read the Tutorials first to know the basics on building neural network models with Keraflow.
initial_weights
, regularizers
, constraints
arguments for layers' __init__
.get_config
for layers' serialization.For more details about the difference between Keraflow and Keras, please refer to the Differences from Keras.
Keraflow uses the following dependencies:
When using the Theano backend:
When using the TensorFlow backend:
To install Keraflow, cd
to the Keraflow folder and run the install command:
You can also install Keraflow from PyPI:
To make Keraflow compatible with both python2 & python3. We use pyenv to build virtual environment. The shell script dev_scrips/install_dependency.sh
could quickly sets up the testing environments.
Note: The script does not add pyenv PATH in your script config file (e.g. ~/.zshrc). You will need to manually copy and paste the following into your shell config file so that the next time you log in, pyenv will be in the PATH:
To quickly modify and run tests. Run:
And then run tests via dev_scripts/run_test.sh
. Run:
dev_scripts/run_test.sh
checks pep8, python2 testing and python3 testing. You could also run tests manually:
dev_scripts/run_test.sh
checks pep8, it you fail on syntax check, you could use autopep8
to fix it:
It is highly recommend you avoid these errors when writing them using some editor plugins. If you use vim (or neovim), I recommend installing flake8
and adopt the settings in this gist. Note that flask8
is required:
@
for special commands (param
, package
... etc.)Two main things that makes Keras complicated:
inbound_nodes
and outbound_nodes
in each layer. However, maintaining a list to keep track of linkage is usually a brain-killing job.Keraflow, intead
a+b+c+d
is more amiable than (a+(b+(c+d)))
).__init__
Check the constructor of Dense layer:
Keras
Keraflow
The signal of initial weights
, regularizers
, and constraints
disappear since Keraflow takes care of them in Layer class. The signal of input dimension also disappears since Keraflow force users to specify an Input layer and their shape for all models.
When creating a customized layer, users no longer need to write regularizers
, constraints
initialization code for the layer. Special care for the input dimension is also unnecessary.
One additional merit of abstracting initial_weights
, regularizers
, constraints
initialization process in Layer class is that Keraflow easily (without adding too much code) enables users to initialize those of a layer with dictionary:
get_config
for serialization.Every layer in Keras has a get_config
function, which is needed for serializing models. Though its implementation is not necessary for customized layers, it would be good for developers to save the time implementing it just for serializing their models.
Keraflow takes care of this, every layer that fulfils some constraints is naturally seizable.
Currently, in Keras, when writing you own layers, even if you want to conduct similar operation of the Dense
layer, you still need to define some trainable parameters (write initialization code and add it to the layer's trainable parameters list) for that.
In Keraflow, you could simply write (in output
function, the correspondence of get_output_for
in Keras):
Everything is done!! The parameters of Dense
is automatically added as parameters of your layer and is updated during training. For more information, see Layer Embedding