[Update] Residual Networks implementation on Keras

Here is my implementation of residual networks on Keras (on Theano). I’ll update readme.md as there’s update more up-to-date information would be there.

14 Apr 2016. I updated the implementation. Now the example.py working properly, showing reasonable performances, and re-written with Keras 1.0 API (which is AWESOME). Also the structure follows the authors’ new paper, Identity Mapping in Deep Residual Networks. — (b) proposed in the figure below.

Screen Shot 2016-04-14 at 20.19.09.png



8 thoughts on “[Update] Residual Networks implementation on Keras

  1. Thank you for your work and I’m very interested in implementing this model. Before I start, are there any tips in training this model? Do you do it from scratch, or start with a smaller model? What kind of precision have you attained with this model? Assuming you have a GPU, what kind of training times do you see (or expect)?


    1. Hi, so far I only have the bench mark on MNIST – which is in the readme.md of the repo. Perhaps I could extend it to CIFAR as it’s quite handy to get it in Keras, but not sure I could provide a model that is trained on huge dataset such as ImageNet. Also with my GPU (Tesla with 12GB memory), the author’s model (resnet-151) would not fit.
      If there’s a PR that prepare all the stuff I’m willing to run it with my GPU and share more information/trained weight. I would have some vacancy of my GPU soon.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s