Python-Keras resources: Directory of tutorials and open-source code repositories for working with Keras, the Python deep learning library

Keras resources

This is a directory of tutorials and open-source code repositories for working with Keras, the Python deep learning library.

If you have a high-quality tutorial or project to add, please open a PR.

Official starter resources

Tutorials

Books based on Keras

Code examples

Working with text

Working with images

Creative visual applications

Reinforcement learning

  • DQN
  • FlappyBird DQN
  • async-RL: Tensorflow + Keras + OpenAI Gym implementation of 1-step Q Learning from "Asynchronous Methods for Deep Reinforcement Learning"
  • keras-rl: A library for state-of-the-art reinforcement learning. Integrates with OpenAI Gym and implements DQN, double DQN, Continuous DQN, and DDPG.

Miscallenous architecture blueprints

Third-party libraries

  • Elephas: Distributed Deep Learning with Keras & Spark
  • Hyperas: Hyperparameter optimization
  • Hera: in-browser metrics dashboard for Keras models
  • Kerlym: reinforcement learning with Keras and OpenAI Gym
  • Qlearning4K: reinforcement learning add-on for Keras
  • seq2seq: Sequence to Sequence Learning with Keras
  • Seya: Keras extras
  • Keras Language Modeling: Language modeling tools for Keras
  • Recurrent Shop: Framework for building complex recurrent neural networks with Keras
  • Keras.js: Run trained Keras models in the browser, with GPU support
  • keras-vis: Neural network visualization toolkit for keras.

Projects built with Keras

Comments

  • added Mask R-CNN to Working with images
    added Mask R-CNN to Working with images

    Mar 6, 2018

    Implementation of Mask R-CNN based on arXiv:1703.06870 by Kaiming He, Georgia Gkioxari, Piotr Dollár, Ross Girshick

    Reply
  • Update README.md
    Update README.md

    Apr 12, 2018

    Add link to blog on one-shot learning in Keras

    Reply
  • Update README.md
    Update README.md

    Apr 13, 2018

    Adding Attention based Language Translation in Keras

    Reply
  • Updated README.md with the project that I made using Keras.
    Updated README.md with the project that I made using Keras.

    Jun 13, 2018

    https://github.com/Gogul09/flower-recognition

    Reply
  • added Talos hyperparameter optimization
    added Talos hyperparameter optimization

    Jul 4, 2018

    disclaimer: I'm the author of the package

    Reply
  • Link broken
    Link broken

    Aug 24, 2018

    Stateful LSTM, Siamese Network link is broken.

    Reply
  • Added multimodal autoencoders repository link
    Added multimodal autoencoders repository link

    Nov 15, 2018

                                                                                                                                                                                                           
    Reply
  • Added links to model converters.
    Added links to model converters.

    Dec 2, 2018

    Hello. I added links to these model converters:

    • pytorch2keras - Convert PyTorch models to Keras (with TensorFlow backend) format
    • gluon2keras - Convert Gluon models to Keras (with TensorFlow backend) format
    Reply
  • Update README.md
    Update README.md

    Oct 12, 2019

                                                                                                                                                                                                           
    Reply
  • Research
    Research

    May 17, 2020

    Working

                                                                                                                                                                                                           
    Reply
  • Added caption_generator: image caption generation
    Added caption_generator: image caption generation

    May 2, 2017

    Added the project to generate image captions built using Keras.

    Reply
  • NMT-Keras: Neural Machine Translation
    NMT-Keras: Neural Machine Translation

    May 4, 2017

                                                                                                                                                                                                           
    Reply
  • Add Practical Keras Spot AWS tutorial
    Add Practical Keras Spot AWS tutorial

    Jun 26, 2017

                                                                                                                                                                                                           
    Reply
  • added keras tutorials from ml4a
    added keras tutorials from ml4a

    Jul 28, 2016

    added ml4a-guides to tutorials; all of the neural network guides use keras. only the RL guides don't.

    added image t-SNE to images section, which shows how to extract fc7 activations from a collection of images, then apply t-SNE to them with sklearn. to-do: generate the actual image with Pillow, and show how to assign t-SNE points to a grid.

    Reply
  • Update README.md
    Update README.md

    Jul 29, 2016

    Added link to codebase for doing monolingual and multilingual image description with Keras.

    Reply
  • Adding tutorials?
    Adding tutorials?

    Nov 28, 2016

    Custom layer tutorial: https://keunwoochoi.wordpress.com/2016/11/18/for-beginners-writing-a-custom-keras-layer/ Callback tutorial: https://keunwoochoi.wordpress.com/2016/07/16/keras-callbacks/

    These are very simple ones, please add them if you think they're suitable.

    Reply
  • Update README.md
    Update README.md

    Nov 6, 2017

    Added Functional API tutorial (http://www.puzzlr.org/the-keras-functional-api-five-simple-examples/)

    Reply
  • Add tutorial to convolutional denoising autoencoder
    Add tutorial to convolutional denoising autoencoder

    Sep 15, 2017

    A beginner's tutorial on how to use autoencoders to create a CBIR system using keras

    Reply
  • MultiHeadAttention
    MultiHeadAttention

    Nov 19, 2021

    I would like to match the results of the self_attention() function on page 339 of the Keras book, Deep learning with Python, second edition, with the those of the MultiHeadAttention() example just below. I wrote an example with the same input and I have different results. Can somebody explain why?

    import numpy as np
    from scipy.special import softmax
    from tensorflow.keras.layers import MultiHeadAttention
    
    
    def self_attention(input_sequence):
        output = np.zeros(shape=input_sequence.shape)
        # The output will consist of contextual embeddinsgs of the same shape
        for i, pivot_vector in enumerate(input_sequence):
            scores = np.zeros(shape=(len(input_sequence),))
            for j, vector in enumerate(input_sequence):
                scores[j] = np.dot(pivot_vector, vector.T)  # Q K^T
            scores /= np.sqrt(input_sequence.shape[1])  # sqrt(d_k)
            scores = softmax(scores)  # softmax(Q K^T / sqrt(d_k))
            print(i, scores)
            new_pivot_representation = np.zeros(shape=pivot_vector.shape)
            for j, vector in enumerate(input_sequence):
                new_pivot_representation += vector * scores[j]
            output[i] = new_pivot_representation
        return output
    
    
    test_input_sequence = np.array([[[1.0, 0.0, 0.0, 1.0],
                                     [0.0, 1.0, 0.0, 0.0],
                                     [0.0, 1.0, 1.0, 1.0]]])
    
    test_input_sequence.shape
    # (1, 3, 4)
    
    self_attention(test_input_sequence[0])
    """
    returns
    [[0.50648039 0.49351961 0.30719589 0.81367628]
     [0.23269654 0.76730346 0.38365173 0.61634827]
     [0.21194156 0.78805844 0.57611688 0.78805844]]
     
    the attention scores being:
    [0.50648039 0.18632372 0.30719589]
    [0.23269654 0.38365173 0.38365173]
    [0.21194156 0.21194156 0.57611688]
    """
    att_layer = MultiHeadAttention(num_heads=1,
                                   key_dim=4,
                                   use_bias=False,
                                   attention_axes=(1,))
    
    att_layer(test_input_sequence,
              test_input_sequence,
              test_input_sequence,
              return_attention_scores=True)
    
    """
    returns 
    array([[[-0.46123487,  0.36683324, -0.47130704, -0.00722525],
            [-0.49571565,  0.37488416, -0.52883905, -0.02713571],
            [-0.4566634 ,  0.38055322, -0.45884743, -0.00156384]]],
          dtype=float32)
          
    and the attention scores
    array([[[[0.31446996, 0.36904442, 0.3164856 ],
             [0.34567958, 0.2852166 , 0.36910382],
             [0.2934979 , 0.3996053 , 0.30689687]]]], dtype=float32)>)
    """
    
    Reply
  • Added second
    Added second "Densely Connected Convolutional Networks"

    Nov 8, 2016

    Added a second DenseNet repository, which contains weights for the DenseNet-40 model (widening factor = 12) which has been trained on CIFAR 10.

    Reply