Simple Keras model that tackles the Stanford Natural Language Inference (SNLI) corpus using summation and/or recurrent neural networks

Keras SNLI baseline example This repository contains a simple Keras baseline to train a variety of neural networks to tackle the Stanford Natural Language Inference (SNLI) corpus. The aim is to determine whether a premise senten

Related Repos

Smerity Keras SNLI baseline example This repository contains a simple Keras baseline to train a variety of neural networks to tackle the Stanford Natural Language Inference (SNLI) corpus. The aim is to determine whether a premise senten

yg211 This project includes a natural language inference (NLI) model, developed by fine-tuning Transformers on the SNLI, MultiNLI and Hans datasets.

farizrahman4u Recurrent Shop Framework for building complex recurrent neural networks with Keras Ability to easily iterate over different neural network architectures is key to doing machine learning research. While deep learning libraries li

jimfleming Recurrent Entity Networks This repository contains an independent TensorFlow implementation of recurrent entity networks from Tracking the World State with Recurrent Entity Networks. This paper introduces the first method to solv

mirceamironenco Bayesian Recurrent Neural Networks This is a replication of the paper 'Bayesian Recurrent Neural Networks' by Meire Fortunato, Charles Blundell, Oriol Vinyals. Paper: Status: Basic model replica

madvn CTRNN Python package that implements Continuous Time Recurrent Neural Networks (CTRNNs) See Beer, R.D. (1995). On the dynamics of small continuous-time recurrent neural networks. Adaptive Behavior 3:469-509. for a study of CTRNN

lvapeab NMT-Keras Neural Machine Translation with Keras. Library documentation: Attentional recurrent neural network NMT model Transformer NMT model Features (in addi

sina-al pynlp A pythonic wrapper for Stanford CoreNLP. Description This library provides a Python interface to Stanford CoreNLP built over corenlp_protobuf. Installation Download Stanford CoreNLP from th

stanfordnlp Stanford CoreNLP Stanford CoreNLP provides a set of natural language analysis tools written in Java. It can take raw human language text input and give the base forms of words, their parts of speech, whether they are names of

karpathy char-rnn This code implements multi-layer Recurrent Neural Network (RNN, LSTM, and GRU) for training/sampling from character-level language models. In other words the model takes one text file as input and trains a Recurrent Neur

facebookresearch This repository contains source code for the TaBERT model, a pre-trained language model for learning joint representations of natural language utterances and (semi-)structured tables for semantic parsing. TaBERT is pre-trained on a massive corpus of 26M Web tables and their associated natural language context, and could be used as a drop-in replacement of a semantic parsers original encoder to compute representations for utterances and table schemas (columns).

lrjconan Efficient Graph Generation with Graph Recurrent Attention Networks, Deep Generative Model of Graphs, Graph Neural Networks, NeurIPS 2019

kjw0612 Awesome Recurrent Neural Networks A curated list of resources dedicated to recurrent neural networks (closely related to deep learning). Maintainers - Myungsub Choi, Taeksoo Kim, Jiwon Kim We have pages for other topics: awesom

jind11 TextFooler A Model for Natural Language Attack on Text Classification and Inference This is the source code for the paper: Jin, Di, et al. "Is BERT Really Robust? Natural Language Attack on Text Classification and Entailment." a

danieldjohnson Biaxial Recurrent Neural Network for Music Composition This code implements a recurrent neural network trained to generate classical music. The model, which uses LSTM layers and draws inspiration from convolutional neural network

pronobis LibSPN Keras is a library for constructing and training Sum-Product Networks. By leveraging the Keras framework with a TensorFlow backend, it offers both ease-of-use and scalability. Whereas the previously available libspn focused on scalability, libspn-keras offers scalability and a straightforward Keras-compatible interface.

iwantooxxoox Keras-OpenFace Keras-OpenFace is a project converting OpenFace from it's original Torch implementation to a Keras version If you are only interested in using pre-trained model Load the Keras OpenFace model(Accuracy: 0

jayparks Neural Machine Translation using Quasi-RNN Pytorch implementation of Neural Machine Translation using "Quasi-Recurrent Neural Networks", ICLR 2017 Requirements NumPy >= 1.11.1 Pytorch >= 0.2.0

jfsantos Keras tutorial This is a basic Keras tutorial, teaching the basics of feedforward, convolutional, and recurrent neural networks. There are also sections on regularization and how to use the Keras backend to write portable code th

microsoft Multi-Task Deep Neural Networks for Natural Language Understanding MT-DNN, an open-source natural language understanding (NLU) toolkit that makes it easy for researchers and developers to train customized deep learning models. Bu