tamnun is a python framework for Machine and Deep learning algorithms and methods especially in the field of Natural Language Processing and Transfer Learning. The aim of
tamnun is to provide an easy to use interfaces to build powerful models based on most recent SOTA methods.
For more about
tamnun, feel free to read the introduction to TamnunML on Medium.
tamnun depends on several other machine learning and deep learning frameworks like
keras and others. To install
tamnun and all it's dependencies run:
$ git clone https://github.com/hiredscorelabs/tamnun-ml $ cd tamnun-ml $ python setup.py install
Or using PyPI:
pip install tamnun
Jump in and try out an example:
$ cd examples $ python finetune_bert.py
Or take a look at the Jupyer notebooks here.
BERT stands for Bidirectional Encoder Representations from Transformers which is a language model trained by Google and introduced in their paper. Here we use the excellent PyTorch-Pretrained-BERT library and wrap it to provide an easy to use scikit-learn interface for easy BERT fine-tuning. At the moment,
tamnun BERT classifier supports binary and multi-class classification. To fine-tune BERT on a specific task:
from tamnun.bert import BertClassifier, BertVectorizer from sklearn.pipeline import make_pipeline clf = make_pipeline(BertVectorizer(), BertClassifier(num_of_classes=2)).fit(train_X, train_y) predicted = clf.predict(test_X)
Please see this notebook for full code example.
Fitting (almost) any PyTorch Module using just one line
You can use the
TorchEstimator object to fit any
pytorch module with just one line:
from torch import nn from tamnun.core import TorchEstimator module = nn.Linear(128, 2) clf = TorchEstimator(module, task_type='classification').fit(train_X, train_y)
Distiller Transfer Learning
This module distills a very big (like BERT) model into a much smaller model. Inspired by this paper.
from tamnun.bert import BertClassifier, BertVectorizer from tamnun.transfer import Distiller bert_clf = make_pipeline(BertVectorizer(do_truncate=True, max_len=3), BertClassifier(num_of_classes=2)) distilled_clf = make_pipeline(CountVectorizer(ngram_range=(1,3)), LinearRegression()) distiller = Distiller(teacher_model=bert_clf, teacher_predict_func=bert_clf.decision_function, student_model=distilled_clf).fit(train_texts, train_y, unlabeled_X=unlabeled_texts) predicted_logits = distiller.transform(test_texts)
For full BERT distillation example see this notebook.
You can ask questions and join the development discussion on Github Issues
Apache License 2.0 (Same as Tensorflow)