Neural Arithmetic Logic Units (NALU)
Basic pytorch implementation of NAC/NALU from Neural Arithmetic Logic Units by trask et.al
Installation
pip install NALU
Usage
from nalu.core import NaluCell, NacCell
from nalu.layers import NaluLayer
Category: Python / Deep Learning |
Watchers: 11 |
Star: 112 |
Fork: 19 |
Last update: Jun 28, 2022 |
Basic pytorch implementation of NAC/NALU from Neural Arithmetic Logic Units by trask et.al
pip install NALU
from nalu.core import NaluCell, NacCell
from nalu.layers import NaluLayer
The NacCell class has a bug in it that turns it into a simple linear cell. The way it's written right now, the W_ and M_ are not updated and the tanh and sigmoid don't do anything.
I go into more detail in this notebook: https://github.com/vrxacs/NALU/blob/master/fauxNAC.ipynb
https://github.com/bharathgs/NALU/blob/16a4c9965ad371b48d086bfd780f0479ce311d98/nalu/core/nalu_cell.py#L22
Shouldn't W be implemented as below according to the paper ?: W = tanh(Wˆ ) * σ(Mˆ )
As of Pytorch 0.4.1, torch.nn.functional.sigmoid and torch.nn.functional.tanh are deprecated in favor of torch.sigmoid and torch.tanh.