Skip to content

Commit 2330cbd

Browse files
authored
Merge pull request #30 from bhavnicksm/main
Update README; Add ToC + Update supported
2 parents 979b418 + 7d54bc8 commit 2330cbd

File tree

3 files changed

+65
-26
lines changed

3 files changed

+65
-26
lines changed

README.md

Lines changed: 53 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22

33
# Nadir
44

5-
65
![PyPI - Downloads](https://img.shields.io/pypi/dm/nadir)
76
![GitHub commit activity](https://img.shields.io/github/commit-activity/m/Dawn-Of-Eve/nadir)
87
![GitHub Repo stars](https://img.shields.io/github/stars/Dawn-Of-Eve/nadir?style=social)
@@ -12,34 +11,33 @@
1211

1312
PyTorch is a popular machine learning framework that provides a flexible and efficient way of building and training deep neural networks. This library, Nadir, is built on top of PyTorch to provide high-performing general-purpose optimisation algorithms.
1413

15-
:warning: ***Currently in Developement Beta version with every update having breaking changes; user discreation and caution advised!*** :warning:
16-
17-
## Supported Optimisers
18-
19-
| Optimiser | Paper |
20-
|:---------: |:-----: |
21-
| **SGD** | https://paperswithcode.com/method/sgd |
22-
| **Momentum** | https://paperswithcode.com/method/sgd-with-momentum |
23-
| **Adagrad** | https://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf |
24-
| **RMSProp** | https://paperswithcode.com/method/rmsprop |
25-
| **Adam** | https://arxiv.org/abs/1412.6980v9 |
26-
| **Adamax** | https://arxiv.org/abs/1412.6980v9 |
27-
| **Adadelta** | https://arxiv.org/abs/1212.5701v1 |
28-
14+
# Table of Contents
2915

16+
- [Nadir](#nadir)
17+
- [Table of Contents](#table-of-contents)
18+
- [Installation](#installation)
19+
- [Simple Usage](#simple-usage)
20+
- [Supported Optimisers](#supported-optimisers)
21+
- [Acknowledgements](#acknowledgements)
22+
- [Citation](#citation)
3023

3124

32-
## Installation
3325

34-
Nadir is on the PyPi packaging Index! :partying_face:
26+
# Installation
3527

36-
Simply run the following command on your terminal and start using Nadir now!
28+
You can either choose to install from the PyPI index, in the following manner:
3729

3830
```bash
3931
$ pip install nadir
4032
```
33+
or install from source, in the following manner:
4134

42-
## Usage
35+
```bash
36+
$ pip install git+https://github.com/Dawn-Of-Eve/nadir.git
37+
```
38+
**Note:** Installing from source might lead to a breaking package. It is recommended that you install from PyPI itself.
39+
40+
# Simple Usage
4341

4442
```python
4543
import nadir as nd
@@ -51,5 +49,41 @@ model = ...
5149
config = nd.SGDConfig(lr=learning_rate)
5250
optimizer = nd.SGD(model.parameters(), config)
5351

52+
# Call the optimizer step
53+
optimizer.step()
5454
```
5555

56+
# Supported Optimisers
57+
58+
| Optimiser | Paper |
59+
|:---------: |:-----: |
60+
| **SGD** | https://paperswithcode.com/method/sgd |
61+
| **Momentum** | https://paperswithcode.com/method/sgd-with-momentum |
62+
| **Adagrad** | https://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf |
63+
| **RMSProp** | https://paperswithcode.com/method/rmsprop |
64+
| **Adam** | https://arxiv.org/abs/1412.6980v9 |
65+
| **Adamax** | https://arxiv.org/abs/1412.6980v9 |
66+
| **AdamW** | https://arxiv.org/abs/1711.05101v3 |
67+
| **Adadelta** | https://arxiv.org/abs/1212.5701v1 |
68+
| **AMSGrad** | https://arxiv.org/abs/1904.09237v1 |
69+
| **RAdam** | https://arxiv.org/abs/1908.03265v4 |
70+
| **Lion** | https://arxiv.org/abs/2302.06675 |
71+
72+
# Acknowledgements
73+
74+
We would like to thank all the amazing contributors of this project who spent so much effort making this repositary awesome! :heart:
75+
76+
77+
# Citation
78+
79+
You can use the _Cite this repository_ button provided by Github or use the following bibtex:
80+
81+
```bibtex
82+
@software{MinhasNadir,
83+
title = {{Nadir: A Library for Bleeding-Edge Optimizers in PyTorch}},
84+
author = {Minhas, Bhavnick and Kalathukunnel, Apsal},
85+
year = 2023,
86+
month = 3,
87+
version = {0.0.2}
88+
}
89+
```

src/nadir/__init__.py

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,12 +21,14 @@
2121
from .adam import Adam, AdamConfig
2222
from .adamax import Adamax, AdamaxConfig
2323
from .base import BaseOptimizer, BaseConfig
24+
from .lion import Lion, LionConfig
2425
from .momentum import Momentum, MomentumConfig
2526
from .rmsprop import RMSProp, RMSPropConfig
27+
from .radam import Radam, RadamConfig
2628
from .sgd import SGD, SGDConfig
2729

2830

29-
__version__ = "0.0.2"
31+
__version__ = "0.0.3"
3032

3133
__all__ = ('Adadelta',
3234
'AdadeltaConfig',
@@ -35,12 +37,17 @@
3537
'Adam',
3638
'AdamConfig',
3739
'Adamax',
38-
'AdamaxConfig',
40+
'AdamaxConfig',
41+
'Adam'
3942
'BaseOptimizer',
4043
'BaseConfig',
44+
'Lion',
45+
'LionConfig',
4146
'Momentum',
4247
'MomentumConfig',
4348
'RMSProp',
4449
'RMSPropConfig',
50+
'Radam',
51+
'RadamConfig',
4552
'SGD',
4653
'SGDConfig')

tests/mnist.py

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858

5959
# writing the logging args as a namespace obj
6060
largs = argparse.Namespace()
61-
largs.run_name : str = 'Nadir-Adadelta 2'
61+
largs.run_name : str = 'Nadir-Lion'
6262
largs.run_seed : str = args.seed
6363

6464

@@ -94,7 +94,7 @@ def forward(self, x):
9494
return output
9595

9696

97-
def train(args, model, device, train_loader, optimizer, epoch):
97+
def train(args, model, device, train_loader, oAdadeltaptimizer, epoch):
9898
model.train()
9999
for (data, target)in (pbar := tqdm(train_loader)):
100100
data, target = data.to(device), target.to(device)
@@ -212,13 +212,11 @@ def mnist_tester(model, optimizer=None, args = None):
212212
run.name = f'{largs.run_name}'
213213
run.config.update(args)
214214
run.config.update(largs)
215-
216215

217-
218216
# Initialising the optimiser
219217
model = MNISTestNet().to(args.device)
220218
# config = nd.AdadeltaConfig(lr = args.learning_rate, beta_1=args.betas[0], beta_2=args.betas[1])
221-
optimizer = nd.Adadelta(model.parameters())
219+
optimizer = nd.Lion(model.parameters())
222220
# config = AutoConfig(args.params..)
223221
# optimizer = args.optimizer(config)
224222

0 commit comments

Comments
 (0)