AnyGrad is a simple tensor library that makes it easy to perform forward and backward passes. It uses a high-performance C++ backend together with a user-friendly Python frontend. You can change the backend easily and simply.
Note: currently version
0.0.1does not support any engine. But in the future, the integrations of engines likenumpy,pytorchetc. will come and you can use them for anything from Tensor operation to high-level transformer training.
Install the library from PyPI:
pip install anygradIf you'd like to work on the code:
git clone https://github.com/Ruhaan838/AnyGrad.git
./setup.shCreate tensors by importing the library and instantiating Tensor. By default, gradients are not tracked unless you enable them:
import anygrad
# A tensor that does not calculate gradients
a = anygrad.Tensor([1, 2, 3])
# A tensor with gradient tracking enabled
b = anygrad.Tensor([2, 3, 4], requires_grad=True)
# A tensor with a specific data type (float64)
c = anygrad.Tensor([2, 3, 4], dtype=anygrad.float64)Other datatypes:
anygrad.int32
anygrad.int64
anygrad.bool
Perform calculations on tensors element by element:
d = a + b # addition
d = a * d # multiplication
d = d / 10 # division
e = d - 10 # subtractionYou can multiply matrices in two ways:
# Using the @ operator:
a = anygrad.ones((1, 2, 3), requires_grad=True)
b = anygrad.ones((2, 3, 4), requires_grad=True)
c = a @ b # tensor of shape (2, 2, 4)
# Or using the function:
c = anygrad.matmul(a, b)AnyGrad automatically computes gradients, which you can access after running the backward pass:
a = anygrad.Tensor([1, 2, 3], requires_grad=True)
b = anygrad.Tensor([2, 3, 4], requires_grad=True)
c = a * b
result = c.sum()
result.backward()
print(a.grad)
print(b.grad)Contributions are welcome! Whether you want to improve performance or enhance the documentation, please open an issue or submit a pull request.
This project is licensed under the terms outlined in the LICENSE file.