Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add custom logsigmoid grad for PyTorch executor #1555

Open
mruberry opened this issue Dec 13, 2024 · 0 comments
Open

Add custom logsigmoid grad for PyTorch executor #1555

mruberry opened this issue Dec 13, 2024 · 0 comments
Labels
autograd operators thunderfx for things that could be applicable to the dynamo+thunder frontend

Comments

@mruberry
Copy link
Collaborator

mruberry commented Dec 13, 2024

See #1520, where some details of PyTorch's logsigmoid grad implementation are discussed and supported.

I think we can avoid the extra creation of the CUDA buffer tensor (even though it's empty) by implementing a thunder grad formula without it, then adding a special grad formula for logsigmoid when it's being executed by the PyTorch executor that has the buffer tensor.

cc @apaz-cli

@mruberry mruberry mentioned this issue Dec 13, 2024
4 tasks
@tfogal tfogal added the thunderfx for things that could be applicable to the dynamo+thunder frontend label Dec 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
autograd operators thunderfx for things that could be applicable to the dynamo+thunder frontend
Projects
None yet
Development

No branches or pull requests

2 participants