-
-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proper Testing & better use of Lux layers #58
Conversation
Can some of the adjoints here be dropped now that NonlinearSolve.jl directly supports the adjoints? (@frankschae ) |
The only difference from the upstreamed version is (Initially, it made sense to have this here so that I could iterate faster, but rn there is no other blocker for upstreaming) |
That could be done. We've been discussing canonicalization here: SciML/DifferentialEquations.jl#881 |
3db3f49
to
75ac2a7
Compare
Codecov Report
@@ Coverage Diff @@
## main #58 +/- ##
==========================================
+ Coverage 85.52% 90.65% +5.12%
==========================================
Files 14 12 -2
Lines 456 503 +47
==========================================
+ Hits 390 456 +66
+ Misses 66 47 -19
📣 Codecov can now indicate which changes are the most critical in Pull Requests. Learn more |
3f2f8bc
to
6eadf85
Compare
# TODO(@avik-pal): Move to Lux.jl or maybe CRC? | ||
function CRC.rrule(::Type{T}, args...) where {T <: NamedTuple} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CRC I would think?
Yeah I think it's fine to merge and try upstreaming some of those pieces in pieces. |
Supercedes #56 and #54