Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proper Testing & better use of Lux layers #58

Merged
merged 2 commits into from
Jul 18, 2022
Merged

Proper Testing & better use of Lux layers #58

merged 2 commits into from
Jul 18, 2022

Conversation

avik-pal
Copy link
Member

@avik-pal avik-pal commented Jul 16, 2022

Supercedes #56 and #54

@ChrisRackauckas
Copy link
Member

Can some of the adjoints here be dropped now that NonlinearSolve.jl directly supports the adjoints?

(@frankschae )

@avik-pal
Copy link
Member Author

The only difference from the upstreamed version is jacobian free backprop which is a trivial check, and the other one is it works with ps as NamedTuple using fmap. Would it be fine to add a dep on Functors?

(Initially, it made sense to have this here so that I could iterate faster, but rn there is no other blocker for upstreaming)

@ChrisRackauckas
Copy link
Member

That could be done. We've been discussing canonicalization here: SciML/DifferentialEquations.jl#881

@avik-pal avik-pal force-pushed the ap/expt branch 8 times, most recently from 3db3f49 to 75ac2a7 Compare July 17, 2022 03:41
@codecov
Copy link

codecov bot commented Jul 17, 2022

Codecov Report

Merging #58 (7718e1e) into main (58a444f) will increase coverage by 5.12%.
The diff coverage is 91.63%.

@@            Coverage Diff             @@
##             main      #58      +/-   ##
==========================================
+ Coverage   85.52%   90.65%   +5.12%     
==========================================
  Files          14       12       -2     
  Lines         456      503      +47     
==========================================
+ Hits          390      456      +66     
+ Misses         66       47      -19     
Impacted Files Coverage Δ
src/operator.jl 80.00% <0.00%> (-20.00%) ⬇️
src/solvers/termination.jl 64.93% <70.00%> (+6.03%) ⬆️
src/adjoint.jl 79.31% <76.47%> (-1.78%) ⬇️
src/layers/jacobian_stabilization.jl 90.90% <90.90%> (+90.90%) ⬆️
src/layers/core.jl 94.28% <93.33%> (+37.14%) ⬆️
src/layers/mdeq.jl 97.84% <96.72%> (-2.16%) ⬇️
src/layers/deq.jl 100.00% <100.00%> (+4.00%) ⬆️
src/solve.jl 100.00% <100.00%> (+8.00%) ⬆️
src/solvers/discrete/broyden.jl 100.00% <100.00%> (+5.00%) ⬆️
src/solvers/discrete/limited_memory_broyden.jl 100.00% <100.00%> (ø)
... and 7 more

📣 Codecov can now indicate which changes are the most critical in Pull Requests. Learn more

@avik-pal avik-pal force-pushed the ap/expt branch 7 times, most recently from 3f2f8bc to 6eadf85 Compare July 18, 2022 08:02
@avik-pal avik-pal changed the title [Extremely WIP] Proper Testing & better use of Lux layers Proper Testing & better use of Lux layers Jul 18, 2022
@avik-pal avik-pal requested a review from ChrisRackauckas July 18, 2022 08:36
Comment on lines +10 to +11
# TODO(@avik-pal): Move to Lux.jl or maybe CRC?
function CRC.rrule(::Type{T}, args...) where {T <: NamedTuple}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CRC I would think?

@ChrisRackauckas
Copy link
Member

Yeah I think it's fine to merge and try upstreaming some of those pieces in pieces.

@ChrisRackauckas ChrisRackauckas merged commit 4c42243 into main Jul 18, 2022
@ChrisRackauckas ChrisRackauckas deleted the ap/expt branch July 18, 2022 13:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants