Skip to content

compintell/Mooncake.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

5c4d8ac · Mar 17, 2025
Jan 9, 2025
Mar 11, 2025
Mar 5, 2025
Feb 24, 2025
Mar 11, 2025
Mar 17, 2025
Mar 11, 2025
Nov 25, 2024
Nov 23, 2024
Mar 25, 2024
Mar 17, 2025
Feb 12, 2025
Dec 13, 2024

Repository files navigation

Mooncake.jl

Build Status codecov Code Style: Blue ColPrac: Contributor's Guide on Collaborative Practices for Community Packages Stable docs Aqua QA

The goal of the Mooncake.jl project is to produce a reverse-mode AD package which is written entirely in Julia, which improves over both ReverseDiff.jl and Zygote.jl in several ways, and is competitive with Enzyme.jl. Please refer to the docs for more info.

Getting Started

Check that you're running a version of Julia that Mooncake.jl supports. See the SUPPORT_POLICY.md file for more info.

There are several ways to interact with Mooncake.jl. The way that we recommend people to interact with Mooncake.jl is via DifferentiationInterface.jl. For example, use it as follows to compute the gradient of a function mapping a Vector{Float64} to Float64.

using DifferentiationInterface
import Mooncake

f(x) = sum(cos, x)
backend = AutoMooncake(; config=nothing)
x = ones(1_000)
prep = prepare_gradient(f, backend, x)
gradient(f, prep, backend, x)

You should expect that prepare_gradient takes a little bit of time to run, but that gradient is fast.

We are committed to ensuring support for DifferentiationInterface, which is why we recommend using that. If you are interested in interacting in a more direct fashion with Mooncake.jl, you should consider Mooncake.value_and_gradient!!. See its docstring for more info.