fabrique provides foundation components for ML research in LLM/VLM space, including:
- model implementations
- fine-tuning routines and examples
- multi-GPU execution
- interoperability with broader ecosystem
fabrique is written in JAX/Flax NNX and follows their philosophy.
You can install the latest released version of fabrique from PYPI:
pip install fabriqueAlternatively, you can mount the development version of fabrique directly to your project and use existing code as reference for your own models:
cd /path/to/your/project
# clone the repository
mkdir lib
git clone https://github.com/ridcl/fabrique lib/fabrique
# or even add it as a submodule
# git submodule add [email protected]:ridlc/fabrique.git lib/fabrique
# set up PYTHONPATH to include fabrique as a package
export PYTHONPATH=${PYTHONPATH}:lib/fabrique/srcTODO
As of now, fabrique focuses on Gemma as its primary LLM/VLM implementation. Previously, fabrique also supported Llama 3, Phi 3/4 and Qwen 2.5. You can find these old implementations in legacy/src/fabrique/models.