Skip to content

nod-ai/shark-ai

Repository files navigation

shark-ai: SHARK Modeling and Serving Libraries

GitHub License pre-commit

SHARK Users

If you're looking to use SHARK check out our User Guide. For developers continue to read on.

Sub-projects

PyPI version CI - shortfin

The shortfin sub-project is SHARK's high performance inference library and serving engine.

  • API documentation for shortfin is available on readthedocs.

PyPI version CI - sharktank

The SHARK Tank sub-project contains a collection of model recipes and conversion tools to produce inference-optimized programs.

CI - sharktuner

The SHARK Tuner sub-project assists with tuning program performance by searching for optimal parameter configurations to use during model compilation. Check out the readme for more details.

CI - sharkfuser

The SHARK Fuser sub-project is home to Fusili - a C++ Graph API and Frontend to the IREE compiler and runtime stack for JIT compilation and execution of training and inference graphs. It allows us to expose cuDNN-like primitives backed by IREE code-generated kernels. Check out the readme for more details.

Support matrix

Models

Model name Model recipes Serving apps Guide
SDXL sharktank/sharktank/models/punet/ shortfin/python/shortfin_apps/sd/ shortfin/python/shortfin_apps/sd/README.md
llama sharktank/sharktank/models/llama/ shortfin/python/shortfin_apps/llm/ docs/shortfin/llm/user/llama_serving.md
Flux sharktank/sharktank/models/flux/ shortfin/python/shortfin_apps/flux/ shortfin/python/shortfin_apps/flux/README.md

SHARK Developers

If you're looking to develop SHARK, check out our Developer Guide.