Skip to content

Latest commit

 

History

History

lcm

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 

Serving SDXL + LCM LoRAs with BentoML

Latent Consistency Models (LCM) introduce a method to improve how images are created, especially with models like Stable Diffusion (SD) and Stable Diffusion XL (SDXL). By integrating LCM LoRAs for SD-based models, you can significantly reduce computational timeframe within just 2 to 8 steps.

This is a BentoML example project, demonstrating how to build a REST API server for SD XL using LCM LoRAs. See here for a full list of BentoML example projects.

Prerequisites

If you want to test the Service locally, we recommend you use a Nvidia GPU with at least 12GB VRAM.

Install dependencies

git clone https://github.com/bentoml/BentoDiffusion.git
cd BentoDiffusion/lcm

# Recommend Python 3.11
pip install -r requirements.txt

Run the BentoML Service

We have defined a BentoML Service in service.py. Run bentoml serve in your project directory to start the Service.

bentoml serve .

The server is now active at http://localhost:3000. You can interact with it using the Swagger UI or in other different ways.

CURL

curl -X 'POST' \
  'http://localhost:3000/txt2img' \
  -H 'accept: image/*' \
  -H 'Content-Type: application/json' \
  -d '{
  "prompt": "close-up photography of old man standing in the rain at night, in a street lit by lamps, leica 35mm summilux"
}' -o out.jpg

Python client

import bentoml
from pathlib import Path

with bentoml.SyncHTTPClient("http://localhost:3000") as client:
    result_path = client.txt2img(
        guidance_scale=1,
        num_inference_steps=4,
        prompt="close-up photography of old man standing in the rain at night, in a street lit by lamps, leica 35mm summilux",
    )

    destination_path = Path("/path/to/save/image.png")
    result_path.rename(destination_path)

For detailed explanations of the Service code, see Stable Diffusion XL with LCM LoRAs.

Deploy to BentoCloud

After the Service is ready, you can deploy the application to BentoCloud for better management and scalability. Sign up if you haven't got a BentoCloud account.

Make sure you have logged in to BentoCloud, then run the following command to deploy it.

bentoml deploy .

Once the application is up and running on BentoCloud, you can access it via the exposed URL.

Note: For custom deployment in your own infrastructure, use BentoML to generate an OCI-compliant image.