Agent Service is a powerful offering within Azure AI Foundry that allows you to develop intelligent AI agents. AI agents can be customised to answer questions, perform autonomous sets of tasks and interact with users naturally and intuitively.
This repo contains the source code for a Streamlit-based UI Demo Kit showcasing various capabilities of the Agent Service, including:
- Solving complex problems with the
Code Interpreter
(which builds and runs sandboxed Python code); - Grounding model outputs (completions) with real-time
Bing Search
results; - with more to come...
Note
The Streamlit app can be run locally on your computer and requires access to AI models deployed in Azure AI Foundry. Alternatively, you can deploy a pre-built app using the provided Docker image.
- Part 1: Configuring solution environment
- Part 2: Web app - User Guide
- Part 3: Web app - Docker image option
- Part 4: Demo videos on YouTube
- Copy the connection string from your AI Foundry Project settings, as shown in the image below:
- Set the environment variable for the copied Project connection string:
- Windows: Add AZURE_FOUNDRY_PROJECT_CONNSTRING as a system variable with the copied string as its value;
- macOS/Linux: Set the variable in your terminal:
export AZURE_FOUNDRY_PROJECT_CONNSTRING="your_connection_string"
- Add other environment variables to enable specific UI Demo Kit capabilities:
Environment Variable | Description | Scenario |
---|---|---|
AZURE_FOUNDRY_GPT_MODEL |
Deployment name of the Azure OpenAI GPT model | * |
AZURE_FOUNDRY_BING_SEARCH |
Connection name of the Bing Search resource, as described here | Grounding with Bing Search |
- Install the required Python packages using the pip command and the provided requirements.txt file:
pip install -r requirements.txt
Note
Local installation utilises the DefaultAzureCredential class. Depending on your environment, the UI Demo Kit will search for available Azure identities in the order described here.
- To launch the web app, run the following command from the root folder of this repo:
streamlit run AgentService_Streamlit_v1.py
- If everything was installed correctly as per Part 1's instructions, you should be able to access the demo solution's web page locally at http://localhost:8501.
- The UX is intentionally minimalistic. Here's how to use it:
- Choose a Capability: Begin by choosing the desired capability from the left-side navigation panel.
- Enter Your Prompt: Each scenario comes with a default prompt. You can modify it in the provided text area.
- Run the Agent: Click the "Run" button. The underlying Agent Service will utilise relevant tools, with the run status reflected in the progress bar.
- View the Output: Depending on the selected scenario, the UI Demo Kit may produce its output in text, code and image formats.
Note
As a Generative AI solution, the Agent Service is inherently non-deterministic. Therefore, it’s normal to receive slightly different outputs in the UI Demo Kit for the same prompts.
This repo includes a companion Docker image on GitHub Container Registry (GHCR), containing a pre-built web app with all dependencies. It allows you to launch the UI Demo Kit as a container without getting deep into its code's specifics.
There are two ways to utilise the provided Docker image:
- Create a new Azure Web App and set the source container to:
_Image Source_
: Other container registries_Access type_
: Public_Registry server URL_
: https://ghcr.io_Image and tag_
: lazauk/uidemokit-agentservice:latest_Port_
: 8501
- Enable Managed Identity for your newly created Web App:
- In Azure AI Foundry, assign your Web App's managed identity the Azure AI Developer role:
- Add new application settings for each environment variable described in Part 1 above.
- If you prefer to customise the web app, you can use the provided Docker image as a base for your own Dockerfile. Begin your Dockerfile with the following line:
FROM ghcr.io/lazauk/uidemokit-agentservice:latest
- The main script (AgentService_Streamlit_v1.py) is located in the /app working directory of the container.
Warning
If deploying the Docker container locally or on another cloud platform, you will need to configure a mechanism to pass credentials for a service principal authorised to access your Azure AI Foundry resources. This is not required when deploying to Azure Web App with Managed Identity.
This is a playlist of short videos demonstrating this solution in action.