Open
Description
Describe the bug
When running the sample StartTranscriptionByTimer
(from samples/ingestion/ingestion-client/StartTranscriptionByTimer
), after running for some time on Azure Functions, the following error occurs:
Sample Function Logs
7/8/2025, 10:56:00 AM Information Executing 'Functions.StartTranscriptionByTimer' (Reason='Timer fired at 2025-07-08T10:56:00.0067277+00:00', Id=eed2a349-6156-411d-a066-94670e1547a7)
7/8/2025, 10:56:00 AM Information Trigger Details: ScheduleStatus: {"Last":"2025-07-08T10:54:00.0078379+00:00","Next":"2025-07-08T10:56:00+00:00","LastUpdated":"2025-07-08T10:54:00.0078379+00:00"}
7/8/2025, 10:56:00 AM Information C# Isolated Timer trigger function v4 executed at: 7/8/2025 10:56:00 AM. Next occurrence on 7/8/2025 10:56:00 AM.
7/8/2025, 10:56:00 AM Information Pulling messages from queue...
7/8/2025, 10:56:00 AM Information start_transcription_queue-52d611e6-0928-4d63-8f6e-cb1ab489ae40: ReceiveBatchAsync start. MessageCount = 1000
7/8/2025, 10:56:00 AM Information Creating receive link for Identifier: start_transcription_queue-52d611e6-0928-4d63-8f6e-cb1ab489ae40.
7/8/2025, 10:56:00 AM Error start_transcription_queue-52d611e6-0928-4d63-8f6e-cb1ab489ae40: ReceiveBatchAsync Exception: Azure.Messaging.ServiceBus.ServiceBusException: Cannot allocate more handles. The maximum number of handles is 4999. (QuotaExceeded)...
7/8/2025, 10:56:00 AM Error Executed 'Functions.StartTranscriptionByTimer' (Failed, Id=eed2a349-6156-411d-a066-94670e1547a7, Duration=12ms)
7/8/2025, 10:56:00 AM Error Result: Failure Exception: Azure.Messaging.ServiceBus.ServiceBusException: Cannot allocate more handles. The maximum number of handles is 4999. (QuotaExceeded)...
This error causes the function to fail repeatedly and blocks further processing until the function app is restarted.
To Reproduce
Steps to reproduce the behavior:
- Deploy the sample to Azure Functions (.NET isolated process).
- Set a timer trigger to run every 2–3 minutes.
- Allow the function to process messages from Service Bus continuously.
- After several hours (or days, depending on message volume), the function starts to throw
QuotaExceeded
exceptions and stops being able to receive messages.
Expected behavior
The function should be able to continuously process messages without exceeding the Service Bus handle quota, as long as message volume is within expected limits.
Actual behavior
- The function leaks Service Bus handles/connections, eventually exhausting the allowed quota.
- The only workaround is to restart the Function App, which temporarily resolves the issue.
- Metrics in Azure Portal (Active Connections) confirm that the number of connections keeps increasing with each execution.
Root Cause (Analysis)
- The
ServiceBusReceiver
instance is created in the constructor and is never disposed after use. - In Azure Functions isolated worker, new class instances may be created frequently, causing multiple
ServiceBusReceiver
/connection objects to accumulate without cleanup. - Eventually, this leads to
QuotaExceeded
for Service Bus handles.
Proposed Solution
- Refactor the sample code to ensure
ServiceBusReceiver
(and related Service Bus client objects) are disposed after use. - Use
await using
or explicitly callDisposeAsync()
after message processing completes in each execution. - Consider creating the receiver within the function method instead of the class constructor, unless the lifecycle can be reliably managed.
Additional context
- See this Azure troubleshooting guide
- Metrics screenshot:

Service Bus

Environment:
- Azure Functions .NET Isolated Worker
- Sample:
StartTranscriptionByTimer
- Service Bus SDK: Azure.Messaging.ServiceBus
Deployment.
Metadata
Metadata
Assignees
Labels
No labels