Skip to content

Bug Report: @fastify/compress causes premature close for large payloads in Fastify v5 #350

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
2 tasks done
AniketUndalekar1997 opened this issue Mar 12, 2025 · 21 comments

Comments

@AniketUndalekar1997
Copy link

AniketUndalekar1997 commented Mar 12, 2025

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

5.2.0

Plugin version

8.0.3

Node.js version

20.8.1

Operating system

macOS

Operating system version:

sonoma(14.1.1)

Description

After upgrading Fastify from v3 to v5, we encountered an issue where the @fastify/compress plugin prematurely closes the response when handling large payloads . This results in a "premature close" error before the response is fully sent to the client.

error:
{"type":"Error","message":"premature close","stack":"Error: premature close\n at onclosenexttick (/Users/4911462/Documents/testBmService/baymax-service/node_modules/end-of-stream/index.js:54:86)\n at process.processTicksAndRejections (node:internal/process/task_queues:77:11)"},"msg":"premature close"}

Actual Behavior
The response is never fully sent, and Fastify logs a "premature close" error.

Additional Context
Fastify version: 5.2.0
@fastify/compress version: 8.0.2/8.0.1
Node.js version: 20.x
Deployed environment: Kubernetes pods (though reproducible locally)

### Link to code that reproduces the bug
Ex: myHandler.js
const myHandler = {
    async getAllLiveData(request, response){
        ...rest of the code goes here
        const res = await serviceLayer.getDataMethod(this, );
        response.header('Cache-Control', `s-maxage=------`);
        response.type('application/json');
        if(res){
            res.config = getConfig();
            res.config.enableSDKLogs = shouldAllowSDKLogging(vault.conf, requestDetails)
            setNodeCacheIfEnabled(this, res, request, 'v2Cache')
            response.compress(res)
        } else {
            response.status(204);
        } 
    },

compress plugin initialization
const fp = require('fastify-plugin');
const compress = require('@fastify/compress');

module.exports = fp((fastify) => {
    fastify.register(compress, { encodings: ['gzip'], global: false });
});

Potential Workarounds
We temporarily removed @fastify/compress and directly returned the response. However, this is not ideal for production as compression is essential for reducing payload sizes.

Open Questions
Is this a known issue in Fastify v5?
Are there alternative configurations for handling large payloads without hitting this limitation?
Should we handle compression manually instead of using @fastify/compress?

Link to code that reproduces the bug

None

Expected Behavior

Fastify should successfully compress and send large responses without prematurely closing the connection.

@mcollina
Copy link
Member

Thanks for reporting!

Can you provide steps to reproduce? We often need a reproducible example, e.g. some code that allows someone else to recreate your problem by just copying and pasting it. If it involves more than a couple of different file, create a new repository on GitHub and add a link to that.

@AniketUndalekar1997
Copy link
Author

AniketUndalekar1997 commented Mar 12, 2025

@mcollina Sure! Since our code is private, I’ll share a code snippet to reproduce the issue.

Code to Reproduce the Fastify Compression Error

// yourRoute.js
async function routes(fastify) {
        fastify.route({
        method: 'GET',
        url: '/api/enpoint/to/reproduce/compression/error',
        preHandler: [],
        handler: yourHandler.reproduceFastifyCompressErrorHandler,
        schema: {
                ...add your schema here
        }
    });
} 
// yourHandler.js
const yourHandler = {
    async reproduceFastifyCompressErrorHandler(request, response){
        const res = mockedResponse
        response.header('Cache-Control', `s-maxage=-------`);
        response.type('application/json');
        if(res){
            response.compress(res)
        } else {
            response.status(204);
        } 
    },
}

To reproduce, use a large mocked response file (>150KB). You can download an example here: https://chatgpt.com/share/67d1ea17-d1cc-800d-8e84-60ff36dd4247
Alternatively, you can generate a large JSON response on your own.

Compression Plugin Configuration (compress.js):

const fp = require('fastify-plugin');
const compress = require('@fastify/compress');

module.exports = fp((fastify) => {
    fastify.register(compress, { encodings: ['gzip'], global: false });
})

@mcollina
Copy link
Member

Sorry that's exactly what a reproducible example is not. Please combine all the above to create something I can execute to reproduce the problem. Verify that it actually show the problem.

@AniketUndalekar1997
Copy link
Author

Hey @mcollina
I've created a repository with:
https://github.com/AniketUndalekar1997/fastify-compression-issue

  • A large JSON file where the response is failing.
  • A sample JSON file where the response works as expected.
    I've added instructions in the README to run it locally. Let me know if you need anything else.
    Hope this is enough to address the issue.
    Happy to contribute further if needed! Let me know.

@mcollina
Copy link
Member

The README seems empty

@AniketUndalekar1997
Copy link
Author

@Eomm
I followed the above solution as per your suggestion but still see the same error.
Did you test it properly after adding the return statement?

@mcollina
i've added simple step to run the service locally
Required node version is 20.8.1 and above
cmd to run

  • yarn
  • yarn start

@mcollina
Copy link
Member

How do you trigger the error?

@AniketUndalekar1997
Copy link
Author

@mcollina
run the service locally and hit this endpoint via postman -> /api/reproduce/compression-error

I'm still encountering the "premature close" error for large response payloads, while it works fine for smaller ones.

I'm using Node v20.8.1.

Attaching a screenshot for reference. Let me know if you're able to reproduce the issue on your machine.

Image Image Image Image

@mcollina
Copy link
Member

what headers are you setting?

@AniketUndalekar1997
Copy link
Author

AniketUndalekar1997 commented Mar 22, 2025

@mcollina
Image

@AniketUndalekar1997

This comment has been minimized.

@mcollina
Copy link
Member

Confirmed, we can use the following to reproduce:

curl -v -H 'Accept-Encoding: gzip' http://127.0.0.1:3000/api/reproduce/compression-error > /dev/null

@mcollina
Copy link
Member

mcollina commented Mar 24, 2025

The following will allow you to bypass the problem because you are using the global handler setup:

diff --git a/handler.js b/handler.js
index 38e05b5..d71e71c 100644
--- a/handler.js
+++ b/handler.js
@@ -13,7 +13,7 @@ const yourHandler = {
         reply.type('application/json');
 
         if (res) {
-            return reply.compress(res); // Use compression
+            return res; // Use compression
         } else {
             return reply.status(204).send();
         }

I don't really know why calling compress directly is not working, I'm a bit puzzled, I'll need to investigate further. Overall I think this module is overly complex and needs a cleanup anyway, likely the bug is lurking somewhere.

@AniketUndalekar1997
Copy link
Author

@mcollina
Thanks, For the the time being will follow the above approach.

@AniketUndalekar1997
Copy link
Author

AniketUndalekar1997 commented Apr 7, 2025

@mcollina @Eomm
can you guys try to solve the above issue ASAP, we have some critical services running on Fastify and
We took this migration task as we've deadlines to resolve all the major Snyk vulnerabilities.
Also above approach didn't worked well, we're still seeing few errors with compression logic.

@mcollina
Copy link
Member

mcollina commented Apr 7, 2025

We would 100% welcome a PR for this contribution. I cannot take this on; I am currently focusing on supporting Platformatic customers.

@simoneb
Copy link
Contributor

simoneb commented Apr 7, 2025

@mcollina @Eomm can you guys try to solve the above issue ASAP, we have some critical services running on Fastify and We took this migration task as we've deadlines to resolve all the major Snyk vulnerabilities. Also above approach didn't worked well, we're still seeing few errors with compression logic.

Can you please share what didn't work in the previously suggested workaround?

@jsumners
Copy link
Member

jsumners commented Apr 7, 2025

@mcollina @Eomm can you guys try to solve the above issue ASAP, we have some critical services running on Fastify and We took this migration task as we've deadlines to resolve all the major Snyk vulnerabilities. Also above approach didn't worked well, we're still seeing few errors with compression logic.

This is an open source, community supported, project. If there is something you feel you need solved on a schedule you are able to influence, you have two options:

  1. Contribute the work yourself.
  2. Hire a consultancy to do the work on your behalf. I believe Nearform would be able to accommodate you.

Otherwise, you will find that demanding people give up their personal time to solve an issue for you will not be effective.

@simoneb
Copy link
Contributor

simoneb commented Apr 7, 2025

As a workaround, you may want to try to do the compression manually:

diff --git a/handler.js b/handler.js
index 38e05b5..b42035b 100644
--- a/handler.js
+++ b/handler.js
@@ -1,5 +1,6 @@
 const fs = require('fs');
 const path = require('path');
+const zlib = require('zlib');

 const yourHandler = {
     async reproduceFastifyCompressErrorHandler(request, reply) {
@@ -12,11 +13,8 @@ const yourHandler = {
         reply.header('Cache-Control', 's-maxage=600');
         reply.type('application/json');

-        if (res) {
-            return reply.compress(res); // Use compression
-        } else {
-            return reply.status(204).send();
-        }
+        const compressed = zlib.gzipSync(res);
+        reply.header('Content-Encoding', 'gzip').send(compressed);
     },
 };

@AniketUndalekar1997
Copy link
Author

AniketUndalekar1997 commented Apr 7, 2025

@simoneb
Thanks for suggesting manual compression, tried it sadly I was getting this error when I try to hit the endpoint multiple times
"msg":"Reply was already sent, did you forget to "return reply"

Sample code is this

Handler.js
const yourHandler = {
     async reproduceFastifyCompressErrorHandler(request, reply) {
           ...other code goes here.....
           const compressed = zlib.gzipSync(JSON.stringify(liveFFs));
           return response.header('Content-Encoding', 'gzip').send(compressed);
    }
}

@simoneb
Copy link
Contributor

simoneb commented Apr 8, 2025

@AniketUndalekar1997 all worked fine here, see the below autocannon run against the app. Maybe there's something else in your app that causes the misbehavior.

> npx autocannon http://127.0.0.1:3000/api/reproduce/compression-error
Running 10s test @ http://127.0.0.1:3000/api/reproduce/compression-error
10 connections


┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼───────┤
│ Latency │ 1 ms │ 5 ms │ 13 ms │ 16 ms │ 5.67 ms │ 2.92 ms │ 32 ms │
└─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg     │ Stdev  │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼────────┼─────────┤
│ Req/Sec   │ 1457    │ 1457    │ 1564    │ 1986    │ 1620,7  │ 149,82 │ 1457    │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼────────┼─────────┤
│ Bytes/Sec │ 1.06 MB │ 1.06 MB │ 1.14 MB │ 1.45 MB │ 1.18 MB │ 109 kB │ 1.06 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 10

16k requests in 10.02s, 11.8 MB read

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants