You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: documentation/3-streams.md
+26-12Lines changed: 26 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -124,27 +124,41 @@ An object representing how much data have been downloaded.
124
124
125
125
An object representing how much data have been uploaded.
126
126
127
-
**Note:**
128
-
> - When a chunk is greater than `highWaterMark`, the progress won't be emitted. The body needs to be split into chunks.
127
+
> **Note:** To get granular upload progress instead of just 0% and 100%, split the body into chunks using the [`chunk-data`](https://github.com/sindresorhus/chunk-data) package:
128
+
> -`chunk(buffer, chunkSize)` - For buffers/typed arrays
129
+
> -`chunkFromAsync(iterable, chunkSize)` - For iterables (Node.js streams are async iterables)
129
130
130
131
```js
131
132
importgotfrom'got';
133
+
import {chunk, chunkFromAsync} from'chunk-data';
134
+
import {Readable} from'node:stream';
132
135
133
-
constbody=Buffer.alloc(1024*1024); //1MB
136
+
//Chunk a buffer
134
137
135
-
function*chunkify(buffer, chunkSize=64*1024) {
136
-
for (let pos =0; pos <buffer.byteLength; pos += chunkSize) {
0 commit comments