-
Notifications
You must be signed in to change notification settings - Fork 58
Open
Description
I create a crc32 hash of file with await createCRC32();. After many hash, i have the following error "out of memory". I suspect a bug because i found nothing on documentation to destroy the hash.
I use the following code to compute the checksum of several thousand files:
async calculateChecksum(file: File /* UppyFile for reproduction */): Promise<string> {
const CHUNK_SIZE = 10 * 1024 ** 2; // 10 MB
const blob = file.data;
let offset = 0;
const crc32 = await createCRC32();
crc32.init();
while (offset < blob.size) {
const chunk = blob.slice(offset, offset + CHUNK_SIZE);
const buffer = await chunk.arrayBuffer();
crc32.update(new Uint8Array(buffer));
offset += CHUNK_SIZE;
}
return crc32.digest();
}
When I use a custom implementation of createCRC32, like this:
function createCRC32(): Promise<{
init: () => void;
update: (data: Uint8Array) => void;
digest: () => string;
}> {
const CRC32_TABLE = new Uint32Array(256);
for (let i = 0; i < 256; i++) {
let c = i;
for (let j = 0; j < 8; j++) {
c = c & 1 ? 0xEDB88320 ^ (c >>> 1) : c >>> 1;
}
CRC32_TABLE[i] = c;
}
let crc = 0xFFFFFFFF;
return Promise.resolve({
init: () => {
crc = 0xFFFFFFFF;
},
update: (data: Uint8Array) => {
for (let i = 0; i < data.length; i++) {
const byte = data[i];
crc = CRC32_TABLE[(crc ^ byte) & 0xFF] ^ (crc >>> 8);
}
},
digest: () => {
return ((crc ^ 0xFFFFFFFF) >>> 0).toString(16).padStart(8, '0');
},
});
}
everything works fine — memory usage stays stable even with thousands of files.
However, when I use createCRC32 from the "hash-wasm" package:
import { createCRC32 } from "hash-wasm";
I quickly run into an "out of memory" error.
Metadata
Metadata
Assignees
Labels
No labels