-
Notifications
You must be signed in to change notification settings - Fork 80
FAQ
Promise is an ES6 addition. Because fflate
aims to support ancient browsers without the bloat of a polyfill, Promises weren't an option. However, if you like using them or want a clean async
/await
experience, it's trivially easy to extend fflate
with Promise or thenable support yourself:
// You can use this or Node.js's util.promisify
const promisify = (func) => {
return (...args) => {
return new Promise((resolve, reject) => {
func(...args, (err, res) => err
? reject(err)
: resolve(res)
);
});
}
}
import { zip as zipCb, strToU8 } from 'fflate';
const zip = promisify(zipCb);
async function makeZip() {
const toZip = {
dir: {
'hello.txt': strToU8('Hello world!')
},
'example.json': strToU8(JSON.stringify({
wasThis: 'a good example?'
}))
};
// zipped is a Uint8Array
const zipped = await zip(toZip);
return zipped;
}
fflate
accepts only Uint8Array objects, but often you'll end up with a FileList from the upload.
// This is a file input DOM object. If you're using a
// framework like React, this may come from, for
// example, a ref object created by `React.useRef`.
const fileInput = document.querySelector(
'input[type="file"]'
);
// This is a FileList object
const uploadedFiles = fileInput.files;
// This is a File array
const files = Array.prototype.slice.call(
uploadedFiles
);
In order to maximize performance, browsers do not make it possible to immediately access the raw file data from JavaScript. You must read it first either with file.arrayBuffer() (only supported by modern browsers and not automatically polyfilled) or FileReader. If you can use arrayBuffer() (i.e. you don't need to support older browsers), your life is easy:
const myFile = files[0];
const u8File = new Uint8Array(await myFile.arrayBuffer());
// Now the single file compression methods work
fflate.gzip(u8File, (err, dat) => ...);
// Alternatively, to create a zippable object:
const zippable = {};
for (const file of files) {
zippable[
// If you want the ZIP to include directories, use
// file.webkitRelativePath
file.webkitRelativePath || file.name
] = new Uint8Array(await file.arrayBuffer());
}
fflate.zip(zippable, (err, dat) => ...);
You'd probably be able to reduce memory usage or handle multi-gigabyte files by using the streaming API:
const myFile = files[0];
const fileStream = myFile.stream();
const reader = fileStream().getReader();
// This can be any fflate stream, just tweak the ReadableStream logic
const fflateStream = new fflate.AsyncGzip();
const outStream = new ReadableStream({
start(controller) {
fflateStream.ondata = (err, chunk, final) => {
if (err) outStream.error(err);
else {
controller.enqueue(chunk);
if (final) controller.close();
}
}
}
});
const read = async () => {
const { done, value } = await reader.read();
if (done) {
fflateStream.push(new Uint8Array(), true);
return;
}
fflateStream.push(value);
read();
};
read();
If you want to use FileReader
but don't know how, you'll need to take a look at how this was done in the demo or this discussion.
What about the Compression Streams API?
This API is only available in Chromium-based web browsers as of late 2022. However, I've made a polyfill based on fflate
if you'd like to use it anyway.
It is important to note that CompressionStream
and DecompressionStream
are slightly slower than fflate
in many cases due to overheads caused by Web Streams. You can avoid these overheads by using a native Web Stream source (e.g. File.prototype.stream()
) instead of manually pushing chunks. Even in this case, the fflate
-based polyfill is nearly as fast as the native implementation.
If you see require('worker_threads')
in any code bundled for the browser, your bundler probably didn't resolve the browser
field of package.json
. You can enable it (e.g. for Rollup) or you can manually import the ESM version at fflate/browser
. If for some reason you're getting the browser variant and want to force the Node version, import fflate/node
. The API for both imports is identical to the original API described in the docs.
If you don't understand something here, don't hesitate to open an issue, create a discussion, or email me directly.