Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reduce memory consumption #47

Open
wants to merge 13 commits into
base: main
Choose a base branch
from

Conversation

romainmenke
Copy link
Member

@romainmenke romainmenke commented Jun 30, 2024

see : #37

The current implementation of reading files and creating polyfill bundles doesn't really make sense.

It is both streaming and cached.
These are not things you can/should combine.

Either you make something streaming because you are concerned about memory usage.
Or you have a cache and try to serve directly from memory.

So the current implementation is needlessly complex while also being slow and using too much memory.

The slowness is mostly the result of having too many cache misses and an implementation that is way too complex for what is actually going on.


This change:

  • make it a real streaming implementation
  • reduce the number of filesystem operations by combining the meta.json files for all polyfills and lazily loading this in memory
  • reduce the number of async operations, especially waiting on things that are actually sync reads
  • pre-compute the dependency graph during the build process and store the toposort order in meta.json
  • remove as many needless conversions from streams to strings and back
  • make the fs functions user configurable

Configurable fs functions:

/**
 * Set the file system implementation to use.
 * The default implementation uses the Node.js `fs` module.
 *
 * @param {Object} readFile - A function that reads a file and returns a promise.
 * @param {Object} createReadStream - A function that creates a read stream.
 */
function setFs(readFile, createReadStream) {
	fsImpl = {
		readFile,
		createReadStream
	};
}

This is a replacement for those users who want caching or other optimizations.

I don't want to add an explicit cache mechanic as I still believe that an in memory cache implemented in JavaScript is dangerous when running this as a hosted service.

It is too easy to request different polyfills one by one, thrashing both the LRU cache layer and the garbage collector to eventually starve the host of both memory and CPU.

Either you want to have an edge cache in front of a hosted service (making in memory cache less relevant) or you want to use something like sqlite to optimize filesystem reads.


I don't consider the removal of the LRU cache a breaking change because this was an internal implementation detail. But I am open to other opinions here :)

@romainmenke romainmenke marked this pull request as draft June 30, 2024 18:17
Comment on lines -117 to -130
describe('sources.listPolyfills()', () => {
it('returns a promise which resolves with an array containing names for each polyfilled feature', () => {
const sources = require('../../../lib/sources');
return sources.listPolyfills().then((result) => {
assert.ok(Array.isArray(result));
assert.ok(result.length > 0);
assert.deepStrictEqual(result[0], 'AbortController');
assert.equal(
result.filter(x => (typeof x === 'string')).length,
result.length
);
});
});
});
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test existed twice

@romainmenke romainmenke marked this pull request as ready for review July 1, 2024 21:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant