-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cancelling and exception handling async iteration of ReadableStreams #1255
Comments
The easiest way would be to
You can use an const controller = new AbortController();
button.addEventListener('click', () => controller.abort());
logChunks(mystream, { signal: controller.signal });
async function logChunks(readableXX, { signal }) {
for await (const chunk of readableXX) {
if (signal.aborted) throw signal.reason;
bytes += chunk.length;
logConsumer( `Chunk: ${chunk}. Read ${bytes} characters.`);
}
}
Hmm, that's odd. An error should cause Could your provide a minimal reproduction case for this?
Yes, there's web-streams-polyfill and sd-streams.
Indeed, there are no browsers that ship an implementation yet. But both NodeJS and Deno already have full support and are fully compliant. 🙂 |
@MattiasBuelens Thank you! For 1., the problem with using AbortController is that it is the same as the suggestion I made in the bullet:
So yes, you can throw in the loop on abort or some other signal, or you can just call return to silently exit. The problem is that you only get to do this after a new chunk of data has arrive - right? I was thinking it would be better to call your fetch with the abortcontroller - that would abort the fetch that is supplying your stream, propagating the abort reason from the underlying source. So as a general recommendation I was thinking "abort or cancel the underlying source if mechanisms exist, otherwise you will have to wait for the next loop iteration and call break/return (as you indicate). For 2, I will get back to you on Friday (on another job today). |
That's a good point. As currently specified, all calls to So unfortunately, it's not possible to cancel a pending read when async-iterating a
Indeed, if you can pass an However, that does make it harder to compose streams. If you have a pipe chain, ideally you want to consume and cancel it from the end of the chain, and then have it propagate up to the start. But now you have to also keep track of an I'm wondering if we should change the specification for this. Should Or maybe we could add a async function logChunks(readableXX, { signal }) {
for await (const chunk of readableXX.values({ signal })) { // <<<
bytes += chunk.length;
logConsumer( `Chunk: ${chunk}. Read ${bytes} characters.`);
}
} |
@domenic What are your thoughts on this? Should it be possible to cancel the stream immediately through its async iterator, and have those pending reads become rejected? (Or I guess they should actually become resolved...) Was there a reason why we needed all |
Thanks @MattiasBuelens - it is really helpful that you have confirmed the current behaviour. I'm interested to see the further result of this discussion on whether the API needs to change. W.r.t. the other part of my post about catching errors, you are right that putting the try/catch around the |
I am pretty sure we designed Web IDL's async iterator machinery this way, to make it follow JavaScript async generators. That is, if you do
The async iterator protocol itself would allow us to do something different here, but I'm unsure if we should depart from async generator behavior... |
This might be a silly question, but are you try {
await logChunks();
} catch (error) {
console.error("Oh no!");
} Because if you don't
Right, of course, I should have compared with async generators. 😅 I suppose we can't depart from those semantics. How do we feel about adding an However, I'm a bit worried about how that would compose. Would the proposed const readable1 = new ReadableStream({ /* ... */ });
const controller = new AbortController();
const iterator = readable1.values({ signal: controller.signal });
const readable2 = ReadableStream.from(iterator, { controller });
// readable2 behaves like readable1, and cancelling readable2 immediately cancels readable1 |
@MattiasBuelens I'm an idiot - as you say, not awaiting the The rest of the question I presume is to @domenic . |
I mean, we definitely can. The async iterator protocol is very bare-bones, with lots of details left up to the specific iterator. The async generator case is one specific instantiation of that protocol, and perhaps is the one that has behavior people expect / the language designers intended? So I think it's a good starting place, and we should be hesitant to depart from it. But, we could depart, if we think there's a good reason and it wouldn't surprise people too much... Regarding
|
That's true, but I've just had a chance to refactor some code that was using async iterables and generators to use streams and I don't think it makes sense to add an abort signal to stream constructors. Calling However, it might be common enough that cancelling a readable stream should abort an in-flight pull that providing a signal on the controller would be a convenience and improve symmetry with writable streams. There are other asymmetries between readable and writable stream, the nuance of which I might not fully appreciate yet, and I don't know if the signal is part of that. Calling If we did want to add a signal to the readable controller, does it then become important that the signal be a way to abort an in-flight pull, and should cancellation therefore wait for that? I think the spec would maybe want to break cancellation into two phases like it does with writable streams. In the first phase, an in-flight pull would get a chance to resolve, then the second phase would be as cancellation is now. That seems like a more invasive change that deliberate breaks the asymmetry, and then I'm left wondering why |
@domenic it's basically unsafe to use async iteration on anything that acquires resources, unless you do something like wrap with a signal handler: export function abortableAsyncIterable<T, TReturn, TNext>(
iter: AsyncIterable<T, TReturn, TNext>,
signal: AbortSignal
): AsyncIterable<T, TReturn, TNext> {
const abortedPromise = new Promise<IteratorResult<T, TReturn>>(
(resolve, reject) => {
if (signal.aborted) {
reject(new DOMException('aborted', 'AbortError'))
}
signal.addEventListener('abort', () =>
reject(new DOMException('aborted', 'AbortError'))
)
}
)
abortedPromise.catch(() => {})
return {
[Symbol.asyncIterator]: () => {
const inner = iter[Symbol.asyncIterator]()
const { return: _return, throw: _throw } = inner
return {
next: (...args) => Promise.race([inner.next(...args), abortedPromise]),
return: _return ? (...args) => _return.apply(inner, args) : undefined,
throw: _throw ? (...args) => _throw.apply(inner, args) : undefined,
}
},
}
} And even with this, it's still up to the async iterable to handle That's a lot of responsibility on both the consumer and producer sides... I'm pretty sure among places the webapps I work on do async iteration, the majority of them need something like this. I think this should be a builtin function or even special syntax like for await (const elem of asyncIterable until signal) {
...
} Because it's easy to forget that we'll leak resources without doing this, and I'm sure a lot of JS developers don't even realize it. |
I almost think it would be better for everyone if there were no But maybe I'm in the minority for wrapping event streams that can wait indefinitely without timing out in |
I'm looking at this for the MDN documentation (tracked in mdn/content#23678)
Reading a stream looks pretty straightforward; you get a stream and read it in chunks using
for await (const chunk of mystream) {}
:What is the "right way" to cancel a stream read during async iteration from a button press?
If you're writing your own underlying source then you can add a listener that closes the stream on button click using
controller.close()
.But if you're using
fetch()
you have no control over the underlying source - you get a ReadableStream back from the response.for
loop you could use it toreturn
, cancelling the operation. But this would have to await at least one more chunk, which could in theory take some time to arrive.What is the "general recommendation"? Is it that you abort the underlying source (if possible) and if not, perhaps you wrap your stream in a another custom stream?
How are you supposed to handle errors from the source - e.g. a TypeError or a network error or something? I tried putting try/catch around the
logChunks()
above and various other places but I don't seem to be able to catch them.What is the recommended way to handle the case of a browser that does not support this feature? Is there a polyfill on the Internet that we should point users to?
Tracking bugs seem to indicate this is not yet in Safari or Chrome. Do you happen to know if Deno/NodeJS support this, and if so, whether it is compatible to the streams spec?
The text was updated successfully, but these errors were encountered: