Optimistic navigation when using defer
#7915
Closed
alfredsgenkins
started this conversation in
Proposals
Replies: 2 comments 9 replies
-
I have created a simple graphics to showcase the problem I am referring to: In case someone is looking into making this possible, here is how I solved it (a bit hacky): const criticalData = '{"promise":"__deferred_promise:promise"}';
const proxyFetch = (): void => {
const originalFetch: typeof fetch = window.fetch;
// eslint-disable-next-line sonarjs/cognitive-complexity
window.fetch = (...args: Parameters<typeof fetch>): Promise<Response> => {
const [resource] = args;
const url: string = resource.toString();
if (url.includes("_data")) {
let controller: ReadableStreamDefaultController<Uint8Array> | undefined;
const stream = new ReadableStream<Uint8Array>({
start(c) {
controller = c;
// Immediately enqueue the criticalData without waiting for original fetch
controller.enqueue(new TextEncoder().encode(`${criticalData}\n\n`));
// We don't await the original fetch, thus making the response available immediately
originalFetch(...args)
.then((response) => {
const reader = response.body!.getReader();
const push = (): void => {
reader
.read()
.then(({ done, value }) => {
if (done) {
controller?.close();
return;
}
// Process the chunk, skipping criticalData
const decoder = new TextDecoder();
const textChunk = decoder.decode(value, { stream: true });
if (textChunk.includes(criticalData)) {
// skipping critical data
const [, secondPart] = textChunk.split("\n\n");
// push only second part of the chunk
if (secondPart) {
controller?.enqueue(
new TextEncoder().encode(secondPart)
);
}
push();
return;
}
// console.log("received full data", bufferedString);
controller?.enqueue(value);
push();
})
.catch((err) => {
// console.error("Failed to read", err);
controller?.error(err);
reader.cancel();
});
};
push();
})
.catch((err) => {
// console.error("Failed to fetch", err);
controller?.error(err);
});
},
// cancel(reason) {
// console.log("Stream cancelled", reason);
// },
});
return Promise.resolve(
new Response(stream, {
headers: new Headers({
"Content-Type": "text/remix-deferred; charset=utf-8",
}),
})
);
}
return originalFetch(...args);
};
};
proxyFetch(); This way, awaiting initial response from server is skipped, and Remix thinks we got the reply instantly. |
Beta Was this translation helpful? Give feedback.
4 replies
-
I think this should be a default behaviour. |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey!
I would like to propose optimistic navigation, aka. navigation before loader has returned any result. Currently, navigation, in case when defer is used happens right when server has responded with first critical data. On slow network connection, where the most waiting comes from reaching the server, this could save a lot of time. This is important, as in the real application, I could have enough data to render optimistic UI for next page, passing it, for example, through a history state.
Consider the following site (currently fully CSR): https://ui.scandiweb.com I have implemented a very smooth transitions to PLP and to PDP no matter the network speed, utilizing history state. I can assure user navigation will take the same time, no matter the network connection speed. Once I migrated to remix, despite me switching to defer dynamically (based on if that is an initial page request), I still have to wait for server to respond before page is switched to reuse the prepared history data.
Is there any way around such behavior now? Can we add it?
Otherwise, I would be forced to write custom rendering solution based on React Router, without actually using loaders. Which are VERY CONVENIENT.
Basically, I want router behave as follows:
I kinda expect such behavior when switching to use
defer
loader.Beta Was this translation helpful? Give feedback.
All reactions