Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add batch request content #354

Open
wants to merge 13 commits into
base: main
Choose a base branch
from
Open

feat: add batch request content #354

wants to merge 13 commits into from

Conversation

rkodev
Copy link
Contributor

@rkodev rkodev commented Feb 14, 2025

Closes #5

Add support for batching

@rkodev rkodev marked this pull request as ready for review February 19, 2025 08:37
@rkodev rkodev requested a review from a team as a code owner February 19, 2025 08:37
method: string;
url: string;
headers?: Record<string, string> | null;
body?: Record<string, any> | null;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if the body is a stream?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rkodev please avoid resolving conversation until a consensus has been reached. It makes progress much harder to track reviewers' side.

const method = requestInformation.httpMethod?.toString();

return {
id: createGuid(),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The user should still have the ability to pass in their own id, incase they want to use it retreive the response.

@rkodev rkodev marked this pull request as draft February 20, 2025 14:25
@rkodev rkodev marked this pull request as ready for review February 21, 2025 06:18
* @param {string} requestId - The request id value
* @returns The Response object instance for the particular request
*/
public getResponseById(requestId: string): BatchResponse | undefined {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe we are missing overloads that take in a parsablefactory or type parameter to deserialize the response body to the desired object type.

https://github.com/microsoftgraph/msgraph-sdk-design/blob/main/content/BatchResponseContent.md#requirements

*/
private readonly errorMappings: ErrorMappings | undefined;

constructor(requestAdapter: RequestAdapter, errorMappings?: ErrorMappings) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

missing doc comment for the constructor

* To get the iterator for the responses
* @returns The Iterable generator for the response objects
*/
public *getResponsesIterator(): IterableIterator<[string, BatchResponse]> {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
public *getResponsesIterator(): IterableIterator<[string, BatchResponse]> {
public getResponsesIterator(): IterableIterator<[string, BatchResponse]> {

Is this a typo? or a new syntax I don't know about?

Copy link
Contributor Author

@rkodev rkodev Feb 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

export interface BatchResponse {
id: string;
headers?: Record<string, string> | null;
body?: ArrayBuffer | null;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

once they get a batch response (item?), if the payload is structured/a parsable, how can they get from the body to the deseiralized payload?

Copy link
Contributor Author

@rkodev rkodev Feb 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe this is a will be resolved by implementing #354 (comment) ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for batching
3 participants