Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-48922][SQL][3.5] Avoid redundant array transform of identical expression for map type #50265

Closed
wants to merge 1 commit into from

Conversation

wForget
Copy link
Member

@wForget wForget commented Mar 13, 2025

What changes were proposed in this pull request?

Backports #50245 to 3.5

Similar to #47843, this patch avoids ArrayTransform in resolveMapType function if the resolution expression is the same as input param.

Why are the changes needed?

My previous pr #47381 was not merged, but I still think it is an optimization, so I reopened it.

During the upgrade from Spark 3.1.1 to 3.5.0, I found a performance regression in map type inserts.

There are some extra conversion expressions in project before insert, which doesn't seem to be always necessary.

map_from_arrays(transform(map_keys(map#516), lambdafunction(lambda key#652, lambda key#652, false)), transform(map_values(map#516), lambdafunction(lambda value#654, lambda value#654, false))) AS map#656

Does this PR introduce any user-facing change?

No

How was this patch tested?

added unit test

Was this patch authored or co-authored using generative AI tooling?

No

Closes #50245 from wForget/SPARK-48922.

Authored-by: wforget [email protected]
Signed-off-by: beliefer [email protected]

(cherry picked from commit 1be108e)

…ssion for map type

### What changes were proposed in this pull request?

Similar to apache#47843, this patch avoids ArrayTransform in `resolveMapType` function if the resolution expression is the same as input param.

### Why are the changes needed?

My previous pr apache#47381 was not merged, but I still think it is an optimization, so I reopened it.

During the upgrade from Spark 3.1.1 to 3.5.0, I found a performance regression in map type inserts.

There are some extra conversion expressions in project before insert, which doesn't seem to be always necessary.

```
map_from_arrays(transform(map_keys(map#516), lambdafunction(lambda key#652, lambda key#652, false)), transform(map_values(map#516), lambdafunction(lambda value#654, lambda value#654, false))) AS map#656
```

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

added unit test

### Was this patch authored or co-authored using generative AI tooling?

No

Closes apache#50245 from wForget/SPARK-48922.

Authored-by: wforget <[email protected]>
Signed-off-by: beliefer <[email protected]>

(cherry picked from commit 1be108e)
@github-actions github-actions bot added the SQL label Mar 13, 2025
Copy link
Contributor

@beliefer beliefer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM if tests passed.

@kazuyukitanimura
Copy link
Contributor

LGTM Thank you @wForget

beliefer pushed a commit that referenced this pull request Mar 13, 2025
…expression for map type

### What changes were proposed in this pull request?

Backports #50245 to 3.5

Similar to #47843, this patch avoids ArrayTransform in `resolveMapType` function if the resolution expression is the same as input param.

### Why are the changes needed?

My previous pr #47381 was not merged, but I still think it is an optimization, so I reopened it.

During the upgrade from Spark 3.1.1 to 3.5.0, I found a performance regression in map type inserts.

There are some extra conversion expressions in project before insert, which doesn't seem to be always necessary.

```
map_from_arrays(transform(map_keys(map#516), lambdafunction(lambda key#652, lambda key#652, false)), transform(map_values(map#516), lambdafunction(lambda value#654, lambda value#654, false))) AS map#656
```

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

added unit test

### Was this patch authored or co-authored using generative AI tooling?

No

Closes #50245 from wForget/SPARK-48922.

Authored-by: wforget <643348094qq.com>
Signed-off-by: beliefer <beliefer163.com>

(cherry picked from commit 1be108e)

Closes #50265 from wForget/SPARK-48922-3.5.

Authored-by: wforget <[email protected]>
Signed-off-by: beliefer <[email protected]>
@beliefer
Copy link
Contributor

@wForget @viirya Thanks
Merged into branch-3.5

@beliefer beliefer closed this Mar 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants