Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Support for Non-Parametric Layernorm #1332

Closed
wants to merge 6 commits into from

Conversation

aflah02
Copy link
Contributor

@aflah02 aflah02 commented Jan 4, 2025

Hi
Based on our discord discussion I've added non-parametric layernorm as described in the OLMo paper.

if neox_args.layernorm_fusion:
raise ValueError(f"neox_args.layernorm_fusion not supported for non_parametric_layernorm")
else:
norm = LayerNorm(elementwise_affine=False, bias=False)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This leads to an error due to initializing torch.nn.LayerNorm without normalized_shape.
Also, get_norm should not return an initialized object.
#1338 resolves this by creating a separate NonParametricLayernorm class.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!! I'll close this
I totally forgot you need to return a non-initialized object

@aflah02 aflah02 closed this Feb 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants