Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Forcing different representations to use Fock and L2_norm when a component is not Gaussian #559

Open
wants to merge 11 commits into
base: develop
Choose a base branch
from

Conversation

apchytr
Copy link
Collaborator

@apchytr apchytr commented Feb 24, 2025

Context: Issues like MemoryError #481 are currently present in develop. These occur when attempting to compute the L2_norm of a state with a non scalar c. The solution presented here is to force the inner product between the state and its dual to occur in Fock when the dimension of the polynomial is greater than zero. In addition to this, some issues have been raised from forcing to_bargmann when two different representations meet.

Description of the Change: Introduced the _compute_L2_norm hidden method. This method implements the above mentioned solution as well as provides a single source for the calculation of the L2_norm. Added #481 as a test. Representation now raises when two different ansatz are met. Circuit component now handles changing the representation and uses to_fock instead

Benefits: normalization and purity calculation issues resolved

@apchytr apchytr added the no changelog Pull request does not require a CHANGELOG entry label Feb 24, 2025
@apchytr apchytr requested a review from ziofil February 24, 2025 14:50
Copy link

codecov bot commented Feb 24, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 90.28%. Comparing base (22937d1) to head (a85fe5e).

Additional details and impacted files
@@             Coverage Diff             @@
##           develop     #559      +/-   ##
===========================================
+ Coverage    90.17%   90.28%   +0.10%     
===========================================
  Files          102      102              
  Lines         6180     6186       +6     
===========================================
+ Hits          5573     5585      +12     
+ Misses         607      601       -6     
Files with missing lines Coverage Δ
mrmustard/lab_dev/circuit_components.py 97.70% <100.00%> (+0.82%) ⬆️
mrmustard/lab_dev/states/base.py 96.77% <100.00%> (+0.10%) ⬆️
mrmustard/physics/representations.py 98.85% <100.00%> (-1.15%) ⬇️

... and 3 files with indirect coverage changes


Continue to review full report in Codecov by Sentry.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 22937d1...a85fe5e. Read the comment docs.

🚀 New features to boost your workflow:
  • Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@apchytr apchytr changed the title Forcing L2_norm to use Fock when a component is not Gaussian Forcing different representations to use Fock and L2_norm when a component is not Gaussian Mar 5, 2025
@@ -169,13 +169,14 @@ def _validate_probs(self, probs: Sequence[float], atol: float) -> Sequence[float
atol: The absolute tolerance to validate with.
"""
atol = atol or settings.ATOL
probs = math.abs(probs)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this to go from complex to float?

Copy link
Collaborator Author

@apchytr apchytr Mar 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is yeah. math.real works as well. Is there a preference for which to use?

ziofil
ziofil previously approved these changes Mar 5, 2025
Copy link
Collaborator

@ziofil ziofil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great!

@@ -122,6 +125,29 @@ def test_normalize(self, coeff):
normalized = state.normalize()
assert np.isclose(normalized.probability, 1.0)

def test_normalize_poly_dim(self):
# https://github.com/XanaduAI/MrMustard/issues/481
with settings(AUTOSHAPE_PROBABILITY=0.99999999):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ziofil If this becomes a consistent enough issue should we just set the global AUTOSHAPE_PROBABILITY higher?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
no changelog Pull request does not require a CHANGELOG entry
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants