-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Forcing different representations to use Fock and L2_norm when a component is not Gaussian #559
base: develop
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #559 +/- ##
===========================================
+ Coverage 90.17% 90.28% +0.10%
===========================================
Files 102 102
Lines 6180 6186 +6
===========================================
+ Hits 5573 5585 +12
+ Misses 607 601 -6
... and 3 files with indirect coverage changes Continue to review full report in Codecov by Sentry.
🚀 New features to boost your workflow:
|
mrmustard/lab_dev/samplers.py
Outdated
@@ -169,13 +169,14 @@ def _validate_probs(self, probs: Sequence[float], atol: float) -> Sequence[float | |||
atol: The absolute tolerance to validate with. | |||
""" | |||
atol = atol or settings.ATOL | |||
probs = math.abs(probs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this to go from complex to float?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is yeah. math.real
works as well. Is there a preference for which to use?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
great!
@@ -122,6 +125,29 @@ def test_normalize(self, coeff): | |||
normalized = state.normalize() | |||
assert np.isclose(normalized.probability, 1.0) | |||
|
|||
def test_normalize_poly_dim(self): | |||
# https://github.com/XanaduAI/MrMustard/issues/481 | |||
with settings(AUTOSHAPE_PROBABILITY=0.99999999): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ziofil If this becomes a consistent enough issue should we just set the global AUTOSHAPE_PROBABILITY
higher?
Context: Issues like MemoryError #481 are currently present in develop. These occur when attempting to compute the L2_norm of a state with a non scalar c. The solution presented here is to force the inner product between the state and its dual to occur in Fock when the dimension of the polynomial is greater than zero. In addition to this, some issues have been raised from forcing
to_bargmann
when two different representations meet.Description of the Change: Introduced the
_compute_L2_norm
hidden method. This method implements the above mentioned solution as well as provides a single source for the calculation of the L2_norm. Added #481 as a test. Representation now raises when two different ansatz are met. Circuit component now handles changing the representation and usesto_fock
insteadBenefits: normalization and purity calculation issues resolved