Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implemented normalisation for weights with sum larger than one. Also … #185

Merged
merged 5 commits into from
Oct 31, 2023

Conversation

BStoelzner
Copy link
Collaborator

This fixes the normalisation issue in #146

@codecov
Copy link

codecov bot commented Aug 3, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (a38c210) 100.00% compared to head (0e2a811) 100.00%.

Additional details and impacted files
@@            Coverage Diff            @@
##              main      #185   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           33        33           
  Lines         2279      2282    +3     
=========================================
+ Hits          2279      2282    +3     
Files Coverage Δ
src/qp/mixmod_pdf.py 100.00% <100.00%> (ø)

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@aimalz aimalz self-requested a review August 18, 2023 20:40
Copy link
Collaborator

@aimalz aimalz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is good! It just needs one little unit test to be complete.

@BStoelzner
Copy link
Collaborator Author

@aimalz I've added a unit test, but forgot to let you know. Feel free to take a look :)

@BStoelzner
Copy link
Collaborator Author

The conflict should be resolved now.

@eacharles eacharles force-pushed the issue/146/mixmod-weight-normalisation-bug branch from dfc14ae to 090c8fa Compare October 24, 2023 18:07
Copy link
Collaborator

@aimalz aimalz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note the to-do item for an additional test to catch the original way the bug cropped up. I can write it later this week and add it to this PR, or we can make an issue to come back to it some other time.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great! There is an even simpler test that I think is more essential. Recalling that the behavior that first alerted us to the bug was that weights [1,1,1] led to the .pdf() and .cdf() yielding results identical to that for weights of [1,0,0], the most basic unit test for this fix would show that it indeed fixes that: when weights that sum to >1 are provided, we want to check that the .cdf() or .pdf() of the ensemble is not equal to what it would be for just the first element of the mixture model.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ack, sorry, I meant this comment as a suggestion that could be made now or in the future but failed to change the status of the review to approved to actually indicate that, my bad!

@eacharles eacharles merged commit 8a7f69e into main Oct 31, 2023
6 checks passed
@eacharles eacharles deleted the issue/146/mixmod-weight-normalisation-bug branch May 28, 2024 16:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants