Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implemented normalisation for weights with sum larger than one. Also … #185

Merged
merged 5 commits into from
Oct 31, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion src/qp/mixmod_pdf.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ def __init__(self, means, stds, weights, *args, **kwargs):
stds: array_like
The standard deviations of the Gaussians
weights : array_like
The weights to attach to the Gaussians
The weights to attach to the Gaussians. Weights should sum up to one. If not, the weights are interpreted as relative weights.
"""
self._scipy_version_warning()
self._means = reshape_to_pdf_size(means, -1)
Expand All @@ -58,6 +58,9 @@ def __init__(self, means, stds, weights, *args, **kwargs):
kwargs['shape'] = means.shape[:-1]
self._ncomps = means.shape[-1]
super().__init__(*args, **kwargs)
if np.any(self._weights<0):
raise ValueError('All weights need to be larger than zero')
self._weights = self._weights/self._weights.sum(axis=1)[:,None]
self._addobjdata('weights', self._weights)
self._addobjdata('stds', self._stds)
self._addobjdata('means', self._means)
Expand Down
8 changes: 8 additions & 0 deletions tests/qp/test_ensemble.py
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great! There is an even simpler test that I think is more essential. Recalling that the behavior that first alerted us to the bug was that weights [1,1,1] led to the .pdf() and .cdf() yielding results identical to that for weights of [1,0,0], the most basic unit test for this fix would show that it indeed fixes that: when weights that sum to >1 are provided, we want to check that the .cdf() or .pdf() of the ensemble is not equal to what it would be for just the first element of the mixture model.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ack, sorry, I meant this comment as a suggestion that could be made now or in the future but failed to change the status of the review to approved to actually indicate that, my bad!

Original file line number Diff line number Diff line change
Expand Up @@ -213,5 +213,13 @@ def test_iterator(self):
test_vals = ens_i.pdf(test_grid)
assert np.allclose(check_vals, test_vals)

def test_mixmod_with_negative_weights(self):
"""Verify that an exception is raised when setting up a mixture model with negative weights"""
means = np.array([0.5,1.1, 2.9])
sigmas = np.array([0.15,0.13,0.14])
weights = np.array([1,0.5,-0.25])
with self.assertRaises(ValueError):
_ = qp.mixmod(weights=weights, means=means, stds=sigmas)

if __name__ == '__main__':
unittest.main()
Loading