-
-
Notifications
You must be signed in to change notification settings - Fork 139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: CMRR Behaviour is Wierd #700
Comments
Which version did you test it on? |
anki 24.06.03 ver. |
Do you still encounter this problem in the latest beta? |
I cannot check that. Can you with my deck? |
I can't reproduce the big drop, but I can reproduce something similar. |
I tried that range for another one of my decks. In this deck, try changing w[6] from |
Nope, not on the latest beta. It's just consistently 0.89. aaaa.mp4 |
Does this issue still exist? How can I reproduce it? |
I'm not sure. I could only reproduce this at some specific ranges of |
OK. Because nobody could reproduce this bug, I close this issue. |
@L-M-Sherlock Can you reopen this? deck: If you import the attached deck, you can reproduce this case by changing w[6] between Actually, yesterday I was seeing this happening at a slightly different range of w[6] ( |
This concerns how the MRR value changes as w[6] is changed. I think this is weird behaviour and demands an explanation.
In one of my presets, when w[6] is
.8885
the MRR value is shown as.87
. If you decrease that to.8884
the MRR value drops to.78
which is a drop of 10 percentage points. This looked a bit weird as I was playing around with w[6] and it barely did anything with CMRR; I had been slowly decreasing w[6] from.9000
. I'd have expected to see slow change.From
.8884
, decreasing w[6] further does nothing to MRR until it's reduced to.8879
where MRR moves back to.87
. Note that, the RMSE/log loss values stays exactly the same as I decrease w[6] from.8885
(log loss=.3869
, RMSE=6.52%
). I have experimented with different values, but haven't found any patterns here.Here is the deck:export.zip
(rename
export.zip
→export.apkg
)Also reproduced this with another one of my decks although it's less drastic change in MRR (drops from
.83
to.76
as w[6] is increased from.9997
to.9998
). Log-loss/RMSE values stays the same.On a tangent, as CMRR outputs a workload:knowledge value, can we test this (or have we already?) in the 20k dataset. You can run CMRR using half of the reviews and check how accurate the simulation is for the rest of the collection.
The text was updated successfully, but these errors were encountered: