Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: CMRR Behaviour is Wierd #700

Open
brishtibheja opened this issue Oct 25, 2024 · 11 comments
Open

[BUG]: CMRR Behaviour is Wierd #700

brishtibheja opened this issue Oct 25, 2024 · 11 comments
Labels
bug Something isn't working

Comments

@brishtibheja
Copy link

This concerns how the MRR value changes as w[6] is changed. I think this is weird behaviour and demands an explanation.

In one of my presets, when w[6] is .8885 the MRR value is shown as .87. If you decrease that to .8884 the MRR value drops to .78 which is a drop of 10 percentage points. This looked a bit weird as I was playing around with w[6] and it barely did anything with CMRR; I had been slowly decreasing w[6] from .9000. I'd have expected to see slow change.

From .8884, decreasing w[6] further does nothing to MRR until it's reduced to .8879 where MRR moves back to .87. Note that, the RMSE/log loss values stays exactly the same as I decrease w[6] from .8885 (log loss=.3869, RMSE=6.52%). I have experimented with different values, but haven't found any patterns here.

Here is the deck:export.zip
(rename export.zipexport.apkg)


Also reproduced this with another one of my decks although it's less drastic change in MRR (drops from .83 to .76 as w[6] is increased from .9997 to .9998). Log-loss/RMSE values stays the same.

On a tangent, as CMRR outputs a workload:knowledge value, can we test this (or have we already?) in the 20k dataset. You can run CMRR using half of the reviews and check how accurate the simulation is for the rest of the collection.

@brishtibheja brishtibheja added the bug Something isn't working label Oct 25, 2024
@L-M-Sherlock
Copy link
Member

Which version did you test it on?

@brishtibheja
Copy link
Author

anki 24.06.03 ver.

@Expertium
Copy link
Collaborator

Do you still encounter this problem in the latest beta?

@brishtibheja
Copy link
Author

I cannot check that. Can you with my deck?

@Expertium
Copy link
Collaborator

I can't reproduce the big drop, but I can reproduce something similar.
(365 days)
w[6]=0.9999, MRR=0.87
w[6]=0.9998, MRR=0.89
w[6]=0.9990, MRR=0.87

@brishtibheja
Copy link
Author

I tried that range for another one of my decks. In this deck, try changing w[6] from .8890 to .8880. The big drop happens in .8884.

@Expertium
Copy link
Collaborator

Nope, not on the latest beta. It's just consistently 0.89.

aaaa.mp4

@L-M-Sherlock
Copy link
Member

Does this issue still exist? How can I reproduce it?

@brishtibheja
Copy link
Author

I'm not sure. I could only reproduce this at some specific ranges of w[6] earlier.

@L-M-Sherlock
Copy link
Member

OK. Because nobody could reproduce this bug, I close this issue.

@L-M-Sherlock L-M-Sherlock closed this as not planned Won't fix, can't repro, duplicate, stale Dec 18, 2024
@brishtibheja
Copy link
Author

@L-M-Sherlock Can you reopen this?

deck:
cmrr.zip

If you import the attached deck, you can reproduce this case by changing w[6] between 1.2077 and 1.2076 and then using CMRR shows a sudden change in its calculated value too. Do it in the preset named Kanken.

Actually, yesterday I was seeing this happening at a slightly different range of w[6] (1.20741.2075) and even though nothing else has changed, today this is different. If you can't manage to look into this today, you might need to change your date to get it working.

@L-M-Sherlock L-M-Sherlock reopened this Dec 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants