Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding performance metrics to AG Assistant #173

Open
rob7112 opened this issue Dec 16, 2024 · 0 comments
Open

Adding performance metrics to AG Assistant #173

rob7112 opened this issue Dec 16, 2024 · 0 comments

Comments

@rob7112
Copy link

rob7112 commented Dec 16, 2024

AGA does not return any performance metrics after training/testing at this point in time. It only returns the predicted labels of the test set, which means that the user needs to compute the metrics separately (metrics such as accuracy, precision, recall for classification and mse, mae for regression).

It would be very helpful to provide those metrics too as part of the output. Specifically, my recommendation is to return a CSV file with the training/validation/testing metrics on a leaderboard, similarly to what AG tabular does for TabularPredictor.leaderboard(), but expanded with auxiliary metrics (accuracy, precision, recall, etc.) like those pulled with TabularPredictor.evaluate(auxiliary_metrics=...).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant