You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AGA does not return any performance metrics after training/testing at this point in time. It only returns the predicted labels of the test set, which means that the user needs to compute the metrics separately (metrics such as accuracy, precision, recall for classification and mse, mae for regression).
It would be very helpful to provide those metrics too as part of the output. Specifically, my recommendation is to return a CSV file with the training/validation/testing metrics on a leaderboard, similarly to what AG tabular does for TabularPredictor.leaderboard(), but expanded with auxiliary metrics (accuracy, precision, recall, etc.) like those pulled with TabularPredictor.evaluate(auxiliary_metrics=...).
The text was updated successfully, but these errors were encountered:
AGA does not return any performance metrics after training/testing at this point in time. It only returns the predicted labels of the test set, which means that the user needs to compute the metrics separately (metrics such as accuracy, precision, recall for classification and mse, mae for regression).
It would be very helpful to provide those metrics too as part of the output. Specifically, my recommendation is to return a CSV file with the training/validation/testing metrics on a leaderboard, similarly to what AG tabular does for TabularPredictor.leaderboard(), but expanded with auxiliary metrics (accuracy, precision, recall, etc.) like those pulled with TabularPredictor.evaluate(auxiliary_metrics=...).
The text was updated successfully, but these errors were encountered: