You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
it prints a nice markdown report at the end, summarizing the run benchmarks, times, bandwidths etc. All data is also stored into baseline.json. I archive the JSON file for later use, e.g., comparing it with other results. The terminal output is discarded. However, the comparison script nvbench_compare.py later only shows the runtimes of the baseline and comparison run, not the bandwidth and %SOL. The data should be in the JSON file tough.
Is there a way to have the comparison script print the bandwidth as well?
Is there a tool/script to print the markdown report given a JSON file?
Both features would be very useful to easily print the achieved bandwidth and %SOL for results stored as JSON. Otherwise, I would always need to capture and archive the markdown report as well, which is annoying and seems redundant.
The text was updated successfully, but these errors were encountered:
When I run an nvbench-based benchmark, like:
it prints a nice markdown report at the end, summarizing the run benchmarks, times, bandwidths etc. All data is also stored into
baseline.json
. I archive the JSON file for later use, e.g., comparing it with other results. The terminal output is discarded. However, the comparison scriptnvbench_compare.py
later only shows the runtimes of the baseline and comparison run, not the bandwidth and %SOL. The data should be in the JSON file tough.Is there a way to have the comparison script print the bandwidth as well?
Is there a tool/script to print the markdown report given a JSON file?
Both features would be very useful to easily print the achieved bandwidth and %SOL for results stored as JSON. Otherwise, I would always need to capture and archive the markdown report as well, which is annoying and seems redundant.
The text was updated successfully, but these errors were encountered: