Skip to content

Commit

Permalink
api documentation for tuning tools
Browse files Browse the repository at this point in the history
  • Loading branch information
wenzhangliu committed Jan 11, 2025
1 parent 6327ebf commit a751b8a
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ Paper link: [https://arxiv.org/pdf/2312.16248.pdf](https://arxiv.org/pdf/2312.16
- :key: High compatibility for different users. (PyTorch, TensorFlow2, MindSpore, CPU, GPU, Linux, Windows, MacOS, etc.)
- :zap: Fast running speed with parallel environments.
- :computer: Distributed training with multi-GPUs.
- 🎛️: Support automatically hyperparameters tuning.
- :chart_with_upwards_trend: Good visualization effect with [tensorboard](https://www.tensorflow.org/tensorboard) or [wandb](https://wandb.ai/site) tool.

## Currently Included Algorithms
Expand Down
1 change: 1 addition & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,7 @@ Here are its key features:
- **Broad Compatibility**: Supports PyTorch, TensorFlow, MindSpore, and runs efficiently on CPU, GPU, and across Linux, Windows, and macOS.
- **High Performance**: Delivers fast execution speeds, leveraging vectorized environments for efficiency.
- **Distributed Training**: Enables multi-GPU training for scaling up experiments.
- **Hyperparameters Tuning**: Supports automatic tuning tools that guide hyperparameter configuration.
- **Enhanced Visualization**: Provides intuitive and comprehensive visualization with tools like TensorBoard and Weights & Biases (wandb).

List of Algorithms
Expand Down

0 comments on commit a751b8a

Please sign in to comment.