How to Use Trained Model for Inference in Tortoise TTS? #846
Unanswered
MLOpsWizard
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I trained a model using German podcasts based on the DL-Art-School repository, and now I have the following files:
.pth
model.state
fileI want to use this trained model with the Tortoise TTS pipeline for inference. Could anyone guide me on how to integrate and test my custom-trained model? Are there any specific configurations or code changes required to load my model in Tortoise?
Thanks in advance for your help!
Beta Was this translation helpful? Give feedback.
All reactions