Skip to content

RuntimeError: CUDA failed with error out of memory after fine-tuing #470

Answered by kouohhashi
kouohhashi asked this question in Q&A
Discussion options

You must be logged in to vote

Thanks, my GPU memory size was 12GB. But I think I find the problem.
In a nutshell, what I did to convert whisper model into faster-whisper model was wrong.

What I dis was

  1. convert openAI model into hugging face model with python convert_openai_to_hf.py.
  2. copy hugging face json files because step 1 did not create all json files
  3. convert hugging face mode into faster-whisper model with ct2-transformers-converter

the problem was hugging face's config.json was not equal to faster-whisper's config.json.
Or just I might have copied wrong one.

So I copied config.json from working example and it worked.

Thanks anyway,

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@kouohhashi
Comment options

Answer selected by kouohhashi
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants