Instructions to use suno/bark with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use suno/bark with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-to-speech", model="suno/bark")# Load model directly from transformers import AutoProcessor, AutoModelForTextToWaveform processor = AutoProcessor.from_pretrained("suno/bark") model = AutoModelForTextToWaveform.from_pretrained("suno/bark") - Notebooks
- Google Colab
- Kaggle
Error using model in HF Pipeline| Colab
#52
by priyamarwaha - opened
Hi,
The pipeline example doesn't work currently. Using the following code:
hindi_narrator = pipeline("text-to-speech", model="suno/bark")
returns an error:
TypeError: transformers.generation.utils.GenerationMixin.generate() got multiple values for keyword argument 'generation_config'
Please note - I ran this in Colab.
Thanks,
Priya