Instructions to use mlx-community/S3TokenizerV3 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- MLX
How to use mlx-community/S3TokenizerV3 with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir S3TokenizerV3 mlx-community/S3TokenizerV3
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- LM Studio
mlx-community/S3TokenizerV3
S3TokenizerV3 (Supervised Semantic Speech Tokenizer) converted to MLX format from FunAudioLLM/Fun-CosyVoice3-0.5B-2512.
This tokenizer is automatically downloaded when using CosyVoice 3 with mlx-audio-plus version 0.1.4.
- Downloads last month
- 191
Hardware compatibility
Log In to add your hardware
Quantized
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for mlx-community/S3TokenizerV3
Base model
FunAudioLLM/Fun-CosyVoice3-0.5B-2512