โ๏ธ Prerequisites
- If you havenโt already, install the nexa-SDK by following the installation guide.
- MLX models only work on MacOS. Itโs recommended to run on a device with at least 16GB RAM.
- Below are the MLX-compatible model types you can experiment with right away.
LLM - Language Models
๐ Language models in MLX format. Try out this quick example: Try it out:bash
โจ๏ธ Once model loads, type or paste multi-line text directly into the CLI to chat with the model.
LMM - Multimodal Models
๐ผ๏ธ Language models that also accept vision and/or audio inputs. LMM in MLX formats. Try out this quick example:bash
โจ๏ธ Drag photos or audio clips directly into the CLI โ you can even drop multiple images at once!
Supported Model List
We curated a list of top, high quality models in MLX format.Many MLX models in the Hugging Face
mlx-community have quality issues and may not run locally. We recommend using models from our collection for best results.- Create an account at sdk.nexa.ai
- Generate a token: Go to Deployment โ Create Token
- Activate your SDK: Run the following command on the terminal to set your license:
bash
๐ Request New Models
Want a specific model? Submit an issue on the nexa-sdk GitHub or request in our Discord/Slack community!Was this page helpful?