M

MLX Local AI — LLM, Image Gen, STT, Embeddings Native on Apple Silicon

VERIFIED

by community

(0 reviews)
63,393installs
Updated Mar 2026

Description

MLX-powered local AI — run LLMs, Stable Diffusion, speech-to-text, and embeddings natively on Apple Silicon via MLX. Ollama uses MLX for LLM inference, mflux...

Security Analysis

⚠️警告62/100

Open Source

Code is publicly available for audit.

Community Verified

Reviewed by the ClawHub community.

User Reviews

No ratings yet

No reviews yet. Be the first!

Community Signal

ClawHub Score2.69 / 5.00
📥 Installs63,393
🔄 Last UpdateMar 31, 2026
🟢 Actively maintained (0d ago)
View on ClawHub →

Submit Your Review

Share your experience with the community and help others find the best skills.