C
Cuda Ollama
VERIFIED
by community
â(0 reviews)
58,875installs
Updated Apr 2026
Description
CUDA Ollama â route Ollama LLM inference across NVIDIA GPUs with automatic CUDA load balancing. CUDA Ollama cluster for RTX 4090, RTX 4080, A100, L40S, H100....
Security Analysis
â ïžèŠć60/100
Open Source
Code is publicly available for audit.
Community Verified
Reviewed by the ClawHub community.
User Reviews
No ratings yet
Be the first to share your experience!
Community Signal
â ClawHub Score2.50 / 5.00
đ„ Installs58,875
đ Last UpdateApr 3, 2026
đą Actively maintained (0d ago)
Submit Your Review
Share your experience with the community and help others find the best skills.