Hi, currently have a spare GeForce 1060 lying around collecting dust. Planning
to use it with Ollama [https://ollama.com/] for self-hosting my own AI model or
maybe even for AI training. Problem is, none of my home lab devices have a
compatible connection to the GPU’s GPIO. My current setup includes: - Beelink
MINI S12 Intel Alder Lake N100 - Raspberry Pi 5 - Le Potato AML-S905X-CC - Pi
Picos Would like to hear about recommendations or experiences with external GPU
docks that I can use to connect my GPU to my home lab setup, thanks.
What do you mean connect to the GPUs GPIO?