I am running q 4xgpu rig at home (similar to a mining rig) doing everything from llms to content creation. I have learned a lot. Having an AI rig today is much like having an early PC in the 80s. You dont appeciate the possible uses until you have it in your hands.
All you need is a used GPU slapped onto any disused ddr4 mobo. New 5060s, the 16gb models, can do basically everything now.
Can you specify a bit, what gpus and how do you wire them "together"? Nvlink?
A couple 5060s and a couple 3060s. They are wired via PCI risers to an older mono with an amd cpu. (I wanted to avoid long 3-fan cards.) It looks like a mining rig, but with thicker pci risers. Many llm tools easily leverage multiple GPUs. Sucks 800w at full load, idles below 50w.
Would you please share a link to your chassis and risers? I have the PCIE lanes, but not yet encountered a reasonable way to have more than 3 GPUs directly attached to a host, both from physical space and power requirements. External PCIe switch cases are not reasonably available to mortals :/
I have three 3090 cards. Are you saying you run them together using specialized hardware or can I somehow combine them using software over Ethernet?