Which AI hardware to use
This article is here to give you information if you wish to build your own AI server. At Ozeki we have built several AI servers and we have gained a lot of experience. Read the following articles because it will save you time, effort, costs and frustration. You can learn from our mistakes. In this article we have also featured AI servers built by our customers, our users and independent enthusiasts. All of these AI servers can be operated by the Ozeki AI Server software.
AI Server #1 - Dual Nvidia GPUs - Air cooled
Learn how to build an air-cooled AI server with two Nvidia GPUs for AI chat, AI phone, AI email and other AI services for organizations with less then 100 users. This system is based on a desktop architecture and it's components are easy to acquire in all parts of the world. (Format: Tower)
How to build a dual GPU AI server.AI Server #2 - Quad Nvidia GPU - Water cooled
Learn how to build a water-cooled AI server with four Nvidia GPUs for AI chat, AI phone, AI email and deepfake AI video tasks. This system is based on a small office server architecture including an AMD Ryzen Threadripper server CPU and a corresponding server motherboard. (Format: Tower)
How to build a quad GPU AI server.AI Server #3 - Octa Nvidia GPU - Air cooled
This mid range deep learning AI server is a 4U rackmount server capable of supporting up to 8 PCIe dual-slot NVIDIA GPU accelerators, with two host CPUs from the AMD EPYC processor series with up to 64 cores per processor and 4TB of 8-channel DDR4 memory. (Format: 4U rack)
How to build a Octa GPU AI server.AI Server #4 - Nvidia DGX H100 - AI Appliance
The NVIDIA DGX H100 is a cutting-edge AI appliance designed to meet the demands of high-performance computing (HPC) and artificial intelligence (AI) workloads. This server is designed to operate in dedicated server rooms and sever hosting facilities, as it requires a properly sized power source, and controlled temperature. (Format: 8U rack)
Learn more about the Nvidia DGX H100 AI ApplianceAI Server #5 - Nvidia DGX B200 - AI Appliance
The sixth-generation DGX datacentre AI appliance is built around the Blackwell architecture and the flagship B200 accelerator, providing unprecedented training and inferencing performance in a single system. The DGX B200 includes 400Gb/s Connect-X7 Smart NICs and Bluefield DPUs for connecting to external storage. This server appliance is designed to operate in dedicated server rooms and sever hosting facilities, as it requires a properly sized power source, and controlled temperature. (Format: 10U rack)
Learn more about the Nvidia DGX B200 AI Appliance