eRacks Adds Arc Pro B50 and NVIDIA Blackwell GPU Options to AILSA - 2U AI Rack Server under $8K
The Californer/10346148

Trending...
Up to 96GB of GPU memory in a standard 2U rack unit - private AI with no per-user fees, all data on-premise, starting at $5,995

FREMONT, Calif. - Californer -- eRacks Open Source Systems today announces expanded GPU options for its AILSA 2U AI server, adding the Intel Arc Pro B50 16GB low-profile card as an immediately available configuration and the NVIDIA RTX PRO 4000 Blackwell SFF 24GB as a configure-to-order upgrade. eRacks/AILSA now supports up to four low-profile GPUs in a single 2U chassis, delivering up to 96GB of GPU memory for on-premise AI inference without cloud fees.

eRacks/AILSA starts at $5,995 and accepts up to four low-profile GPUs standing upright with full PCIe 5.0 x8 bandwidth to each card. The Intel Arc Pro B50 provides 16GB of GDDR6 memory per card at $349 to $399. Four cards deliver 64GB for enterprise inference under $8,000 total. The NVIDIA RTX PRO 4000 Blackwell SFF adds 24GB of GDDR7 ECC memory per card at 70W TDP, enabling 96GB configurations on a configure-to-order basis.

More on The Californer
"Most 2U GPU servers mount cards sideways on PCIe risers, which limits both bandwidth and card count," said Joseph Wolff, founder of eRacks Open Source Systems. "AILSA uses low-profile cards upright with native PCIe routing. Four cards, full bandwidth, fits in two rack units. Nobody else is doing this."

The economics are direct. A 30-person company using ChatGPT Team pays $30 per user per month - $10,800 per year, every year, with all data transiting OpenAI's infrastructure. eRacks/AILSA at $5,995 covers the same inference workload with all data remaining on-premise, no per-seat fees, and no vendor dependency. The hardware pays for itself in under 12 months.

All eRacks/AILSA systems ship with Ubuntu 26.04 LTS, Ollama, and Open WebUI pre-installed. Staff access AI through a browser from day one. Llama 3.3 70B, Mistral Small 3, Qwen 2.5 32B, and DeepSeek-R1 run natively on 64GB configurations.

eRacks has built custom Linux servers since 1999. All systems are configured to order and tested before shipping. The NVIDIA RTX PRO 4000 Blackwell SFF is available now; contact eRacks for lead times and pricing.

More on The Californer
eRacks/AILSA is available at https://eracks.com/products/ai-rackmount-servers/AILSA/

About eRacks Open Source Systems

eRacks Open Source Systems is an open-source server and storage specialist founded in 1999 and headquartered in Fremont, CA. The company builds rackmount servers, NAS, HPC clusters, and AI inference servers configured to customer requirements, running Linux and open-source software. eRacks serves businesses, research institutions, and government agencies worldwide.

Media Contact

Joseph Wolff eRacks Open Source Systems joe@eracks.com https://eracks.com

Contact
Joseph Wolff, eRacks Open Source Systems
***@eracks.com


Source: eRacks Open Source Systems
Filed Under: Computers

Show All News | Disclaimer | Report Violation

0 Comments

Latest on The Californer