Trending...
- Los Angeles County to Begin Distributing Vote by Mail Ballots to Registered Voters in Long Beach Beginning April 30 - 342
- For Small Business Week: This Math should be Required Reading For Every Business in the Universe!
- Emmy Winning Broadcaster Dave Benz Launches TixScape to Help Fans Find World Cup Deals
Up to 96GB of GPU memory in a standard 2U rack unit - private AI with no per-user fees, all data on-premise, starting at $5,995
FREMONT, Calif. - Californer -- eRacks Open Source Systems today announces expanded GPU options for its AILSA 2U AI server, adding the Intel Arc Pro B50 16GB low-profile card as an immediately available configuration and the NVIDIA RTX PRO 4000 Blackwell SFF 24GB as a configure-to-order upgrade. eRacks/AILSA now supports up to four low-profile GPUs in a single 2U chassis, delivering up to 96GB of GPU memory for on-premise AI inference without cloud fees.
eRacks/AILSA starts at $5,995 and accepts up to four low-profile GPUs standing upright with full PCIe 5.0 x8 bandwidth to each card. The Intel Arc Pro B50 provides 16GB of GDDR6 memory per card at $349 to $399. Four cards deliver 64GB for enterprise inference under $8,000 total. The NVIDIA RTX PRO 4000 Blackwell SFF adds 24GB of GDDR7 ECC memory per card at 70W TDP, enabling 96GB configurations on a configure-to-order basis.
More on The Californer
"Most 2U GPU servers mount cards sideways on PCIe risers, which limits both bandwidth and card count," said Joseph Wolff, founder of eRacks Open Source Systems. "AILSA uses low-profile cards upright with native PCIe routing. Four cards, full bandwidth, fits in two rack units. Nobody else is doing this."
The economics are direct. A 30-person company using ChatGPT Team pays $30 per user per month - $10,800 per year, every year, with all data transiting OpenAI's infrastructure. eRacks/AILSA at $5,995 covers the same inference workload with all data remaining on-premise, no per-seat fees, and no vendor dependency. The hardware pays for itself in under 12 months.
All eRacks/AILSA systems ship with Ubuntu 26.04 LTS, Ollama, and Open WebUI pre-installed. Staff access AI through a browser from day one. Llama 3.3 70B, Mistral Small 3, Qwen 2.5 32B, and DeepSeek-R1 run natively on 64GB configurations.
eRacks has built custom Linux servers since 1999. All systems are configured to order and tested before shipping. The NVIDIA RTX PRO 4000 Blackwell SFF is available now; contact eRacks for lead times and pricing.
More on The Californer
eRacks/AILSA is available at https://eracks.com/products/ai-rackmount-servers/AILSA/
About eRacks Open Source Systems
eRacks Open Source Systems is an open-source server and storage specialist founded in 1999 and headquartered in Fremont, CA. The company builds rackmount servers, NAS, HPC clusters, and AI inference servers configured to customer requirements, running Linux and open-source software. eRacks serves businesses, research institutions, and government agencies worldwide.
Media Contact
Joseph Wolff eRacks Open Source Systems joe@eracks.com https://eracks.com
eRacks/AILSA starts at $5,995 and accepts up to four low-profile GPUs standing upright with full PCIe 5.0 x8 bandwidth to each card. The Intel Arc Pro B50 provides 16GB of GDDR6 memory per card at $349 to $399. Four cards deliver 64GB for enterprise inference under $8,000 total. The NVIDIA RTX PRO 4000 Blackwell SFF adds 24GB of GDDR7 ECC memory per card at 70W TDP, enabling 96GB configurations on a configure-to-order basis.
More on The Californer
- RECYCLEXPERT FZE Strengthens Leadership in Data Destruction UAE and GCC with Certified Secure ITAD Services
- Nonprofit Partners with Local Artist to Support Veteranss and People with Disabilities
- Governor Newsom honors fallen California Highway Patrol officers
- Clean Comedy Kings Comes To Brea Improv Sunday May 17
- Assymetrix Launches the Deepest Independent Prediction Market Data API
"Most 2U GPU servers mount cards sideways on PCIe risers, which limits both bandwidth and card count," said Joseph Wolff, founder of eRacks Open Source Systems. "AILSA uses low-profile cards upright with native PCIe routing. Four cards, full bandwidth, fits in two rack units. Nobody else is doing this."
The economics are direct. A 30-person company using ChatGPT Team pays $30 per user per month - $10,800 per year, every year, with all data transiting OpenAI's infrastructure. eRacks/AILSA at $5,995 covers the same inference workload with all data remaining on-premise, no per-seat fees, and no vendor dependency. The hardware pays for itself in under 12 months.
All eRacks/AILSA systems ship with Ubuntu 26.04 LTS, Ollama, and Open WebUI pre-installed. Staff access AI through a browser from day one. Llama 3.3 70B, Mistral Small 3, Qwen 2.5 32B, and DeepSeek-R1 run natively on 64GB configurations.
eRacks has built custom Linux servers since 1999. All systems are configured to order and tested before shipping. The NVIDIA RTX PRO 4000 Blackwell SFF is available now; contact eRacks for lead times and pricing.
More on The Californer
- Outfront Solutions Launches New Website Built for the AI Era
- mBOLDen Change Launches Move My DAF to Help Donors Find Values-Aligned DAF Sponsors
- CCHR: 'Plant-Based' Psychedelics Push Masks Synthetic Drugs and Billion-Dollar Profits
- BTR: i2 Group Launches i2 Amplify, a Community Platform for Intelligence Professionals Worldwide
- Registration Now Available for Dodgers Dreamteam Program in Long Beach for Local Youth
eRacks/AILSA is available at https://eracks.com/products/ai-rackmount-servers/AILSA/
About eRacks Open Source Systems
eRacks Open Source Systems is an open-source server and storage specialist founded in 1999 and headquartered in Fremont, CA. The company builds rackmount servers, NAS, HPC clusters, and AI inference servers configured to customer requirements, running Linux and open-source software. eRacks serves businesses, research institutions, and government agencies worldwide.
Media Contact
Joseph Wolff eRacks Open Source Systems joe@eracks.com https://eracks.com
Source: eRacks Open Source Systems
Filed Under: Computers
0 Comments
Latest on The Californer
- Intuitive Flow Systems Launches Mokēd Meditation Whistle
- Plaza Mexico celebrates Mother's Day
- CGL Santa Fe Springs Launches CGL Insurance for Local Los Angeles Businesses
- California: Governor Newsom warns insurance companies after major state enforcement against State Farm
- More Life Summit 2026 Announces Gary Brecka & Mr. Olympia Derek Lunsford as First Speakers for Miami Event
- LA Lemon Lawyer Explains the Number of Repairs Required to Qualify Under California Lemon Law
- New Linux Blog LinuxDork.com Launches With Practical Guides for Self-Hosters and Tinkerers
- Bay Area Housing Market Shifts as "Must-Move" Buyers Replace Traditional Demand
- Governor Newsom honors California's fallen peace officer heroes
- Michael H. Kaplan, Colorado Workers' Compensation Attorney, Rallies Athlete Unions Against Proposed Legislative "Carve-Outs"
- ADFA Launches "code On The Go," The First Professional Phone-native Ide
- Viasat, Galaxy 1 Communications and L2 Aviation to bring avionics integration to Advanced Air Mobility
- Southland Wind Ensemble Presents Dances for Band
- CRC Builders Featured in California Homes Magazine, Napa Estate
- Justplug Brings Infrastructure-Grade Off-Grid Solar to California Ranches and Homesteads
- May the 4th be with you: California reaches for the stars with space accomplishments that are out of this world
- Fulton County DA Fani Willis Officially Endorses Dr. Heavenly Kimes + Black Economic Agenda
- UVIFY to Debut New Product Line at XPONENTIAL 2026 in Detroit
- West Coast Sourdough Acts to Protect Business Interests and Contractual Integrity
- Lawyers Realty Group Opens New Practice for Home Equity Investment Disputes