AI Inference server MAXER-2100
GPU2UNVIDIA® RTX 4080 SUPER

AI Inference server
AI Inference server
AI Inference server
Add to favorites
Compare this product
 

Characteristics

Type
GPU
Mounting
2U
Processor
NVIDIA® RTX 4080 SUPER, Intel® Core™ i9-13900
Applications
industrial, business
Network
2.5 GbE, gigabit Ethernet
Memory
DDR5-SODIMM
Operating system
Linux® Ubuntu™, Windows 10 IoT, Windows 11 Pro
Other characteristics
AI Inference
RAM capacity

128 GB

Description

AI Inference Server|2U Rackmount|12/13th Gen CPU|RTX-4080 Super|Integrated CPU and GPU - High-perfomance for AI inference: Support NVIDIA RTX 4090, RTX 4080 Super etc. - 12/13th Gen Intel® Core™ LGA1700 Socket Processors - High CPU computing performance: Build in i9-13900, up to i9- 13900K supported - 2U Rack Mount, Front Access I/O Design - M.2 2242/2280 M-Key x 2 - M.2 3042/3052/2242 B-Key + Micro SIM Slot - M.2 2230 E-Key x 1 - Onboard TPM 2.0

Catalogs

MAXER-2100
MAXER-2100
2 Pages
*Prices are pre-tax. They exclude delivery charges and customs duties and do not include additional charges for installation or activation options. Prices are indicative only and may vary by country, with changes to the cost of raw materials and exchange rates.