Personal tools
You are here: Home / Wiki / Utahhardware

Utahhardware

Hardware Overview, "Emulab Classic"

Hardware Overview, "Emulab Classic"

Test Nodes

  • 160 d430 PC nodes (pc701-pc860) consisting of:
    • Dell Poweredge R430
    • Two 2.4 GHz 64-bit 8-Core Xeon E5-2630v3 processors, 8.0 GT/s, 20 MB cache, VT-x, VT-d and EPT support
    • 64 GB 2133 MT/s DDR4 RAM (8 x 8GB modules)
    • 2-4 Intel i350 GbE NICs for experimental use
    • 2-4 Intel X710 10 GbE NICs for experimental use
    • 200 GB 6Gbps SATA SSD, 2 x 1 TB 7200 RPM 6 Gbps SATA
    Important Notes about the d430s

  • 16 d820 PC nodes (pc601-pc616) consisting of:
    • Dell Poweredge R820
    • Four 2.2 GHz 64-bit 8-Core Xeon "Sandy Bridge" processors, 7.20 GT/s bus speed, 16 MB cache, VT-x and VT-d support
    • 128 GB 1333 MHz DDR3 RAM (8 x 16GB modules)
    • 4 Broadcom NetXtreme BCM5720C GbE NICs builtin to motherboard (only one in use)
    • 2 dual-port Intel X520 PRO/10GbE PCI-Express NICs
    • 250GB 7200 rpm SATA disk, 6 x 600GB 10000 rpm SAS disks
    Important Notes about the d820s

  • 160 d710 PC nodes (pc401-pc560) consisting of:
    • Dell Poweredge R710
    • 2.4 GHz 64-bit Quad Core Xeon E5530 "Nehalem" processor, 5.86 GT/s bus speed, 8 MB L3 cache, VT support
    • 12 GB 1066 MHz DDR2 RAM (6 x 2GB modules)
    • 4 Broadcom NetXtreme II BCM5709 rev C GbE NICs builtin to motherboard
    • 2 Broadcom NetXtreme II BCM5709 rev C GbE NICs in dual-port PCIe x4 expansion card (one NIC is the control net)
    • 2x 250GB 7200 rpm SATA disks
    Important Notes about the d710s

  • 160 pc3000 PC nodes (pc201-pc360) consisting of:
    • Dell Poweredge 2850
    • 3.0 GHz 64-bit Xeon processors, 800Mhz FSB
    • 2GB 400Mhz DDR2 RAM
    • Multiple PCI-X 64/133 and 64/100 busses
    • 6 10/100/1000 Intel NICs spread across the busses (one NIC is the control net).
    • 2 x 146GB 10,000 RPM SCSI disks
    • CD-ROM drive (GCR-82840N.)
    Important Notes about the pc3000s

  • 60 pc2400w wireless PC nodes (pcwf105-201), consisting of:
    • 2.4GHz Intel Core Duo "Conroe" processors.
    • Based on Dell OptiPlex 745.
    • 1GB DDR-400MHz (PC3200) SDRAM.
    • 2 Netgear WAG311 802.11a/b/g (Atheros) wifi cards.
    • 2 Ethernet ports:
    • 2 Western Digital WDXL80 120GB 7200RPM Serial ATA hard drives.
    • DVD drive.

 

Servers

  • a users, file, and serial line server (users.emulab.net), consisting of:
    • Dell PowerEdge 2850 server
    • Dual 3.0GHz Intel Xeon processors, 800Mhz FSB
    • 4GB 400Mhz DDR2 RAM
    • Dell M1000 with 7TB of disk space
  • a DB, web, DNS and operations server, consisting of:
    • Dell PowerEdge 2850 server
    • Dual 3.0GHz Intel Xeon processors, 800Mhz FSB
    • 4GB 400Mhz DDR2 RAM
  • a motley collection of serial line servers, consisting of:

 

Switches and Routers

  • 7 Cisco 6500 series high-end switches. Five serve as the testbed backplane ("programmable patch panel").
    The other two, a 6506 and 6509 providing "control" interfaces for the test nodes with the 6509 hosting an MSFC router card and functioning as the core router for the testbed, regulating access to the testbed servers and the outside world.
  • 5 HP 5400zl switches connecting the d710 nodes. Three are part of the testbed backplane, one provides the control network, and the final is a "hub" switch interconnecting the other experimental switches via multiple 10Gb ports.
  • 1 Arista 7504 10Gb switch connecting the d820 nodes. Connected to the experiment network fabric via 4x10Gb interconnect to the 5412zl "hub" switch.

 

Power Controllers

  • 10 APC MasterSwitch AP9210 8 port power controllers.
    (The AP9210 is discontinued; replaced in the product line by the AP9211.)
  • 12 APC MasterSwitch AP7960 24 port power controllers.
  • 7 BayTech RPC27 20-port remote power controllers.
  • 24 BayTech RPC14 1U 8-port remote power controllers.

Racks

  • 13 Wrightline "Tech I" racks: 44U, 34" deep, 2 are 24" wide with cable management; the rest are 19" wide. (Aug 2001: these appear to have been discontinued or renamed.)
  • 12 Dell server racks.

 

Layout

All PC nodes have at least four ethernet ports connected to the testbed backplane. All of the over 2000 ports can be connected in arbitrary ways by setting up VLANs on the switches via remote configuration tools. The current PC and switch topology is shown here. One additional ethernet port on each PC is connected to the core router. Thus each PC has a full duplex 100Mbps or 1000Mbps connection to the servers. These connections are for dumping data off of the nodes and such, without interfering with the experimental interfaces. The only impact on the node is processor and disk use, and bandwidth on the PCI bus.