Skip to content
Dan Wheeler edited this page Jun 7, 2022 · 9 revisions

Overview

The Turing cluster was purchased with funds from the ITS FY2017 Budget in May of 2017.  Much like the Biomath cluster, it is optimized for highly parallel jobs requiring very fast Infiniband networking between nodes.  Anyone at Colgate is allowed to use the Turing cluster; please send an email to ITSHelp to get an account.

Turing consists of 20 nodes, a master (scheduler) node and a storage node.  All are connected via a FDR Infiniband fabric network.

The job scheduler software, PBS Pro, handles the submission and running of intertactive and non-interactive jobs on the cluster.

Hardware Specs

Network

The master and all compute nodes are directly attached to a FDR Infiniband Fabric.  The fabric support transport speeds of about 54 Gbit/s and 0.7 microseconds of latency.  The Infiniband fabric is configured to support IPoIB, which allows us to use standard TCP/IP programs and code on the Infiniband network.   This makes writing code to run on the fabric very easy, with minimal overhead in the protocol.

The storage node and the infiniband switch both support EDR Infiniband, which has speeds of 96 Gbit/s and a latency of 0.5 microseconds.

A separate 1 Gbit/s Ethernet network is used for general maintenance only.

The master node and storage node are both directly connected to the Colgate campus network via standard ethernet.

Master Node

Hostname Description Processor Processor Generation Total Cores Speed (GHz) Cache (MB) Ram (GB)
Turing.colgate.edu Dell PowerEdge R530 Xeon E5-2620v4 Broadwell 8 2.1 20 64

 

Compute Nodes

Hostname Description Processor Processor Generation Total Cores Speed (GHz) Cache (MB) Ram (GB)
n01 Dell PowerEdge R430 Dual Xeon E5-2683 v4 Broadwell 32 2.1 40 128
n02 Dell PowerEdge R430 Dual Xeon E5-2683 v4 Broadwell 32 2.1 40 128
n03 Dell PowerEdge R430 Dual Dual Xeon E5-2683 v4 Broadwell 32 2.1 40 128
n04 Dell PowerEdge R430 Dual Dual Xeon E5-2683 v4 Broadwell 32 2.1 40 128
n05 Dell PowerEdge R430 Dual Dual Xeon E5-2683 v4 Broadwell 32 2.1 40 128
n06 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n07 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n08 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n09 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n10 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n11 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n12 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n13 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n14 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n15 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n16 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n17 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n18 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n19 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128
n20 Dell PowerEdge R430 Dual [https://ark.intel.com/products/91766/Intel-Xeon-Processor-E5-2683-v4-40M-Cache-2\_10-GHz Xeon E5-2683 v4] Broadwell 32 2.1 40 128

 

Compute Totals

Total Compute CPU Cores Total Compute RAM (GB)
640 2560

 

Storage

Hostname Mount Free Space (TB)
turing-storage.colgate.edu /home 15

Storage is housed on a Dell PowerEdge R730.  The CPU is a Xeon E5-2623 v3 (3.0GHz) with 128GB of RAM.  The storage array is composed of sixteen 1.2 TB (10,000 RPM) SAS hard drives in a ZFS Raid-Z2 configuration.  This is similar to a RAID-6 configuration, but with the additional features of the ZFS file system.

Access

Files stored on the Turing cluster can be accessed via ssh, sftp and scp.  User home directories on Turing can also be connected to via samba from a Mac, PC or Linux workstation.

sftp/scp turing-storage.colgate.edu
samba (from PC) \\turing-storage.colgate.edu\
samba (from Mac) smb://turing-storage.colgate.edu/

Please see Network Drives on Turing Cluster for more information about using samba.

 

Backups

The /home partition on storage.cluster is snapshotted daily at midnight.  These snapshots exist for 2 weeks.

The /shared partition on storage.cluster is snapshotted weekly, on Sunday at midnight.

These snapshots are visible via /home/.zfs/snapshot/GMT-(date)/.  Simple command line unix tools can be used to copy files to recover them to a writable location.

Clone this wiki locally