Introduction

Royal Holloway physics HPC cluster is a 220 core with XXX CPUS. The cores are distrbuted between 11 compute nodes each with 20 cores and XXX of memory. The compute nodes are connected with Infiniband interconnects with a bandwith of 56 GBits.

Guides and notes

  • Infiniband and network
  • Configuration
  • SLURM
  • Modules
  • EasyBuild
  • xCATtool for node administration
  • Accelerator Physics
  • Log book

  • Hpc farm 20160927
  • Hpc farm 20161012 (installation)
  • -- StewartBoogert - 27 Sep 2016


    This topic: Public > WebHome > HpcFarm
    Topic revision: r4 - 23 Jan 2017 - AntonioPerezFernandez1
     
    This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
    Ideas, requests, problems regarding RHUL Physics Department TWiki? Send feedback