Chemistry Packages at CHPC

Slides:



Advertisements
Similar presentations
MFA for Business Banking – Security Code Multifactor Authentication: Quick Tip Sheets Note to Financial Institutions: We are providing these QT sheets.
Advertisements

Quantum Mechanics Calculations III Apr 2010 Postgrad course on Comp Chem Noel M. O’Boyle.
Using Gaussian & GaussView on CHPC Resources Anita M. Orendt Center for High Performance Computing Fall 2012.
OAAIS Enterprise Information Security Security Awareness, Training & Education (SATE) Program or UCSF Campus VPN.
Module 1 Introduction to Network Operating Systems
Computational Spectroscopy II. ab initio Methods Multi-configuration Self-Consistent Field Theory Chemistry 713 Trocia Clasp.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
High Performance Computing (HPC) at Center for Information Communication and Technology in UTM.
Cambodia-India Entrepreneurship Development Centre - : :.... :-:-
Installing and running COMSOL on a Windows HPCS2008(R2) cluster
I Information Systems Technology Ross Malaga 3 "Part I Understanding Information Systems Technology" Copyright © 2005 Prentice Hall, Inc. 3-1 SOFTWARE.
HPCC Mid-Morning Break Interactive High Performance Computing Dirk Colbry, Ph.D. Research Specialist Institute for Cyber Enabled Discovery.
Engineering H192 - Computer Programming The Ohio State University Gateway Engineering Education Coalition Lect 4P. 1Winter Quarter Introduction to UNIX.
11 SYSTEMS ADMINISTRATION AND TERMINAL SERVICES Chapter 12.
ITCS 6/8010 CUDA Programming, UNC-Charlotte, B. Wilkinson, Jan 22, 2011assignprelim.1 Assignment Preliminaries ITCS 6010/8010 Spring 2011.
FireRMS SQL Audit, Archiving & Purging Presented by Laura Small FireRMS Quality Assurance.
CS105 Lab 1 – Introduction Section: ??? TA: ??? ??? Announcements CITES Accounts Compass Netfiles Other Administrative Information CS105 Fall
Introduction to our On-Line Self Service Center at
Microsoft Office Communicator A General Introduction.
WORK ON CLUSTER HYBRILIT E. Aleksandrov 1, D. Belyakov 1, M. Matveev 1, M. Vala 1,2 1 Joint Institute for nuclear research, LIT, Russia 2 Institute for.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Introduction to the HPCC Jim Leikert System Administrator High Performance Computing Center.
CENT 305 Information Systems Security Linux Introduction.
VIPBG LINUX CLUSTER By Helen Wang March 29th, 2013.
HPC at HCC Jun Wang Outline of Workshop1 Overview of HPC Computing Resources at HCC How to obtain an account at HCC How to login a Linux cluster at HCC.
Bigben Pittsburgh Supercomputing Center J. Ray Scott
111 EMC CONFIDENTIAL—INTERNAL USE ONLY NMC -- NW Administration NMC Team NetWorker 7.3 TOI July 28, 2005.
Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research.
| nectar.org.au NECTAR TRAINING Module 5 The Research Cloud Lifecycle.
Lab System Environment
Graphing and statistics with Cacti AfNOG 11, Kigali/Rwanda.
INFSO-RI Enabling Grids for E-sciencE Running ECCE on EGEE clusters Olav Vahtras KTH.
Using the Weizmann Cluster Nov Overview Weizmann Cluster Connection Basics Getting a Desktop View Working on cluster machines GPU For many more.
| nectar.org.au NECTAR TRAINING Module 5 The Research Cloud Lifecycle.
Welcome! Welcome! Agenda - Wednesday  Introduction  Installation Tips  New Client Features  New Application Design Features  Installing Chart Director.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
SPI NIGHTLIES Alex Hodgkins. SPI nightlies  Build and test various software projects each night  Provide a nightlies summary page that displays all.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Computational chemistry with ECCE on EGEE.
Remote & Collaborative Visualization. TACC Remote Visualization Systems Longhorn – Dell XD Visualization Cluster –256 nodes, each with 48 GB (or 144 GB)
Portable Batch System – Definition and 3 Primary Roles Definition: PBS is a distributed workload management system. It handles the management and monitoring.
Advanced topics Cluster Training Center for Simulation and Modeling September 4, 2015.
Wouter Verkerke, NIKHEF 1 Using ‘stoomboot’ for NIKHEF-ATLAS batch computing What is ‘stoomboot’ – Hardware –16 machines, each 2x quad-core Pentium = 128.
Munis Version 9.1 & 8.3 Sneak Peek System Administration.
NIMAC for Accessible Media Producers: February 2013 NIMAC 2.0 for AMPs.
Multicore Applications in Physics and Biochemical Research Hristo Iliev Faculty of Physics Sofia University “St. Kliment Ohridski” 3 rd Balkan Conference.
Computational chemistry packages (efficient usage issues?) Jemmy Hu SHARCNET HPC Consultant Summer School June 3, 2016 /work/jemmyhu/ss2016/chemistry/
An Brief Introduction Charlie Taylor Associate Director, Research Computing UF Research Computing.
Advanced Computing Facility Introduction
Compute and Storage For the Farm at Jlab
Welcome to Indiana University Clusters
Development Environment
Open OnDemand: Open Source General Purpose HPC Portal
HPC usage and software packages
Welcome to Indiana University Clusters
Implementation Issues
Chapter 2: System Structures
Architecture & System Overview
Chapter 6 Introduction to Network Operating Systems
Bomgar Remote support software
Introduction to Computers
File Transfer Olivia Irving and Cameron Foss
Research Consulting & Faculty Engagement
Welcome to our Nuclear Physics Computing System
Center for High Performance Computing
CCR Advanced Seminar: Running CPLEX Computations on the ISE Cluster
Welcome to our Nuclear Physics Computing System
SOFTWARE TECHNOLOGIES
Using the Omega3P Eigensolver
Quick Tutorial on MPICH for NIC-Cluster
Presentation transcript:

Chemistry Packages at CHPC Anita M. Orendt Center for High Performance Computing anita.orendt@utah.edu Fall 2012

Purpose of Presentation Identify the computational chemistry software and related tools currently available at CHPC Present brief overview of these packages Present how to access packages on CHPC Next presentation – Gaussian09 & GaussView 18 October 2012 http://www.chpc.utah.edu

Brief Overview CHPC Resources Computational Clusters – sanddunearch, updraft and ember Home directory – NFS mounted on all clusters /uufs/chpc.utah.edu/common/home/<uNID> generally not backed up (there are exceptions) Scratch systems /scratch/serial – all clusters – 15 TB NFS /scratch/general – updraft only – 3.5TB NFS /scratch/ibrix/chpc_gen – updraft and ember – 55TB /scratch/local on compute nodes (60GB sda, 200GB updraft, 400GB ember) Applications /uufs/arches/sys/pkg /uufs/chpc.utah.edu/sys/pkg /uufs/cluster.arches/sys/pkg where cluster = sanddunearch, updraft or ember http://www.chpc.utah.edu

256 nodes/2048 cores Infiniband and GigE 85 nodes general usage http://www.chpc.utah.edu 4/15/2017 Sanddunearch 156 nodes/624 cores Infiniband and GigE Updraft 256 nodes/2048 cores Infiniband and GigE 85 nodes general usage Owner nodes Switch Ember about 382 nodes/4584 cores Infiniband and GigE 67 nodes general usage Turret Arch Telluride 12 GPU nodes (6 CHPC) Meteorology /scratch/ibrix/chpc_gen ember, updraft IBRIX Administrative Nodes scratch systems serial – all clusters general – updraft NFS NFS Home Directories http://www.chpc.utah.edu 4

Getting Started at CHPC Account application – an online process https://www.chpc.utah.edu/apps/profile/account_request.php Username is your unid with password administrated by campus Interactive nodes two CHPC owned nodes per cluster (cluster.chpc.utah.edu) with round-robin access to divide load CHPC environment scripts – in account when created www.chpc.utah.edu/docs/manuals/getting_started/codes/chpc.tcshrc www.chpc.utah.edu/docs/manuals/getting_started/codes/chpc.bashrc Getting started guide www.chpc.utah.edu/docs/manuals/getting_started Problem reporting system http://jira.chpc.utah.edu or email to issues@chpc.utah.edu http://www.chpc.utah.edu

Interactive Node Usage Interactive nodes for prepping/testing of input files, analyzing results, compilations, debugging, data transfer, etc no running of jobs 15 min MAX cpu no jobs of any time length that negatively impact ability of other users to get work done (e.g., heavy cpu, memory usage and/or i/o) https://wiki.chpc.utah.edu/display/policy/2.1.1+Cluster+Interactive+Node+Policy http://www.chpc.utah.edu

Batch System All use of compute nodes go through a batch system using Torque (PBS) and Maui (Moab) for scheduling #PBS -S /bin/csh #PBS – A account #PBS –l walltime=24:00:00,nodes=2:ppn=8 qsub –I to get interactive use of a compute node (-X if X11 forwarding needed) Walltime limits 72 hours on sanddunearch 24 hours on updraft 72 hours on ember (long qos by request) http://www.chpc.utah.edu

Allocation Procedure Allocation process now within jira https://jira.chpc.utah.edu/secure/CreateIssue.jspa?pid=10130&issuetype=12 Allocations on a calendar quarter basis Next requests due about Dec 1 for Jan-Mar 2013 Can request for up to 4 quarters at a time Allocations for different clusters are kept separate Sanddunearch is no longer being allocated – all freecycle One allocation per research group; accessible by all members of group http://www.chpc.utah.edu

Out of Allocation Options If your group did not request any allocation – can get a quick allocation (limited to 5000 units) on one of the clusters https://jira.chpc.utah.edu/secure/CreateIssue.jspa?pid=10130&issuetype=13 Can run in freecycle mode on sanddunearch; once job starts it will run until done or walltime requested is reached Can run on smithp-guest on smithp nodes ember or freecycle on CHPC nodes on either ember or updraft (jobs are preemptable) http://www.chpc.utah.edu

Security Policies (1) No clear text passwords - use ssh and scp Do not share your account under any circumstances Don’t leave your terminal unattended while logged into your account Do not introduce classified or sensitive work onto CHPC systems Use a good password and protect it – see gate.acs.utah.edu for tips on good passwords http://www.chpc.utah.edu

Security Policies (2) Do not try to break passwords, tamper with files, look into anyone else’s directory, etc. – your privileges do not extend beyond your own directory Do not distribute or copy privileged data or software Report suspicions to CHPC (security@chpc.utah.edu) Please see http://www.chpc.utah.edu/docs/policies/security.html for more details http://www.chpc.utah.edu

Access to Interactive Nodes From Windows machine: Need ssh client PuTTY: http://www.chiark.greenend.org.uk/~sgtatham/putty/ Xshell 4: http://www.netsarang.com/products/xsh_overview.html For Xwindowing – need tool to display graphical interfaces such as Gaussview, ECCE, AutoDockTools Xming (use Mesa version) - http://www.straightrunning.com/XmingNotes http://www.chpc.utah.edu

Default login scripts CHPC maintains default login scripts that will set up necessary environment for batch commands and many of the programs to work http://www.chpc.utah.edu/docs/manuals/getting_started/code/chpc.tcshrc http://www.chpc.utah.edu/docs/manuals/getting_started/code/chpc.bashrc Located in your home directory as .tcshrc or .bashrc Can comment out setups for packages not used Default ones provided have chemistry package setups commented out – need to remove # at start of line Can customize by creating .aliases file that is sourced at end of the CHPC script – do not customize .tcshrc/.bashrc file as you will need to occasionally get a new version http://www.chpc.utah.edu

Location of Installations Have a three layer system for installations: /uufs/chpc.utah.edu/sys/pkg Accessible on all clusters and CHPC supported linux desktops /uufs/arches/sys/pkg Accessible on all clusters, but not desktops /uufs/cluster.arches/sys/pkg Single cluster access For builds of packages customized to some feature (usually for the MPI) of the given cluster 10/21/2010 http://www.chpc.utah.edu Slide 13

Software - General Information Available on CHPC web pages/wiki Can get to from www.chpc.utah.edu → Software documentation → More (for complete list) http://www.chpc.utah.edu/docs/manuals/software Available for most packages Has information on licensing restrictions, example batch scripts, where to get more information on a specific package Also has useful information on running of jobs (scaling, issues, etc) for some applications http://www.chpc.utah.edu

Types of Chemistry Packages Molecular Mechanics/Dynamics Electronic Structure Calculations Plane Wave Pseudopotential based packages – allows for periodic boundary conditions Support Packages Integrated Modeling Platforms http://www.chpc.utah.edu

Molecular Mechanics/Dynamics Amber Gromacs NAMD LAMMPS Charmm (licensed by research group) Note – Gaussian and NWChem also have MM capabilities http://www.chpc.utah.edu

Quantum (Mainly) Packages Gaussian03/09 NWChem GAMESS Molpro – serial only Dalton – serial only Orca Psi SIESTA – licensed by individual research group but can add others to license at no charge CFOUR http://www.chpc.utah.edu

Plane Wave Pseudopotential (good for periodic systems) Quantum Espresso CPMD VASP – licensed by group BigDFT (wavelet basis set) http://www.chpc.utah.edu

Integrated Molecular Platforms Schrodinger – owned by individual groups http://www.chpc.utah.edu

Support Packages Gaussview Molden VMD NBOView Babel (Openbabel) Dock AutoDock (and Autodock Vina) Cambridge Structural Database ECCE Special case – installed on carbon.chpc.utah.edu Talk to me first if you want to use http://www.chpc.utah.edu

GaussView Molecular builder and viewer for Gaussian input/output files CHPC provides campus license for linux version For Gaussian03 – standard is version 4 For Gaussian09 – standard is version 5 Chemistry Department has campus license for GV5 for windows; can buy into license Access with gv & – provided you have uncommented the Gaussian setup from the standard .tcshrc DO NOT submit jobs from within GaussView – instead create and save input file and use batch system http://www.chpc.utah.edu

NBOView Package to view NBO orbitals Works only with G03 (not G09) For information see http://www.chem.wisc.edu/~nbo5/v_manual.htm How to use at CHPC: Add to your path: /uufs/chpc.utah.edu/sys/pkg/nboview/nboview1.1 nboview http://www.chpc.utah.edu

Molden Another program for viewing molecular/electronic structures; version 4.7 Works with Gamess, Gaussian, Molpro Supports plots of electronic density, MOs, etc More information at http://www.cmbi.ru.nl/molden/molden.html How to use at CHPC: Make sure your path includes /uufs/chpc.utah.edu/sys/pkg/molden/std molden & http://www.chpc.utah.edu

VMD Visualization, mainly for MM/MD latest version (1.9.1) installed Reads a number of different file formats Information at http://www.ks.uiuc.edu/Research/vmd Can install on own desktop (windows/mac/linux versions available) /uufs/chpc.utah.edu/sys/pkg/vmd/vmd-std/bin/vmd (can uncomment setup in .tcshrc then vmd) Can use for 3D viewing on CHPC’s vis wall http://www.chpc.utah.edu

ECCE Extensible Computational Chemistry Environment Package developed at EMSL at PNNL See ecce.pnl.gov for info Set of modules to manage computational chemistry computer jobs (Gaussian, NWChem) from start (including building system) to finish (analyzing output) all from your desktop system Installed on carbon.chpc.utah.edu User accounts need setup – see me if interested as carbon is not always kept running http://www.chpc.utah.edu

OpenBabel Tool to interconvert structure files between a number of formats used in molecular modeling See openbabel.org for more information To run: source /uufs/chpc.utah.edu/sys/openbabel/etc/babel.csh (or uncomment in your .tcshrc) babel -i < input-type > < infile > -o < output-type > < outfile > babel –H to see format for usage, options, and input/output file-types http://www.chpc.utah.edu

Dock/AutoDock Programs to look at binding of a small molecule within the active site of a receptor, usually a macromolecule Dock version 6.5 installed get info at: http://dock.compbio.ucsf.edu source /uufs/chpc.utah.edu/sys/pkg/dock/etc/dock.csh dock6.mpi to start (needs arguments) Autodock version 4.2 info available at http://autodock.scripps.edu source /uufs/chpc.utah.edu/sys/pkg/autodock/etc/autodock.csh autodock4 (with proper arguments) or autogrid4 autodocktools, a GUI interface, installed – start with adt & Autodock Vina – multicore performance and enhanced accuracy – start with vina & http://www.chpc.utah.edu

Cambridge Structural Database Moved from library to CHPC summer 2006 – additions to the database are made every 3-4 months; package updated annually www.ccdc.cam.ac.uk for information Need CHPC account to use From PC need Xterm/Xwindowing software (Putty/XMing work well) to start session on any of the interactive nodes source (or uncomment line in .tchsrc) /uufs/chpc.utah.edu/sys/pkg/CSD/std/cambridge/etc/csd.csh cq & <- to start conquest (search engine) mercury & <- to start crystal structure viewer The first time you use it on a given computer you will be asked to confirm licensing need to provide site/license codes (840/7E92A7) http://www.chpc.utah.edu

Amber Molecular mechanics/dynamics package Current version is Amber 12 (Amber 11 still active) Basic information on getting started: http://www.chpc.utah.edu/docs/manuals/software/amber.html For more information see http://ambermd.org The Amber Mail Reflector Archive and mailing list is very useful! Different parallel builds available for each cluster to make use of the different interconnects Have GPU version – initial tests show impressive performance gains http://www.chpc.utah.edu

-- All have builds optimized for individual clusters -- Other MM/MD packages -- All have builds optimized for individual clusters -- Gromacs - http://www.gromacs.org/ version 4.5.4 installed with builds of both single and double precision LAMMPS - http://lammps.sandia.gov/ NAMD: http://www.ks.uiuc.edu/Research/namd/ version 2.6 on sanddunearch (2.7 current) NWChem also has MM/MD capabilities and Gaussian09 also has limited MM capabilities http://www.chpc.utah.edu

CHARMM Package is licensed per research group and for a specific version If interested – get license and we can do installation such that only your group has access See www.charmm.org for details http://www.chpc.utah.edu

Gaussian03/09 Commercial electronic structure package http://www.gaussian.com for information and User’s Guide Last installed revision of G03 is E.01 /uufs/chpc.utah.edu/sys/pkg/gaussian03/std Has been updated to include latest NBO5 Version C.01 of G09 installed /uufs/chpc.utah.edu/sys/pkg/gaussian09/AMD64 /uufs/chpc.utah.edu/sys/pkg/gaussian09/EM64T For information on accessing the CHPC installation http://www.chpc.utah.edu/docs/manuals/software/g03.html http://www.chpc.utah.edu

NWChem Package developed at PNNL to work on massively parallel systems http://www.nwchem-sw.org Goal: Computational chemistry solutions that are scalable with respect to both chemical system size and MPP hardware size Has quantum mechanics, molecular mechanics/dynamics, and quantum molecular dynamics, plane waves for periodic systems Version 6.1 (with Python) – 5.1.1 still available in chpc.utah.edu /uufs/chpc.utah.edu/sys/pkg/nwchem/nwchem-6.1/bin/LINUX64 /uufs/ember.arches/sys/pkg/nwchem/nwchem-6.1/bin/LINUX64 (not tested) /uufs/updraft.arches/sys/pkg/nwchem/nwchem-6.1/bin/LINUX64 (not tested) To run: source appropriate etc/nwchem.csh must have $HOME/pdir directory if running parallel More information and example batch script at http://www.chpc.utah.edu/docs/manuals/software/nwchem.html http://www.chpc.utah.edu

GAMESS General Atomic and Molecular Electronic Structure System Another option for most ab initio quantum calculations On 22 February 2006 version – can be updated if needed /uufs/chpc.utah.edu/sys/pkg/GAMESS/gamess http://www.msg.ameslab.gov/GAMESS for information on usage and capabilities Can run both parallel or serial For information on accessing the CHPC installation see http://www.chpc.utah.edu/docs/manuals/software/gamess.html http://www.chpc.utah.edu

Other Quantum Codes -- Mainly codes that are specialized in terms of functionality -- Dalton – only serial build Focus on property calculations at HF, DFT, MCSCF and CC levels of theory version 2.0 installed; Dalton2011 recently released http://dirac.chem.sdu.dk/daltonprogram.org/ Molpro – only serial build Emphasis on highly accurate calculation methods Version 2006.1 installed; new 2012.1 available http://www.molpro.net/ Psi - http://www.psicode.org/ Orca - http://www.thch.uni-bonn.de/tc/orca/ CFOUR - http://www.cfour.de/ http://www.chpc.utah.edu

Plane Wave Codes Plane wave codes for the study of systems under periodic boundary conditions (PBC) NWChem has some functionality VASP – http://cms.mpi.univie.ac.at/vasp/ Ab initio QM/MD Licensed per research group (have 1 group with an old version) Quantum Espresso - http://www.quantum-espresso.org/ Understands crystal space groups Has GIPAW module to do NMR calculations J-ICE and XCrysDen to view CPMD - http://www.cpmd.org/ BigDFT - http://inac.cea.fr/L_Sim/BigDFT/ http://www.chpc.utah.edu

Schrodinger Suite Commercial package geared for chemical modeling and simulation in the pharmaceutical field, specifically drug discovery http://www.schrodinger.com/ Interface – Maestro (free for academia) Calculation code includes Jaguar (Quantum), Macromodel (MM), Qsite (QM/MM) Interfaces with Desmond for MD (installed) Tools for structure based drug design Docking, ligand design, binding affinities, screening libraries Year to Year licensing Token based, so limited in number of concurrent uses Purchased by two groups in Medicinal Chemistry If interested in using – come talk to me http://www.chpc.utah.edu

Finally….. Let us know if there is some other package that does something that our current packages do not; we can look into the possibility of getting it. Factors: cost, hardware/OS requirements, licensing issues, usage needs Any questions – contact me anita.orendt@utah.edu Phone: 801-231-2762 Office: 422 INSCC http://www.chpc.utah.edu