HP-SEE Firefly Scientific Software Jelena Randjelovic PhD Student Faculty of Pharmacy, UOB The HP-SEE initiative.

Slides:



Advertisements
Similar presentations
Xushan Zhao, Yang Chen Application of ab initio In Zr-alloys for Nuclear Power Stations General Research Institute for Non- Ferrous metals of Beijing September.
Advertisements

Adding Rappture to MATLAB Applications
Running Assembly Jobs on the Cluster with Checkpointing NERSC Tutorial 2/12/2013 Alicia Clum.
Quantum Mechanics Calculations III Apr 2010 Postgrad course on Comp Chem Noel M. O’Boyle.
Network for Computational Nanotechnology (NCN) Purdue, Norfolk State, Northwestern, UC Berkeley, Univ. of Illinois, UTEP Basic Portable Batch System (PBS)
HPCC Mid-Morning Break MPI on HPCC Dirk Colbry, Ph.D. Research Specialist Institute for Cyber Enabled Research
Software Tools Using PBS. Software tools Portland compilers pgf77 pgf90 pghpf pgcc pgCC Portland debugger GNU compilers g77 gcc Intel ifort icc.
Running Jobs on Jacquard An overview of interactive and batch computing, with comparsions to Seaborg David Turner NUG Meeting 3 Oct 2005.
IT MANAGEMENT OF FME, 21 ST JULY  THE HPC FACILITY  USING PUTTY AND WINSCP TO ACCESS THE SERVER  SENDING FILES TO THE SERVER  RUNNING JOBS 
New MPI Library on the cluster Since WSU’s Grid had an upgrade of its operating system recently, we need to use a new MPI Library to compile and run our.
WEST VIRGINIA UNIVERSITY HPC and Scientific Computing AN OVERVIEW OF HIGH PERFORMANCE COMPUTING RESOURCES AT WVU.
SEE-GRID-SCI Hands-On Session: Workload Management System (WMS) Installation and Configuration Dusan Vudragovic Institute of Physics.
GAMESS Workshop & Tutorial
Using the P4-Xeon cluster HPCCC, Science Faculty, HKBU Usage Seminar for the 64-nodes P4-Xeon Cluster in Science Faculty March 24, 2004.
Computational Spectroscopy II. ab initio Methods Multi-configuration Self-Consistent Field Theory Chemistry 713 Trocia Clasp.
Quick Tutorial on MPICH for NIC-Cluster CS 387 Class Notes.
REVITALISE Scientific Visualization in the Classroom Bob Gotwals Morehead Planetarium and Science Center UNC-Chapel Hill.
How to optimize automatically the parameters that determine the quality of the basis Javier Junquera Alberto García.
Quantum Chemistry with GAMESS Brett M. Bode Scalable Computing Laboratory Department of Electrical and Computer Engineering Iowa State University.
Understanding the Basics of Computational Informatics Summer School, Hungary, Szeged Methos L. Müller.
Linux Commands LINUX COMMANDS.
MSc. Miriel Martín Mesa, DIC, UCLV. The idea Installing a High Performance Cluster in the UCLV, using professional servers with open source operating.
VIPBG LINUX CLUSTER By Helen Wang March 29th, 2013.
Bigben Pittsburgh Supercomputing Center J. Ray Scott
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
XML-Based Construction of Input Files for Computational Chemistry  Use XML files to populate the graphical interface  Use XML files to identify combinations.
Performance Optimization Getting your programs to run faster CS 691.
HPC for Statistics Grad Students. A Cluster Not just a bunch of computers Linked CPUs managed by queuing software – Cluster – Node – CPU.
Performance Optimization Getting your programs to run faster.
Sources of Cisco IOS Honolulu Community College Cisco Academy Training Center Semester 2 Version 2.1.
SEE-GRID-SCI The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no.
My first Gaussian input
INFSO-RI Enabling Grids for E-sciencE Running ECCE on EGEE clusters Olav Vahtras KTH.
SEE-GRID-SCI Overview of YAIM and SEE-GRID-SCI YAIM templates Dusan Vudragovic Institute of Physics Belgrade Serbia The.
Software Tools Using PBS. Software tools Portland compilers pgf77 pgf90 pghpf pgcc pgCC Portland debugger GNU compilers g77 gcc Intel ifort icc.
Running Parallel Jobs Cray XE6 Workshop February 7, 2011 David Turner NERSC User Services Group.
SEE-GRID-SCI WRF – ARW application: Overview The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research.
HP-SEE High-Performance Computing Infrastructure for South East Europe’s Research Communities 8th e-Infrastructure Concertation Meeting,
® IBM Software Group © 2006 IBM Corporation Rational Asset Manager v7.2 Using Scripting Tutorial for using command line and scripting using Ant Tasks Carlos.
Portable Batch System – Definition and 3 Primary Roles Definition: PBS is a distributed workload management system. It handles the management and monitoring.
NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable.
HP-SEE Debugging with GDB Vladimir Slavnic Research Assistant SCL, Institute of Physics Belgrade The HP-SEE initiative.
HP-SEE FFTs Using FFTW and FFTE Libraries Aleksandar Jović Institute of Physics Belgrade, Serbia Scientific Computing Laboratory
SEE-GRID-SCI New AEGIS services Dusan Vudragovic Institute of Physics Belgrade Serbia The SEE-GRID-SCI initiative is co-funded.
Taylor Barnes June 10, 2016 Introduction to Quantum Espresso at NERSC.
HP-SEE In the search of the HDAC-1 inhibitors. The preliminary results of ligand based virtual screening Ilija N. Cvijetić, Ivan O. Juranić,
HP-SEE TotalView Debugger Josip Jakić Scientific Computing Laboratory Institute of Physics Belgrade The HP-SEE initiative.
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
An Brief Introduction Charlie Taylor Associate Director, Research Computing UF Research Computing.
Advanced Computing Facility Introduction
Gaussian 09 Tutorial Ph. D. Candidate
GRID COMPUTING.
PARADOX Cluster job management
AEGIS Training for Site Administrators
HPC usage and software packages
MPI Basics.
Tools of the Trade
OpenEye Scientific Software
Implementation Issues
By Jonathan Rinfret CREATING A BASH SCRIPT By Jonathan Rinfret
Investigation of Electronic Effects with Difference Densities
Cmake Primer.
Services for Experienced and Starting HPC Tier 3 Users (SES-HPC)
Computational Study Understanding Catalytic Hydrogen Oxidation/Reduction in Metal-Sulfur Proteins Tyler Ewing 2017.
Proton Transfer in Phosphoric Acid Based Fuel Cell Membranes
DPM Hands-on Session AEGIS Training for Site Administrators
Quick Tutorial on MPICH for NIC-Cluster
Working in The IITJ HPC System
Question 1 How are you going to provide language and/or library (or other?) support in Fortran, C/C++, or another language for massively parallel programming.
Fig. 6 Performance of the AIMNet model on several benchmark sets compared to MM, semiempirical, DFT, and ab initio QM methods. Performance of the AIMNet.
Presentation transcript:

HP-SEE Firefly Scientific Software Jelena Randjelovic PhD Student Faculty of Pharmacy, UOB The HP-SEE initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no

Firefly  ab initio and DFT software  SCF  DFT  CI, CIS  MP2, MP3, MP4  TDDFT  solvation models  no support for CC (coupled cluster) and FMO methods  32 bit mpich/openmpi libraries needed to run Firefly on 64 bit systems  Obtainable at: Tuning and Optimization of HPC application – Belgrade, Serbia - 01 June 20122

Example PBS submission script Tuning and Optimization of HPC application – Belgrade, Serbia - 01 June #!/bin/bash #PBS -q hpsee #PBS -l nodes=2:ppn=8 #PBS -l walltime=200:00:00 #PBS -e ${PBS_JOBID}.err #PBS -o ${PBS_JOBID}.out cd $PBS_O_WORKDIR cat $PBS_NODEFILE ${MPI_MPICH_MPIEXEC} /opt/exp_soft/hpsee/firefly/firefly -ex /opt/exp_soft/hpsee/firefly/

Some Command Line Options  -ex  specifies path from which to duplicate run time extension files  -ex /opt/exp_soft/hpsee/firefly/  -b  specifies external basis set lib file  -i, -o  overwrite default input/output  default input is a file named “input” Tuning and Optimization of HPC application – Belgrade, Serbia - 01 June 20124

Example Input File Tuning and Optimization of HPC application – Belgrade, Serbia - 01 June $contrl scftyp=RHF runtyp=ENERGY mplevl=2 units=ANGS icharg=0 mult=1 $end -$contrl exetyp=check $end $system MWORDS=256 $end $basis gbasis=N31 ngauss=6 ndfunc=1 nffunc=0 npfunc=1 diffsp=.TRUE. $end $guess guess=huckel $end $data Water MP2/6-31+G(d,p) single point energy calculation CNv 2 O H $end

More Examples Tuning and Optimization of HPC application – Belgrade, Serbia - 01 June $contrl scftyp=rhf runtyp=optimize coord=zmt $end $system memory= $end $basis gbasis=sto ngauss=3 $end $guess guess=huckel $end $pcm solvnt=water $end $data a water molecule solvated by PCM water Cnv 2 O H 1 rOH H 1 rOH 2 aHOH rOH=0.95 aHOH=104.5 $end

More Examples Tuning and Optimization of HPC application – Belgrade, Serbia - 01 June ! HESSIAN JOB TRIAL $CONTRL SCFTYP=RHF DFTTYP=B3LYP1 runtyp=hessian $END $CONTRL COORD=CART UNITS=ANGS $END $SYSTEM MEMORY= $END $BASIS extfil=.t. GBASIS=SVP $END $DATA cisOH_closeTS C1 C N C O Pd O P H H $END

More Examples Tuning and Optimization of HPC application – Belgrade, Serbia - 01 June Final energies (Eh): (DFT def2-SVP) _______________ diff kJ/mol

More Examples Tuning and Optimization of HPC application – Belgrade, Serbia - 01 June 20129