February 22-23, Washington D.C. SURA ENDyne Software for Dynamics of Electrons and Nuclei in Molecules. Developed by Dr. Yngve Öhrn and Dr. Erik Deumens,

Slides:



Advertisements
Similar presentations
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
Advertisements

First there was batch Serial processing Waiting jobs sit on a job queue until they can be processed Things processed in first-in first-out order (FIFO)
University of Southampton Electronics and Computer Science M-grid: Using Ubiquitous Web Technologies to create a Computational Grid Robert John Walters.
Lectures on File Management
Overview of Wisconsin Campus Grid Dan Bradley Center for High-Throughput Computing.
Condor and GridShell How to Execute 1 Million Jobs on the Teragrid Jeffrey P. Gardner - PSC Edward Walker - TACC Miron Livney - U. Wisconsin Todd Tannenbaum.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
GlueX Computing GlueX Collaboration Meeting – Glasgow Edward Brash – University of Regina August 4 th, 2003.
GRID workload management system and CMS fall production Massimo Sgaravatto INFN Padova.
Slides for Grid Computing: Techniques and Applications by Barry Wilkinson, Chapman & Hall/CRC press, © Chapter 1, pp For educational use only.
CSI 101 Elements of Computing Spring 2009 Lecture #2 Development Life Cycle of a Computer Application Monday January 26th, 2009.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Zach Miller Condor Project Computer Sciences Department University of Wisconsin-Madison Flexible Data Placement Mechanisms in Condor.
DIRAC API DIRAC Project. Overview  DIRAC API  Why APIs are important?  Why advanced users prefer APIs?  How it is done?  What is local mode what.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
System Calls 1.
BaBar WEB job submission with Globus authentication and AFS access T. Adye, R. Barlow, A. Forti, A. McNab, S. Salih, D. H. Smith on behalf of the BaBar.
Track 1: Cluster and Grid Computing NBCR Summer Institute Session 2.2: Cluster and Grid Computing: Case studies Condor introduction August 9, 2006 Nadya.
ExTASY 0.1 Beta Testing 1 st April 2015
Chao “Bill” Xie, Victor Bolet, Art Vandenberg Georgia State University, Atlanta, GA 30303, USA February 22/23, 2006 SURA, Washington DC Memory Efficient.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
LOGO Scheduling system for distributed MPD data processing Gertsenberger K. V. Joint Institute for Nuclear Research, Dubna.
Matthew Palmer, Cambridge University01/10/2015 First Use of the UK e-Science Grid Overview The Physics Experiences Looking forward Conclusions Matthew.
Central Reconstruction System on the RHIC Linux Farm in Brookhaven Laboratory HEPIX - BNL October 19, 2004 Tomasz Wlodek - BNL.
Grid Computing I CONDOR.
Compiled Matlab on Condor: a recipe 30 th October 2007 Clare Giacomantonio.
1 Overview of the Application Hosting Environment Stefan Zasada University College London.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
Nadia LAJILI User Interface User Interface 4 Février 2002.
Rochester Institute of Technology Job Submission Andrew Pangborn & Myles Maxfield 10/19/2015Service Oriented Cyberinfrastructure Lab,
Grid and Cloud Computing Globus Provision Dr. Guy Tel-Zur.
Grid Computing at Yahoo! Sameer Paranjpye Mahadev Konar Yahoo!
© Janice Regan, CMPT 300, May CMPT 300 Introduction to Operating Systems Memory: Relocation.
December 8 & 9, 2005, Austin, TX SURA Cyberinfrastructure Workshop Series: Grid Technology: The Rough Guide Grid Enabling Applications for the Grid: ENDYNE.
TeraGrid Advanced Scheduling Tools Warren Smith Texas Advanced Computing Center wsmith at tacc.utexas.edu.
A Web Laboratory for Visual Interactive Simulation of Epitaxial Growth Feng Liu University of Utah Recently, we have developed a prototype of web laboratory.
Configuring IQmol for Windows machines, use version!
RUBRIC IP1 Ruben Botero Web Design III. The different approaches to accessing data in a database through client-side scripting languages. – On the client.
Review of Condor,SGE,LSF,PBS
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Portal Update Plan Ashok Adiga (512)
INFSO-RI Enabling Grids for E-sciencE Running ECCE on EGEE clusters Olav Vahtras KTH.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
Application Software System Software.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Using the ARCS Grid and Compute Cloud Jim McGovern.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid2Win : gLite for Microsoft Windows Roberto.
JavaScript 101 Introduction to Programming. Topics What is programming? The common elements found in most programming languages Introduction to JavaScript.
A computer contains two major sets of tools, software and hardware. Software is generally divided into Systems software and Applications software. Systems.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor and (the) Grid (one of.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
PROGRESS: GEW'2003 Using Resources of Multiple Grids with the Grid Service Provider Michał Kosiedowski.
Status of Globus activities Massimo Sgaravatto INFN Padova for the INFN Globus group
Large-scale accelerator simulations: Synergia on the Grid turn 1 turn 27 turn 19 turn 16 C++ Synergia Field solver (FFT, multigrid) Field solver (FFT,
Geant4 GRID production Sangwan Kim, Vu Trong Hieu, AD At KISTI.
FESR Consorzio COMETA - Progetto PI2S2 Molecular Modelling Applications Laura Giurato Gruppo di Modellistica Molecolare (Prof.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
U.S. ATLAS Grid Production Experience
Usecase Dynamo Moderate Requirements:
Grid Application Support Group Case study Schrodinger equations on the Grid Status report 16. January, Created by Akos Balasko
Practice #0: Introduction
NGS computation services: APIs and Parallel Jobs
Adding Computational Resources to SURAgrid (the document) September 27, 2007 Mary Trauner SURA Consultant.
gLite Job Management Christos Theodosiou
Frieda meets Pegasus-WMS
Grid Computing Software Interface
Presentation transcript:

February 22-23, Washington D.C. SURA ENDyne Software for Dynamics of Electrons and Nuclei in Molecules. Developed by Dr. Yngve Öhrn and Dr. Erik Deumens, University of Florida. Presented by Jerry Perez Texas Tech University

February 22-23, Washington D.C. SURA ENDyne agenda What is the application? Why it's important to grid-enable? Description of the grid-enabling that's been done so far. Discussion/detailing of next steps towards SURAgrid deployment Steps to be undertaken on Day 2 or plans or beyond.

February 22-23, Washington D.C. SURA What is the application? ENDyne is an application that implements the Electron Nuclear Dynamics (END) theory for studying the interaction between molecular geometry and electronic structure in a time-dependent and self- consistent way. The theory is somewhat unfamiliar to most people and the software is not very user friendly For that reason we do not make the code available to a general audience at this time. However, if someone is interested in the code, they can come and study with us for about a month and they get to take the code with them at the end. Please send to for more

February 22-23, Washington D.C. SURA Why it's important to grid-enable? The code we use is ENDyne 2.7 coded by Dr. Erik Deumens, University of Florida. A new code call CSTechG, having novel Coherent States applications, novel Dynamic Field Theory implementations and our Compute Grid implementations, is under development in the Morales group. ENDyne scales nicely with grids. ENDyne requires many runs.

February 22-23, Washington D.C. SURA Why it's important to grid-enable? Calculate electron transfer processes in large molecules of biological interest Simulate gas-phase molecular collisions Nanotechnology – Quantum Computing

February 22-23, Washington D.C. SURA Description of the grid-enabling that's been done so far. Grid enabling has been done on campus grid level. Grid enabling has been tested using Condor. Grid enabling has been tested using Globus.

February 22-23, Washington D.C. SURA Grid Enabling Applications for the Grid: ENDYNE C 2 H 2 Target Carbon 1 Carbon 2 d=15 a.u. b p x z H + Projectile, Hydrogen 3   [  ] Orientation y Hydrogen 2 Hydrogen 1

February 22-23, Washington D.C. SURA How does ENDYNE run on a cluster? A batch file was written for the ENDYNE users and stored in an environment variable called DYNROOT$ The batch file that ran single processor jobs was called “endyne”. The batch file called “run” submits multiple ENDYNE jobs into the queue by reading multiple input files from the directory “run” is ran in.

February 22-23, Washington D.C. SURA How does ENDYNE run on a cluster? Before any jobs are submitted, the files must be prepared for running: 1. prepare the "endyne_H+HF_opt_pvdz in" 2. optimize the "endyne_H+HF_opt_pvdz in" by command "endyne endyne_H+HF_opt_pvdz in > inin" 3. collect the zoca parameters by using collect.pe program and command "./collect.pe >outout" 4. prepare the "endyne_H+HF_run_pvdz tmpl.in" and insert the zoca parameters from "outout" file 5. modify the "run" file to program path of input files 6. modify the endynejob file to handle output files 7. type "./run" to submit the job 8. type "qstat" to see the job

February 22-23, Washington D.C. SURA Grid enabling ENDYNE on a Cycle Scavenging Grid Had to remove environment dependencies (DYNROOT$). Had to recompile ENDYNE for uniprocessor capabilities. TEST recompiled program locally before moving it onto the grid!!! Had to register libraries and necessary input files with the grid. Had to teach researchers how to use TechGrid campus-wide grid. Approx 3 hours of instruction with added documentation was necessary to get them on their feet.

February 22-23, Washington D.C. SURA Grid enabling ENDYNE on a Globus Grid If all sites wish to use ENDYNE, one way is to create an environment variable called DYNROOT$. Another way to run ENDYNE is to do away with environmental dependencies and recompile for static execution. globus-url-copy -vb -p 20 –dbg gsiftp://antaeus.hpcc.ttu.edu:2811/home/adde pall/GRID/endyne gsiftp://buda.tacc.utexas.e du:2811/home/addepall/endyne

February 22-23, Washington D.C. SURA Discussion/detailing of next steps towards SURAgrid deployment Create accounts for ENDyne researchers on other SURAgrid machines. Jobs are ready to run. Which machines in SURA are cross certified? How many CPU’s are available?

February 22-23, Washington D.C. SURA Steps to be undertaken on Day 2 or plans or beyond. Create certificates for Dr. Maiti and Dr. Yan of Texas Tech. Create accounts. Transfer input files. Create scripts and instructions for new grid users. ENDyne creates about 1.5 Gigs of output per run. We need to run thousands!