Download presentation
Presentation is loading. Please wait.
Published byDuane Garrison Modified over 8 years ago
1
COMPUTING METHODS Calculated with HF Radar Surface Currents: Hourly back-projections of water particles, out through 40 days in the past, from the Pt. Buchon SMCA/SMR; repeated daily with surface currents measured from October 2006 – October 2007 With the scope of this initial investigation limited to the single MPA at Pt. Buchon, one desktop-class computer workstation was able to complete the necessary calculations using MATLAB ® with over a year’s worth of data in 12 hours. Example: If the currents at a given location flowed south at a speed of 10 cm s -1 over the course of an hour (1 hour = 3600 seconds), then the waters at that location would have been 360 m to the north the previous hour (3600 s × 10 cm s -1 = 36,000 cm = 360 m). Continuing to “step backwards” each hour in this fashion calculated the trajectories of source waters and their transit times. Back-projection and Prediction with High-frequency (HF) Radar Ocean Surface Currents: Computing Methodology to Expand Product Coverage Brian Zelenke (Center for Coastal Marine Sciences, California Polytechnic State University, San Luis Obispo, California 93407-0401 U.S.A.; bzelenke@calpoly.edu) Mark A. Moline, Ph.D. (Center for Coastal Marine Sciences, California Polytechnic State University, San Luis Obispo, California 93407-0401 U.S.A.; mmoline@calpoly.edu) ABSTRACT In consideration of such investigations as evaluating the connectivity between marine protected areas (MPA) and projecting the trajectories of water parcels, it has been necessary to consider multi-year datasets of ocean surface current vectors measured with a network of CODAR Ocean Sensors, Ltd. SeaSonde ® HF-radar stations. By looking at several years of measured ocean surface current data, seasonal and annual regimes can begin to be characterized, and increased predictive accuracy can be obtained, through comparison of the multiple realizations of the surface current field available for each time-period of interest. The computational requirements of developing products with datasets such as these on the local scale can be manageable with single instances of a program running on individual computer workstations. However, expansion of HF-radar products that rely on long-term datasets to the regional or national-scale involves a volume of data that benefits considerably in terms of both speed and precision from higher performance computing techniques. Specifically, grid computing, parallelizing, and server virtualization can be used individually and in combination to increase the scope available to HF-radar data products, while more efficiently utilizing the computational resources available. Here we consider two HF-radar products under development – a matrix of the connectivity between oceanic central California MPAs and water parcel trajectory tracking – and the computing methods available to expand their coverage. INTRODUCTION: MPA Connectivity Where does the water in the MPAs come from, and how long does it take to get there (e.g., from other MPAs) Trajectories of a 1 km resolution grid of water particles were back-projected from the Pt. Buchon MPA each hour, out through 40 days in the past, from 365 starting-days, producing a map of where surface waters travel over a 40-day period to reach the Pt. Buchon MPA – and a visualization of the length of time the waters travel along these paths. By comparing the travel times of those back-projected track-points that fell within other central California MPAs, the connection time between the Pt. Buchon MPA and adjacent MPAs was calculated. Repeating these calculations for the other central California MPAs would result in a connectivity matrix between all the MPAs in the region. The development of this product is intended to address a challenge central to the design and management of the MPA network: accurately understanding the larval transport which links the MPAs and forms the foundation of the ecosystem. Figure 1. Central California MPAs. Figure 2. Shown is an aerial view of the Point Buchon region off the San Luis Obispo, California county coast with the border of the Pt. Buchon MPA (SMCA/SMR) outlined in yellow. Inset within the Pt. Buchon MPA boundary are red dots at each of the 45 grid-points, spaced 1 km apart, where the paths of the surface waters into the MPA were back-projected from. 4 th Radiowave Operators Working Group (ROWG) Meeting Poster available for download at ftp://marine.calpoly.edu/zelenke Figure 3. A map of 15,768,000 hourly back- projections of water particles, out through 40 days (960 hours) in the past, into the Pt. Buchon MPA; repeated daily for each of the 45 1-km grid-points with surface currents measured from October 2006- October 2007 (45 trajectories × 960 time-steps × 365 start-days = 15,768,000 back-projected points). These water particle track-points are color coded to show the travel time (up through 40 days) the waters took to reach the Pt. Buchon MPA. Table 1. Percentage of the 16,425 trajectories (45 traj. × 365 days) back-projected through 40 days intersecting the Pt. Buchon MPA and the average connection time. FUTURE WORK Expanding product to cover all California MPAs To simultaneously run the same calculations for multiple MPAs while increasing the resolution and time-scale of the measurements: invoke parallized functions within MATLAB to utilize invoke parallized functions within MATLAB to utilize multiple processors/cores distribute jobs for each MPA across a grid of distribute jobs for each MPA across a grid ofworkstations These computing architecture changes will allow simultaneous, coordinated model calculations over all MPA regions. INTRODUCTION: Trajectory Projection What is the likely path currents would take a floating object from its last known position in the ocean On February 16, 2008 27-year-old San Luis Obispo man Chris Partridge drowned after being swept off the Limekiln State Park beach by a wave, after which his body washed ashore at a different location. An investigation was undertaken to develop a data product using ocean surface current measurements to help predict the location of a person lost at sea. The product developed was used to calculate the likely path the currents would have taken the floating body from its last known position in the ocean. Application of such a product to search-and-rescue (SAR) planning reduces the size of search areas, allowing SAR resources to be deployed more efficiently and with a greater chance of success. Saturday, February 16, 2008, Midnight (PST)Saturday, February 16, 2008, Midnight (PST) – San Luis Obispo man Chris Partridge, 27, swept off the Limekiln State Park beach by a rogue wave. Saturday, February 16, 2008, Midnight-AfternoonSaturday, February 16, 2008, Midnight-Afternoon – Search conducted by Monterey County Search & Rescue, Big Sur Fire Brigade, and the Coast Guard. – By early afternoon, all agencies ended search and recovery efforts. Sunday, February 17, 2008, EveningSunday, February 17, 2008, Evening – Corpse of Chris Partridge found along northern Limekiln State Park shoreline. Monday, February 18, 2008, AfternoonMonday, February 18, 2008, Afternoon – Dr. Mark Moline (Cal Poly) reads article that Chris Partridge was swept offshore Saturday. – Discovery of the body not made public until Tuesday, February 19, 2008. – Dr. Moline asks Brian Zelenke (Cal Poly) to calculate a track based on SCM measurements along which to search for the body. – Dr. Moline sends Brian Zelenke’s predictions to Aric R. Johnson (USCG). – USCG confirms that they have already found the body, at the location of the eddy shown in the drift track. Limekiln State Park Figure 4. Location of Limekiln State Park beach. Limekiln State Park beach Body found Predicted path of a surface drifter from SCM. This image and data (latitude, longitude, & time) was sent to the USCG. Red dots represent the position predicted each hour (February 16, 2008, 12:00 A.M. PST through February 18, 2008, 5:00 A.M. PST). – – Prospective search path starts at the green dot and continues to the large red dot a the end. Drift track intentionally does not stop near shore, just in case the body had remained afloat, rather than washing ashore as it did. – – If the body was still floating, continuing the track would have allowed the USCG to find the nearest red dot in time and continue searching around/along the track from there. COMPUTING METHODS Calculated with HF Radar Surface Currents: The trajectory currents would have transported a floating object within a known parcel of water With MATLAB identified as the best platform to speed code development, the particle tracking model was written to project the trajectory through each hourly set of vectors measured after the report of the victim’s last know position. The U and the V components were separately linearly interpolated to the latitude and longitude at each hourly time-step to provide an estimate of the velocity and position along the particle track. Spatial and, if necessary in the case of coverage loss, temporal linear interpolation of the vector components eliminated the need for model gridding and more computationally time consuming methods of mode-based gap filling. Additionally, linear interpolation provided a straightforward method of extrapolating the surface current field to locations outside areas of data coverage (e.g., nearshore waters). Figure 5. Path projected (red line) from green dot to large red dot: the start at the Limekiln State Park beach (green) and end ashore (red) of an object set adrift at the same place/time as Chris Partridge. Projected path from large red dot to yellow dot: subsequent projections continued in time from the point of landfall, incase the drifter did not remain washed ashore. The drift track projected indicated landfall north of the Park after one tidal cycle (yellow dot), corresponding to the same time and place Chris Partridge’s body was recovered by the U.S. Coast Guard. FUTURE WORK Developing product to track multiple spill and SAR incidents at the same time, in disparate regions To enhance spill response and SAR planning the MATLAB code must incorporate real-time data calculation with a redundant fault tolerance. Solutions have been identified to provide these operations while maintaining the extensibility of the MATLAB platform: run multiple instances of MATLAB on blade servers in University’s high- performance grid-computing environment run multiple instances of MATLAB on blade servers in University’s high- performance grid-computing environment virtualize servers virtualize servers By adapting the model’s supporting computing infrastructure to grid computing, the necessary real-time execution speed can be achieved, and without additional model coding. Further, virtualizing the supporting servers will allow for the needed fault tolerance while better utializing existing, underused computer resources.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.