1 Dave Shafer, ITS Systems & Platforms June 25, 2010
Central File Storage Individuals: 1-5 GB Departments: 1 GB per FTE Base Entitlement Asynchronous replication to identical system at remote data center 4 Hour Recovery Objectives < 1 hour of downtime in the past year 99.97% Uptime 33K of 37K eligible users 89% Participation 2June 25, 2010
The Effect “One price fits all” network file storage Lower-cost network file storage Lowest-cost bare server storage 3June 25, 2010
Storage Tiers High Performance Enterprise SAN Central File Storage Low Cost SAN Target UsersCentrally managed servers End users or distributed servers Distributed servers Funding ModelCentrally fundedBase entitlement, pay for more Pay for use Monthly Price per Gigabyte N/A$0.12 – $0.25$0.03 – $ June 25, 2010
Moving Forward Interviewed partners, peers Categorized storage Analyzing data Storage Services Roadmap 11 peer institutions 17 campus units 2.7 petabytes of campus storage 5June 25, 2010
Peer Institutions 1.Indiana 2.Iowa State 3.Michigan 4.Michigan State 5.Minnesota 6.Northern Iowa 7.Northwestern 8.Ohio State 9.Penn State 10.Purdue 11.Wisconsin 6June 25, 2010
Campus Units 1.College of Education 2.College of Engineering 3.College of Law 4.College of Liberal Arts and Sciences 5.College of Nursing 6.College of Pharmacy 7.College of Public Health 8.Department of Biology 9.Department of Psychology 10.Division of Mathematical Sciences 11.Graduate College 12.International Programs 13.Office of the Vice-President for Research 14.College of Business 15.Information Technology Services 16.Institute for Clinical and Translational Science 17.University of Iowa Libraries 7June 25, 2010
Campus Storage by Provider 8June 25, 2010
Storage by University Mission 9June 25, 2010
Storage by Access Type 10June 25, 2010
Issues Research storage Fragmented Poor data protection Backup storage Inefficient No central service Archival storage Few options Shifts to online or backup storage 11 Cost June 25, 2010
Common Solutions Group Survey 1.Carnegie Mellon 2.Chicago 3.Colorado 4.Columbia 5.Cornell 6.Duke 7.Georgetown 8.Illinois 9.Iowa 10.Michigan State 11.Minnesota 12.NYU 13.Princeton 14.UC Berkeley 15.UC San Diego 16.USC 17.Virginia 18.Virginia Tech 19.Washington 20.Wisconsin 21.Yale 12June 25, 2010
CSG: Central Storage Services 13 Iowa June 25, 2010
CSG: Biggest Storage Challenge 14 “Bootstrapping a funding plan for central storage services where growth is explosive.” Data growth “As Central IT, our biggest challenge is fighting the misconception that storage is ‘cheap.’” Perception of cost “Well by volume it is research data storage and particularly data generated by instruments like telescopes and sequencers.” Research storage June 25, 2010
Possible Next Steps Identify and promote research solutionsBuild central backup serviceBuild, promote archival solutions 15June 25, 2010
Research Storage 16 Low cost Commodity server hardware Inexpensive disks High capacity Scale out capability Service broker vs. provider Partner with existing campus providers June 25, 2010
Central Backup Service Focus on distributed servers Engage potential users Define requirements Identify possible solutions 17June 25, 2010
Archival/Repository Solutions 18June 25, 2010
Next Steps Census Share data with the campus Roadmap Propose future directions Projects Initiate projects according to roadmap Services Roll out new services 19June 25, 2010
Feedback What other data do we need? What changes should ITS make to existing services? What new services should ITS offer? 20June 25, 2010
Contact Information Dave Shafer Enterprise Storage Manager ITS Systems & Platforms 21June 25, 2010