Download presentation
Presentation is loading. Please wait.
Published bySharyl Davidson Modified over 9 years ago
1
CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM REPRESENTATIVE OFFICES BUENOS AIRES KUALA LUMPUR PORTO ALEGRE NASA Earth Observing System Data and Information System Customer Satisfaction Results November 23, 2009 December 1, 2009 December 8, 2009
2
2 © CFI Group 2009 Today’s Discussion Background Overview Key Results Detailed Analysis Summary
3
3 © CFI Group 2009 Background
4
4 © CFI Group 2009 Project Background Objectives Measure customer satisfaction with the NASA Earth Observing System Data and Information System at a national level and for each Data Center –Alaska Satellite Facility Distributed Active Archive Center –Crustal Dynamics Data Information System –Global Hydrology Resource Center –Goddard Earth Sciences Data and Information Services Center –Land Processes Distributed Active Archive Center –MODAPS Level-1 Atmospheres Archive and Distribution System –NASA Langley Atmospheric Science Data Center –National Snow and Ice Data Center Distributed Active Archive Center –Oak Ridge National Laboratory Distributed Active Archive Center –Ocean Biology Processing Group –Physical Oceanography Distributed Active Archive Center Jet Propulsion Laboratories (JPL) –Socioeconomic Data and Applications Center Assess the trends in satisfaction with NASA EOSDIS specifically in the following key areas: –Product Search –Product Selection and Order –Delivery –Product Quality –Product Documentation –Customer Support Identify the key areas that NASA can leverage across the Data Centers to continuously improve its service to its users
5
5 © CFI Group 2009 Project Background Measurement timetable Finalized questionnaireJuly 30, 2009 Data collection via web Sending invitations spanned the first two weeks of Sept. Sending reminders spanned the last two weeks of Sept. September 3, 2009 – October 13, 2009 Topline resultsOctober 23, 2009 Results briefingNovember 23, 2009 December 1, 2009 December 8, 2009
6
6 © CFI Group 2009 Project Background Data collection Respondents 3,842 responses were received 3,842 responses were used for modeling Those who answered for more than one data center: Two: 108 Three: 15 Four: 2
7
7 © CFI Group 2009 Project Background Respondent information For which specific areas do you need or use Earth science data and services? Demographics (when comparable) remain fairly consistent with 2008 * Multi-select question; Language to question was changed slightly in 2009; Modeling was asked as a separate question prior to 2008
8
8 © CFI Group 2009 Project Background Respondent information Demographics (when comparable) remain fairly consistent with 2008 * Questionnaire was modified in 2009
9
9 © CFI Group 2009 Overview Key Results
10
10 © CFI Group 2009 NASA EOSDIS Customer satisfaction results 2006 ATTRIBUTES Ideal How close does [DAAC] come to the ideal organization? Overall satisfaction How satisfied are you with the data products and services provided by [DAAC]? Expectations To what extent have the data products and services provided by [DAAC] fallen short of or met your expectations? ACSI 72 7871 74 76 82 73 78 2005 2007 75 80 73 2008 77 81 74 75 71 79 73 75 2004 (+/-) 0.9(+/-) 0.7 (+/-) 0.5 (+/-) 0.6 (+/-) 0.5 N=1016 N=1263 N=2857N=2291 N=2601 2009 77 81 73 75 (+/-) 0.4 N=3842
11
11 © CFI Group 2009 NASA EOSDIS Benchmarks Strong performance continues … 73 77 69 76 304050607080 News & Information Sites (Public Sector) Q2 2009 NASA EOSDIS - Aggregate 2009 Federal Government (Overall) 2008 ACSI (Overall) Q3 2009 ACSI (Overall) is updated on a quarterly basis, with specific industries/sectors measured annually. Federal Government (Overall) is updated on an annual basis and data collection is done in Q3. Quarterly scores are based on a calendar timeframe: Q1- Jan through March; Q2 – April through June; Q3 – July through Sept.; Q4 – Oct. through Dec.
12
12 © CFI Group 2009 NASA EOSDIS Model Product Search/Selection/Documentation and Customer Support most critical The performance of each component on a 0 to 100 scale. Component scores are made up of the weighted average of the corresponding survey questions. Scores The change in target variable that results from a five point change in a component score. For example, a 5-point gain in Product Search would yield a 0.9-point improvement in Satisfaction. Impacts
13
13 © CFI Group 2009 NASA EOSDIS 2006 - 2009 Significant changes from 2008 =Significant Difference vs. 2008 (+/-) 0.4 (+/-) 0.8 (+/-) 0.5 (+/-) 0.6
14
14 © CFI Group 2009 Areas of Opportunity for NASA EOSDIS Remain consistent year over year Top Improvement Priority Product Search (75) Product Selection and Order (76) Product Documentation (77)
15
15 © CFI Group 2009 Detailed Analysis
16
16 © CFI Group 2009 Score Comparison Higher satisfaction inside the USA Respondents outside the USA have a slightly lower overall Satisfaction score with EOSDIS (76 outside vs. 77). Compared to last year there was a score increase for those respondents within the USA and a score decrease for those respondents outside the USA. Respondents outside the USA have a slightly lower overall Satisfaction score with EOSDIS (76 outside vs. 77). Compared to last year there was a score increase for those respondents within the USA and a score decrease for those respondents outside the USA. 71% of respondents are outside of the USA in 2009 vs. 68% in 2008.
17
17 © CFI Group 2009 CSI by Data Centers Only two data centers show statistically sig. changes =Significant Difference vs. 2008 (+/-) 2.5 (+/-) 1.2 (+/-) 2.9 (+/-) 4.7 (+/-) 2.1 (+/-) 2.7 (+/-) 3.5 (+/-) 0.7 (+/-) 1.0 (+/-) 1.5 (+/-) 2.0 (+/-) 2.1 (+/-) 2.6
18
18 © CFI Group 2009 Product Search Key driver of satisfaction 40% used data center’s or data-specific specialized search, online holdings or datapool (18% in 2008) 29% used WIST/EDG to search for data and products (41% in 2008) 19% selected Internet Search Tool (12% in 2008) =Significant Difference vs. 2008 Impact=0.9
19
19 © CFI Group 2009 Product Search Score Comparison By method for most recent search How did you search for the data products or services you were seeking? =Significant Difference vs. 2008 (+/-) 0.8 (+/-) 2.6 (+/-) 5.1 (+/-) 1.3 (+/-) 0.9 (+/-) 3.3 40% 4% 1% 19% 29% 3%
20
20 © CFI Group 2009 Product Search Scores by Data Center; variation in the trends =Significant Difference vs. 2008 (+/-) 3.0 (+/-) 3.3 (+/-) 3.0 (+/-) 4.2 (+/-) 0.8 (+/-) 1.1 (+/-) 1.7 (+/-) 1.8 (+/-) 2.5 (+/-) 2.2 (+/-) 3.1
21
21 © CFI Group 2009 Product Selection and Order Also a top opportunity for continuous improvement 94% said that they are finding what they want in terms of type, format, time series, etc. Impact=1.2 =Significant Difference vs. 2008 Did you use a sub-setting tool? 32% said No 43% said Yes, by geographic area 4% said Yes by geophysical parameter 18% said Yes, by both geographic area and geophysical parameter 3% said by band 1% said by channel.
22
22 © CFI Group 2009 Product Selection and Order Scores by Data Center; variation in the trends =Significant Difference vs. 2008 (+/-) 3.2 (+/-) 2.8 (+/-) 3.1 (+/-) 4.3 (+/-) 0.9 (+/-) 1.2 (+/-) 1.6 (+/-) 1.8 (+/-) 2.5 (+/-) 2.2 (+/-) 3.0 (+/-) 3.1
23
23 © CFI Group 2009 Product Documentation Data product description most sought after What documentation did you use or were you looking for? Data product description 45% Product format 16% Science algorithm 14% Instrument specifications 7% Tools 6% Science applications 6% Production code 1% Impact=1.0 CSI for those whose documentation was not found is 69 vs. those who got it delivered with the data (78) or online (78). =Significant Difference vs. 2008 Was the documentation… Delivered with the data (18% vs. 16% in ‘08) Available online (73% vs. 70% in ‘08) Not found (9% vs. 14% in ‘08)
24
24 © CFI Group 2009 Product Documentation Scores by data center =Significant Difference vs. 2008 (+/-) 2.6 (+/-) 3.4 (+/-) 3.2 (+/-) 2.5 (+/-) 3.4 (+/-) 0.9 (+/-) 1.3 (+/-) 1.8 (+/-) 1.9 (+/-) 2.7 (+/-) 2.4 (+/-) 3.3
25
25 © CFI Group 2009 Customer Support Maintain strong performance Did you request assistance from the Data Center’s user services staff during the past year? No=72%. Of those who said yes, 84% used email, 3% used the phone, and 12% used both phone and e-mail. 89% (91% in 2008) were able to get help on first request. These respondents continue to have a significantly higher CSI (81) than those who did not (65). Impact=1.6 =Significant Difference vs. 2008
26
26 © CFI Group 2009 Product Quality Preferences in line with actual for the most part In 2008, 74% said products were provided in HDF- EOS and HDF and 41% said they were their preferred method. *Multiple responses allowed Format data products were providedFormat preferred HDF-EOS/HDF58%HDF-EOS/HDF44% GeoTIFF31%GeoTIFF44% ASCII15%ASCII20% JPEG, GIF, PNG, TIFF13%GIS18% Binary10%JPEG, GIF, PNG, TIFF17% NetCDF9%NetCDF15% GIS6%Binary13% Don't know3%KML, KMZ9% KML, KMZ3%Other preferred format3% CEOS2%OGC Web services3% Other format2%CEOS2% OGC Web services1% Number of respondents3842Number of respondents3842
27
27 © CFI Group 2009 Product Quality Improves significantly this year Impact=0.4 =Significant Difference vs. 2008
28
28 © CFI Group 2009 Delivery Remains stable Over half said their data came from MODIS (same in 2008) 27% said ASTER (18% in 2008) Impact=0.5 =Significant Difference vs. 2008 *Multi-Select
29
29 © CFI Group 2009 Delivery Methods for receiving … How long did it take to receive your data products? 20% immediate retrieve CSI=80 18% less than 1 hour CSI=78 27% less than a day CSI=77 27% 1-3 days CSI=76 5% 4-7 days CSI=74 2% more than 7 days CSI=69 73% said FTP was their preferred method in 2008
30
30 © CFI Group 2009 Customers over multiple years Who have answered the survey multiple years … For those answering the survey over multiple years, scores have seen some movement No significant differences were see between 2008 and 2009 for those who have answered the survey over the last four years.
31
31 © CFI Group 2009 Customers over the past two years Who answered the survey in 2008 and 2009 For those answering the survey in 2008 and 2009, there are many statistically significant score increases.
32
32 © CFI Group 2009 Customers over the past three years Who answered the survey in 2007, 2008 and 2009 For those answering the survey in 2007, 2008 and 2009 Product Documentation saw a statistically significant score increase from 2008 to 2009.
33
33 © CFI Group 2009 Summary
34
34 © CFI Group 2009 Summary NASA EOSDIS has made significant improvements versus last year in multiple areas (Product Documentation, Quality and Customer Support) Product Selection/Order saw a small but significant decrease this year Product Search, Selection and Order continue to be the top opportunities for improvement Documentation continues the trend with a higher impact this year and also continues to be a top opportunity Customer Support continues to be high impact for those who require it. Imperative to maintain the strong level of service. Ensure those who are providing it realize how it affects satisfaction
35
35 © CFI Group 2009 Appendix
36
36 © CFI Group 2009 x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x 1 x 3 x 4 x 5 x 6 y1y1 y2y2 y3y3 y 3 y 2 y 1 11 22 x i xit i , for i=1,2,3 t=1,2 y jyjj 1, for j=1,2,3 111221 x 2 The Math Behind the Numbers A discussion for a later date…or following this presentation for those who are interested.
37
37 © CFI Group 2009 A Note About Score Calculation Attributes (questions on the survey) are typically answered on a 1-10 scale –Social science research shows 7-10 response categories are optimal –Customers are familiar with a 10 point scale Before being reported, scores are transformed from a 1-10 to a 0-100 scale –The transformation is strictly algebraic; e.g. –The 0-100 scale simplifies reporting: Often no need to report many, if any, decimal places 0-100 scale is useful as a management tool
38
38 © CFI Group 2009 Deriving Impacts Remember high school algebra? The general formula for a line is: y = mx + b The basic idea is that x is a “cause” and y is an “effect”, and m represents the slope of the line – summarizing the relationship between x & y CFI Group uses a sophisticated variation of the advanced statistical tool, Partial Least Squares (PLS) Regression, to determine impacts when many different causes (i.e., quality components) simultaneously effect an outcome (e.g., Customer Satisfaction)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.