Dissemination and Communication Introductory course

Slides:



Advertisements
Similar presentations
Individual Education Plans in Practice Timetable 9:00 - 9:15IEPs in the Code of Practice 9:15 - 9:30Planning and target setting: whole-school approaches.
Advertisements

DEPAUL UNIVERSITY SERVICE LEVEL AGREEMENTS AND PERFORMANCE METRICS Marge Hayes Ashton Hayes Group January 31, 2000.
WEB ANALYTICS Prof Sunil Wattal. Business questions How are people finding your website? What pages are the customers most interested in? Is your website.
3rd Nordic Marketing conference, W2 Helsinki June, Audronė Miškinienė Head, Public Relations Division, SL WG 2.
Multi-source tools for assessing the users’ needs & perception on statistical quality. The Spanish experience. European Conference on Quality in Official.
Slide 1 Begix Online -Tool Bertelsmann Stiftung Begix Europe Carolin Welzel Bertelsmann Foundation Stefan Friedrichs Public Management Consultant.
2006 BYU Reaffirmation of NWCCU Accreditation Executive Accreditation Committee February 12, 2006.
Changes in Communications: Experience of the National Statistics Office of Georgia Work Session on the Communication of Statistics May 2013 Berlin,
Maria João Zilhão Planning and Quality Control Unit « High Level Seminar “Quality Matters in Statistics” June, Athens, Greece Implementation of the.
Research Methods in Communication Cyndi-Marie Glenn.
Russell & Jamieson chapter Evaluation Steps 15. Evaluation Steps Step 1: Preparing an Evaluation Proposal Step 2: Designing the Study Step 3: Selecting.
PRESENTATION OF MONTENEGRO
Risk Analysis – definition, training and application area within National Customs Agency SOFIA, 4-7 OCTOBER 2005.
April L. Zenisky University of Massachusetts Amherst
CAMPUS CLIMATE SURVEY SPRING 2009
Community Development Department April 3, 2017
Evaluation of Society’s Interest in the Official Statistics and Calculation of the Society’s Interest Index Laima Grižaitė Deputy head, Public Relations.
UNECE Work Session on Gender Statistics, Belgrade,
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
The Generic Statistical Information Model (GSIM) and the Sistema Unitario dei Metadati (SUM): state of application of the standard Cecilia Casagrande –
Business Communication Dr. Aravind Banakar –
Business Communication
Business Communication
Community Development Department April 3, 2017
Measuring dissemination policies
Auditor Report Card Effective January 2018, PRI Registrar started gathering data to help evaluate the risk and performance of our auditors. The criteria.
Labour accounts THE CONTRACTOR IS ACTING UNDER A FRAMEWORK CONTRACT CONCLUDED WITH THE COMMISSION.
11/11 / 2011.
Copernicus Global Land monitoring service
Dissemination Working Group Luxemburg April 2010 User support
Documentation of statistics
Institutional Research and Assessment (IR&A) Feedback Survey
ESTP COURSE ON PRODCOM STATISTICS
1 Background Public Sector Accounting and Reporting Reforms in the Republic of Croatia began in 2002 when the Modified Accrual Accounting Principle and.
QUALITY DEVELOPMENT IN COLOMBIA AND LATIN AMERICAN
User Satisfaction Survey ESDS Germany Forsa Institute
Rolling Review of Education Statistics
Labour accounts Robin Lynch
European statistics User support network – Report 2012
Eurostat User Satisfaction Survey
ESTP October 2012, Luxembourg Item 5 of the agenda
Meet the Teacher September 2016
The Redesigned GeorgiaFIRST Website
Jozsef Szlezak & Pawel Kazmierczyk (EEA)
Eurostat USS 2012 Antonio Consoli Dissemination WG 26 October 2012.
New user support application (in preparation)
EUPAN DG-Meeting Innovative Public Services Group (IPSG)
4QC – Feedback of the conference
Retrospective report of the past year
DG Troika – 26 October – Portugal
4QC feedback and evaluation
RESULTS AND CHALLENGES
Who are the users and what they want
Agenda Purpose for Project Goals & Objectives Project Process & Status Common Themes Outcomes & Deliverables Next steps.
PRESENTATION OF EXISTING EVALUATION
other ‘emerging tools’
Streamlining of monitoring and reporting under WFD, Nitrates Directive and EEA's SoE –concept paper DG Environment.
Ðì SA Effective Monitoring and Evaluation of Progress on the SDGs Monitoring SDGs : the perspective of Armstat Learning Conference: Implementing.
Guidelines to visualise statistical information: Tables, graphs and maps THE CONTRACTOR IS ACTING UNDER A FRAMEWORK CONTRACT CONCLUDED WITH THE COMMISSION.
Modernising dissemination and communication of European statistics
ECONOMIC CLASSIFICATIONS Advanced course Day 1 – third afternoon session Tools for assisting the use of classifications Zsófia Ercsey - KSH – Hungary.
Executive Project Kickoff
My name is VL, I work at the EEA, on EA, and particularly on developing a platform of exchange which aims at facilitating the planning and development.
Advance HE Surveys Conference
Statistics Portugal Quality Management System
Dissemination and Communication Introductory course
Community Development Department April 3, 2017
AIM Accelerated Implementation Template
Section 3 FOR HUD USE ONLY.
Presentation transcript:

Dissemination and Communication Introductory course Disseminating Strategy Focusing on user needs Madrid 19-21 October 2016 THE CONTRACTOR IS ACTING UNDER A FRAMEWORK CONTRACT CONCLUDED WITH THE COMMISSION

Dissemination & Satisfaction Measuring Dissemination & Satisfaction Our example!

“Priority” to requests through website User´s Requests Channels “Priority” to requests through website

Other traditional channels User´s Requests Channels Other traditional channels info@ine.pt

100% integrated on website Portal technology User´s Requests Channels Tracing The Request

User requests.. Total Requests 2016 (jan-jul): 3.506 dissemination requests (without phone )

Visits/quarter

Easy measuring use with Youtube analytics (integrated tool) Statistics Portugal started his channel on Youtube on April 1st, 2015 with following purposes: Statistical information Improve Statistical Literacy Institutional Information Youtube Channel Easy measuring use with Youtube analytics (integrated tool)

Customer explanations Assess customer satisfaction A Request “Life Cycle” Automatic Request code Customer explanations Request Distribution Dissemination expert Answer to Customer Assess customer satisfaction Production Experts Measuring answer time (INE) Distribution Dissemination expert (With several time stops)

Customer Satisfaction Evaluation Answer Time Customer Satisfaction

Average time (95% of requests), 2015 1. Measuring Time Perfomance Average time (95% of requests), 2015 Total Statistics Portugal 0.504 working days Dissemination Experts 0.07 working days Sophisticated time counting

Customer explanations 1. Measuring Time Perfomance Sophisticated time counting Customer explanations (STOP Watch) All the minutes are counted and converted into equivalent working days of work Distribution (x minutes) Dissemination expert ∑(y1, y2,…yn minutes) The minutes are numbered between 09h00 and 17h00, with a one-hour interval discount (lunch time) Production Experts ∑(z1, z2,…zn minutes) User requests after 17h00 are counted from the working day following Total INE x + ∑(y1, y2,…yn minutes) + ∑(z1, z2,…zn minutes) Dissemination Experts ∑(y1, y2,…yn minutes)

2. Assess Customer’s Satisfaction In 2010, we have introduced an additional and automatic proceeding to assess the customer’s level of satisfaction concerning the service. With the implementation of this system, now we are able to assess automatically the customer’s satisfaction on the service provided. The survey is sent by an e-mail automatically generated 24 hours after the final contact/reply. The analysis of the results is led by the quality management service (a separate unit of the dissemination service).

EBR = F1*(-1)+ F2*(-0,5)+ F3*(-0,25)+ F4*(0,25)+ F5*(0,5)+ F6*(1) 2. Assess Customer’s Satisfaction extreme balance responses (EBR) The analysis of the results was displayed by using the “extreme balance responses” (EBR). This aimed to valorize the extreme answers in detriment of the medium evaluations which tend to represent a less expressive dissatisfaction/disagreement or satisfaction/agreement. The attributed ponderations were as follows: EBR = F1*(-1)+ F2*(-0,5)+ F3*(-0,25)+ F4*(0,25)+ F5*(0,5)+ F6*(1) where FI = Relative frequency of each observed value for each one of the categories I =(1,…,6). Data became to be referenced in a metric scale between -1 and +1, where values next to -1 mean full dissatisfaction/disagreement and values near +1 mean full satisfaction/agreement.

2. Assess Customer’s Satisfaction On-line survey, unique ID for each user .   2015 2016 (jan-jun) Response rate 32,4% 33,1% 70,3% of surveys received in 24 hours (jan-jun 2016)

2. Assess Customer’s Satisfaction   2015 2016 (jan-jun) AVERAGE (EBR) 0,71 0,76

2. Assess Customer’s Satisfaction Satisfaction level (EBR), 2016 (jan-jun) Public Administration Education Enterprises Foreigners Private Not Identified 0,84 0,73 0,76 0,78 In 2016 (jan-jun) were received 190 comments and 22 suggestions Positive Comments (149) Negative Comments (41) Fast Service Adequacy of response to the request Quality of Service Fast Service Adequacy of response to the request (Ex: answers that refer to the Portal) No information/lack information Suggestions (22) Information Needs Access to information More geographically disaggregated information

Thanks for your attention