Presentation is loading. Please wait.

Presentation is loading. Please wait.

Orlando, FL November 14, 2009 Stephen H Axelrad, Ph.D. Booz Allen Hamilton Thomas E Ward, II, Ph.D. US Army and Command Staff College EVALUATION 2009 ROUNDTABLE.

Similar presentations


Presentation on theme: "Orlando, FL November 14, 2009 Stephen H Axelrad, Ph.D. Booz Allen Hamilton Thomas E Ward, II, Ph.D. US Army and Command Staff College EVALUATION 2009 ROUNDTABLE."— Presentation transcript:

1 Orlando, FL November 14, 2009 Stephen H Axelrad, Ph.D. Booz Allen Hamilton Thomas E Ward, II, Ph.D. US Army and Command Staff College EVALUATION 2009 ROUNDTABLE DISCUSSION Knowledge Management as a Guerilla Campaign Leading the Horse to Water

2 1Knowledge Management As A Guerilla Campaign Agenda  Roundtable Rules of Engagement  Your Experiences with Knowledge Management  Setting the Context for Knowledge Management  Evaluation and Knowledge Management – Points of Convergence  Knowledge Management – An Opportunity for Evaluators to Demonstrate Value  Knowledge Management Case Examples  Continuing the Work

3 2Knowledge Management As A Guerilla Campaign Roundtable Rules of Engagement  Purpose –Not intended as another primer on knowledge management –Part brainstorming, part problem-solving –Role of evaluators in help organizations understand the impact of knowledge management initiatives  Our role –To frame the intersection of evaluation and KM –To challenge assumptions –To facilitate a discussion –To synthesize observations and insights into action plans  Your role –To reflect on your KM experiences and insights in both evaluator and non-evaluator roles –To provide insights on what works and what is missing from most KM interventions –To contribute approaches or solutions for understanding the effectiveness of KM –To identify some actions steps that you can bring back to your organizations

4 3 Your Experiences with Knowledge Management Knowledge Management As A Guerilla Campaign What is your understanding of KM? How is knowledge accumulated institutionalized in organizations? How does knowledge flow through organizations? Why have so many diverse organization attempted to establish KM policies, programs, and resources? What roles or stances have you adopted with past KM interventions in your organizations? Observer/Bystander? Planner/Collaborator? Evaluator? Facilitator? Critic/Skeptic? Data/Knowledge Manager? What types of KM interventions have you participated in or witness at your organizations? Intranets/ Extranets Lessons Learned Repositories Clearinghouses Communities of Practice Learning Communities Knowledge Sharing Forums Brown Bag Sessions

5 4 Setting the Context for Knowledge Management Frequent reasons for why organizations engage in KM interventions Encourage collaboration across functional and geographic silos Empower frontline supervisors and employees to engage in program/ organization- wide improvement efforts Identify and reduce cost inefficiencies and waste Disseminate knowledge from organizational veterans to newcomers Institutionalize lessons learned from successes and failures Common tools employed in KM interventions Information technology: intranets, extranets, etc. Social networking: list serves, wikis, blogs, etc. Organizational structures: communities of practice, task forces, cross-functional teams, etc. Intended outcomes of KM interventions Development of an institutional memory Increased employee engagement (i.e., motivation, satisfaction, commitment Creation of an organizational culture/climate centered on knowledge sharing and collaboration Improvements in organizational flexibility, adaptability, and innovation Knowledge Management As A Guerilla Campaign

6 5 Evaluation and Knowledge Management – Points of Convergence Knowledge Management As A Guerilla Campaign Evaluation Knowledge Management Goal to increase use of data and evidence for decision making Understanding of organizations as dynamic systems Appreciation of the “soft” side of organizational effectiveness such as informal learning and social networking Recognition of the complementary utility of (innovation & creativity) top-down (importance & accountability) and bottom- up approaches

7 6 Knowledge Management – An Opportunity for Evaluators to Demonstrate Value Needs Assessment Develop a shared understanding of both implicit and explicit needs of intended KM beneficiaries Recognize new or emerging needs resulting from a KM intervention Theories of Change Develop testable hypotheses for understanding the nature and magnitude of changes Encourage thinking about the intended and unintended consequences Outcome Measurement Facilitate understanding of what is success Help KM planners and technologists distinguish between outputs and outcomes Identify and develop indicators for effectiveness as well as indicators for performance Knowledge Utilization Connect improvements in knowledge with improvement in decision making Develop quality and relevance standards for the acquisition and distribution of knowledge Guide organizational decision makers on ethical uses of knowledge Knowledge Management As A Guerilla Campaign

8 7 Knowledge Management Case Examples – BP’s Lexpertise & Schlumberger’s Lexpertise Do you think one was conducted for either or both organizations? Whose needs were being served and/or ignored? Needs Assessment What motivated these organizations to establish KM practices? What changes were these organizations hoping to achieve? Theories of Change Do you think these organizations had specific, measurable outcomes in advance? Were the KM outcomes aligned to the organization’s success? Outcome Measurement What were good examples? What practices would you have recommended? Knowledge Utilization Knowledge Management As A Guerilla Campaign

9 8 Continuing the Work  Attend the Think Tank session later in the day –Title: Examining the Evaluability of Knowledge Management Initiatives –Time: 11:50AM to 12:45 PM –Location: Wekiwa 7 –Description: Digs deeply into that question about KM and evaluation, with an examination of potential measurement criteria, focusing on the difference between "Measures of Performance" (a single loop learning measure) and "Measures of Effectiveness" (a double loop learning measure). How do we do this right, and set up an organization for clear, consistent, and meaningful evaluation of its program? It can be done, but measuring the right things and crafting a picture of the results can be exceedingly challenging. Early identification of evaluation criteria - of both performance and effectiveness - that fit the organization's context is a key.  Keep in touch – share your observations, successes, and challenges in becoming involved with KM –Stephen Axelrad: axelrad_stephen@bah.com or saxelrad@hotmail.comaxelrad_stephen@bah.comsaxelrad@hotmail.com –Thomas E. Ward, II: thomas.wardii@us.army.mil or tewardii@aol.comthomas.wardii@us.army.miltewardii@aol.com  Join our burgeoning KM community of practice within the Business and Industry TIG –Leave your contact information if you are interested –Open to those who work in government, non-profit, and independent consulting settings Knowledge Management As A Guerilla Campaign


Download ppt "Orlando, FL November 14, 2009 Stephen H Axelrad, Ph.D. Booz Allen Hamilton Thomas E Ward, II, Ph.D. US Army and Command Staff College EVALUATION 2009 ROUNDTABLE."

Similar presentations


Ads by Google