Presentation is loading. Please wait.

Presentation is loading. Please wait.

LP, Excel, and Merit – Oh My! (w/apologies to Frank Baum) CIT Research/Teaching Seminar Series (Oct 4, 2007) John Seydel.

Similar presentations


Presentation on theme: "LP, Excel, and Merit – Oh My! (w/apologies to Frank Baum) CIT Research/Teaching Seminar Series (Oct 4, 2007) John Seydel."— Presentation transcript:

1

2 LP, Excel, and Merit – Oh My! (w/apologies to Frank Baum) CIT Research/Teaching Seminar Series (Oct 4, 2007) John Seydel

3 No, It’s Not About Getting Back to Kansas!

4 Here’s the Problem Developing merit evaluations of multiple faculty members Some are good all around Each is good at something Which “somethings” should be considered more/less important? How much more/less important? Why not borrow from Economics: concept of Pareto efficiency? Identify the efficient set of faculty members Avoid answering the “importance” question We can use LP (linear programming) with some help from Excel to address this Hence: “LP, Excel, and Merit”

5 What’s LP? Consider a production planning problem Liva’s Lumber (refer to handout)handout  3 products  3 constraints  1 objective (maximize weekly profit) Summary table: Modelling: LP model and Excel modelLP modelExcel model Product:CDXACForm How much: ??? Profit: $ 5.00 $ 7.00 $ 6.00Available Cutting:231054,000 Gluing:47424,000 Finishing;23718,000

6 Now, the Merit Problem Typical merit criteria: Teaching Research Service Consider the teaching criterion Our CoB evaluations have 35 dimensions associated with the teaching criterion What do we do with all those  There are too many to weight  So we just average them; i.e., we treat them as if they’re all equally important!  “Follows syllabus” is important as “explains clearly” Let’s consider a smaller example (Table 1)Table 1

7 Aggregation of Results Humans want a single performance measures Typical schemes Simple average (see Table 2)Table 2 Focus on “overall effectiveness” question (e.g., #8) Also, weighted average  Weights determined by whom (committee, administrator, statute,... )?  Illustrated by MBO So, what’s wrong with a simple average?  Obscures individual strengths and weaknesses  Artificially values minor differences

8 DEA to the Rescue (?) We want to evaluate the outcomes of behaviors (decisions) on the basis of Multiple criteria to be considered for the outcomes No generally acceptable set of weights exists (and no one is willing to determine such) This is where DEA (data envelopment analysis) can be useful Consider each instructor to be a DMU (decision-making unit) Apply the concept of economic efficiency...

9 Efficient Set Concept Set of entities (DMUs) where no entity performs as well or better on all criteria Graphically: convex hull Consider concept from finance: efficient portfolio  Risk  Return Any entity’s weighted multicriteria score will be the same as the others’ scores, if they all get to choose their own weights These entities are called efficient decision making units Consider a simple example (subset from Table 2)...

10 Bicriterion Performance Comparision Criterion Simple AvgWeighted Avg InstrtuctorImpartialPrepared ValueRankValueRank OBA4.773.78 4.2814.082 GJB3.022.83 2.9362.895 IAB2.012.20 2.1172.147 OVB3.394.24 3.8233.993 BFH3.742.35 3.0552.776 DEI2.683.58 3.1343.314 OLK4.583.96 4.2724.151 Weight:0.300.70

11 Graphically Identifying the Efficient Frontier OBA OLK OVB BFH IAB GJB DEI

12 Some Basic Definitions Efficiency = Output / Input Maximum possible efficiency is defined as 100% (i.e., 1.00) Output for an instructor is her/his weighted average evaluation score Input for all instructors is theoretically the same (100% of time available) This leads to a model (recall the LP model for Liva’s Lumber)...

13 Efficiency Model Choose a set of criterion weights for a given instructor so as to Maximize: Instructor’s Output/Input Subject to:  Each other instructor’s Output/Input <= 1  Weight values are positive Which is the same as Maximize: Instructor’s Weighted Average Score Subject to:  Each other instructor’s Weighted Average Score <= 1  Weight values are positive Since each instructor’s input is defined to be 1.00 Note, however, that the “weighted average” is now scaled to the 0.00 – 1.00 interval

14 An Example DEA Output Model for Evaluating Faculty Teaching Let w 1 and w 2 be the weights to assign to impartiality and preparedness, respectively Then, for instructor GJB (for example, the objective is to Maximize: 3.02w 1 + 2.83w 2 (GJB score) ST: 4.77w 1 + 3.78w 2 ≤ 1.00 (OBA) 3.02w 1 + 2.83w 2 ≤ 1.00 (GGB) 2.01w 1 + 2.20w 2 ≤ 1.00 (IAB)... 4.58w 1 + 3.96w 2 ≤ 1.00 (IAB) w 1, w 2 > 0.00 We can use Excel to model and solve this, but we need to reformulate and solve for every instructor That’s where macro programming comes in...

15 Now, Let’s Apply This to the DataData Consider the model for QVA Then note the summary table Things of interest Size of efficient set Rank reversals Comparison with simple average approach (Figure 1)

16 Where To From Here? Constraining the weights Ranking the “efficient” instructors Expand across other criteria in the merit evaluations Other DEA applications (decsion support) Comparing ecommerce platforms Vendor selection Other... ? Go looking for more “Lions and tigers and bears (oh my)!”

17 Appendix

18 The LP Model for Liva’s Lumber We can model this mathematically: Let x 1 = number of sheets of CDX to produce weekly x 2 = number of sheets of form plywood to produce weekly x 3 = number of sheets of AC to produce weekly The objective is to Maximize: 5x 1 + 7x 2 + 6x 3 (Weekly profit) ST: 2x 1 + 3x 2 + 10x 3 ≤ 54000 (Cutting) 4x 1 + 7x 2 + 4x 3 ≤ 24000 (Gluing) 2x 1 + 3x 2 + 7x 3 ≤ 36000 (Finishing) Solving is “simply” a matter of determining the best combination of x 1, x 2, and x 3

19 Enter Excel Create a spreadsheet table like the summary table Add a few formulae Total profit Total amount of each resource consumed Solve by trial and error... ? Better: use the Solver tool Find the optimal solution quickly Tinker with parameters and re-solve Even better: use Solver with a macro button Record macro Call subroutine when editing onClick event for button

20 Table 1: Example Evaluation Items

21 Table 2: Example Departmental Summary

22 DEA Model for Instructor QVL

23 Results Across Instructors

24 Figure 1: DEA vs. Simple Averaging


Download ppt "LP, Excel, and Merit – Oh My! (w/apologies to Frank Baum) CIT Research/Teaching Seminar Series (Oct 4, 2007) John Seydel."

Similar presentations


Ads by Google