Download presentation
Presentation is loading. Please wait.
Published byLiliana Parsons Modified over 9 years ago
1
Franciszek Seredynski, Damian Kurdej Polish Academy of Sciences and Polish-Japanese Institute of Information Technology APPLYING LEARNING CLASSIFIER SYSTEMS for MULTIPROCESSOR SCHEDULING PROBLEM MULTIPROCESSOR SCHEDULING PROBLEM
2
Motivations New scheduling algorithms are proposed near every day In the light of –NP-hard compliteness of the scheduling problem, and –No free lunch theorem concerning metaheuristics this situation may last forever, at least till the moment of appearing quantum computers Can we use the knowledge gained from the experience with already known scheduling algorithms (hypeheuristics approach) ? We will use GA-based Learning classifier systems (LCS) to extract some knowledge and use it in the scheduling process
3
Multiprocessor Scheduling Problem The idea of LCS The concept of LCS-based scheduling Experimental results Conclusions
4
4 Multiprocessor system: undirected, unweighted graph G s =(V s,E s ), called a system graph. Parallel program: weighted, directed, acyclic graph G P =, called a precedence task graph or a program graph. The purpose of the scheduling is to distribute the tasks among the processors in such a way that the precedence constraints are preserved and the response time T (the total execution time) is minimized. T = f (allocation, scheduling_policy = const) Examples of a precedence task graph (a) and a system graph (b). t1t1 t1t1 t3t3 t3t3 t2t2 t2t2 5 3 1 2 3 5 5 t4t4 t4t4 P1P1 P1P1 P3P3 P3P3 P4P4 P4P4 P2P2 P2P2 a) b)
5
Problem formulation Given a set of program graph instances Given a multiprocessor system Given a number of scheduling algorithms (heuristics) solving instances of the scheduling problem with some efficiency Is it possible to train LCS system to match a given instance of the scheduling problem with the best for it scheduling algorithm (to minimize the total exectution time) from a set of scheduling algorithms ?
6
I dea of GA-based Learning Classifier System (LCS) Learning Classifier System Evaluation system Decision system System for discovery of new rules Environment
7
I dea of Learning Classifier System (LCS) Learning Classifier System System for discovery of new rules Decision system Evaluation system Environment Environment state or message e.g.10100
8
I dea of Learning Classifier System (LCS) Learning Classifier System System for discovery of new rules Decision system Evaluation system Environment action e.g. Turn right Environment state e.g.10100
9
I dea of Learning Classifier System (LCS) Learning Classifier System System for discovery of new rules Decision system Evaluation system Environment reward e.g. 120 action e.g. Turn right Environment state e.g.10100
10
Classifier (rule) in classical LCS The structure of a classifier –Condtition part C –Action A –Strength S Strength S –Used when a classifier is selected from a set of classifiers to perform an action –Used when GA creates new rules #011: 01 : 43 C A S
11
Classifier in XCS C: condition part a: action p: expected reward e: prediction error f: fitness exp: experience of classifier ts: remembers recent time when GA was applied to this classifier as: expected population size [A], in which appears classifier num: numerosity of classifier 010##0#####:0 1000 2,504 0,77 499 19924 146,76 109 C a p ε fts exp as num
12
XCS Environment Detector 0011 Population [P] C : a : p: e: f #011:01:43:.01:99 #0##:11:11:.13:9 001#:01:27:.05:52 #0#1:11:18:.24:3 11##:00:32:.02:92 1#01:10:24:.17:15... Match set [M] C : a : p: e: f #011:01:43:.01:99 #0##:11:11:.13:9 001#:01:27:.05:52 #0#1:11:18:.24:3 Action set [A] C : a : p: e: f #011:01:43:.01:99 001#:01:27:.05:52 Action set [A] -1 C : a : p: e: f 11##:00:32:.02:92 Efector 01 cover 1. 2. 3. 4. 5. 6. Prediction array PA 00-00- 01 37.49 11 12,75 10 - Enforcement GA Subsumption 7. 8. 9. ρ σ a
13
Features of XCS Creates population of classifiers Pr ocesses messages received from environment Applies GA to evolve classifiers Sends action to environment Learns, generalizes and modifies the set of classifiers
14
Our problem Given 200 program graph instances created on the base of the 15-tree graph: training set Each instance is a tree with different random task and communication weights Two processor system is considered Given 5 scheduling heuristics We want to train LCS system to select in the best way the scheduling heuristic to solve given set of instance of the scheduling problem to provide the best possible solutions ?
15
Set of list algorithms ISH (Insert Scheduling Heuristic) MCP (Modified Critical Path) STF (Shortest Time First) LTF (Longest Time First) own list algorithm: priority of a task depends on a size of the subgraph We know how works each algorithm (response time) on the set of scheduling instances
16
XCS-based scheduling system 1.XCS receives information about an instance of the scheduling problem Program graph + System graph XCS 1.
17
XCS-based scheduling system 1.XCS receives information about an instance of the scheduling problem 2.XCS selects the best available heuristic Program graph + System graph XCS scheduling algorithm 1. 2.
18
XCS-based scheduling system 1.XCS receives information about an instance of the scheduling problem 2.XCS selects the best heuristic from the set of available heuristics 3.Program graph and a system graph become input data of scheduling algorithm Program graph + System graph XCS scheduling algorithm 1. 2. 3.
19
XCS-based scheduling system 1.XCS receives information about an instance of the scheduling problem 2.XCS selects the best heuristic from the set of available heuristics 3.Program graph and a system graph become input data of scheduling algorithm 4.Scheduling algorithm delivers a solution Program graph + System graph XCS scheduling algorithm Gantt diagram 1. 2. 3. 4.
20
Program graph signature: the basic information concerning program graph LCS receives from environment a signature of program graph The signature codes some static properties of program graph – comm/comp – the averaged communication to computation time for a program graph (3 bits) – information about distribution of tasks with a given computational requirements (12 bits) – information about distribution of communication time requirements to communicate between tasks (12 bits) – Information about critical path based on evaluation of comp/comm (16bits) The length of the signature: 43 bits
21
Distribution of tasks with a given computational requirements/ distribution of communication time requirements
22
Coding information concerning critical path based on evaluation of comp/comm Computing ratios on critical path : ratios[0] = 1/4, ratios[1] = 5/3, ratios[2] = 1/3 Normaliza tion : ratios[0] = 3/27, coding as 01, ratios[1] = 20/27, coding as 11, ratios[2] = 4/27, coding as 01. Coding signal concerning critical path: 0111010000000000
23
Training LCS: number of correct matching scheduling algorithms to instances as function of number of training cycles
24
Training LCS: population size of rules as function of a number of cycles
25
Training: summary of experiments Nontrained system correctly matched heuristics with scheduling instances in 40-50% cases The system was able to learn to match correctly in 100% heuristics to instances It means that information about the matching process was extracted during the learning process Classifiers contain this information and during the learning process the process of generalization of rules was observed Learning process is a costly process, but the gained information can be used in the scheduling
26
LCS-based scheduling system: normal operation mode Modification of instances (program graphs) from training set: testing set All computation and communication weights were scailed by 10 Next, weights of k tasks or communications were changed by constant d
27
Experiment: k=1, d=1
28
Experiment: k=2, d=2
29
Experiment: k=3, d=3
30
Normal operation mode: summary of experiments Number of correct matching heuristics to scheduling instances Number k of mod ified weights (tasks or communications) Difference d between initial weight value and the value after modification 90% 11 80% 22 75% 33
31
Conclusions LCS has been proposed to learn optimal matching scheduling algorithms to instances Instances were represented by specially signatures During the learning process the knowledge about matching was extracted in the shape of LCS rules, and next generalized Creating signatures is one of the most crucial issues in the proposed approach Performance of the system depends also on many parameters of LCS We believe that encouraging results of experiments open new possibilities in developing hyperheuristics
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.