Download presentation
Presentation is loading. Please wait.
Published byAudrey Roberts Modified over 9 years ago
1
Gradual Relaxation Techniques with Applications to Behavioral Synthesis Zhiru Zhang, Yiping Fan, Miodrag Potkonjak, Jason Cong Department of Computer Science University of California, Los Angeles Partially supported by NSF under reward CCR-0096383
2
Outline Motivations & objectives Motivations & objectives Gradual relaxation techniques Gradual relaxation techniques –Driver example: Time-Constrained Scheduling Other driver examples Other driver examples –Maximum Independent Set (MIS) –Soft Real-Time Scheduling Experimental results Experimental results Conclusions Conclusions
3
Motivations & Objectives Motivations: Motivations: –Many synthesis problems are computational intractable SAT, scheduling, graph coloring, … SAT, scheduling, graph coloring, … –Lack of systematical way to develop effective heuristics Objectives: Objectives: –Development of a new general heuristic paradigm Gradual Relaxation Gradual Relaxation –Applications to a wide range of synthesis problems
4
Gradual Relaxation Techniques Most constrained principle Most constrained principle Minimal freedom reduction Minimal freedom reduction Negative thinking Negative thinking Compounding variables Compounding variables Simultaneous step consideration Simultaneous step consideration Calibration Calibration Probabilistic modeling Probabilistic modeling
5
Driver Example: Time-Constrained Scheduling (1) Problem: Time-constrained scheduling Problem: Time-constrained scheduling –Given: (1) A CDFG G(V, E) (1) A CDFG G(V, E) (2) A time constraint T (2) A time constraint T –Objective: Schedule the operations of V into T cycles so that the resource usage is minimized and all precedence constraints in G are satisfied Schedule the operations of V into T cycles so that the resource usage is minimized and all precedence constraints in G are satisfied
6
Driver Example: Time-Constrained Scheduling (2) Related work Related work –M. C. McFarland, A. C. Parker, and R. Camposano, Proc. of IEEE, 1990 –G. D. Micheli, 1994 –E. A. Lee and D. G. Messerschmitt, Proc. of IEEE, 1987, SDF scheduling Classical approach – Force-Directed Scheduling Classical approach – Force-Directed Scheduling –P. G. Paulin and J. P. Knight, DAC 1987 –Exploit schedule freedom (slack) to minimize the hardware resources –Iterative approach: schedule one operation per iteration
7
Driver Example: Time-Constrained Scheduling (3) Determine ASAP & ALAP Schedules Determine Time Frame of each operation – –Length of box : Possible execution cycles – –Width of box: Probability of assignment – –Uniform distribution, Area assigned = 1 Create Distribution Graphs – –Sum of probabilities of each Op type – –Indicates concurrency of similar Ops DG(i) = Prob(Op, i) DG for Multiply 0 1 2 3 4 12341234 DG for Add, Sub, Comp 0 1 2 3 4 12341234 C-step 1 C-step 2 C-step 3 C-step 4 Time Frames 1/2 1/3 * + < + * * - * * - * **** ** ASAP +< + - - ALAP ** * * * * +< +- -
8
Principle: Principle: –First resolve the most constrained components –Minimally impact the difficulty of still unresolved constraints Related work: Related work: –General technique Bitner and Reigold, 1975; Brelaz, 1979, for graph coloring Bitner and Reigold, 1975; Brelaz, 1979, for graph coloring Pearl, 1984, intelligent search Pearl, 1984, intelligent search –Slack based heuristics Davis, Tindell, and Burns, 1993; Gldwasser, 2003 Davis, Tindell, and Burns, 1993; Gldwasser, 2003 –Force-directed scheduling Paulin and Knight, 1989 Paulin and Knight, 1989 Most Constrained Principle
9
Most Constrained Principle: Time-Constrained Scheduling Operation Op, at control step i, targeting control step t Operation Op, at control step i, targeting control step t –Force(Op, i, t) = DG(i) * x(Op, i, t) –x(Op, i, t): the Prob change in i when Op is scheduled to t The self force of operation Op w.r.t control step t The self force of operation Op w.r.t control step t –Self Force(Op, t) = i time frame Force(Op, i, t) 0 1 2 3 4 12341234 1/3 C-step 1 C-step 2 C-step 3 C-step 4 1/2 d*d* h+h+ i<i< e+e+ c*c* a* j- b* f* k- g*g* c*c* 0 1 2 3 4 12341234 C-step 1 C-step 2 C-step 3 C-step 4 1/3 d*d* h+h+ i<i< e+e+ a* j- b* f* k- g* c* d*d*
10
Minimal Freedom Reduction / Negative Thinking (1) Minimal Freedom Reduction – key of a good heuristic: Minimal Freedom Reduction – key of a good heuristic: –To avoid the greedy behavior of optimization –Make a small gradual atomic decision –Evaluate its individual impact before committing to large decisions Negative Thinking – way to realize Minimal Freedom Reduction Negative Thinking – way to realize Minimal Freedom Reduction –Traditional heuristics resolve a specific component of the solution –Negative thinking determines what will not be considered as a component of the solution
11
Minimal Freedom Reduction / Negative Thinking (2) Similar ideas: Similar ideas: –Improved Force-Directed scheduling: W. F. J. Verhaegh, P. E. R. Lippens, E. H. L. Aarts, J. H. M. Korst, J. L. van Meerbergen, and A. van der Werf, IEEE Trans. on Computer-Aided Design of Integrated Circuits and Systems, 1995 W. F. J. Verhaegh, P. E. R. Lippens, E. H. L. Aarts, J. H. M. Korst, J. L. van Meerbergen, and A. van der Werf, IEEE Trans. on Computer-Aided Design of Integrated Circuits and Systems, 1995 Gradually shrink operations’ time fames Gradually shrink operations’ time fames –Standard cell global routing: J. Cong and Patrick H. Madden, ISPD, 1997 J. Cong and Patrick H. Madden, ISPD, 1997 Iterative deletion method – from the complete routing graph, delete edges one by one to get an optimum routing tree Iterative deletion method – from the complete routing graph, delete edges one by one to get an optimum routing tree
12
Negative Thinking: Time-Constrained Scheduling Traditional FDS: Traditional FDS: –Select minimum force (Op, t), schedule Op to t Negative thinking FDS: Negative thinking FDS: –Select maximum force (Op, t), remove t from Op’s time frame DG for Multiply 0 1 2 3 4 12341234 DG for Add, Sub, Comp 0 1 2 3 4 12341234 C-step 1 C-step 2 C-step 3 C-step 4 Time Frames 1/2 1/3 d*d* h+h+ i<i< e+e+ c*c* a* j- b* f* k- g*g* d*d* DG for Multiply 0 1 2 3 4 12341234 DG for Add, Sub, Comp 0 1 2 3 4 12341234 Time Frames d*d* h+h+ C-step 1 C-step 2 C-step 3 C-step 4 1/2 1/3 i<i< e+e+ c*c* a* j- b* f* k- g*g*
13
Compounding Variables / Simultaneous Steps Consideration (1) Compounding variables Compounding variables –For the problems where variables can be assigned only to binary values –Combine several variables together Simultaneous steps consideration Simultaneous steps consideration –Consider a small negative decision on a set of variables simultaneously Example: a SAT instance Example: a SAT instance –Compound x 1 and x 2, there are 4 assignment options –Evaluate their impacts to the maximum constraints –Negative thinking: remove one option, keep the other three promising options
14
Calibration Heuristics conduct the optimization Heuristics conduct the optimization –Keep the options for important variables –Discard the options for unimportant variables Example: Example: –In resource-minimization scheduling: Multipliers are much more expensive than adders Multipliers are much more expensive than adders Preserve maximum slacks for the multiplications Preserve maximum slacks for the multiplications Lower the priority to minimize required adders Lower the priority to minimize required adders C-step 1 C-step 2 C-step 3 C-step 4 1/2 1/3 * h+h+ < + * * - * * - * d*d* h+h+ C-step 1 C-step 2 C-step 3 C-step 4 1/2 1/3 < + * * - * * - * d*d*
15
Probabilistic Modeling Options of every variable are non-uniformly distributed Options of every variable are non-uniformly distributed Probabilistic modeling Probabilistic modeling –A non-uniform function of all constraints imposed on a particular variable prob(1,1) = 0.6prob(1,1) = 0.6 prob(1,2) = 0.3prob(1,2) = 0.3 prob(1,3) = 0.1prob(1,3) = 0.1 3 1 2 c-step1 c-step2 c-step3 c-step4 c-step5 3 1 2 3 1 2 3 1 2 1 2 3 1 2 3 1 2 3 3 1 2 3 1 2 1 2 3 C-step 1 C-step 2 C-step 3 C-step 4 3 2 1
16
When is Gradual Relaxation Most Effective? Minimal freedom reduction / Negative thinking Minimal freedom reduction / Negative thinking –A large number of variables have significant slack –Variables have complex interactions among a large number of constraints Compounding variables / simultaneous steps consideration Compounding variables / simultaneous steps consideration –Each variable has a small set of potential values Calibration Calibration –The final solution only involves relatively few types of resources Probabilistic modeling Probabilistic modeling –Effective for large and complex instances
17
Driver Example: Maximum Independent Set (1) Problem: Maximum Independent Set Problem: Maximum Independent Set –Given: G (V, E) –Objective: find a maximum-size independent set V ’ V, such that for u V ’ and v V ’, (u, v) E. Related work Related work –A popular generic NP-Complete problem M. R. Garey and D. S. Johnson, 1979 M. R. Garey and D. S. Johnson, 1979 –Useful for efficient graph coloring D. Kirovski and M. Potkonjak, DAC 1998 D. Kirovski and M. Potkonjak, DAC 1998
18
Driver Example: Maximum Independent Set (2) Reasoning: Reasoning: –In practice, MIS size is much smaller than the total graph size A smaller decision: A smaller decision: –To select a most constrained vertex not to be in the MIS –Simple heuristic: h 1 (v) = Number of Neighbors of v –Look-forward heuristic: h 2 (v) = u Neighbors (v) (1 / Number of Neighbors of u)
19
Driver Example: Soft Real-Time Scheduling (1) Problem: Soft real-time scheduling Problem: Soft real-time scheduling –Given: (1) A set of non-preemptive tasks ={ 1, 2, … n } and each task i =(a i, d i, e i ) is characterized by an arrival time a i, a deadline d i and an execution time e i (1) A set of non-preemptive tasks ={ 1, 2, … n } and each task i =(a i, d i, e i ) is characterized by an arrival time a i, a deadline d i and an execution time e i (2) A single processor P (2) A single processor P (3) A timing constraint T (3) A timing constraint T –Objective: Schedule a subset of tasks in on processor P within the available time T so that the number of tasks scheduled is maximized Schedule a subset of tasks in on processor P within the available time T so that the number of tasks scheduled is maximized
20
Driver Example: Soft Real-Time Scheduling (2) Multimedia applications Multimedia applications –B. Kao and H. Garcia-Molina, 1994 –B. Adelberg, H. Garcia-Molina, and B. Kao, 1994; Video and WWW servers Video and WWW servers –M. Jones, D. Rosu, M.-C Rosu, 1997 Formal definition Formal definition –P. D’Argenio, J.-P Katoen, and E. Brinksma, 1999 CAD and embedded systems CAD and embedded systems D. Ziegenbein, J. Uerpmann, and R. Ernst, ICCAD 2000 D. Ziegenbein, J. Uerpmann, and R. Ernst, ICCAD 2000 D. Verkest, P. Yang, C. Wong, and P. Marchal, ICCAD 2001 D. Verkest, P. Yang, C. Wong, and P. Marchal, ICCAD 2001 K. Richter, D. Ziegenbein, M. Jersak, and R. Ernst, DAC 2002 K. Richter, D. Ziegenbein, M. Jersak, and R. Ernst, DAC 2002
21
Driver Example: Soft Real-Time Scheduling (3) Two phase heuristic: Two phase heuristic: –Conflict minimization Gradually shrink the time frame for every task Gradually shrink the time frame for every task –Legalization Probabilistic modeling: Probabilistic modeling: –Trapezoid shape Task Probability Distribution sisi s i +e i c i -e i cici t prob( i,t)
22
Driver Example: Soft Real-Time Scheduling (4) Objective: Objective: –Minimize the number of conflicts Repeat until all tasks are locked Repeat until all tasks are locked –Update distribution graph –Compute forces for every tasks at the start and cutoff time slots –Select the maximum force (T, t), remove time slot t from task T’s time frame Time Slot Task. Prob Time Slot Task. Prob
23
Experimental Results: Maximum Independent Set Apply to DIMACS benchmark graphs for the Clique problem challenge Apply to DIMACS benchmark graphs for the Clique problem challenge Compare to a state-of-the-art iterative algorithm Compare to a state-of-the-art iterative algorithm –MIS algorithm used in D. Kirovski and M. Potkonjak, DAC 1998 –Similar quality –Much faster: 50X using h 1, 30X using h 2 Look-forward heuristic outperforms the simple version Look-forward heuristic outperforms the simple version
24
Experimental Results: Time-Constrained Scheduling (1) Scheduling results comparison under critical-path time constraint Scheduling results comparison under critical-path time constraint
25
Experimental Results: Time-Constrained Scheduling (2) Scheduling results comparison under time constraint with 1.5x critical path length Scheduling results comparison under time constraint with 1.5x critical path length
26
Experimental Results: Soft Real-Time Scheduling Soft real-time scheduling results Soft real-time scheduling results
27
Conclusions Development of gradual relaxation techniques Development of gradual relaxation techniques Most constrained principle Most constrained principle Minimal freedom reduction Minimal freedom reduction Negative thinking Negative thinking Compounding variables Compounding variables Simultaneous step consideration Simultaneous step consideration Calibration Calibration Probabilistic modeling Probabilistic modeling Applications to: Applications to: –Maximum independent set –Time-constrained scheduling –Soft real-time scheduling
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.