Download presentation
Presentation is loading. Please wait.
Published byChristian Derek George Modified over 6 years ago
1
The Vision of Self-Aware Performance Models
Johannes Grohmann, Simon Eismann, Samuel Kounev International Conference on Software Architecture Seattle,
2
Motivation What’s the appropriate model granularity? Performance Model
Model database as black-box, as its never a bottleneck. What’s the appropriate model granularity? Use mean value analysis, as I do not contain any forks. Performance Model What’s an appropriate solver for this model? My prediction accuracy is currently 97%. How accurate is this performance model? How to adapt this model if the system evolves? Recalibrate service demand of component X. Capacity planning & system design analysis
3
Overview
4
Query-based Model Tailoring
Required model granularity changes depending on requested metric and performed adaptations Idea: dynamically adapt (tailor) model to fit scenario Provide multiple component descriptions Dynamically select appropriate modeling granularity Model unrelated system parts as black boxes Benefit: Improves simulation time Scales better with system size Maintains accuracy
5
Overview
6
Query-tailored Model Solution
Brosig et al. [1] showed: Significant time-to-result and accuracy differences between different simulation- based solvers Time-to-result and accuracy depend on model properties Idea: Predict accuracy based on information loss Predict time-to-result based on historic information Select best suited solver based on these predictions [1] Brosig, Fabian, et al. "Quantitative evaluation of model-driven performance analysis and simulation of component-based architectures." IEEE Transactions on Software Engineering 41.2 (2015):
7
Overview
8
Model Validation Self-reflective parameter analysis
Compare model variable descriptions with monitoring data Model confidence is aggregation of variable confidences Detects inaccuracies proactively Cannot detect structural inaccuracies Historic prediction accuracy analysis Compare performance predictions with monitoring data Includes structural inaccuracies Hard to pinpoint source of inaccuracy Can only validate accuracy for previously deployed system states Combination allows for holistic model validation
9
Overview
10
Inaccurate parameterization Structural inaccuracies
Model Recalibration Inaccurate parameterization Relearn parameter with additional monitoring data Choose different learning approach (meta-learning) Structural inaccuracies Re-extraction of model from monitoring data Add black-box causing e.g., additional response time or utilization
11
Conclusion Self-aware performance models as a solution to common modeling problems Four concrete examples for self-awareness in performance models: Model dynamically adapts to each request Query-based Model Tailoring Model solving process adapts to input model Query-tailored Model Solution Model learns about its prediction accuracy Model Validation Model repairs itself in case of system evolution Model Recalibration
12
Thank you for your attention!
Slides are available at Johannes Grohmann, Simon Eismann, Samuel Kounev International Conference on Software Architecture Seattle,
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.