Download presentation
Presentation is loading. Please wait.
Published by藏 广 Modified over 7 years ago
1
Optimizing Linear Algebra Code and Symbolic Differentiation with LAGO and SC
Amir Shaikhha, Mohammed ElSeidy, Daniel Espino, and Christoph Koch Developing Libraries Evolution of Library Design Current Workflow Limitations Domain-Specific Languages Systems Compiler (SC)1 For every new Architecture Native Libraries Automated Specialization DSL Semantics/ Rules A language for a particular domain Concise syntax MATLAB/Maple/SQL Independent of heterogeneous architectures CPU/GPU/FPGA/Super Computers Leverages Domain-Specific Optimizations Algebraic, lambda calculus, etc. Simulation Performed by SC Tuned Library (DSL) Developed by Platform Specification Compilation Techniques Has to Library User Library Developer Learn a new library Learn a new Programming Language / Programming Model (e.g. CUDA) Rewrite the applications DBLAB2 Systems Compiler (SC) LAGO … DSL C++ MPI Input Tuned Library Library Semantics DSL Frameworks Has to Learn only that DSL Write the application once Reuse for different platforms Benefit from domain-specific optimizations Has to Core Platform Specification Manual Specialization Learn a new hardware architecture Rewrite the library Optimize and tune the library Code Generation LAGO: Matrix Algebra Modular Framework Transformation Rules LAGO Program Extensible Rules DSL: matrix manipulation operations Scientific computations, machine learning, and graph algorithms Generates optimized code for the underlying processing substrate Octave, Spark, etc. Leverages Symbolic Computation General-purpose compilers cannot reason symbolically Performance optimization is a first class citizen CAS: e.g. Mathematica can do Symbolic Computation But are not meant for performance (very slow) Meta-Information Dependent Rewrite Rules LAGO Synthesizer Equivalence Rules Meta-Info. Inference Simplification Rules Compiler Optimizations Search Algorithm Transformation Rules Modular Components Algebraic Rewrites Equivalence Rules: A(B+A-1) ⇔ AB+AA-1 ⇔ AB+I Simplification Rules: X*I ⇒ X, X*0⇒0 Meta-Information Inference rules A=AT ⇒ Symmetric, e.g., XTX, X⋀XT Meta-Information Dependent Rewrite Rules k ⇔ k k⇔[U1 U2][V1 V2]Trank 2k Compiler Optimizations CSE, Hoisting, Constant Folding Cost Model Meta Information Systems Compiler (SC) LAGO: Sweet spot between symbolic manipulations and compiler optimizations. Optimized Code Derivation & Symbolic Computation Symbolic Incremental Computation Δ Symbolic Differentiation Many optimization problems require differentiation of expressions formulated as matrices, vectors, and scalars Challenge: Manual derivation, optimization, and coding is hard, messy & error prone Approach: Developer defines a set of differentiation reduction rules E.g. ∂(u*v,x)→u*∂(v,x)+∂(u,x)*v (Product Rule) LAGO automatically derives, optimizes and generates code Example: Non-Negative Matrix Factorization: Mn*n⇒Un*kVk*n Gaussian Distribution: Poisson Distribution: Exponential Distribution: Data undergoes continuous dynamic change. Challenge: After slight changes in the input data, the user has to re- evaluate analysis programs to reflect the incremental changes Re-evaluation is expensive Does not reuse pre-processed results. Approach: Developer defines a set of incremental delta Δ evaluation rules E.g. (A+ΔA)(B) → AB + ΔAB (Product Expansion) Δs are represented in factored forms: outer products LAGO automatically derives, optimizes, and generates code Example: Linear regression: Ordinary Least Squares: As data points change or increase, the regression coefficients change Re-evaluation: O(n3) Incremental: O(n2) PDF Update Rule Evaluation 2 × 6 cores 2.66GHz Intel Xeon 64GB DDR3 RAM Mac OS X Lion - 1 2 A. Shaikhha, Y. Klonatos, L. Parreaux, L. Brown, M. Dashti, and C. Koch, “How to Architect a Query Compiler”, SIGMOD’16.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.