Download presentation
Presentation is loading. Please wait.
Published byRosanna Boyd Modified over 9 years ago
1
Messaging and Data integration models are used in Phase I of the MCAS project to connect a set of message- enabled services into a workflow for efficient refactoring of existing informational portals. Project goal. Our goal is to factor out presentation and business analysis logic from available monitoring solutions into a standalone model supporting common standards in data manipulation and presentation. We have prototyped several services which rely on common techniques for data and information display aggregation. In particular, we’ve chosen portlet technology, to compose troubleshooting and metric analysis dashboards, and ESB Mule, to drive the data-integration model. We have used the portlet JSR128 specification and the JBoss engine for the integration of displays and Mule ESB to transform external data for consumption by the portlet code. Project goal. Our goal is to factor out presentation and business analysis logic from available monitoring solutions into a standalone model supporting common standards in data manipulation and presentation. We have prototyped several services which rely on common techniques for data and information display aggregation. In particular, we’ve chosen portlet technology, to compose troubleshooting and metric analysis dashboards, and ESB Mule, to drive the data-integration model. We have used the portlet JSR128 specification and the JBoss engine for the integration of displays and Mule ESB to transform external data for consumption by the portlet code. Mission need The complexity of Grid workflow activities and of their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting ) are challenged by the informational entropy inherent with independently maintained and operated software components. Because such an information "pond" is disorganized, it becomes a difficult target for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. Metric Correlation and Analysis Service (MCAS) Mission: deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware. Data integration model The data integration model relies on Mule ESB for data source access, inter-component message exchange, and data transformation scheduling. The primary benefit of the Mule ESB integration platform is its ability to manage data and execution flow in a way that is agnostic to transport or interface. In particular Mule ESB offers: Codes to translate or templatise translation of data formats Options to manage synchronicity, ranging from fully synchronous to SEDA-based (Stage Event Driven Architecture) solutions. Code that adapts out-of-the-box to different transport protocols (TCP, UDP, SMTP, FTP, JDBC, etc.) Data integration model The data integration model relies on Mule ESB for data source access, inter-component message exchange, and data transformation scheduling. The primary benefit of the Mule ESB integration platform is its ability to manage data and execution flow in a way that is agnostic to transport or interface. In particular Mule ESB offers: Codes to translate or templatise translation of data formats Options to manage synchronicity, ranging from fully synchronous to SEDA-based (Stage Event Driven Architecture) solutions. Code that adapts out-of-the-box to different transport protocols (TCP, UDP, SMTP, FTP, JDBC, etc.) Example ds(D0ProductionEfficiency) ec=eff_code; ef=eff_fini; RRD( CDEF:ec_adj=ec,0,100,LIMIT CDEF:ef_adj=ef,0,100,LIMIT LINE2:ec_adj#FF0000:eff_code(x100) LINE2:ef_adj#0000FF:eff_fini(x100) ) imgsize(600,300) Figure 2 depicts one of the implemented scenarios, which uses data transformations and an RRD processing engine to do splitting, rescaling, and redrawing of D0SiteEfficiency data. The data transformation engine is built by setting up independent “models”, which drive the message interactions between Mule ESB message endpoints. This particular schema is designed to understand a template-like language and has only one data source (production efficiency) endpoint. The embedded RRD template enables transformations over split data streams. The result of the transformation, a new document with an image, is sent to a portlet instance, specifically configured to interact with this data integration model. Figure 2. An example of a data integration workflow Different Data Sources Content Management System (CMS) Portal P1 P4 P3 P2 Portal URL Data Integration Layer Data Access Layer Data Transformation Layer Input Data Data Aggregator Raw Data Apply Rules Aggregated Data Data Transformation Rules Engine HTML-> XML Text -> XML DB -> XML Database Reader URL Reader Graph Reader Apply Rules Request Parser Legends: Control Flow Data/Information P1, P2, P3, P4: Portlets Figure 1. High level view of control and data flow inside the system Messaging Message exchange is a clever way to decouple the contexts of two programs. Messaging is a soft pattern which does not imply a fixed db schema or file format behind. It can encapsulate data transport and synchronicity semantics. Most importantly, with the help of Mule message enabled infrastructure we can use opaque payloads and do modeling of the data access and aggregation work flow without the need to set the specifics of the type-structure of the data sources. Messaging Message exchange is a clever way to decouple the contexts of two programs. Messaging is a soft pattern which does not imply a fixed db schema or file format behind. It can encapsulate data transport and synchronicity semantics. Most importantly, with the help of Mule message enabled infrastructure we can use opaque payloads and do modeling of the data access and aggregation work flow without the need to set the specifics of the type-structure of the data sources. Work supported by the U.S. Department of Energy under contract No. DE-AC02-07CH11359 Workflow Summary The data integration layer defines all URL endpoints that provide content for the display. These endpoints may be as simple as proxies to existing web pages or hide rules for transforming and aggregating data that is retrieved from other sources. The purpose of this complexity is to provide data that is not available in contents preferred by the user interface. Summary The data integration layer defines all URL endpoints that provide content for the display. These endpoints may be as simple as proxies to existing web pages or hide rules for transforming and aggregating data that is retrieved from other sources. The purpose of this complexity is to provide data that is not available in contents preferred by the user interface. JBoss portal content management system builds composition of independent interface elements – portlets (P1, P2, P3, P4). Each portlet can be independently designed with some unique perspective of the particular system aspects. The composition of such elements is a dashboard of indicators specifically put together to comprehensively reflect the state of the entire system. Each portlet is rendered using information provided by the data integration layer. The data integration layer accesses a set of data sources and uses a collection of rules to transform and aggregate the retrieved content. The data integration layer generates a digest content with only those details that are relevant for rendering the portlet itself. The result is returned synchronously to a requesting party - portlet or parent data integration rule set. The data integration layer accesses a set of data sources and uses a collection of rules to transform and aggregate the retrieved content. The data integration layer generates a digest content with only those details that are relevant for rendering the portlet itself. The result is returned synchronously to a requesting party - portlet or parent data integration rule set. D0 Production efficiency data source RRD processing engine Model gateway: REST query to Mule Message convertor Content Management Split Meta data Display Mule message w/processing instructions Splitter Efficiency : files produced Efficiency : exit code Mule endpoint Future work One of the major obstacles in reusing the results of this work is the complexity of the technologies adopted. To overcome this, we now focus on understanding common data integration patterns, in order to isolate reusable service components. The role of these components may be changed for a different data model by rearranging the workflow or by reconfiguring the parameters of the service. Future work One of the major obstacles in reusing the results of this work is the complexity of the technologies adopted. To overcome this, we now focus on understanding common data integration patterns, in order to isolate reusable service components. The role of these components may be changed for a different data model by rearranging the workflow or by reconfiguring the parameters of the service. Transformation pipeline Andrew Baranovski, Gabriele Garzoglio, Ted Hesselroth, Tanya Levshina, Parag Mhashilkar {abaranov, garzoglio, tdh, tlevshin, parag}@fnal.gov Computing Division, Fermilab, Batavia, IL
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.