Data Warehouse Application Console (DAC) is centralized framework for the setup, configuration, administration and loading of Oracle Business Analytics Warehouse. It is one of the important components of Oracle Business Intelligence Application Architecture.
While working on various BI Apps projects, I have come across the various questions from customers regarding DAC. What is DAC? What it does? Why not use ETL monitoring and controlling tool rather than DAC? Is DAC dispensable?
This is an attempt to discuss the DAC with enough details to answer some of the frequently raised questions about DAC.
What is DAC?
DAC is centralized framework for the setup, configuration and administration and loading of Oracle Business Analytics Warehouse. It has easy to use interface for deploying, defining, administering and monitoring of data warehouse processes.
In simple terms DAC is a metadata driven “ETL orchestration tool” which supports application configuration, execution, recovery and monitoring.
DAC allows
• Pin-point deployment
• Load balancing / parallel loading
• Reduced load windows
• Fine-grained failure recovery
• Index management
• Database statistics collection
ETL Architecture and DAC
DAC is integral part of BI Application Architecture; the ETL Overview with DAC is shown below.
Frequently asked Questions by Customers
The list of questions by customers regarding the DAC compoments are :
· Can Informatica native features to replace DAC?
· Is there any ETL requirement that cannot achieve by ETL tool?
· Customization of OTTB ETL tasks also requires the customization of DAC? Is not overhead?
· Does it support multi-source data extract and Load? i.e. Siebel and Oracle Source System
· Does it support multiple instance data extract and load? i.e. Oracle E-Business – two instances
· Can we have multiple execution plans running once?
· Is it scalable? Is it HA compatible?
Why DAC?
DAC provides application specific capability which is not prebuilt into ETL platforms. It complements the ETL platforms. The DAC is designed to leverage a range of Informatica features. It supports inclusion of execution of PL/SQL, SQL file and External program to enhance the data extract and load process.
DAC features
• Dynamic multi-threaded and load balanced ETL
• Intelligent Task Queue Engine based on user defined and computed scores
• ETL Application Aware Dynamic generation of Execution Plans
• Smart Index Management for ETL and Query Performance
• Restart at any point of Failure
• Automatic Full and Refresh Mode awareness
• Simple Handles for ETL from Multiple Sources
• Analysis and reporting tools for isolating ETL bottlenecks
Value Preposition of DAC
In my view DAC offers a lot for BI Application implementation. The value proposition of DAC is as follows.
1. It helps to manage the pre-built content like BI Apps. Functionality such as Subject Area Assembly and Execution Plan design, use the DAC metadata to find what and how to run the tasks by auto calculating dependencies. It then makes it easier for inclusion of the customizations.
2. Pure play ETL tools such as Informatica specialize in loading a table at a time, whereas DAC specializes in loading a set of relevant tables.
3. Dynamic and Automatic Detection of Relevant Tasks and Dependencies
o Simplifies the complexity of the myriad combinations of Source Systems and Subject Areas
o Reduces the domain knowledge required to deploy BI Apps
o Does not need a pre-configured process flow
o Merge Custom tasks seamlessly into an existing flow
o Reduces TCO
4. DAC execution focuses on efficient orchestration of several individual across ETL tools and databases operations such as running pl-sql blocks, or creation of materialized views.
5. It helps in doing efficient index and table analyzes management, with fine grain control on how each of these is done. It is capable of creating indexes in parallel.
6. Some runtime analysis to find the bottlenecks of the ETL to fine tunes them.
7. It’s incrementally aware, and so can manage the full and incremental loads. Helps in incrementally rolling in Subject Areas and Adaptors. For example, if one were to implement a Financial subject area for the initial rollout, and subsequently rollout out other subject area such as HR, then its capable of kicking off incremental run for all the Finance related tables, the common dimension tables, while performing a full load for those unique to HR
8. Capable of orchestrating multi-source ETLs.
o Manage ETL from several heterogeneous or distributed sources
o Combine Extracts from Multiple Sources and Load into a common Target
o Automatically manage challenges of common truncates, index management and table analysis.
o Easy rearrangement of extraction sequence
9. Helps manage parameters, both runtime and static.
10. Easy Customization, Identification & isolation of Customizations
11. DAC metadata helps answer many of the commonly asked questions such as all the source tables, target tables touched by an ETL or a SA.
12. Runtime monitoring maintains snapshots of the runs that are easy to understand, and go back to do post-mortem analysis.
DAC’s Role
The following pie-chart (source –Gartner) shows the DAC’s role in BI Application implementation project. In short it is more than dependency tool.
Conclusion
Data Warehouse Application Console (DAC) works together with Informatica to accomplish the ETL for pre-packaged BI applications.
Here is what happens behind the scene:
· DAC publish the changes from OLTP ( Application Change Capture)
· Informatica extracts the changes from the change log published by DAC as well as from the base table
· Informatica load the data to the stage tables and the target tables in the data warehouse
· DAC manage the performance by dropping indexes, truncating stage tables, rebuilding the indexes, and analyzing the tables during the process
If you do not use DAC, you have to write your own custom change capture process and need to redesign from scratch an ETL method that allow the restart of the ETL process from point of failure at record level. The biggest saving is that DAC can survive during the upgrade, while your custom processes cannot.
No comments:
Post a Comment