AdvancedEPM Announces Arrival of Proprietary Financial Data Bridge for OLAP Technology (FDBOT) Software Application.
Datavail | | March 29, 2016
The software is used for extracting data and metadata from an Oracle Hyperion Financial Management (HFM) relational database schema and transforming it for consumption by Oracle Essbase Studio. HFM is an application owned, developed and supported by the Oracle Corporation. HFM is intended to produce consolidated financial statement data by translating multi-currency data into one or more reporting currencies, eliminating intercompany activity and applying custom calculations as defined within the application interface. The HFM software product utilizes a relational database repository to store the data and metadata exposed to the user through the software’s user interface and the user interface of ancillary reporting and analysis tools. End users of the HFM software rarely interact directly with the relational database repository, as such interaction is handled by the software’s application layer.
The FDBOT software extracts data and metadata directly from this relational database repository, bypassing the HFM software’s application layer entirely; the FDBOT software in no way utilizes or depends upon the HFM application software. The HFM application software utilizes a multidimensional data model, including a core set of required, pre-defined dimensions and the optional ability for consumers of the software to define one or more additional dimensions. In addition to extracting the data and metadata described above, the FDBOT software may apply one or more transformations to the data, depending upon the relevant configuration options the user has chosen. After the relevant transformations have been applied to the data records, the resulting data set is written to a table (also known as a fact table) in a relational database repository created by the user. The set of columns (or fields) included in this table will depend upon the user’s configuration choices within the FDBOT software as well as the dimensional data model that was configured in the HFM application software.
The records extracted from the metadata tables undergo a series of transformations to reformat the metadata so that it conforms to the requirements of an Oracle Essbase metadata model. The transformations include the derivation of certain Essbase metadata property values, such as consolidation operator and data storage. After the appropriate transformations have been applied, the metadata records for each dimension are written to a separate table, so that a separate table exists for each dimension. Together with the fact table that was created from the extracted data, the resulting set of tables comprises a star schema data model.
After the FDBOT software has created the star schema data model, the software issues the necessary commands to Oracle Essbase Studio to build and deploy an Essbase database (also known as a cube). The Essbase Studio application software consumes the fact and dimension tables from the star schema data model to build the cube structure (i.e., the outline) and load data. The cube is loaded with aggregated data values, which are derived by aggregating the fact table records for each unique combination of metadata members. Once Essbase Studio has completed the cube build and data load operations, the FDBOT software issues a command to the Essbase application server to initiate an aggregation operation; the aggregation process calculates values for the upper-level members of each dimension (subject to certain exceptions based upon the metadata configuration). Upon completion of the aggregation process, the Essbase database is ready for consumption by end users via a variety of reporting and analysis tools.
EPM applications help measure the business performance. This post will help you choose the best EPM solutions for your organization’s needs and objectives.
Imagine there are over one hundred logins in the source server and you need to migrate them all over to the destination server. Wouldn’t it be awesome if we could automate the process by generating the scripts for the required tasks?