Implementation Steps for Oracle Communications Data Model

Oracle Communications Data Model follows the traditional implementation steps of a usual data warehouse or BI project with some exceptions with respect to the Analysis and Development Phase.

The exceptions areas are shown in bold:

  • Analysis

    • Business Requirement Study

    • Gap Analysis and Identification of

      • Data Model Enhancements

      • BI Reporting / Analytics Enhancements

    • Source Systems Understanding

    • Source to Target Data Element Mapping

Deliverable Includes:

  • Summary of high-level business strategy focusing on enabling information needs

  • Conceptual view of the current analytical environment pointing out gaps, deficiencies, problem areas, and so on.

  • Identification of business needs that cannot be answered today.

  • A step-by-step Implementation/migration path to the new functional and technical architecture that will support business objectives.

  • Prioritize Implementation/migration steps based on business strategy and the potential ROI of each step.

  • An estimated cost to build/migrate to the new architecture.


  • ETL Architecture Definition & ETL Tool Selection

  • ETL Design

    • Mapping / Interface Design

    • Process Flow Design

    • Scheduling / Automation Criteria Identification

  • Logical Data Model Enhancement (as per Gap Analysis Findings)

  • Report Design Enhancement (as per Gap Analysis Findings)

  • Physical Database Design Enhancement / Customization / based on LDM Enhancement


  • ETL Development (Scripting, Mappings & Process Flows and so on.) and Unit Testing

  • Oracle Communications Data Model-based BI Solution Customization / Enhancement (based on Gap Analysis finding).

    • Pre-packaged ETL Scripts

    • BI Reports - Relational

    • OLAP Cubes & Reports

    • Mining Models & Reports

System & Integration Testing

Documentation & Training (User Doc & User Training)

  • Acceptance Testing

  • Deployment

    • Deployment in Production Environment

    • Production Data Load into the Data Warehouse

      Initial / History Data Load

      Incremental Data Load

  • Implementation / Go Live

  • Maintenance Support

A typical project lasts four to six months on the first phase and might be quicker on the following phases of the implementation. Note that being up and live in three months is possible in specific cases:

  • When the implementers are experts in Telecommunications and BI & Data Warehousing and know Oracle Communications Data Model

  • When the sources to map are limited in numbers, well known and without much data quality issue or data duplication or correspondence issues

  • When one focuses on delivering Oracle Communications Data Model vanilla or out-of-the-box with OBIEE as BI tools, before doing anything on top.

Additional project accelerators are the use of pre-built adapters and analytics packages.

Resources and Skills Required

The minimal expertise required for Oracle Communications Data Model implementation are:

  • A Data Modeler who knows telecommunications processes, Oracle Communications Data Model (or TeleManagement Shared Information Data model as alternative), and who can use SQL Data Modeler and various tools

  • At least one BI reporting tool expert to configure (especially around administration of roles and rights, and to review of the business model to present) and customize it for the end-users.

  • At least two ETL experts of the chosen ETL tools, for the mapping from source to Oracle Communications Data Model and then with good PL/SQL knowledge to adapt possibly the intra-ETLs or develop new ones upon requests (with the help of the data modeler).

  • A Project Manager

  • From the customer or as part of the implementation team, a good DBA (part time only) to configure and adapt the Database (partitions, parallelism, cache, role & rights, code optimization...) to enhance the performance of the default Oracle Communications Data Model configuration. The fact Oracle Communications Data Model is pre-optimized needs to be re-optimized depending on the data volume for each table (especially the base tables) and the load strategy chosen (see below).