42 Pipeline Manager Utilities

This chapter provides reference information for Oracle Communications Billing and Revenue Management (BRM) Pipeline Manager utilities.

Database Loader

The Database Loader utility loads and unloads aggregation data into and from a database.

For information about aggregation, see "Setting Up Pipeline Aggregation".

Dependencies

This utility needs a connection to the DBC database module, and the DBL library (libDBLXXX.so). See "Database Connect (DBC)".

Location

Pipeline_Home/tools

where Pipeline_Home is the directory in which you installed Pipeline Manager.

Syntax

dbLoader -r registry [-f files] [-u]

Parameters

-r

Defines the registry file.

-f

Defines the file pattern (regular expression).

-u

Undo mode.

Registry Entries

Table 42-1 lists the Database Loader registry entries.

Table 42-1 Database Loader Registry Entries

Entry Description Mandatory

BULKSIZE

Specifies the Oracle array size for bulk inserts (loadmode 2 and 3).

Yes

DIRECTIONMODE

Defines the selection order of the control files (1 file name, 2 sequence).

Yes

FILES.ARCHIVE.PATH

Specifies the path where the successfully loaded files are stored.

Yes

FILES.ARCHIVE.SUFFIX

Specifies the suffix of the successfully loaded data files.

Yes

FILES.BAD.PATH

Specifies the path where the bad files are stored.

Yes

FILES.BAD.SUFFIX

Specifies the suffix of the bad data files.

Yes

FILES.CONTROL.PATH

Specifies the path for the input aggregate control files.

Yes

FILES.CONTROL.SUFFIX

Specifies the suffix of the input aggregate control files.

Yes

FILES.DATA.PATH

Specifies the path for the input aggregate data files.

Yes

FILES.DATA.SUFFIX

Specifies the suffix of the input aggregate data files.

Yes

FILES.MERGE.PATH

Specifies the path where the source merge data files are stored.

Yes

FILES.MERGE.SUFFIX

Specifies the suffix of the source data files before merging/sorting.

Yes

FILES.REJECT.PATH

Specifies the path where the rejected files are stored.

Yes

FILES.REJECT.SUFFIX

Specifies the suffix of the rejected data files.

Yes

LOADMODE

Specifies how to load data:

  • 1: Single row updates and inserts.

  • 2: Single row updates and bulk inserts.

  • 3: Single row updates and bulk inserts.

Before loading, the files can be merged or sorted and split into smaller pieces.

Undo mode is always 1.

Yes

MAXSPLITLINES

Specifies the maximum number of lines per data file after splitting (loadmode 3).

Yes

ROLLBACKSEGMENT

Specifies which Oracle rollback segment to use when loading the database. How to set this entry depends on your database software setup.

If your Oracle9i database uses automatic undo management, comment out or remove this registry entry. If your database does not use undo management, specify a rollback segment.

The Oracle9i software provides an automatic undo management feature, which creates undo tablespaces rather than rollback segments for undo information. If you use this undo management feature and specify a rollback segment for the Pipeline Manager Database Loader utility, the utility fails when it attempts to load the database. To prevent this problem, don't specify a rollback segment.

No

SORTCMD

Specifies the external sort command (loadmode 3).

Yes

SORTING

Specifies a flag if files of identical structure should me merged and sorted (loadmode 3).

Yes

SORTMAXFILESIZE

Specifies the maximum destination size of the merged and sorted files (loadmode 3).

Yes

SORTTMPDIR

Specifies the path where sort stores temporary files (loadmode 3).

Yes

SPLITTING

Specifies whether to split data files before loading (reduce transaction size) (loadmode 3).

Yes


Sample Registry

DBLOADER 
  { 
    Active              = TRUE
    ProcessLoopTimeout  = 10
    QueueRequestTimeout = 0
    Instrumentation
    {
      #-----------------------------------------------------------
      # ProbeBroker registry entries.
      # ProbeInfoFilePath - The path that contains all probe
      # info files used by instrumented objects.
      #-----------------------------------------------------------
      ProbeBroker
      {
        ProbeInfoFilePath = ./instrumentation
      }
    }
    LogMessageTable
    {
      MessageFilePath   = ./etc
      MessageFileSuffix = .msg
    }
    DiagnosticDataHandler
    {
      DiagnosticFilePath = ./log
      DiagnosticFileName = diagnostic.dat
    }
    #
    # main parameter
    #
    DIRECTIONMODE   = 2
    LOADMODE        = 2
    BULKSIZE        = 100
    ROLLBACKSEGMENT = R04
    SORTING         = true
    SORTCMD         = sort
    SORTTMPDIR      = .
    SORTMAXFILESIZE = 2000000000
    SPLITTING       = true
    MAXSPLITLINES   = 40000
    #
    # database section
    #
    DataPool
    {
      Database
      {
        ModuleName = DBC
        Module
        {
          DatabaseName = $ORACLE_SID
          UserName     = AGGREGATOR
          PassWord     = 595EA7DFC8C6C3D8A1AFDADC0600180F12771D73        
          AccessLib    = oci11g72
          Connections  = 1
        }
      }
    }
    #
    # File Section
    #
    FILES
    {
      CONTROL
      {
        PATH    = ./data/aggregate/cntl
        SUFFIX  = .ctl
      }
      DATA
      {
        PATH    = ./data/aggregate/done
        SUFFIX  = .dat
      }
      REJECT
      {
        PATH      = ./data/aggregate/reject
        SUFFIX    = .rej
        THRESHOLD = 85
      }
      REJECT_HANDLE
      {
        PATH      = ./data/aggregate/reject
        SUFFIX    = .rej
        THRESHOLD = 85
      }
      ARCHIVE
      {
        PATH    = ./data/aggregate/archive
        SUFFIX  = .arc
      }
      BAD
      {
        PATH    = ./data/aggregate/bad
        SUFFIX  = .bad
      }
      MERGE
      {
        PATH    = ./data/aggregate/merge
        SUFFIX  = .mrg
      }
    }
    #
    # log section
    #
    ProcessLog
    {
      ModuleName = LOG
      Module
      {
        ITO
        {
          MessageFilePath = etc
          MessageFilePrefix = error
          MessageFileSuffix = error.msg
          FilePath = ./data/aggregate/log
          FileName = process
          FilePrefix = DBL_
          FileSuffix = .log
          ProcessName = dbLoader
          MessageGroup = DBLOADER
        }
        Buffer
        {
          Size = 1000
        }
      }
    }
  }

db2irules.pl

Use the db2irules.pl script to extract rule sets from the Pipeline Manager database to the Rule Set XML file.

See "Importing and Exporting Validation Rules" in BRM Developer's Guide.

Important:

This utility uses DBI and DBD drivers which are not part of the Pipeline Manager installation. You download these drivers from http://www.cpan.org and compile and install them separately.

Location

Pipeline_Home/tools/IRules2Db/db2irules.pl

Important:

Since there are dependencies between the db2irules.pl script and the PerlParser.pm XML library located in the same directory as the script. Always run the script from this location.

Syntax

db2irules.pl [-d] [-u] dbi:dcs password user_name file_path rule_set_id

Parameters

If you start the db2irules.pl script without any parameters, a usage description and an example for each parameter are displayed.

dcs

The database connection string. This required parameter enables the script to access the database. The string is different for each database type. Example dcs for Oracle:

Oracle:orcl

Note:

The database connection string is the standard database access module for Perl scripts. It defines a set of methods, variables, and conventions that provide a consistent database interface, independent of the actual database being used.
password

This parameter is required to connect to the database. It is your standard Pipeline Manager database password.

user_name

This parameter is required to connect to the database. It is your standard user name for the Pipeline Manager database.

file_path

Use this parameter to specify where you want to export the rule set. If you want to use the same directory in which the rule set is stored, use ./ as file path. If you don't set this parameter, the rule set is exported automatically to the current directory.

rule_set_id

Use this parameter to extract only one specific rule set, which is identified by its unique ID. If you don't set this parameter, the db2irules.pl script will extract all rule sets from the database. If you use this parameter, you must use the file_path parameter. This rule_set_id refers to the IFW_RULESET.RULESET database field.

-u

This parameter creates a unique file name for the rule set, based on date and time. It uses the following format: RULESET_yyyy-mm-dd_hh-mm-ss.xml. Use this parameter to ensure that you do not override an existing XML file when extracting rule sets. If the file name for a rule set contains spaces, replace them with the underscore character (_).

Example:

db2irules.pl -u dbi:Oracle:orcl scott tiger TAP3_VAL
-d

This parameter deletes the specified rule set(s) from the database after you extracted them. If you use this parameter, a transaction is opened with the database. If any of the rule set deletes fail, the entire delete sequence is rolled back to preserve database integrity. If all rule set tables are deleted successfully, the transaction is committed to the database.

Example:

db2irules.pl -d -u dbi:Oracle:orcl scott tiger

Diagnostic Data Handler

Use Diagnostic Data Handler to get data about Pipeline Manager after a crash, exception, critical error, or while it is running.

For more information, see "Using The Diagnostic Data Handler To Get OMF Diagnostic Data" in BRM System Administrator's Guide.

Registry Entries

Table 42-2 lists the Diagnostic Data Handler registry entries.

Table 42-2 Diagnostic Data Handler Registry Entries

Entry Description Mandatory

DiagnosticFilePath

Path to the log file that is created by Diagnostic Data Handler.

Yes

DiagnosticFileName

File name of the log file that is created by Diagnostic Data Handler.

Yes


Sample Registry

DiagnosticDataHandler
{
   DiagnosticFilePath = ./log
   DiagnosticFileName = diagnostic.dat
}

irules2db.pl

Use the irules2db.pl script to insert a rule set from Validation Rules XML file into the Pipeline Manager database.

See "Importing and Exporting Validation Rules" in BRM Developer's Guide.

Important:

This utility uses DBI and DBD drivers which are not part of the Pipeline Manager installation. You download these drivers from http://www.cpan.org and compile and install them separately.

Location

Pipeline_Home/tools/IRules2Db/irules2db.pl

Important:

Since there are dependencies between the irules2db.pl script and the PerlParser.pm XML library which is located in the same directory as the script. Always run the script from this location.

Syntax

irules2db.pl [-f] dbi:dcs password user_name rule_set_name backup_file_path

Parameters

If you start the irules2db.pl script without any parameters, a usage description and an example for each parameter are displayed.

dcs

The database connection string. This required parameter enables the script to access the database. The string is different for each database type. Example dcs for Oracle:

Oracle:orcl

Note:

The database connection string is the standard database access module for Perl scripts. It defines a set of methods, variables, and conventions that provide a consistent database interface, independent of the actual database being used.
password

This parameter is required to connect to the database. It is your standard Pipeline Manager database password.

user_name

This parameter is required to connect to the database. It is your standard user name for the Pipeline Manager database.

rule_set_name

Use this parameter to specify the name of the Rule Set XML file that you want to import to the database. This parameter supports fully qualified and relative path names.

Examples:

  • ./tap3_val.xml

  • /home/data/tap3_val.xml

  • /../files/tap3_val.xml

  • tap3_val.xml

backup_file_path

Use this parameter to specify the path for storing the extracted rule set before it is deleted from the database and then after modification inserted from the Rule Set XML file into the database. Use this parameter with the -f parameter.

-f

This parameter forces the rule set into the database. The irules2db.pl script connects to the database and starts parsing the Rule Set XML file. When it finds the name of the rule set, it calls the export script that contains the -u and -d parameters. If the db2irules.pl script finished successfully, the irules2db.pl script continues parsing the XML file and imports the rule set to the database. If any of the rule set columns fail to be inserted, the irules2db.pl script rolls back the transaction and exits. If all columns are inserted into the database successfully, the rule set for the transaction is committed.

LoadIfwConfig

Use this utility to extract data from or load data into the Pipeline Manager database. This enables you to:

  • Migrate data from a legacy database to the Pipeline Manager database. See "Migrating Price List Data From Legacy Databases" in BRM Setting Up Pricing and Rating.

  • Transfer data between Pipeline Manager databases; for example, from a test database to a production database. See "Transferring Data Between Pipeline Manager Databases".

    Caution:

    The 7.4 version of the LoadIfwConfig utility is not backwards-compatible with previous versions of the utility. Any data exported by a previous version of the utility must also be loaded with that same version. In addition, any custom scripts or procedures that are dependent on the utility's functionality might need to be modified to work with the 7.4 version.

The LoadIfwConfig utility can run in these modes:

  • Non-interactive mode: You use commands that batch several related parts of the extracting or loading process. You must enter a full command, including the utility name for each set of actions.

  • Interactive mode: You issue a command for each step in the process of extracting or loading. After you enter interactive mode, the prompt changes to an angle bracket and commands are single words for performing particular actions. You can view a list of the change sets that will be extracted or loaded.

Location

Pipeline_Home/bin

Syntax: Non-Interactive Mode

LoadIfwConfig   {-rall [-t Modifidate] | -r [-t Modifidate] | -p [f] | -u | -I} 
                [-c] [-nodep] -i InputFile [-o OutputFile] [-h] [-v]

Parameters: Non-Interactive Mode

-rall [-t Modifidate

Extracts all objects from the Pipeline Manager database. This parameter does not require an input XML file.

Using -t Modifidate retrieves only pricing objects that were modified after the specified timestamp. Enter the time in the ISO-8601 format: YYYY-MM-DDThh:mm:ss or YYYY-MM-DD with the server time zone as the default.

-r [-t Modifidate]

Extracts from the database the objects listed in InputFile.

Using -t Modifidate retrieves only pricing objects that were modified after the specified time. Enter the time in the ISO-8601 format: YYYY-MM-DDThh:mm:ss or YYYY-MM-DD with the server time zone as the default.

-p [f]

Deletes objects from the database.

Using the f parameter turns off the delete confirmation.

-u

Updates the Pipeline Manager database. Data is not actually updated in the database until it is committed with the -c parameter.

-I

Inserts data into the Pipeline Manager database. Data is not actually inserted into the database until it is committed with the -c parameter.

-c

Commits the data to the database. You use this command in conjunction with the -u and -I parameters.

-nodep

Suppresses any object dependency relationships that you configured in the Pipeline_Home/tools/XmlLoader/CustomConfig.xml file. This allows the utility to extract from the database only those objects that meet your criteria and to ignore any dependent objects. For more information about object dependencies, see "About Specifying to Extract Child and Dependent Objects".

-i InputFile

When extracting pipeline data by using the -r or -rall parameter, this is the name of the XML file that specifies the list of objects to extract from the source Pipeline Manager database.

When loading pipeline data by using the -u or -I parameter, this is the name of the XML file that contains the data you are loading into the destination Pipeline Manager database.

When deleting pipeline data by using the -p parameter, this is the name of the XML file that specifies the list of objects to delete from the Pipeline Manager database.

-o OutputFile

Specifies the output file to which the Pipeline Manager data is extracted. By default, the utility writes the output to a file named default.out in the current directory.

-h

Displays help about using the utility.

-v

Displays information about successful or failed processing as the utility runs.

Syntax: Interactive Mode

LoadIfwConfig  [read InputFile] [write OutputFile] [retrieve_all [-t Modifidate]] 
               [fetch [-t Modifidate]] [list] [delete] [commit] [update] [insert]  
               [help] [nodep] [verbose on|off] [quit]

Parameters: Interactive Mode

read InputFile

Specifies to read the specified input file into internal memory.

write OutputFile

Specifies the output file to which the Pipeline Manager data is extracted. By default, the utility writes the output to a file named default.out in the current directory

retrieve_all [-t Modifidate]

Extracts all objects from the Pipeline Manager database.

Using -t Modifidate retrieves only pricing objects that were modified after the specified time. Enter the time in the ISO-8601 format: YYYY-MM-DDThh:mm:ss or YYYY-MM-DD with the server time zone as the default.

fetch [-t Modifidate]

Extracts from the database the objects listed in internal memory. You use this parameter after you use the read parameter.

Using -t Modifidate retrieves only pricing objects that were modified after the specified time. Enter the time in the ISO-8601 format: YYYY-MM-DDThh:mm:ss or YYYY-MM-DD with the server time zone as the default.

list

Lists the current pipeline data stored in internal memory.

delete

Deletes from the database the objects listed in InputFile.

commit

Commits the data to the database. You use this command in conjunction with the update and Insert parameters.

update

Updates the Pipeline Manager database. Data is not actually updated in the database until it is committed with the commit parameter.

insert

Inserts data into the Pipeline Manager database. Data is not actually inserted into the database until it is committed with the commit parameter.

help

Displays help about using the utility.

-nodep

Suppresses any object dependency relationships that you configured in the Pipeline_Home/tools/XmlLoader/CustomConfig.xml file. This allows the utility to extract only those objects that meet your criteria and to ignore any dependent objects. For more information about object dependencies, see "About Specifying to Extract Child and Dependent Objects".

verbose [on | off]

Sets verbose information:

  • verbose on displays the status of the command most recently executed.

    Use the ProcessLog section of the registry file to specify the name and location of the file where debug messages are written.

  • verbose off displays the status only if there is an error.

quit

Quits from the utility.

Results

If the LoadIfwConfig utility is successful, it displays a confirmation message. If unsuccessful, it displays errors.

Memory Monitor

Use the Memory Monitor module to warn you when available system memory is low and to shut down Pipeline Manager when memory reaches a specified threshold.

For more information, see "Monitoring Pipeline Manager Memory Usage" in BRM System Administrator's Guide.

Registry Entries

Table 42-3 lists the Memory Monitor registry entries.

Table 42-3 Memory Monitor Registry Entries

Entry Description Mandatory

ScaleUnit

Specifies the unit for monitoring memory.

  • P specifies percentage.

  • K specifies Kilobytes.

  • M specifies MegaBytes.

Yes

ShutdownFreeMemLimit

Specifies the amount or percentage of remaining system memory that triggers Pipeline Manager to gracefully shut down.

Note: For percentage, you must enter a value from 1 to 99 inclusive.

Yes

WarningFreeMemLimit

Specifies the amount or percentage of remaining system memory that triggers Pipeline Manager to issue a warning to the user.

Note: For percentage, you must enter a value from 1 to 99 inclusive.

Yes


Sample Registry

ifw
{
    MemoryMonitor
    {
        ScaleUnit = P
        WarningFreeMemLimit = 10
        ShutdownFreeMemLimit = 5
    }
}

pin_container_to_stream_format

Use this utility to create EDR stream, input and output mapping, and input and output grammar files from an EDR container description file. FCT_CallAssembling then uses these files in the process of converting partially assembled call records to a new container description.

For more information on the process of converting EDRs to a new EDR container description, see "Upgrading Incomplete Calls to the New Container Description" in BRM System Administrator's Guide.

Location

BRM_Home/bin

where BRM_Home is the directory in which you installed BRM components.

Syntax

pin_container_to_stream_format -c container_description_filename -g grammar_file_prefix -m mapping_file_prefix -s stream_file_prefix | -h 

Parameters

-c container_description_filename

Specifies the container description file to use to generate a stream file and the mapping and grammar files. Replace container_description_filename with the container description file to use.

-g grammar_file_prefix

Creates the input and output grammar description files based on the container description file. Replace grammar_file_prefix with a prefix to add to the grammar filenames.

-m mapping_file_prefix

Creates the input and output mapping description files based on the container description file. Replace mapping_file_prefix with a prefix to add to the mapping filenames.

-s stream_file_prefix

Creates the stream description file based on the container description file. Replace stream_file_prefix with a prefix to add to the stream filename.

Important:

If you do not specify one or more of the -g, -m, or -s parameters, this utility generates the files using the container description filename as a prefix. However, if you specify these options, you must also specify their arguments. Otherwise this utility returns an error.
-h

Displays help for this utility.

Example

This example:

pin_container_to_stream_format -c containerDesc.dsc -g OLD_ -m OLD_ -s OLD_ 

Creates these files using the information in containerDesc.dsc:

  • OLD_Stream.dsc

  • OLD_InGrammar.dsc

  • OLD_OutGrammar.dsc

  • OLD_InMap.dsc

  • OLD_OutMap.dsc

Results

The pin_containter_to_stream_format utility notifies you only if it encounters errors.

pin_recycle

Use this utility to search for failed EDRs in the BRM database and queue the EDRs for recycling or test recycling, or delete them. This utility can:

This utility calls the suspense manger opcodes to actually perform the recycling. For more information, see "Suspense Manager FM standard opcodes" in BRM Developer's Reference.

Location

BRM_Home/bin

Syntax

pin_recycle [ -f CDR_file] [ -k recycle_key ] [ -d | -D| -r reason_code| -t ] 

Parameters

-f CDR_file

Queues all the failed EDRs that arrived in a single CDR file. Pipeline Manager rates these calls as soon as it can.

-k recycle_key

Searches for and queues EDRs for rating that contain:

  • The recycle_key, an application-specific string that is added to each EDR as it is suspended by Pipeline Manager. See "About Standard Recycling" for details.

  • A status of suspended.

These EDRs are queued for rating by Pipeline Manager as soon as possible.

-d

Searches for and deletes all EDRs with a status of succeeded or written off.

-D

Searches for and deletes all EDRs with a status of succeeded, written off, or suspended.

-r reason_code

Searches for and recycles all EDRs that have the specified reason code.

-t

Specifies a test recycle. In test mode, pin_recycle creates a report about the processing, but does not make any changes to the database. Test results written to the directory and file you specified using the FCT_Suspense module RecycleLog registry entries. You must also set the FCT_Suspense LogTestResults registry entry for standard recycling implementations.

Results

This utility logs messages to stdout.

The following message is returned after you use pin_recycle to recycle EDRs:

pin_recycle tool, number_of_EDRs EDRs Submitted for Recycling

The following message is returned after you use pin_recycle to test recycle EDRs:

pin_recycle tool, number_of_EDRs EDRs submitted for test recycling

The following message is returned after you use pin_recycle to delete EDRs:

pin_recycle tool, number_of_EDRs suspended EDRs deleted

purge_np_data.p

Use this utility to purge existing records from the number portability data file that are older than a specified date and time. See "Purging and Reloading the Memory Records".

Location

Pipeline_Home/bin

Syntax

purge_np_data.pl NP_FileName TimeStamp [–b backup_filename] [-n][-help]

Parameters

NP_FileName

Specifies the name of the number portability data file that will be purged.

TimeStamp

Specifies the date prior to which all the number portability records are purged. After the data is purged, the number portability data file is updated with the purged data.

Format: YYYYMMDDhhmmss.

-b backup_filename

Specifies the name of the backup file that will contain the unpurged number portability records.

-n

Sorts in the ascending order of the CLI. Default sorting is in the ascending order of the time stamp.

-help

Displays the syntax and parameters for this utility.

Results

The purge_np_data.pl utility notifies you when it successfully purges the number portability data file. Otherwise, it displays an error message.

RoamingConfigGen64

Use this utility to retrieve the roaming partner data from the Pipeline Manager database and create the roaming configuration data file. The data file is used by the Instances module to configure multiple instances of sequencers, output streams, or system brands based on the template sections or entries in the roaming registry file.

For more information, see "About Configuring Multiple Instances of Sequencers, Output Streams, or System Brands" in BRM System Administrator's Guide.

Location

Pipeline_Home/bin

Syntax

RoamingConfigGen64 -l database_access_library -s server_name [-d database_name] -c operator_code [-o output_path] [-b base_path] [-h]

Parameters

-l database_access_library

The database access library. For example, liboci10g6312d.a for Oracle on AIX.

-s server_name

Specifies the name of the host machine running the Pipeline Manager database.

-d database_name

Specifies the database name of the Pipeline Manager database. The default is an empty string (' ').

-c operator_code

Specifies the home network operator code. The default is PORTL.

-o output_path

Specifies the output path for the data file generated by the RoamingConfigGen64 utility. By default, the data file is saved in the Pipeline_Home/conf/ directory.

-b base_path

Specifies the base path to the directory for Transferred Account Procedure (TAP) and Near Real Time Roaming Data Exchange (NRTRDE) output files. The default path is Pipeline_Home/data/outcollect/.

For example, if the base path is Pipeline_Home/data/outcollect/, the following new subdirectories are created in the Pipeline_Home/data/outcollect/ directory:

  • tapout/ for TAP output files

  • nrtrdeout/ for NRTRDE output files

-h

Displays the syntax and parameters for this utility.

Note:

When prompted, enter the database user name and password.

Example

RoamingConfigGen64 -l liboci10g6312d.so -s $ORACLE_SID -c EUR01

where:

  • liboci10g6312d.so is the database access library.

  • $ORACLE_SID is the database alias.

  • EUR01 is the home network operator code.

Results

The RoamingConfigGen64 utility creates the roaming configuration data file. Otherwise, it displays an error message.

settlement_extract

Use this utility to retrieve roaming settlement information from the IC-Daily tables in the Pipeline Manager database. When Pipeline Manager rates roaming usage, it stores the amounts owed each roaming partner in the IC-Daily tables.

Important:

To ensure only unbilled events are extracted, before running this utility, you must close the bill run for each roaming partner account. You close the bill run by using the Pricing Center. See "Closing a Billrun" in BRM Configuring Roaming in Pipeline Manager.

For more information about roaming and settlement, see "About Rating Roaming Events" in BRM Configuring Roaming in Pipeline Manager.

This utility creates one file containing all settlement information stored in the Pipeline Manager database that has not already been extracted. The settlement information includes the amounts owed to each network that was used for roaming calls.

Note:

To connect to the BRM database, the settlement_extract utility needs a configuration file in the directory from which you run the utility. See "Creating Configuration Files for BRM Utilities" in BRM System Administrator's Guide.

Important:

  • This utility requires Perl version 5.004_00.

  • This utility uses DBI and DBD drivers which are not part of the Pipeline Manager installation. You download these drivers from www.cpan.org and compile and install them separately.

  • (HP-UX only) Before running this utility, you must load the libjava.so library. One way of doing this is to set the LD_PRELOAD environment variable to point to the library file:

For example:

# setenv LD_PRELOAD /u01/app/oracle/product/817/JRE/lib/PA_RISC/native_threads/libjava.so

Location

BRM_Home/apps/uel

Syntax

settlement_extract.pl [-u] dbi:dcs username password [filepath]

Parameters

-u

Creates a unique file name for the new file using the current time. The format of the file name is:

"settlement_YYYY-MM-DD_hh-mm-ss.txt"

dcs

The database connection string. This required parameter enables the script to access the database. The string is different for each database type. Example dcs for Oracle:

Oracle:orcl

Note:

The database connection string is the standard database access module for Perl scripts. It defines a set of methods, variables, and conventions that provide a consistent database interface, independent of the actual database being used.
username

The database username.

password

The database password.

filepath

The location where the file should be written to. If you don't include this parameter, the file is written to the current directory.

Results

Creates a roaming settlement data file and reports success or displays an error.

stateconfigtool

Use this utility to load state configuration (state.config) files for use with the Pricing Center Pipeline Manager data migration feature.

Important:

Before you run stateconfigtool, make sure that the following files are listed in your system CLASSPATH environment variable:
  • msbase.jar

  • msutil.jar

For more information, see Migrating pipeline pricing data in BRM Pricing Center Online Help.

Location

Pipeline_Home/tools/StateConfigTool

where, Pipeline_Home is the directory where Pipeline Manager is installed.

Syntax

stateconfigtool -f file_name -d database_type -h host -n port -u user_name -p password -i database_id 

Parameters

-f

The path and file name of the of the state.config file to be loaded. This file contains descriptions about changeset state transitions, such as currentState, nextState, and Action.

The default directory is Pipeline_Home/tools/StateConfigTool.

-d

The database type. The supported database is oracle.

-h

The host name of the computer running the Pipeline Manager database.

-n

The port number used by the Pipeline Manager database.

-u

The login name for connecting to the database.

-p

The password for the specified user name.

-i

The database ID of the Pipeline Manager database.

Results

The utility loads the contents of the state.config into the Pipeline Manager database. The states defined in the file become available in the Change Set Manager when it is restarted.

Related Topics

See "Understanding the Change Set Life Cycle" in BRM Configuring Pipeline Rating and Discounting.

StopRapGen

The StopRapGen utility searches the database to collect information required by the Stop RAP Generator pipeline to create Stop Return Returned Account Procedure (RAP) files.

It retrieves information on the following:

  • Transferred Account Procedure (TAP) files that were received by BRM and stored in the database more than seven days ago

  • Stop Return RAP files that were generated by BRM and sent more than seven days ago to the Visited Public Mobile Network (VPMN) operator.

Note:

The output from the StopRapGen utility is used by the Stop RAP Generator pipeline to generate the Stop Return RAP file.

Use the StopRapGen utility along with the Stop RAP Generator pipeline.

Location

Pipeline_Home/bin

where Pipeline_Home is the directory in which you installed Pipeline Manager.

Syntax

StopRapGen64 database_access_library server_name database_name path [prefix] [days]

Parameters

database_access_library

The database access library. For example, liboci10g6312d.a for Oracle on AIX.

server_name

Specifies the name of the host machine running the Pipeline Manager database.

database_name

Specifies the database ID of the Pipeline Manager database.

path

Specifies the output directory of the flat file generated by the StopRapGen utility. This file is used by the Stop RAP Generator pipeline.

Tip:

The output directory for the StopRapGen utility should be the same as the input directory for the Stop RAP Generator pipeline.
prefix

Specifies the prefix to be added to the output flat file. The default prefix is RC.

days

Specifies the number of days to consider for generating a Stop Return RAP file. The default is 7, in accordance with the RAP standard.

Example

StopRapGen64 liboci10g6312d.so $ORACLE_SID '' ./data/stoprap/in

where:

  • liboci10g6312d.so is the database access library.

  • $ORACLE_SID is the database alias.

  • ' ' is the empty string passed in as the database name.

  • .data/stoprap/in is the output directory of the sample usage data for the StopRapGen utility (the flat file it generates). This is also the input directory of the Stop RAP Generator pipeline.

Results

The StopRapGen utility generates the input required by the Stop RAP Generator pipeline.

ZoneDBImport

The ZoneDBImport utility loads data in the IFW_STANDARD_ZONE table of the Pipeline Manager database.

This utility uses the following files:

  • Control File (zoneLoader.ctl)

    The zoneLoader.ctl file controls how the data is loaded. It contains information about the table name, column datatypes, field delimiters, and so on.

    Initialize the infile variable with the path and file name of the file that contains the data to be imported.

  • Execution File (zoneLoader.pl)

    Update the entries for the DatabaseName and UserName with the database name and user name of the current database.

Location

Pipeline_Home/tools

Syntax

./zoneLoader.pl