86 Pipeline Manager Utilities

This chapter provides reference information for Oracle Communications Billing and Revenue Management (BRM) Pipeline Manager utilities.

Database Loader

The Database Loader utility loads and unloads aggregation data into and from a database.

For information about aggregation, see "Setting Up Pipeline Aggregation".

Dependencies

This utility needs a connection to the DBC database module and the DBL library (libDBLXXX.so). See "Database Connect (DBC)".

Location

pipeline_home/tools

where pipeline_home is the directory in which you installed Pipeline Manager.

Syntax

dbLoader -r registry [-f files] [-u]

Parameters

-r registry

Defines the registry file.

-f files

Defines the file pattern (regular expression).

-u

Undo mode.

Registry Entries

Table 86-1 lists the Database Loader registry entries.

Table 86-1 Database Loader Registry Entries

Entry Description Mandatory

BULKSIZE

Specifies the Oracle array size for bulk inserts (loadmode 2 and 3).

Yes

DIRECTIONMODE

Defines the selection order of the control files (1 file name, 2 sequence).

Yes

FILES.ARCHIVE.PATH

Specifies the path where the successfully loaded files are stored.

Yes

FILES.ARCHIVE.SUFFIX

Specifies the suffix of the successfully loaded data files.

Yes

FILES.BAD.PATH

Specifies the path where the bad files are stored.

Yes

FILES.BAD.SUFFIX

Specifies the suffix of the bad data files.

Yes

FILES.CONTROL.PATH

Specifies the path for the input aggregate control files.

Yes

FILES.CONTROL.SUFFIX

Specifies the suffix of the input aggregate control files.

Yes

FILES.DATA.PATH

Specifies the path for the input aggregate data files.

Yes

FILES.DATA.SUFFIX

Specifies the suffix of the input aggregate data files.

Yes

FILES.MERGE.PATH

Specifies the path where the source merge data files are stored.

Yes

FILES.MERGE.SUFFIX

Specifies the suffix of the source data files before merging/sorting.

Yes

FILES.REJECT.PATH

Specifies the path where the rejected files are stored.

Yes

FILES.REJECT.SUFFIX

Specifies the suffix of the rejected data files.

Yes

LOADMODE

Specifies how to load data:

  • 1: Single row updates and inserts.

  • 2: Single row updates and bulk inserts.

  • 3: Single row updates and bulk inserts.

Before loading, the files can be merged or sorted and split into smaller pieces.

Undo mode is always 1.

Yes

MAXSPLITLINES

Specifies the maximum number of lines per data file after splitting (loadmode 3).

Yes

ROLLBACKSEGMENT

Specifies which Oracle rollback segment to use when loading the database. How to set this entry depends on your database software setup.

If your Oracle9i database uses automatic undo management, comment out or remove this registry entry. If your database does not use undo management, specify a rollback segment.

The Oracle9i software provides an automatic undo management feature, which creates undo tablespaces rather than rollback segments for undo information. If you use this undo management feature and specify a rollback segment for the Pipeline Manager Database Loader utility, the utility fails when it attempts to load the database. To prevent this problem, don't specify a rollback segment.

No

SORTCMD

Specifies the external sort command (loadmode 3).

Yes

SORTING

Specifies a flag if files of identical structure should me merged and sorted (loadmode 3).

Yes

SORTMAXFILESIZE

Specifies the maximum destination size of the merged and sorted files (loadmode 3).

Yes

SORTTMPDIR

Specifies the path where sort stores temporary files (loadmode 3).

Yes

SPLITTING

Specifies whether to split data files before loading (reduce transaction size) (loadmode 3).

Yes

Sample Registry

DBLOADER 
  { 
    Active              = TRUE
    ProcessLoopTimeout  = 10
    QueueRequestTimeout = 0
    Instrumentation
    {
      #-----------------------------------------------------------
      # ProbeBroker registry entries.
      # ProbeInfoFilePath - The path that contains all probe
      # info files used by instrumented objects.
      #-----------------------------------------------------------
      ProbeBroker
      {
        ProbeInfoFilePath = ./instrumentation
      }
    }
    LogMessageTable
    {
      MessageFilePath   = ./etc
      MessageFileSuffix = .msg
    }
    DiagnosticDataHandler
    {
      DiagnosticFilePath = ./log
      DiagnosticFileName = diagnostic.dat
    }
    #
    # main parameter
    #
    DIRECTIONMODE   = 2
    LOADMODE        = 2
    BULKSIZE        = 100
    ROLLBACKSEGMENT = R04
    SORTING         = true
    SORTCMD         = sort
    SORTTMPDIR      = .
    SORTMAXFILESIZE = 2000000000
    SPLITTING       = true
    MAXSPLITLINES   = 40000
    #
    # database section
    #
    DataPool
    {
      Database
      {
        ModuleName = DBC
        Module
        {
          DatabaseName = $ORACLE_SID
          UserName     = AGGREGATOR
          PassWord     = password        
          AccessLib    = oci231
          Connections  = 1
        }
      }
    }
    #
    # File Section
    #
    FILES
    {
      CONTROL
      {
        PATH    = ./data/aggregate/cntl
        SUFFIX  = .ctl
      }
      DATA
      {
        PATH    = ./data/aggregate/done
        SUFFIX  = .dat
      }
      REJECT
      {
        PATH      = ./data/aggregate/reject
        SUFFIX    = .rej
        THRESHOLD = 85
      }
      REJECT_HANDLE
      {
        PATH      = ./data/aggregate/reject
        SUFFIX    = .rej
        THRESHOLD = 85
      }
      ARCHIVE
      {
        PATH    = ./data/aggregate/archive
        SUFFIX  = .arc
      }
      BAD
      {
        PATH    = ./data/aggregate/bad
        SUFFIX  = .bad
      }
      MERGE
      {
        PATH    = ./data/aggregate/merge
        SUFFIX  = .mrg
      }
    }
    #
    # log section
    #
    ProcessLog
    {
      ModuleName = LOG
      Module
      {
        ITO
        {
          MessageFilePath = etc
          MessageFilePrefix = error
          MessageFileSuffix = error.msg
          FilePath = ./data/aggregate/log
          FileName = process
          FilePrefix = DBL_
          FileSuffix = .log
          ProcessName = dbLoader
          MessageGroup = DBLOADER
        }
        Buffer
        {
          Size = 1000
        }
      }
    }
  }

db2irules.pl

Use the db2irules.pl script to extract rule sets from the Pipeline Manager database into the Rule Set XML file.

Note:

This utility uses DBI and DBD drivers that are not part of the Pipeline Manager installation. You download these drivers from https://www.cpan.org and compile and install them separately.

Location

pipeline_home/tools/IRules2Db/db2irules.pl

Note:

Since there are dependencies between the db2irules.pl script and the PerlParser.pm XML library located in the same directory as the script, always run the script from this location.

Syntax

db2irules.pl [-d] [-u] dbi:dcs file_path rule_set_id

Parameters

If you start the db2irules.pl script without any parameters, a usage description and an example for each parameter are displayed.

-d

Deletes the specified rule sets from the database after you have extracted them. If you use this parameter, a transaction is opened with the database. If any of the rule set deletes fail, the entire delete sequence is rolled back to preserve database integrity. If all rule set tables are deleted successfully, the transaction is committed to the database.

-u

Creates a unique file name for the rule set, based on date and time. It uses the following format: RULESET_yyyy-mm-dd_hh-mm-ss.xml. Use this parameter to ensure that you do not override an existing XML file when extracting rule sets. If the file name for a rule set contains spaces, replace them with the underscore character (_).

Example:

db2irules.pl -u dbi:Oracle:orcl TAP3_VAL
dbi:dcs

Specifies the database connection string. This required parameter enables the script to access the database. The string is different for each database type. Example dcs for Oracle:

Oracle:orcl

Note:

The database connection string is the standard database access module for Perl scripts. It defines a set of methods, variables, and conventions that provide a consistent database interface, independent of the actual database being used.

file_path

Specifies where you want to export the rule set. If you want to use the same directory in which the rule set is stored, use ./ as file path. If you don't set this parameter, the rule set is exported automatically to the current directory.

rule_set_id

Extracts only one specific rule set, which is identified by its unique ID. If you don't set this parameter, the db2irules.pl script will extract all rule sets from the database. If you use this parameter, you must use the file_path parameter. This rule_set_id refers to the IFW_RULESET.RULESET database field.

Diagnostic Data Handler

Use Diagnostic Data Handler to get data about Pipeline Manager after a crash, exception, critical error, or while it is running.

Registry Entries

Table 86-2 lists the Diagnostic Data Handler registry entries.

Table 86-2 Diagnostic Data Handler Registry Entries

Entry Description Mandatory

DiagnosticFilePath

Path to the log file that is created by Diagnostic Data Handler.

Yes

DiagnosticFileName

File name of the log file that is created by Diagnostic Data Handler.

Yes

Sample Registry

DiagnosticDataHandler
{
   DiagnosticFilePath = ./log
   DiagnosticFileName = diagnostic.dat
}

irules2db.pl

Use the irules2db.pl script to insert a rule set from the Validation Rules XML file into the Pipeline Manager database.

Note:

This utility uses DBI and DBD drivers which are not part of the Pipeline Manager installation. You download these drivers from https://www.cpan.org and compile and install them separately.

Location

pipeline_home/tools/IRules2Db/irules2db.pl

Note:

Since there are dependencies between the irules2db.pl script and the PerlParser.pm XML library which is located in the same directory as the script. Always run the script from this location.

Syntax

irules2db.pl [-f] dbi:dcs rule_set_name backup_file_path

Parameters

If you start the irules2db.pl script without any parameters, a usage description and an example for each parameter are displayed.

dcs

The database connection string. This required parameter enables the script to access the database. The string is different for each database type. Example dcs for Oracle:

Oracle:orcl

Note:

The database connection string is the standard database access module for Perl scripts. It defines a set of methods, variables, and conventions that provide a consistent database interface, independent of the actual database being used.

rule_set_name

Use this parameter to specify the name of the Rule Set XML file that you want to import to the database. This parameter supports fully qualified and relative path names.

Examples:

  • ./tap3_val.xml

  • /home/data/tap3_val.xml

  • /../files/tap3_val.xml

  • tap3_val.xml

backup_file_path

Use this parameter to specify the path for storing the extracted rule set before it is deleted from the database and then after modification inserted from the Rule Set XML file into the database. Use this parameter with the -f parameter.

-f

This parameter forces the rule set into the database. The irules2db.pl script connects to the database and starts parsing the Rule Set XML file. When it finds the name of the rule set, it calls the export script that contains the -u and -d parameters. If the db2irules.pl script finished successfully, the irules2db.pl script continues parsing the XML file and imports the rule set to the database. If any of the rule set columns fail to be inserted, the irules2db.pl script rolls back the transaction and exits. If all columns are inserted into the database successfully, the rule set for the transaction is committed.

load_notification_event

Loads the notification XML file from Pipeline Manager into the BRM database. This allows BRM to notify customers when their balance has reached a threshold value during the batch rating process.

You must configure the Batch Controller to run this utility.

Location

BRM_home/bin

where, BRM_home is the directory in which you installed the BRM software.

Syntax

load_notification_event [-d] [-v] [-h] XML_file

Parameters

-d

Sets the log level to debug and outputs debug information into the log file for this process. If not set, only error-level information is output.

-v

Displays information about failed or successful processing as the utility runs.

-h

Displays syntax and parameters for this utility.

XML_file

The name and location of the XML file to load into the BRM database. This must be the last parameter listed on the command line.

Results

This utility notifies you when it successfully loads the XML file.

If the utility does not notify you that it was successful, look in the utility log file (default.pinlog) to find any errors. The log file is either in the directory from which the utility was started or in a directory specified in the configuration file.

load_pin_rtp_trim_flist

Use this utility to specify account and service object fields to be included in the flist sent to a real-time rerating, discounting, or zoning pipeline. The main uses for this utility include:

  • Improving system efficiency by removing (trimming) fields that Pipeline Manager doesn't use.

  • Supporting custom iScripts and iRules in the real-time pipeline by adding fields to flists which are not included by default.

You can configure a different set of fields to be included in the flist based on event type.

Account object fields are included in the PIN_FLD_INHERITED_INFO substruct in the flist. Service object fields are included in the PIN_FLD_INHERITED_INFO.PIN_FLD_SERVICE_INFO substruct.

Note:

  • You cannot load separate /config/rtp/trim_flist objects for each brand. All brands use the same object.

  • You can't remove fields from the PIN_FLD_INHERITED_INFO substruct or the subordinate PIN_FLD_INHERITED_INFO.PIN_FLD_SERVICE_INFO substruct.

You specify the list of required fields in an XML file (field_list.xml) and then load the file using the utility.

Note:

  • If you use the utility to add new fields to the flist, you must update the input modules of the all pipelines to add the fields to the EDR container.

  • After you use the utility, you must restart BRM.

Location

BRM_home/bin

Syntax

load_pin_rtp_trim_flist -f field_list.xml [-v] [-d]

Parameters

-f field_list.xml

Specifies the XML file that describes which fields should be read. For a sample flist, see BRM_home/sys/data/config/pin_config_rtp_trim_flist.xml.

-v

Displays information about successful or failed processing as the utility runs.

-d

Creates a log file for debugging purposes. Use this parameter for debugging when the utility appears to have run with no errors, but the data has not been loaded into the database.

LoadIfwConfig

Use this utility to extract data from or load data into the Pipeline Manager database. This enables you to:

  • Migrate data from a legacy database to the Pipeline Manager database.

  • Transfer data between Pipeline Manager databases; for example, from a test database to a production database. See "Transferring Data Between Pipeline Manager Databases".

    Note:

    The 7.4 version of the LoadIfwConfig utility is not backwards-compatible with previous versions of the utility. Any data exported by a previous version of the utility must also be loaded with that same version. In addition, any custom scripts or procedures that are dependent on the utility's functionality might need to be modified to work with the 7.4 version.

The LoadIfwConfig utility can run in these modes:

  • Non-interactive mode: You use commands that batch several related parts of the extracting or loading process. You must enter a full command, including the utility name for each set of actions.

  • Interactive mode: You issue a command for each step in the process of extracting or loading. After you enter interactive mode, the prompt changes to an angle bracket and commands are single words for performing particular actions. You can view a list of the change sets that will be extracted or loaded.

Location

pipeline_home/bin

Syntax: Non-Interactive Mode

LoadIfwConfig   {-rall [-t Modifidate] | -r [-t Modifidate] | -p [f] | -u | -I                [-c] [-nodep-i InputFile [-o OutputFile] [-h] [-v]

Parameters: Non-Interactive Mode

-rall [-t Modifidate]

Extracts all objects from the Pipeline Manager database. This parameter does not require an input XML file.

Using -t Modifidate retrieves only pricing objects that were modified after the specified timestamp. Enter the time in the ISO-8601 format: YYYY-MM-DDThh:mm:ss or YYYY-MM-DD with the server time zone as the default.

-r [-t Modifidate]

Extracts from the database the objects listed in InputFile.

Using -t Modifidate retrieves only pricing objects that were modified after the specified time. Enter the time in the ISO-8601 format: YYYY-MM-DDThh:mm:ss or YYYY-MM-DD with the server time zone as the default.

-p [f]

Deletes objects from the database.

Using the f parameter turns off the delete confirmation.

-u

Updates the Pipeline Manager database. Data is not actually updated in the database until it is committed with the -c parameter.

-I

Inserts data into the Pipeline Manager database. Data is not actually inserted into the database until it is committed with the -c parameter.

-c

Commits the data to the database. You use this command in conjunction with the -u and -I parameters.

-nodep

Suppresses any object dependency relationships that you configured in the pipeline_home/tools/XmlLoader/CustomConfig.xml file. This allows the utility to extract from the database only those objects that meet your criteria and to ignore any dependent objects. For more information about object dependencies, see "About Specifying to Extract Child and Dependent Objects".

-i InputFile

When extracting pipeline data by using the -r or -rall parameter, this is the name of the XML file that specifies the list of objects to extract from the source Pipeline Manager database.

When loading pipeline data by using the -u or -I parameter, this is the name of the XML file that contains the data you are loading into the destination Pipeline Manager database.

When deleting pipeline data by using the -p parameter, this is the name of the XML file that specifies the list of objects to delete from the Pipeline Manager database.

-o OutputFile

Specifies the output file to which the Pipeline Manager data is extracted. By default, the utility writes the output to a file named default.out in the current directory.

-h

Displays help about using the utility.

-v

Displays information about successful or failed processing as the utility runs.

Syntax: Interactive Mode

LoadIfwConfig  [read InputFile] [write OutputFile] [retrieve_all [-t Modifidate]] 
               [fetch [-t Modifidate]] [list] [delete] [commit] [update] [insert]  
               [help] [nodep] [verbose on|off] [quit]

Parameters: Interactive Mode

read InputFile

Specifies to read the specified input file into internal memory.

write OutputFile

Specifies the output file to which the Pipeline Manager data is extracted. By default, the utility writes the output to a file named default.out in the current directory.

retrieve_all [-t Modifidate]

Extracts all objects from the Pipeline Manager database.

Using -t Modifidate retrieves only pricing objects that were modified after the specified time. Enter the time in the ISO-8601 format: YYYY-MM-DDThh:mm:ss or YYYY-MM-DD with the server time zone as the default.

fetch [-t Modifidate]

Extracts from the database the objects listed in internal memory. You use this parameter after you use the read parameter.

Using -t Modifidate retrieves only pricing objects that were modified after the specified time. Enter the time in the ISO-8601 format: YYYY-MM-DDThh:mm:ss or YYYY-MM-DD with the server time zone as the default.

list

Lists the current pipeline data stored in internal memory.

delete

Deletes from the database the objects listed in InputFile.

commit

Commits the data to the database. You use this command in conjunction with the update and Insert parameters.

update

Updates the Pipeline Manager database. Data is not actually updated in the database until it is committed with the commit parameter.

insert

Inserts data into the Pipeline Manager database. Data is not actually inserted into the database until it is committed with the commit parameter.

help

Displays help about using the utility.

nodep

Suppresses any object dependency relationships that you configured in the pipeline_home/tools/XmlLoader/CustomConfig.xml file. This allows the utility to extract only those objects that meet your criteria and to ignore any dependent objects. For more information about object dependencies, see "About Specifying to Extract Child and Dependent Objects".

verbose [on | off]

Sets verbose information:

  • verbose on displays the status of the command most recently run.

    Use the ProcessLog section of the registry file to specify the name and location of the file where debug messages are written.

  • verbose off displays the status only if there is an error.

quit

Quits from the utility.

Results

If the LoadIfwConfig utility is successful, it displays a confirmation message. If unsuccessful, it displays errors.

Memory Monitor

Use the Memory Monitor module to warn you when available system memory is low and to shut down Pipeline Manager when memory reaches a specified threshold.

Registry Entries

Table 86-3 lists the Memory Monitor registry entries.

Table 86-3 Memory Monitor Registry Entries

Entry Description Mandatory

ScaleUnit

Specifies the unit for monitoring memory.

  • P specifies percentage.

  • K specifies Kilobytes.

  • M specifies MegaBytes.

Yes

ShutdownFreeMemLimit

Specifies the amount or percentage of remaining system memory that triggers Pipeline Manager to gracefully shut down.

Note: For percentage, you must enter a value from 1 to 99 inclusive.

Yes

WarningFreeMemLimit

Specifies the amount or percentage of remaining system memory that triggers Pipeline Manager to issue a warning to the user.

Note: For percentage, you must enter a value from 1 to 99 inclusive.

Yes

Sample Registry

ifw
{
    MemoryMonitor
    {
        ScaleUnit = P
        WarningFreeMemLimit = 10
        ShutdownFreeMemLimit = 5
    }
}

pin_container_to_stream_format

Use this utility to create EDR stream, input and output mapping, and input and output grammar files from an EDR container description file. FCT_CallAssembling then uses these files in the process of converting partially assembled call records to a new container description.

Location

BRM_home/bin

where BRM_home is the directory in which you installed BRM components.

Syntax

pin_container_to_stream_format -c container_description_filename -g grammar_file_prefix -m mapping_file_prefix -s stream_file_prefix | -h 

Parameters

-c container_description_filename

Specifies the container description file to use to generate a stream file and the mapping and grammar files. Replace container_description_filename with the container description file to use.

-g grammar_file_prefix

Creates the input and output grammar description files based on the container description file. Replace grammar_file_prefix with a prefix to add to the grammar filenames.

-m mapping_file_prefix

Creates the input and output mapping description files based on the container description file. Replace mapping_file_prefix with a prefix to add to the mapping filenames.

-s stream_file_prefix

Creates the stream description file based on the container description file. Replace stream_file_prefix with a prefix to add to the stream filename.

Note:

If you do not specify one or more of the -g, -m, or -s parameters, this utility generates the files using the container description filename as a prefix. However, if you specify these options, you must also specify their arguments. Otherwise this utility returns an error.

-h

Displays help for this utility.

Example

This example:

pin_container_to_stream_format -c containerDesc.dsc -g OLD_ -m OLD_ -s OLD_ 

Creates these files using the information in containerDesc.dsc:

  • OLD_Stream.dsc

  • OLD_InGrammar.dsc

  • OLD_OutGrammar.dsc

  • OLD_InMap.dsc

  • OLD_OutMap.dsc

Results

The pin_containter_to_stream_format utility notifies you only if it encounters errors.

pin_recycle

Use this utility to search for failed EDRs in the BRM database and queue the EDRs for recycling or test recycling, or delete them. This utility can:

This utility calls the suspense manger opcodes to actually perform the recycling.

Location

BRM_home/bin

Syntax

pin_recycle [ -f CDR_file] [ -k recycle_key ] [ -d | -D| -r reason_code| -t ] 

Parameters

-f CDR_file

Queues all the failed EDRs that arrived in a single CDR file. Pipeline Manager rates these calls as soon as it can.

-k recycle_key

Searches for and queues EDRs for rating that contain:

  • The recycle_key, an application-specific string that is added to each EDR as it is suspended by Pipeline Manager. See "About Standard Recycling" for details.

  • A status of suspended.

These EDRs are queued for rating by Pipeline Manager as soon as possible.

-d

Searches for and deletes all EDRs with a status of succeeded or written off.

-D

Searches for and deletes all EDRs with a status of succeeded, written off, or suspended.

-r reason_code

Searches for and recycles all EDRs that have the specified reason code.

-t

Specifies a test recycle. In test mode, pin_recycle creates a report about the processing, but does not make any changes to the database. Test results written to the directory and file you specified using the FCT_Suspense module RecycleLog registry entries. You must also set the FCT_Suspense LogTestResults registry entry for standard recycling implementations.

Results

This utility logs messages to stdout.

The following message is returned after you use pin_recycle to recycle EDRs:

pin_recycle tool, number_of_EDRs EDRs Submitted for Recycling

The following message is returned after you use pin_recycle to test recycle EDRs:

pin_recycle tool, number_of_EDRs EDRs submitted for test recycling

The following message is returned after you use pin_recycle to delete EDRs:

pin_recycle tool, number_of_EDRs suspended EDRs deleted

uninstaller

Use this utility to uninstall the BRM server software, client applications, and optional components from a single machine. If your BRM system is distributed among multiple machines, you must run the uninstaller utility on each machine.

This utility does not remove all BRM files and directories from your system or reverse changes made to your configuration files and database.

Location

BRM_home/uninstaller

Syntax

uninstaller  -log BRM_home/uninstaller/uninst 
             [ + | - | ]product product_name
             [ + | - | ]component component_name product_name
             read text_file_name

Parameters

-log BRM_home/uninstaller/uninst

Logs status and error messages to the uninst log file.

+

Registers the product or component to uninstall.

Points to the product or component to uninstall.

=

Verifies that the product or component is registered for uninstallation.

Commands

  • product product_name

    Uninstalls the specified product. You can only uninstall one product at a time.

    The Infranet.prod file, located in the directory where you downloaded and extracted your BRM software, stores the names of all products installed on your system. product_name must match one of the names in this file.

    For example, to uninstall BRM, type:

    % uninstaller -log BRM_home/uninstaller/uninst -product Portal_Base
      
  • component component_name product_name

    Uninstalls the specified component. You must specify the component name and the parent product.

    The comps directory, located in the directory where you downloaded and extracted your BRM software, lists the names of all components installed on your system. component_name must match one of the file names, minus the extension, in this directory.

    The Infranet.prod file, located in the directory where you downloaded and extracted your BRM software, stores the names of all products installed on your system. product_name must match one of the names in this file.

    For example, to remove the Connection Manager (CM) only, type:

    % uninstaller -log BRM_home/uninstaller/uninst -component CM Portal_Base
      
  • read text_file_name

    Reads the text file and performs any batch operations specified in the text file.

Results

The uninstaller utility doesn't notify you whether it was successful or unsuccessful. You must look in your directory structure to see if your files were removed.

purge_np_data.p

Use this utility to purge existing records from the number portability data file that are older than a specified date and time. See "Purging and Reloading the Memory Records".

Location

pipeline_home/bin

Syntax

purge_np_data.pl NP_FileName TimeStamp [–b backup_filename] [-n] [-help]

Parameters

NP_FileName

Specifies the name of the number portability data file that will be purged.

TimeStamp

Specifies the date prior to which all the number portability records are purged. After the data is purged, the number portability data file is updated with the purged data.

Format: YYYYMMDDhhmmss.

-b backup_filename

Specifies the name of the backup file that will contain the unpurged number portability records.

-n

Sorts in the ascending order of the CLI. Default sorting is in the ascending order of the time stamp.

-help

Displays the syntax and parameters for this utility.

Results

The purge_np_data.pl utility notifies you when it successfully purges the number portability data file. Otherwise, it displays an error message.

RoamingConfigGen64

Use this utility to retrieve the roaming partner data from the Pipeline Manager database and create the roaming configuration data file. The data file is used by the Instances module to configure multiple instances of sequencers, output streams, or system brands based on the template sections or entries in the roaming registry file.

Location

pipeline_home/bin

Syntax

RoamingConfigGen64 -l database_access_library -s server_name [-d database_name] -c operator_code [-o output_path] [-b base_path] [-h]

Parameters

-l database_access_library

The database access library.

-s server_name

Specifies the name of the host machine running the Pipeline Manager database.

-d database_name

Specifies the database name of the Pipeline Manager database. The default is an empty string (' ').

-c operator_code

Specifies the home network operator code. The default is PORTL.

-o output_path

Specifies the output path for the data file generated by the RoamingConfigGen64 utility. By default, the data file is saved in the pipeline_home/conf/ directory.

-b base_path

Specifies the base path to the directory for Transferred Account Procedure (TAP) and Near Real Time Roaming Data Exchange (NRTRDE) output files. The default path is pipeline_home/data/outcollect/.

For example, if the base path is pipeline_home/data/outcollect/, the following new subdirectories are created in the pipeline_home/data/outcollect/ directory:

  • tapout/ for TAP output files

  • nrtrdeout/ for NRTRDE output files

-h

Displays the syntax and parameters for this utility.

Note:

When prompted, enter the database user name and password.

Example

RoamingConfigGen64 -l liboci10g6312d.so -s $ORACLE_SID -c EUR01

where:

  • liboci10g6312d.so is the database access library.

  • $ORACLE_SID is the database alias.

  • EUR01 is the home network operator code.

Results

The RoamingConfigGen64 utility creates the roaming configuration data file. Otherwise, it displays an error message.

settlement_extract

Use this utility to retrieve roaming settlement information from the IC-Daily tables in the Pipeline Manager database. When Pipeline Manager rates roaming usage, it stores the amounts owed each roaming partner in the IC-Daily tables.

Note:

To ensure only unbilled events are extracted, before running this utility, you must close the bill run for each roaming partner account. You close the bill run by using Pricing Center or Pipeline Configuration Center (PCC).

This utility creates one file containing all settlement information stored in the Pipeline Manager database that has not already been extracted. The settlement information includes the amounts owed to each network that was used for roaming calls.

Note:

  • This utility requires Perl version 5.004_00.

  • This utility uses DBI and DBD drivers which are not part of the Pipeline Manager installation. You download these drivers from https://www.cpan.org and compile and install them separately.

For example:

# setenv LD_PRELOAD /u01/app/oracle/product/817/JRE/lib/PA_RISC/native_threads/libjava.so

Location

BRM_home/apps/uel

Syntax

settlement_extract.pl [-udbi:dcs username password [filepath]

Parameters

-u

Creates a unique file name for the new file using the current time. The format of the file name is:

"settlement_YYYY-MM-DD_hh-mm-ss.txt"

dcs

The database connection string. This required parameter enables the script to access the database. The string is different for each database type. Example dcs for Oracle:

Oracle:orcl

Note:

The database connection string is the standard database access module for Perl scripts. It defines a set of methods, variables, and conventions that provide a consistent database interface, independent of the actual database being used.

username

The database username.

password

The database password.

filepath

The location where the file should be written to. If you don't include this parameter, the file is written to the current directory.

Results

Creates a roaming settlement data file and reports success or displays an error.

stateconfigtool

Use this utility to load state configuration (state.config) files for use with the Pipeline Manager data migration feature.

Note:

Before you run stateconfigtool, make sure that the following files are listed in your system CLASSPATH environment variable:

  • msbase.jar

  • msutil.jar

Location

pipeline_home/tools/StateConfigTool

Syntax

stateconfigtool -f file_name -d database_type -h host -n port -i database_id 

Parameters

-f

The path and file name of the of the state.config file to be loaded. This file contains descriptions about changeset state transitions, such as currentState, nextState, and Action.

The default directory is pipeline_home/tools/StateConfigTool.

-d

The database type. The supported database is oracle.

-h

The host name of the computer running the Pipeline Manager database.

-n

The port number used by the Pipeline Manager database.

-i

The database ID of the Pipeline Manager database.

Results

The utility loads the contents of the state.config file into the Pipeline Manager database. The states defined in the file become available in the Change Set Manager when it is restarted.

StopRapGen

The StopRapGen utility searches the database to collect information required by the Stop RAP Generator pipeline to create Stop Return Returned Account Procedure (RAP) files.

It retrieves information on the following:

  • Transferred Account Procedure (TAP) files that were received by BRM and stored in the database more than seven days ago.

  • Stop Return RAP files that were generated by BRM and sent more than seven days ago to the Visited Public Mobile Network (VPMN) operator.

Note:

The output from the StopRapGen utility is used by the Stop RAP Generator pipeline to generate the Stop Return RAP file.

Use the StopRapGen utility along with the Stop RAP Generator pipeline.

Location

pipeline_home/bin

where pipeline_home is the directory in which you installed Pipeline Manager.

Syntax

StopRapGen64 database_access_library server_name database_name path [prefix] [days]

Parameters

database_access_library

The database access library.

server_name

Specifies the name of the host machine running the Pipeline Manager database.

database_name

Specifies the database ID of the Pipeline Manager database.

path

Specifies the output directory of the flat file generated by the StopRapGen utility. This file is used by the Stop RAP Generator pipeline.

Tip:

The output directory for the StopRapGen utility should be the same as the input directory for the Stop RAP Generator pipeline.

prefix

Specifies the prefix to be added to the output flat file. The default prefix is RC.

days

Specifies the number of days to consider for generating a Stop Return RAP file. The default is 7, in accordance with the RAP standard.

Example

StopRapGen64 liboci10g6312d.so $ORACLE_SID '' ./data/stoprap/in

where:

  • liboci10g6312d.so is the database access library.

  • $ORACLE_SID is the database alias.

  • ' ' is the empty string passed in as the database name.

  • .data/stoprap/in is the output directory of the sample usage data for the StopRapGen utility (the flat file it generates). This is also the input directory of the Stop RAP Generator pipeline.

Results

The StopRapGen utility generates the input required by the Stop RAP Generator pipeline.

ZoneDBImport

The ZoneDBImport utility loads data in the IFW_STANDARD_ZONE table of the Pipeline Manager database.

This utility uses the following files:

  • Control File (zoneLoader.ctl)

    The zoneLoader.ctl file controls how the data is loaded. It contains information about the table name, column datatypes, field delimiters, and so on.

    Initialize the infile variable with the path and file name of the file that contains the data to be imported.

  • Execution File (zoneLoader.pl)

    Update the entries for the DatabaseName and UserName with the database name and user name of the current database.

Location

pipeline_home/tools

Syntax

./zoneLoader.pl