Go to primary content
Oracle® Retail POS Suite 14.1/Merchandising 14.1 Implementation Guide
Release 14.1
E59307-03
  Go To Table Of Contents
Contents
Go To Index
Index

Previous
Previous
 
Next
Next
 

5 Customization Notes

This chapter provides information about customizing the implementation.

Data Import Extension Points and Development

Oracle Store Solutions has provided not only extension points for enhancing or modifying the capabilities of the existing data imports, but there are also tools provided for jump-starting an altogether new data import. Do the following to create a new data import module:

  1. Compose an XSD to which the import data conforms. Follow patterns set in existing XSDs for determining order of type declarations.

  2. Generate sample XML based on the XSD. This can be done manually or by using a tool such as the Eclipse EMF plug-in. See:

    http://www.eclipse.org/
    
  3. Map the XSD to the Data Model.

  4. Use SAXParserGenerator with XSD.

  5. Add new SAXParser to the ImportTranslatorMap specified in ServiceContext.xml.

  6. Use DAOGenerator to generate data access objects (DAO) for tables mapped to.

  7. Rename DAO classes to match logical names of tables.

  8. Delete duplicate DTOs or DAOs that might exist in other packages and that can be reused.

  9. Update DAOIfc method parameters to pass actual DTO objects.

  10. Remove column names from UPDATE_SQL that are not updated during update procedure from DAO and SQLIfc.

  11. Update DAO get*Statement() methods to map DTO fields to PreparedStatement buckets.

  12. Create a test that reads the XML and sends it to translator. How the XML is created or read is not important at this time, nor is using Spring or JUnit or AppServer.

The following sections discuss these steps in more detail. Where these steps overlap with steps for enhancement (as opposed to steps for creating new imports), the enhancement steps are identified.

First, extension points are identified, and techniques for enhancing existing data imports are described. Each of the previously mentioned DIMP modules (Taxation, Merchandise Hierarchy, Store Hierarchy, and Employee) follow the same patterns of implementation and vary in minor details only. We concentrate on Employee. Figure 5-1 is the Employee Data Import Static Model.

Figure 5-1 Employee Data Import Static Model

Employee Data Import Static Model

Import Adapter and Translator

The entry point for data imports is the ImportIOAdapterIfc. It is configured through a Spring context as either EEImportIOAdapter, for JCA implementations, or FileImportIOAdapter for direct file I/O implementations. The IO Adapter retrieves the bundles from the file system, determines the processing order, and passes the XML stream data to the ImportInitiator, which determines the import type from the payload and passes the string to a translator. The ImportInitiator (as the BeanLocator) provides an ImportTranslatorIfc from the service context by passing the key EmployeeImportTranslator.IMPORT_TRANSLATOR_BEAN_KEY, for example.

The following example shows the EEImportIOAdapter implementation in use:

    <!-- Import IO Adapter Implements oracle.retail.stores.commerceservices.importdata.ImportIOAdapterIfc -->
    <bean id="service_ImportIOAdapter" class="oracle.retail.stores.commerceservices.importdata.EEImportIOAdapter">
    </bean>
    <!--<bean id="service_ImportIOAdapter" class="oracle.retail.stores.commerceservices.importdata.FileImportIOAdapter">

SAXParserGenerator

If creating a new data import module and starting with a defined XSD, a simple utility can be run to generate code for a Translator, SAX handlers, simple DTO, and a skeleton Import DAO. The following is an example of how to run this utility.

Example 5-1 SAXParserGenerator utility command prompt

<source_directory>\modules\utility>java 
oracle.retail.stores.codegen.importtranslator.SAXParserGenerator "C:\Data Import\Design\Employee\EmployeeImport.xsd" 
oracle.retail.stores.commerceservices.employee.importdata 
../../commerceservices/employee/src

This command line example shows that the utility program is Java-based and takes three arguments:

  • The location of the XSD file.

  • The desired package name for the generated source code.

  • The directory in which to place new source code files.

This utility can be configured as an executable target in your favorite Integrated Development Environment (IDE) so this utility can be run again as changes continue to be made to the XSD which defines the format of the new data input.

The code generation uses the Java-based Velocity templates and APIs. See:

http://velocity.apache.org/

Manually Editing Generated Code

The generated code requires additional manual editing before it can be used. For example, the ImportDAO has only the barest of implementations in its methods. Add code to pass various DTOs to the correct DAO that can handle it.

Appropriate DTOs might already exist in the codebase. Examine the attributes of the pre-existing DTO to see if it or the generated DTO should be used. In some cases, additional code might need to be added. For example, if you consider that a single-entity DTO usually represents a single record in the database, the SAX handlers are coded to not process child DTOs passed to the SAX handlers until the DTO that a SAX Handler creates is successfully processed.

Example 5-2 EmployeeAccessHandler Process DTO Before Children

/**
     * End handling this element. Calls {@link
     * ImportHandlerIfc#processEntity(java.io.Serializable)}
     * @throws SAXException
     */
    public void end() throws SAXException
    {
        try
        {
            // process this first
            parent.processEntity(employeeAccessDTO);
 
            // process all its children
            Iterator iter = children.iterator();
            while (iter.hasNext())
            {
                Serializable child = (Serializable)iter.next();
                parent.processEntity(child);
            }
        }
        catch (ImportException e)
        {
            logger.error("Could not end element " + getText(), e);
            throw new SAXException("Could not end element " + getText(), e);
        }
    }

However, in some cases, such as when there are important attributes that are needed to fill the DTOs, and which need to be persisted immediately, the call to parent.processEntity(Serializable) can be commented out of the end() method and added to the start(Attributes) method. The start(Attributes) method is called when parsing the beginning of the XML element. Notice in the following example, the value for "Incremental" defaults to true if it does not exist.

Example 5-3 EmployeeImportHandler Process DTO During Start

/**
     * Start handling this element by inspecting its attributes, if any.
     * @param attributes the attributes given.
     * @throws SAXException
     */
    public void start(Attributes attributes) throws SAXException
    {
        String incremental = attributes.getValue("Incremental");
        Boolean bIncremental = (incremental != null)? Boolean.valueOf(incremental) : Boolean.TRUE;
        employeeImportDTO.setEmployeeImportIncrementalAttribute(bIncremental.booleanValue());
 
        try
        {
            // process this first
            parent.processEntity(employeeImportDTO);
        }
        catch (ImportException e)
        {
            logger.error("Error starting import" + employeeImportDTO, e);
            throw new SAXException("Error starting import" + employeeImportDTO, e);
        }
    }

There also might be a scenario where parent XML element values, such as IDs, are required for child DTO objects. These attributes might have to be added manually to the DTOs and set by the handlers. See the Merchandise Import DTO, LevelDTO as an example, and the handlers that call its set methods.

If it seems that the SAX handlers or the DTOs are missing attributes for defined XML elements, there might be errors in the XSD that the SAXParserGenerator cannot decipher. Ensure that your XSD validates properly based upon the schema at:

http://www.w3.org/2001/XMLSchema

Metadata

The top-level element of each import includes metadata pertaining to the import bundle. Among other possible uses, this data is included in import bundle tracking and error logging. The following is an example XML fragment. Consult the development team for the status of data import schemas beyond this release.

<ItemImport
            Priority="0"
                        FillType="FullIncremental"
                        Version="1.0"
                        Batch="1"
                        CreationDate="2001-12-17T09:30:47.0Z"
                            ExpirationDate="2007-12-17T09:30:47.0Z"
                            xsi:noNamespaceSchemaLocation="ItemImport.xsd"
                            xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
. . .

The metadata attributes are defined as follows:

Priority

An integer specifying the order, from lowest to highest, in which multiple files of one type in a bundle should be processed.


Note:

Priority is not currently used. The Manifest.mf file specifies the XML file processing order and any dependencies.

FillType

The feed method: Kill And Fill, Delta Incremental, or Full Incremental. The XSD specifies which of these are allowed for an import type. For example, Tax allows only Kill And Fill, while Item allows all three.

  • Kill And Fill – Deletes (kills) all existing data and then performs record operations to fill tables. The kill is rolled back upon failure during the fill stage.

  • Full Incremental – Incrementally perform record operations in order against production tables. Updates contain the full record needed for the update.

  • Delta Incremental – Same incremental behavior as for Full Incremental import type, but updates can contain only the delta values of each record wanting to update.

Version

The version of the application processing the data.

Batch

An integer sequence number, corresponding to the ID of the process that created the file.

CreationDate

A timestamp identifying the file's creation time.

ExpirationDate

A timestamp beyond which a file has become stale and should not be processed. This attribute does not need to be present.

ImportControllerIfc

The current implementation of the ImportControllerIfc operates well in most circumstances. However, there might be circumstances that call for a different version of the controller to be plugged in. For example, a new controller might put a parsed batch onto one of many secondary queues instead of passing it synchronously to a DAO, then returning control to the translator to continue parsing the import.

The secondary queue is another thread that takes the incoming batch and passes it to an instance of the import DAO. This enables multiple batches to be processed at once.

Oracle Retail POS Suite to Oracle Retail Sales Audit Extension Points and Development

There are three distinct situations in which an implementation team would need to extend the functionality in the Export File Generator:

  • Adding data elements to the RTLog Format.

  • Creating an entirely new fixed length export format.

  • Creating an entirely new export format which is not fixed length.

Adding Data Elements to the RTLog Format

To add VAT information added to the one or more of the reference fields in the Transaction Item record to the RTLog a implementation team takes the following steps:

  1. Define the format of the VAT data.

  2. Depending on the outcome of step 1, it might be advantageous to modify the definition of a Reference field in the Transaction Item record. This cause the creation of Acme-specific Export Format Configuration file. If this is desirable, copy RTLogFormat.xml to AcmeRTLogFormat.xml and make the modifications in this file.

  3. Define how the columns in the table TR_LTM_SLS_RTN_TX map to the format defined in step 1.

  4. Write a FieldMapper class called AcmeItemVATTax.java to perform the mapping.

  5. Copy RTLogMappingConfig.xml to AcmeRTLogMappingConfig.xml and make the following change to the new file:

    <TABLE table="TR_LTM_SLS_RTN_TX">
            <MAP column="MO_TX_RTN_SLS" record="TransactionTax" field="TaxAmount"
                        fieldMapper="com.acme.exportfile.RTLog.fieldmappers.AcmeItemVATTax "/>
    </TABLE>
    
  6. Modify StoreServerConduit.xml to use AcmeRTLogMappingConfig.xml and AcmeRTLogFormat.xml instead of RTLogMappingConfig.xml and RTLogFormat.xml.

If the Reference field is partitioned correctly, and the values coming from the database to these new fields do not requires manipulation, then it is possible that the FieldMapper class is not required.

Blocking Transaction Export

The RTLog file export feature processes all transactions. However, there may be some kinds of transactions that a customer does not want to send to Oracle Retail Sales Audit. For example, the customer might not want Training Mode transactions to be sent to Oracle Retail Sales Audit. Do the following to prevent the Training Mode transactions from being exported, for example:

  1. Modify the RTLogMappingConfig.xml file. Replace the following code:

        <MAP column="FL_TRG_TRN" record="TransactionHeader" field="SubTransactionType">
            <VALUE_MAPPINGS handleNotFound="success">
               <VALUE_MAPPING DatabaseValue="1" RecordValue="TRAIN"/>
            </VALUE_MAPPINGS>
        </MAP>
    

    With these lines:

        <MAP column="FL_TRG_TRN" record="TransactionHeader" field="SubTransactionType" fieldMapper="oracle.retail.stores.exportfile.rtlog.fieldmappers.TrainingModeTransNotExportableMapper"/>
    
  2. Add a FieldMapper called TrainingModeTransNotExportableMapper.java. This FieldMapper contains the following method:

        public int map(String columnValue, Row row, ColumnMapIfc columnMap,  
                FieldFormatIfc field, RecordFormatIfc record, EntityIfc entity, 
                EntityMapperIfc entityMapper) throws ExportFileException
        {
            // The column is FL_TRG_TRN; it is a boolean where "1" indicates
            // the transaction was created in training mode.
            if (columnValue.equals("1"))
            {
                logger.warn("Not exporting training mode transactions due to a duplicate transaction issue at Oracle Retail Sales Audit.");
                RTLogMappingResultIfc results = (RTLogMappingResultIfc)entityMapper.getResults();
                results.setTransactionExportable(false);
            }
            return ColumnMapIfc.SUCCESS;
        }
    

Creating a New Fixed Length Export Record Format

Oracle Retail has only one way to send transactional data to a customer's back end systems: POSLog. However, it is expensive and time consuming to extend POSLog, to explain it to customers and to develop the code that loads it into the customer back end.

It might be faster and cheaper to use the Export File Generator to generate the transaction log format that the customer is already consuming.

The generation of all three current formats (DTM for Central Office, POSLog for the customer backend, and RTLog for Oracle Retail Sales Audit) simultaneously has been tested in the development environment.

Do the following to create a transaction log export code for Acme, a generic customer:

  1. Work with Acme developers to create a mapping document that describes the relationship between the Oracle database and the current Acme back end system/transaction log format. A mapping exercise of this type must be done even if the customer eventually chooses to use the POSLog to transfer the data. Understanding the customer's current transaction log can provide valuable insight into the data requirements.

  2. Construct an Acme-specific Export Format Configuration file which describes all the records in the Acme transaction log; call this file AcmeTLogFormatConfig.xml.

  3. Create an Acme-specific Mapping configuration file; call this file AcmeTLogMappingConfig.xml.

  4. Create an Acme-specific Entity Reader configuration file; call this file AcmeTLogExtractConfig.xml.

  5. If Acme exports the RTLog for Oracle Retail Sales Audit, the RTLogExportDaemonTechnician and RTLogExportDaemonThread can still be used to export the Acme Tlog formatted data. Just create another entry in StoreServerConduit.xml with a different technician and daemon name. This entry looks like the following:

    <TECHNICIAN name="AcmeTLogExportDaemonTechnician" 
                class="RTLogExportDaemonTechnician" 
                package="oracle.retail.stores.domain.manager.RTLog" 
                export="Y">   
        <PROPERTY propname="daemonClassName"
                  propvalue="oracle.retail.stores.domain.manager.RTLog.RTLogExportDaemonThread"/>
        <PROPERTY propname="daemonName"
                  propvalue="AcmeTLogExportDaemon"/>
                  .
                  .
                  .
    </TECHNICIAN>
    
  6. Modify StoreServerConduit.xml to use AcmeTLogExtractConfig.xml, AcmeTLogFormatConfig.xml and AcmeTLogMappingConfig.xml when exporting the Acme TLog.

  7. Determine the batch ID column to use for this process. By convention, DTM uses TR_TRN. ID_TLOG_BTCH, POSLog uses TR_TRN.ID_BTCH_ARCH, and RTLog uses ID_RTLOG_BTCH. If your system exports RTLog, you must override RTLogExportBatchGenerator.retrieveTransactionList() and RTLogDatabaseAdapter.postResults() to change the column your application uses.

  8. Over the course of development add table names to AcmeTLogExtractConfig.xml, mapping information to AcmeTLogMappingConfig.xml. Write Acme-specific FieldMapperIfc and AccessorIfc classes.

  9. It is necessary to create an Acme-specific implementation for the MappingResultIfc interface to hold the Acme transactional information. Call this class AcmeTLogMappingResult. This necessitates the creation of an Acme-specific EntityMappingObjectFactoryIfc class. Call this class AcmeEntityMappingObjectFactory.

  10. It is necessary to create an Acme-specific implementation for the RecordFormatContentBuilderIfc to assemble the Acme-specific export records. Call this class AcmeTLogRecordFormatContentBuilder. This necessitates the creation of an Acme specific RecordFormatObjectFactoryIfc class called AcmeRecordFormatObjectFactory.

  11. Modify StoreServerConduit.xml to use the AcmeEntityMappingObjectFactory and the AcmeRecordFormatObjectFactory when exporting the Acme TLog.

Exporting a Non-Fixed-Length Record Format

There are other styles of text besides fixed record length which have been used to transfer transactional information to the enterprise. For example: comma delimited, and tag and value. To support either of these you must complete all the steps in the previous section, as well as the following:

  1. It is likely that you need additional information about the export file format. As a result you must add information to the Export Format Configuration file, and create an Acme-specific implementation of the RecordFormatConfiguratorIfc interface; call this class AcmeRecordFormatConfigurator.

  2. The FieldFormat class formats its data based on the data type and generates a fixed length field. When all the fields in a record are aggregated, this creates a fixed length record. This class must be replaced by an Acme-specific implementation; call this class AcmeCommaDelimitedFieldFomat. It might also be necessary to create an Acme-specific implementation of RecordFormatIfc; call this class AcmeCommaDelimitedRecordFomat.

  3. Modify AcmeRecordFormatObjectFactory to return AcmeRecordFormatConfigurator, AcmeCommaDelimitedFieldFomat, and AcmeCommaDelimitedRecordFomat.

Object Factories

Object factories provide system implementers with the means to replace base product implementations with classes that are more appropriate to their needs. The object factory classes appear as entries in configuration files, and often times a configuration file functions as an object factory. This section discusses the object factory aspects and the configuration aspects of the configuration files.

StoreServerConduit.xml

The Store Server Conduit file (<root>\applications\pos\config\conduit\ StoreServerConduit.xml) defines at runtime the classes and configuration files that make up the managers and technicians in the Point-of-Service Store Server. One of the technicians it defines is the RTLogExportDaemonTechnician. Following are the classes the Store Server Conduit file defines for use when exporting the RTLog:

Table 5-1 Store Server Conduit File

Class Name Interface Name Description

RTLogExportDaemonTechnician (oracle.retail.stores.domain.manager.rtlog)

RTLogExportDaemonTechnicianIfc (oracle.retail.stores.domain.manager.rtlog)

Sets up the RTLog Export Process. The Dispatcher instantiates this class and then sets all the other parameters this object. It is also responsible for managing the batch regeneration process.

RTLogExportDaemonThread (oracle.retail.stores.domain. manager.rtlog)

RTLogExportDaemonThreadIfc (oracle.retail.stores.domain. manager.rtlog)

Sleeps for a configurable amount of time, then wakes up and initiates the export process.

RTLogDatabaseAdapter (oracle.retail.stores.domain.manager.rtlog)

DatabaseEntityAdapterIfc (oracle.retail.stores.exportfile)

Provides access to the database for reading each transaction Entity. This particular implementation uses the DataManager/DataTechnician to retrieve this information.

RTLogEncryptingOutputAdapter (oracle.retail.stores.exportfile.rtlog)

OutputAdapterIfc (oracle.retail.stores.exportfile)

Writes the RTLog file to the configured directory. This particular adapter encrypts the file as it writes the file to disk. There is another adapter, RTLogOutputAdapter, which writes the file in clear text.

RTLogEncryptionAdapter (oracle.retail.stores.domain.manager.rtlog)

EncryptionAdapterIfc (oracle.retail.stores.exportfile)

Provides access to the mechanisms for decrypting values which are encrypted in the database.

ExportFileConfiguration (oracle.retail.stores.exportfile)

ExportFileConfigurationIfc (oracle.retail.stores.exportfile)

Contains much the of configuration information in the RTLogExportDaemonTechnician; the technician passes this object to the daemon, which passes it to the batch generator which passes it to the export file generator.

RTLogExportFileResultAuditLog (oracle.retail.stores.domain.manager.rtlog)

ExportFileResultAuditLogIfc (oracle.retail.stores.exportfile)

Formats the export result information for logging.

EntityMappingObjectFactory (oracle.retail.stores.exportfile)

EntityMappingObjectFactoryIfc (oracle.retail.stores.exportfile)

Instantiates the classes used to map the database Entity to the export file format.

RecordFormatObjectFactory (oracle.retail.stores.exportfile)

RecordFormatObjectFactoryIfc (oracle.retail.stores.exportfile)

Instantiates the classes used to setup and generate the export the file format.

ExtractorObjectFactory (com.oracle.xmlreplication)

ExtractorObjectFactoryIfc (com.oracle.xmlreplication)

Instantiates the classes used to generate the database Entity.

RTLogCurrencyAdapter (oracle.retail.stores.domain.manager.rtlog)

CurrencyAdapterIfc (oracle.retail.stores.exportfile)

Provides currency services.


DomainObjectFactory

The DomainObjectFactory instantiates the RTLogExportBatchGeneratorIfc class. The RTLogExportBatchGenerator builds the WorkUnit (the list of transactions to export) and calls the WorkUnitController (ExportFileGenerator).

RTLogExportBatchGenerator also instantiates the ExportFileGeneratorIfc and the WorkUnitIfc. If you need a different implementation of either class, create a new implementation of RTLogExportBatchGenerator.

ExtractorObjectFactory

The ExtractorObjectFactory instantiates the classes that generate the database Entity class.

One item of note is that the application gains access to this factory through a singleton called ReplicationObjectFactoryContainer. All changes made to these classes must work for both DTM and Export File generation.

EntityMappingObjectFactory

The following table is a list of the classes this factory instantiates:

Table 5-2 EntityMappingObjectFactory Classes

Class Name Interface Name Description

MappingCatalogConfigurator (oracle.retail.stores.exportfile.mapper)

MappingCatalogConfiguratorIfc (oracle.retail.stores.exportfile.mapper)

Reads the mapping configuration file and builds an EntityMappingCatalogIfc object.

EntityMappingCatalog (oracle.retail.stores.exportfile.mapper)

EntityMappingCatalogIfc (oracle.retail.stores.exportfile.mapper)

Holds the information that describes the relationship between the tables and columns in the database to the records and fields in the export file. It contains a list of TableMaps and a map of Accessors.

TableMap (oracle.retail.stores.exportfile.mapper)

TableMapIfc (oracle.retail.stores.exportfile.mapper)

Contains a list of ColumnMaps associated with a table.

ColumnMap (oracle.retail.stores.exportfile.mapper)

ColumnMapIfc (oracle.retail.stores.exportfile.mapper)

Describes the relationship between a column and a field in a specific export record. It can contain a ValueMapping Hashmap and/or FieldMapper class to perform more complex mapping actions.

EntityMapper (oracle.retail.stores.exportfile.mapper)

EntityMapperIfc (oracle.retail.stores.exportfile.mapper)

Controls the mapping process. It stores the result in the MappingResultIfc object.

RTLogMappingResult (oracle.retail.stores.exportfile.rtlog)

MappingResultIfc (oracle.retail.stores.exportfile.mapper)

Contains the result of Mapping an Entity to the Export File Format.


RTLogMappingConfig.xml

This configuration file is a factory for FieldMapperIfc and AccessorIfc classes.

The simplest mapping occurs when a value goes directly from a column to a field. However, many times the mapping between a column and a field is more complex. If code is required, the configuration file calls out a FieldMapperIfc class to perform this mapping task. A FieldMapperIfc is associated with a particular table/column record/field mapping.

The values in a particular record are built up by processing of each individual ColumnMapIfc objects. There is no guarantee that all the data for a particular export record resides in a single row in the database. In fact it is unlikely. For example, a row from the Tender Line Item Table supplies the tender amount, but a row from the Credit Debit Tender Line Item Table supplies authorization information. Much processing can take place in between the time that the application has access to each of these rows.

An AccessorIfc object knows how to locate a particular existing ”working” export record in the MappingResultIfc object. If a record is not available, the AccessorIfc creates a new one and store it in the MappingResultIfc object.

RecordFormatObjectFactory

Following is a list of the classes this factory instantiates:

Table 5-3 RecordFormatObjectFactory Classes

Class Name Interface Name Description

FieldFormat (oracle.retail.stores.exportfile.formater)

FieldFormatIfc (oracle.retail.stores.exportfile.formater)

Contains the attributes associated with a field including name, value, starting index, length, and data type.

RecordFormat (oracle.retail.stores.exportfile.formater)

RecordFormatIfc (oracle.retail.stores.exportfile.formater)

Contains a list of FieldFormatIfc objects.

RecordFormatCatalog (oracle.retail.stores.exportfile.formater)

RecordFormatCatalogIfc (oracle.retail.stores.exportfile.formater)

Contains a list of RecordFormatIfc objects.

RecordFormatConfigurator (oracle.retail.stores.exportfile.formater)

RecordFormatConfiguratorIfc (oracle.retail.stores.exportfile.formater)

Reads the format configuration file and builds a RecordFormatCatalogIfc object.

RTLogRecordFormatContentBuilder (oracle.retail.stores.exportfile.rtlog)

RecordFormatContentBuilderIfc (oracle.retail.stores.exportfile.formater)

Converts MappingResultsIfc object into the text that is written to the export file.

RTLogItemContainedRecords (oracle.retail.stores.exportfile.rtlog)

ContainedRecordsIfc (oracle.retail.stores.exportfile.formater)

A list of records, such as discounts, that are a part of the item information.

RTLogTransactionContainedRecords (oracle.retail.stores.exportfile.rtlog)

ContainedRecordsIfc (oracle.retail.stores.exportfile.formater)

A list of records, such as header total records, that are part of a transaction.


Configuration

Each of the configuration files used by this feature (Store Server Conduit, Entity Reader Configuration, Mapping Configuration, and Record Format Configuration) has already been referred to in this document. This section describes them in more detail.

The Store Server Conduit File

The Store Server Conduit file (<root>\applications\pos\config\conduit\ StoreServerConduit.xml) defines the following settings for the RTLog Export process.

Table 5-4 Store Server Conduit File

Setting Name Installed Product Value Description

sleepInterval

600 (seconds)

The length of time between each execution of the RTLog export process.

exportDirectoryName

For example, POSLog

The directory where the RTLog is placed.

formatConfigurationFileName

../config/rtlog/RTLogFormat.xml

The relative or absolute path of the Export Format configuration file.

entityReaderConfigurationFileName

../config/rtlog/RTLogExtractConfig.xml

The relative or absolute path of the Entity Reader configuration file.

entityMappingConfigurationFileName

../config/rtlog/RTLogMappingConfig.xml

The relative or absolute path of the Mapping configuration file.

maximumTransactionsToExport

-1

The maximum number of transactions that should exported to single RTLog file. The value -1 indicates there is not limit on the maximum number.


The Export Format Configuration File

The export format configuration file describes each of the export record types. For example, the RTLog specifies the following records:

  • File Header

  • File Tail

  • Transaction Header

  • Transaction Tail

  • Transaction Item

  • Item Discount

  • Item Tax

  • Transaction Tender

The following is a snippet from RTLogFormat.xml:

<?xml version="1.0"?>
<RECORD_FORMATS ... >
        <COMMENT>This file defines the format of the Oracle Retail Sales Audit RTLOG</COMMENT>
        <RECORD_FORMAT_VERSION version="V.12.0.5"/>
        <RECORD_FORMAT name="FileHeader">
               <FIELD_FORMAT name="FileRecordDesciptor" type="char" length="5" value="FHEAD"/>
               <FIELD_FORMAT name="FileLineIdentifier" type="integer" length="10"/>
               <FIELD_FORMAT name="FileType" type="char" length="4" value="RTLG"/>
               <FIELD_FORMAT name="FileCreateDate" type="datetime" length="14"/>
               <FIELD_FORMAT name="BusinessDate" type="date" length="8"/>
               <FIELD_FORMAT name="LocationNumber" type="char" length="10"/>
               <FIELD_FORMAT name="ReferenceNumber" type="char" length="30" value=" "/>
        </RECORD_FORMAT>
                    .
                    .
                    .
</RECORD_FORMATS>

This snippet shows one Record definition (the File Header) composed of seven fields of various types, lengths and default values.

The Entity Reader Configuration File

This file defines tables that Entity Reader reads.

The Mapping Configuration File

This file describes the relationship between the tables and columns in the database and the records and fields in the export format. The following is a snippet from RTLogMappingConfig.xml:

<?xml version="1.0"?>
<ENTITY_MAPPER ... >
    <COMMENT>This is a configuration file for the Point-of-Service Transaction to RTLog Mapping</COMMENT>
    <TABLE table="TR_TRN">
        <MAP column="DC_DY_BSN" record="FileHeader" field="BusinessDate"
                    fieldMapper="oracle.retail.stores.exportfile.rtlog.fieldmappers.BusinessDateMapper"/>
        <MAP column="ID_STR_RT" record="FileHeader" field="LocationNumber"
                    fieldMapper="oracle.retail.stores.exportfile.rtlog.fieldmappers.StoreNumberMapper"/>
        <MAP column="TS_TRN_END" record="TransactionHeader" field="RegisterTransactionDate"
                    fieldMapper="oracle.retail.stores.exportfile.rtlog.fieldmappers.DateTimeMapper"/>
.
.
.
        <MAP column="TY_TRN" record="TransactionHeader" field="TransactionType"               
                    mappingStrategyOrder="FieldMapperThenValueMapping"
                    fieldMapper="oracle.retail.stores.exportfile.rtlog.fieldmappers.ExportItemsAndTaxStatusMapper">
                <VALUE_MAPPINGS handleNotFound="error">
                    <VALUE_MAPPING DatabaseValue="1" RecordValue="SALE"/>
                    <VALUE_MAPPING DatabaseValue="2" RecordValue="RETURN"/>
                    <VALUE_MAPPING DatabaseValue="3" RecordValue="PVOID"/>
                            .
                            .
                            .
                </VALUE_MAPPINGS>
        </MAP>
.
.
.        
    </TABLE>
.
.
.
    <ACCESSOR record="FileHeader" 
                            class="oracle.retail.stores.exportfile.rtlog.accessors.AccessFileHeader"/>
    <ACCESSOR record="TransactionHeader" 
                            class="oracle.retail.stores.exportfile.rtlog.accessors.AccessTransactionHeader"/>
.
.
.
</ENTITY_MAPPER>

Looking at this snippet, it is easy to see that the column TR_TRN.DC_DY_BSN maps to the BusinessDate field in the FileHeader record using the BusinessDateMapper class to format the data.

Also note that application uses a VALUE_MAPPINGS element to transform the value from the column TR_TRN.TY_TRN to equivalent value in the TransactionType field in the TransactionHeader record.

Development and Testing Tools

There are a number of tools that were developed during the course of this project that are helpful when extending this subsystem.

Classes

The following classes are all located at <root>\modules\exportfile\src\oracle\ retail\stores\exportfile\utility:

Table 5-5 Exportfile Utility Classes

Class Name Description

ExportTestDriver

This class is a test harness that can be used to develop the configuration files, FieldMapperIfc and AccessorIfc classes in isolation from the reset of the application. It uses the classes DatabaseEntityAdapterTest, EncryptionAdapterTest, CurrencyAdapterTest, OutputAdapterTest and ExportFileResultAuditLogTest to emulate system specific adapters.

An Eclipse-run configuration for this class should run out of the exportfile project. The classpath should include the domain, foundation-client, foundation-server, common, utility, foundation-shared, clientinterfaces, datareplication projects and /thirdparty/apache-ant-1.6.2/lib/xml-apis.jar, /thirdparty/ apache-ant-1.6.2/lib/xercesImpl.jar, and /thirdparty/apache/log4j-1.2.8.jar. It should also include the JDBC jar(s) for the database you are using.

You might need to modify this class to use the appropriate JDBC driver, username, password and transaction IDs.

FileDecryptionUtility

By default the application generates encrypted files. This class reads all the encrypted files from a target directory, decrypts them, and write them to a single target file. This class uses a single known encryption key.

The main() method has two command line parameters:

  • EncryptedDirectoryName - the pathname of the directory of *.ENC files

  • DecryptedFileName - the pathname of the decrypted file

KeyStoreFileDecryptionUtility

Uses the encryption service defined by the Spring configuration in \modules\exportfile\bin\config\context.

The main() method has two command line parameters:

  • EncryptedDirectoryName - the pathname of the directory of *.ENC files

  • DecryptedFileName - the pathname of the decrypted file

RTLOGReportDriver

This class reads an export format configuration file and an export log file then generates a report file (rtlog_rpt.txt) to the current directory. This saves a lot effort when trying to determine if an export file has the correct data in it. The main() method has three command line parameters:

  • ExportFileName - full/relative path pathname of the export file.

  • Either S (sales tax) or V (VAT). This parameter indicates if the IGTAX amounts should be included in the transaction balance calculation.

  • XMLFormatFileName - full/relative path pathname of the format file.


Executables in the bin Directory

The following BAT files are all located at <root>\modules\exportfile\bin:

Table 5-6 bin Directory BAT Files

Class Name Description

setenv.bat

Sets up the classpath

RTLogFileDecryption.bat

Executes FileDecryptionUtility.class; it points at the bin\POSLog directory in the default installation, writes the decrypted records to RTLOG.DEC, and uses the default encryption key.

RTLogReport.bat

Executes RTLOGReportDriver.class; it reads RTLOG.DEC, and uses to the export format file ..\config\RTLogFormat.xml.

RTLogKeyStoreFileDecryption.bat

Executes KeyStoreFileDecryptionUtility.class; it points at the bin\POSLog directory in the default installation, writes the decrypted records to RTLOG.DEC, and uses the default encryption key.