Oracle® Retail POS Suite Implementation Guide, Volume 1 – Implementation Solutions Release 14.1 E54475-02 |
|
![]() Previous |
![]() Next |
The Oracle Retail Point-of-Service client needs the following producers and consumers datasets to support offline functionality:
Employee
Item
Advanced Pricing
Tax
Currency
Store Info
Merchandise Hierarchy
Shipping Method
Reason Codes
Discount
ExportDB
Intra Store Data Distribution Infrastructure (IDDI) automates the following:
DataSet file generation at the Point-of-Service server
DataSet file transfer from Point-of-Service server to Point-of-Service client
Importing dataset files to Point-of-Service client database
The system has been designed to support a pluggable model. The following are all designed to be configurable at deployment time:
DataSetProducerJob
ClientDataSetController
DataSetService
ClientDataSetService
DataSetProducers
StoreInfoDataSetProducer
AdvancedPricingDataSetProducer
CurrencyDataSetProducer
EmployeeDataSetProducer
ItemDataSetProducer
MerchandiseDataSetProducer
OfflineDBProducer
ReasonCodeDataSetProducer
ShippingMethodDataSetProducer
TaxDataSetProducer
DiscountDataSetProducer
DataSetConsumers
StroreInfoDataSetConsumer
AdvancedPricingDataSetConsumer
CurrencyDataSetConsumer
EmployeeDataSetConsumer
ItemDataSetConsumer
MerchandiseDataSetConsumer
OfflineDBConsumer
ReasonCodeDataSetConsumer
ShippingMethodDataSetConsumer
TaxDataSetConsumer
DiscountDataSetConsumer
DerbyDataFormatter
This configuration is accomplished through the use of the Spring Framework as a configuration framework.
Table 6-1 includes the set of Spring bean IDs used for each of the pluggable components.
Table 6-1 Spring Framework Configuration Options
Spring bean ID | Purpose | Provided implementation | Configurable Options |
---|---|---|---|
service_DataSetService |
Configuration for DataSetService. Contains the list of all the DataSetKeys. |
oracle.retail.stores. foundation.iddi.DataSetService |
Generate at start up. To add a new DataSet type, add one more service_config_<<DataSetType>_KEY |
service_ClientDataSetService |
Configuration for ClientDataSetService. Contains the list of all the DataSetKeys. |
oracle.retail.stores. foundation.iddi.ClientDataSetService |
To add a new DataSet type, add one more service_config_<<DataSetType>_KEY dataImportFilePath(service_config_DataImportFilePath) |
service_FrequentProducerJob |
Producer Job that runs frequently. Configured to run once every 15 minutes by default. |
org.springframework. scheduling.quartz.JobDetailBean service_DataSetService |
To add a new DataSet type, add one more service_config_<<DataSetType>_KEY |
service_InfrequentProducerJob |
Producer Job configured to run once a day by default. |
org.springframework. scheduling.quartz.JobDetailBean service_DataSetService |
To add a new DataSet type, add one more service_config_<<DataSetType>_KEY |
service_OfflineDBProducerJob |
Producer Job configured to run once a day by default. |
org.springframework. scheduling.quartz.JobDetailBean service_DataSetService |
To add a new DataSet type, add one more service_config_<<DataSetType>_KEY |
service_TriggerFrequentProducer |
Cron Job Trigger class that runs service_FrequentProducerJob configuration. Cron Expression value can be modified to configure the job frequency.Cron Expression format.value="0 0,15,30,45 * * * ?" Value parameters from left to right separated by spaceSecondsMinutesHoursDaysWeeksYears To configure more than one value to any of the value parameter, configure values separated by commas (,) * Indicates any value |
org.springframework. scheduling.quartz.CronTriggerBean |
service_FrequentProducerJob Cron Expression Value |
service_TriggerInfrequentProducer |
Trigger class that runs service_InfrequentProducerJob configuration |
org.springframework. scheduling.quartz.CronTriggerBean |
service_InfrequentProducerJob Cron Expression Value service_ProducerSchedulerFactory Registers the services, service_TriggerFrequentProducerservice_TriggerInfrequentProducer with the Quartz SchedulerFactoryBean org.springframework. scheduling.quartz.SchedulerFactoryBean service_TriggerFrequentProducerservice_TriggerInfrequentProducer |
service_TriggerOfflineDBProducer |
Trigger class that runs service_OfflineProducerJob configuration |
org.springframework. scheduling.quartz.CronTriggerBean |
service_OfflineDBProducerJob Cron Expression Value |
service_CurrencyProducer |
DataSet Key definition for Currency DataSetProducer |
oracle.retail.stores.domain.iddi.CurrencyDataSetProducer |
dataSetKey (service_config_CUR_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_TaxProducer |
DataSet Key definition for Tax DataSetProducer |
oracle.retail.stores. domain.iddi.TaxDataSetProducer |
dataSetKey(service_config_TAX_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_EmployeeProducer |
DataSet Key definition for Employee Producer |
oracle.retail.stores. domain.iddi.EmployeeDataSetProducer |
dataSetKey(service_config_EMP_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_AdvancedPricingProducer |
DataSet Key definition for Advanced Pricing DataSetProducer |
oracle.retail.stores. domain.iddi.PricingDataSetProducer |
dataSetKey(service_config_PRC_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_ItemProducer |
DataSet Key definition for Item DataSetProducer |
oracle.retail.stores. domain.iddi.ItemDataSetProducer |
dataSetKey(service_config_ITM_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_StoreInfoProducer |
DataSet Key definition for Store Info Producer |
oracle.retail.stores. domain.iddi.StroreInfoDataSetProducer |
dataSetKey(service_config_STORE_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_MerchandiseProducer |
DataSet Key definition for Merchandise Hierarchy Producer |
oracle.retail.stores. domain.iddi.MerchandiseHierarchyDataSetProducer |
dataSetKey(service_config_MER_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_ShippingMethodProducer |
DataSet Key definition for Shipping Method Producer |
oracle.retail.stores. domain.iddi.ShippingMethodDataSetProducer |
dataSetKey(service_config_SHP_MTH_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_ReasonCodeProducer |
DataSet Key definition for Reason Codes Producer |
oracle.retail.stores. domain.iddi.ReasonCodesDataSetProducer |
dataSetKey(service_config_RSN_CODE_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_DiscountProducer |
DataSet Key definition for Discount Producer |
oracle.retail.stores. domain.iddi.DiscountDataSetProducer |
dataSetKey(service_config_DISCOUNT_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_OfflineDBProducer |
DataSet Key definition for OfflineDB Producer |
oracle.retail.stores. domain.iddi.OfflineDBDataSetProducer |
dataSetKey(service_config_OFFLINEDB_KEY)dataExportFilePath (service_config_DataExportFilePath)dataExportZipFilePath (service_config_DataExportZipFilePath)fileWriter(service_FileWriter) |
service_CurrencyConsumer |
DataSet Key definition for Currency DataSetConsumer |
oracle.retail.stores. domain.iddi.CurrencyDataSetConsumer |
dataSetKey(service_config_CUR_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_TaxConsumer |
DataSet Key definition for Tax DataSetConsumer |
oracle.retail.stores. domain.iddi.TaxDataSetConsumer |
dataSetKey(service_config_TAX_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_EmployeeConsumer |
DataSet Key definition for Employee DataSetConsumer |
oracle.retail.stores. domain.iddi.EmployeeDataSetConsumer |
dataSetKey(service_config_EMP_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_AdvancedPricingConsumer |
DataSet Key definition for Advanced Pricing DataSetConsumer |
oracle.retail.stores. domain.iddi.AdvancedPricingDataSetConsumer |
dataSetKey(service_config_PRC_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_ItemConsumer |
DataSet Key definition for Item DataSetConsumer |
oracle.retail.stores. domain.iddi.ItemDataSetConsumer |
dataSetKey(service_config_ITM_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_StoreInfoConsumer |
DataSet Key definition for Store Info DataSetConsumer |
oracle.retail.stores. domain.iddi.StoreInfoDataSetConsumer |
dataSetKey(service_config_STORE_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_MerchandiseConsumer |
DataSet Key definition for Merchandise Hierarchy DataSetConsumer |
oracle.retail.stores. domain.iddi.MerchandiseHierarchyDataSetConsumer |
dataSetKey(service_config_MER_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_ShippingMethodConsumer |
DataSet Key definition for Shipping Method DataSetConsumer |
oracle.retail.stores. domain.iddi.ShippingMethodDataSetConsumer |
dataSetKey(service_config_SHP_MTH_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_ReasonCodesConsumer |
DataSet Key definition for Reason Codes DataSetConsumer |
oracle.retail.stores. domain.iddi.ReasonCodesDataSetConsumer |
dataSetKey(service_config_RSN_CODE_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_DiscountConsumer |
DataSet Key definition for Discount DataSetConsumer |
oracle.retail.stores. domain.iddi.DiscountDataSetConsumer |
dataSetKey(service_config_DISCOUNT_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_OfflineDBConsumer |
DataSet Key definition for OfflineDB DataSetConsumer |
oracle.retail.stores. domain.iddi.OfflineDBDataSetConsumer |
dataSetKey(service_config_OFFLINEDB_KEY)dataImportFilePath(service_config_DataImportFilePath)importHelper(service_OfflineDBHelper) |
service_OfflineDBConsumerJob |
Consumer Job that runs frequently. Configured to run once a day by default |
org.springframework.scheduling.quartz.JobDetailBean |
dataSets To add a new DataSet type, add one more service_config_<<DataSetType>_KEY |
service_FrequentConsumerJob |
Consumer Job that runs frequently. Configured to run every 15mins by default |
org.springframework.scheduling.quartz.JobDetailBean |
dataSets To add a new DataSet type, add one more service_config_<<DataSetType>_KEY |
service_InfrequentConsumerJob |
Consumer Job configured to run once a day by default. |
org.springframework.scheduling.quartz.JobDetailBean |
dataSets To add a new DataSet type, add one more service_config_<<DataSetType>_KEY |
service_TriggerOfflineDBConsumer |
Cron Job Trigger class that runs service_OfflineDBConsumer configuration. |
org.springframework. scheduling.quartz.CronTriggerBean |
service_OfflineDBConsumerJob CronExpression Value |
service_TriggerFrequentConsumer |
Cron Job Trigger class that runs service_FrequentConsumer configuration. |
org.springframework. scheduling.quartz.CronTriggerBean |
service_FrequentConsumerJob CronExpression Value |
service_TriggerInfrequentConsumer |
Cron Job Trigger class that runs service_InfrequentConsumer configuration. |
org.springframework.scheduling.quartz.CronTriggerBean |
service_InfrequentConsumerJob CronExpression Value |
service_clientSchedulerFactory |
Registers the services, service_TriggerFrequentConsumerservice_TriggerInfrequentConsumer with the Quartz SchedulerFactoryBean |
org.springframework.scheduling.quartz.SchedulerFactoryBean |
service_TriggerFrequentConsumer service_TriggerInfrequentConsumer |
service_config_DataExportFilePath |
Configuration for Data Export File Path. This is the relative path. Application takes the application running path and appends the path given in this configuration. |
java.lang.String |
value |
service_config_DataExportZipFilePath |
Configuration for Data Export Zip File Path. This is the relative path. Application takes the application running path and appends the path given in this configuration.Note: The service_config_DataExportFilePath should not contain DataSetKey names (eg: EMPLOYEE, ITEM, CURRENCY, ADVANCED_PRICING,TAX) |
java.lang.String |
value |
service_config_DataImportFilePath |
Configuration for Data Import File Path where the dataset files are downloaded from Point-of-Service server and cached. |
java.lang.String |
value |
service_config_OfflineSchemaSQLFilePath |
Folder configuration where the Offline database schema SQL File. |
java.lang.String |
value |
service_config_OfflineSchemaLogFilePath |
Folder configuration for storing the Offline database schema SQL File import log file. |
java.lang.String |
value |
service_OfflineDBHelper |
Point-of-Service client offline Database Helper Class configuration |
oracle.retail.stores.foundation.iddi.OfflineDerbyHelper |
dataImportFilePath service_config_OfflineSchemaSQLFilePath service_config_OfflineSchemaLogFilePath |
service_ApplicationVersion |
Application Version retreival class configuration. |
oracle.retail.stores.pos.PosVersion |
None |
service_DataFormatter |
Data Formatter Helper to format Point-of-Service server data to Derby data import format specifications. |
oracle.retail.stores.foundation.iddi.DerbyDataFormatter |
None |
service_DerbyDBFormatter |
Data Formatter Helper to format Point-of-Service server data to Derby data import format specifications. |
oracle.retail.stores.iddi.DerbyDBFormatter |
None |
service_Filewriter |
Format Derby import files. |
oracle.retail.stores.foundation.iddi.IDDIFileWriter |
Formatter |
service_DerbyWriter |
Bean used to insert data read from store server database into server side offline derby database. |
oracle.retail.stores.iddi.IDDIDerbyWriter |
|
service_config_EMP_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_CUR_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_TAX_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_ITM_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_PRC_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_MER_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_SHP_MTH_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_RSN_CODE_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_STORE_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_DISCOUNT_KEY |
DataSet key Configuration |
java.lang.String |
None |
service_config_OFFLINEDB_KEY |
DataSet key Configuration |
java.lang.String |
None |
For Point-of-Service, the ServiceContext.xml is under <install directory>\<client or server>\pos\config\context
.
The timeout interval to start data consumption is configured in the application.xml file. The IDDITimeoutInterval parameter value is set to 15 minutes by default and is configurable.
The IDDIOfflineSupport parameter has been renamed to IDDIOfflineSupportRequired, and the values are reversed. Basically, this parameter allows the end-user to decide if the client should come up without offline data. If IDDIOfflineSupportRequired is Y, then the client does not start if no offline data is available (offline data is required for the client to start). If IDDIOfflineSupportRequired is N, then the client starts without offline data (offline data is not required for the client to start).
The batch size of the records to write data to offline file is set in domain.properties with the property IDDIBatchSize.
IDDI integrates with both the Point-of-Service server and the Point-of-Service client application. IDDI integration with Point-of-Service server produces dataset files on a scheduled basis. IDDI integration with Point-of-Service client downloads the dataset files from Point-of-Service server on a scheduled basis, and the client can then consume those files. IDDI server and client integration is pluggable and configurable.
Point-of-Service client should be online when it is run the first time to download the data from Point-of-Service server. If there is no offline data available, Point-of-Service client does not function in offline mode.
The client-side database schema must be in sync with server-side database schema.
Table 6-2 has been used in Derby database at the Point-of-Service client. The database schema for the following tables must mach the Point-of-Service server database schema.
Table 6-2 Point-of-Service DataSet Table
DataSet Name | DataSet Tables | |
---|---|---|
Items |
AS_ITM AS_ITM_I8 ID_IDN_PS PA_MF PA_MF_I8 AS_POG AS_ITM_ASCTN_POG AS_ITM_STK CO_UOM CO_UOM_I8 ID_DPT_PS ID_DPT_PS_I8 AS_ITM_RTL_STR CO_ASC_RLTD_ITM CO_CLN_ITM AS_ITM_SRZ_LB |
CO_EV TR_CHN_TMP_PRC CO_CLR CO_CLR_I8 CO_SZ CO_SZ_I8 CO_STYL CO_STYL_I8 CO_EV_I8 CO_EV_MNT CO_EV_MNT_I8 MA_PRC_ITM MA_ITM_PRN_PRC_ITM MA_ITM_TMP_PRC_CHN TR_CHN_PRN_PRC AS_ITM_SRZ_LB_I8 |
Employees |
PA_EM CO_GP_WRK CO_GP_WRK_I8 CO_ACS_GP_RS |
CO_ACS_GP_RS_LS CO_ACS_GP_RS_LS_I8 PA_RS PA_RS_I8 |
Advanced Pricing |
RU_PRDV RU_PRDV_I8 CO_PRDV_ITM RU_PRDVC_MXMH CO_PRCGP_I8 PA_GP_CT PA_GP_CT_I8 |
TR_ITM_MXMH_PRDV CO_EL_PRDV_ITM CO_EL_PRDV_DPT CO_EL_CTAF_PRDV CO_EL_MRST_PRDV CO_EL_TM_PRDV CO_PRCGP |
Tax |
RU_TX_GP RU_TX_RT PA_ATHY_TX CO_TX_JUR_ATHY_LNK CD_GEO |
GEO_TX_JUR CO_GP_TX_ITM CO_GP_TX_ITM_I8 PA_TY_TX |
Currency |
CO_CNY CO_RT_EXC |
CO_CNY_DNM CO_CNY_DNM_I8 |
Store Info |
PA_STR_RTL PA_STR_RTL_I8 |
LO_ADS |
Merchandise Hierarchy |
ST_ASCTN_MRHRC CO_MRHRC_FNC CO_MRHRC_GP CO_MRHRC_GP_I8 |
CO_MRHRC_LV CO_MRHRC_LV_I8 AS_MRHRC_ITM_GP |
Shipping Method |
CO_SHP_MTH |
CO_SHP_MTH_I8 |
Reason Codes |
ID_LU_CD ID_LU_CD_I8 |
LO_DPT_POS_RTL_STR |
The dataset compressed file contains all the dataset flat files of the tables associated with the dataset and metadata information (for example, the Manifest file).
Here is the structure of the dataset compressed file:
<DataSet Flat file> <DataSet Flat file> <DataSet Flat file> META-INF\MANIFEST.MF
The server generates the compressed file to <install directory>\Server\pos\bin\IDDI
, and the client copies the compressed file to <install directory>\Client\pos\bin\IDDI_CACHE
.
The Currency DataSet compressed file (CURRENCY_<<BATCHID>>.ZIP
) contains:
META-INF\MANIFEST.MF CO_ACS_GP_RS.TXT CO_GP_WRK.TXT PA_RS.TXT
The Manifest file compressed in the DataSet compressed files contains dataset metadata information in the following format:
DataSetName: <<DataSetName>> DataSetID: <<DataSetID>> ApplicationVersion: <<Oracle Retail Point-of-Service Version>> StoreID: <<StoreID>> BatchID: <<DataSetBatchID>> #Add all the Tables Names as shown in the format below DataFile-<<TableName>>: <<Table File Name>> TableSequence: <<Table Names separated by comma in the order of tables to be imported to Derby>>
The following is the Manifest file example for Currency DataSet:
DataSetName: CURRENCY DataSetID: 5 ApplicationVersion: pos StoreID: 04241 BatchID: 20070606084600 DataFile-CO_CNY: CO_CNY.TXT DataFile-CO_RT_EXC: CO_RT_EXC.TXT DataFile-CO_CNY_DNM: CO_CNY_DNM.TXT TableSequence: CO_CNY,CO_RT_EXC,CO_CNY_DNM
The following is the format of the DataSet flat file:
<<Table Row Data with the column information separated by comma (,) and enclosed within double quotes (”) if the information is not of numeric data type. The table row data is followed by New line character>>
The following is the DataSet flat file example for CO_CNY table:
1,"US","USD","USD","US","1",2,0 2,"CA","CAD","CAD","CA","0",2,1 3,"MX","MXN","MXN","MX","0",2,3 4,"GB","GBP","GBP","GB","0",2,4 5,"EU","EUR","EUR","EU","0",2,5 6,"JP","JPY","JPY","JP","0",0,6
Note: All the data type values except number type must be within double quotes. |
Extensibility is supported through the interface-based design and the use of the Spring Framework. From an extensibility stand point, an alternate implementation of any of the exposed interfaces could inherit from one of the out-of-the-box implementation classes and be injected into the system through Spring.
Additionally, the schema has been designed to enable the addition of datasets and dataset tables.
Add a new row to the table CO_DT_ST_TB_IDDI and create a table script in CreateSchema.sql to add a new dataset table to the data model.
The following example walks through the process of adding more tables to the existing DataSet in IDDI.
Insert the tables to be associated with the existing DataSet by adding records to CO_DT_ST_TB_IDDI using SQL.
Run the following queries to insert the table association to DataSet.
Example 6-1 Adding Table Association To Employee DataSet
insert into CO_DT_ST_TB_IDDI (ID_DT_ST, ID_STR_RT, NM_TB, NM_FL,AI_LD_SEQ) values (<<Employee DataSet ID>>, <<'Store ID'>>,<<'Table1'>>,<<'Table1.txt'>>,1 ); TableName: CO_DT_ST_TB_IDDI Column Description ID_DT_ST : DataSet ID ID_STR_RT: Store ID NM_TB : Table Name NM_FL : File Name of the Flat file to be generated AI_LD_SEQ: Table Order in which the data to be exported and imported eg: Get the Employee DataSet ID from CO_DT_ST_IDDI table insert into CO_DT_ST_TB_IDDI (ID_DT_ST, ID_STR_RT, NM_TB, NM_FL,AI_LD_SEQ) values (1,'04241','TABLE1','TABLE1.TXT',1 ); insert into CO_DT_ST_TB_IDDI (ID_DT_ST, ID_STR_RT, NM_TB, NM_FL,AI_LD_SEQ) values (1,'04241','TABLE2','TABLE2.TXT',2 );
Add CREATE TABLE scripts in CreateSchema.sql.
CREATE TABLE "offlinedb"."TABLE1" ( "COLUMN1” <<TYPE>> <<Constraint>>, "COLUMN2, <<TYPE>> <<Constraint>>) CREATE TABLE "offlinedb"."TABLE2" ( "COLUMN1” <<TYPE>> <<Constraint>>, "COLUMN2, <<TYPE>> <<Constraint>>)
To add a table using the build script:
Open <source_directory>\modules\utility\build.xml
.
Find the target dataset's offline table list:
ordered.<data set name>.tables
Add the name of the SQL file that contains the create script.
The create scripts are located at <source_directory>\modules\common\deploy\server\common\db\sql\Create
.
To add a new DataSet:
Add DataSet information in CO_DT_ST_IDDI.
Add DataSet tables to CO_DT_ST_TB_IDDI.
Create <DataSetKey>Producer
and <DataSetKey>Consumer
classes extending from AbstractDataSetProducer and AbstractDataSetConsumer respectively.
Define service_config_<DataSetKey>
in ServiceContext.xml.
Define service_<DataSetKey>Producer
with class=<DataSetKey>Producer
and service_<DataSetKey>Consumer
wit h class=<DataSetKey>Consumer
in ServiceContext.xml.
Add to service_<DataSetKey>Producer
and service_<DataSetKey>Consumer
to service_DataSetService and service_ClientDataSetService respectively in ServiceContext.xml.
Add DataSet key to service_FrequentProducerJob/service_InfrequentProducerJob
and service_FrequentConsumerJob/service_InfrequentConsumerJob
in ServiceContext.xml.
Add create table scripts and insert the script for the newly added DataSet in CreateSchema.sql.
Do the following to add a new dataset using the build script:
Open <source_directory>\modules\utility\build.xm
l .
Find the section that defines the offline table lists (target assemble.iddi).
Create the ordered list of tables, following the pattern established in the file. All create scripts are located at <source_directory>\modules\common\deploy\server\common\db\sql\Create
.
Add a call to concat.file for the new data set schema, following the other calls in the file:
<antcall target="concat.file"> <param name="target.file" value="${raw.sql.file}"/> -- The path and name of the file being generated <param name="file.comment" value="-- Employee DataSet Tables"/> -- Comment added to the file ahead of the create SQL <param name="src.dir" value="${sql.src.dir}"/> -- Path to the create scripts listed in the "ordered.<data set name>.tables" list <param name="file.list" value="${ordered.employee.tables}"/> -- Variable holding the ordered list of create scripts <reference refid="comment.filter" torefid="filter"/> </antcall>
Any existing DataSet Producer and Consumer can be individually configured to run on scheduled basis.
To configure DataSet Producer:
Add JobDetailBean bean configuration service_<<DataSet>>ProducerJob.
<bean id="service_<<DataSet>>ProducerJob" class="org.springframework.scheduling.quartz.JobDetailBean"> <property name="jobClass"> <value>oracle.retail.stores.foundation.iddi.DataSetProducerJob</value> </property> <property name="jobDataAsMap"> <map> <entry key="producer" value-ref="service_DataSetService"/> <entry key="dataSets"> <list> <ref local="service_config_<<DataSetKey>>"/> </list> </entry> </map> </property> </bean>
Note: service_config_<<DataSetKey>> should have been configured with the DataSetKey |
Add CronTriggerBean bean configuration service_Trigger<<DataSet>>Producer
<bean id="service_Trigger<<DataSet>>Producer" class = "org.springframework.scheduling.quartz.CronTriggerBean"> <property name = "jobDetail"> <ref local="service_<<DataSet>>ProducerJob"/> </property> <property name="cronExpression" value="0 0,15,30,45 0 * * ?"/> </bean>
The above DataSet is configured to run once every 15 minutes. For more information about configuring using Quartz, see the following web site:
http://www.quartz-scheduler.org/documentation/quartz-2.1.x/tutorials/tutorial-lesson-10
Add service_Trigger<<DataSet>>Producer
to the SchedulerFactoryBean bean configuration:
<bean id="service_ProducerSchedulerFactory" class="org.springframework.scheduling.quartz.SchedulerFactoryBean"> <property name="triggers"> <list> <ref local="service_TriggerFrequentProducer"/> <ref local="service_TriggerInfrequentProducer"/> <ref local="service_Trigger<<DataSet>>Producer"/> </list> </property> </bean>
To configure DataSet Consumer:
Add JobDetailBean bean configuration service_<<DataSet>>ConsumerJob:
<bean id="service_<<DataSet>>ConsumerJob" class="org.springframework.scheduling.quartz.JobDetailBean"> <property name="jobClass"> <value>oracle.retail.stores.foundation.iddi.ClientDataSetController</value> </property> <property name="jobDataAsMap"> <map> <entry key="dataSets"> <list> <ref local="service_config_<< DataSetKey>>"/> </list> </entry> </map> </property> </bean>
Note: service_config_<<DataSetKey>> should have been configured with the DataSetKey. |
Add CronTriggerBean bean configuration service_Trigger<<DataSet>>Consumer
:
<bean id="service_Trigger<<DataSet>>Consumer" class = "org.springframework.scheduling.quartz.CronTriggerBean"> <property name = "jobDetail"> <ref local="service_<<DataSet>>ConsumerJob"/> </property> <property name="cronExpression" value="0 0,15,30,45 0 * * ?"/> </bean>
The DataSet is configured to run once every 15 minutes.
Add service_Trigger<<DataSet>>Consumer
to the SchedulerFactoryBean bean configuration:
<bean id=" service_clientSchedulerFactory" class="org.springframework.scheduling.quartz.SchedulerFactoryBean"> <property name="triggers"> <list> <ref local="service_TriggerFrequentConsumer"></ref> <ref local="service_TriggerInfrequentConsumer"></ref> <ref local="service_Trigger<<DataSet>>Consumer"/> </list> </property> </bean>
The following example walks through the process of adding a new DataSet to the existing IDDI.
Insert the new DataSet information in into the databaset table CO_DT_ST_IDDI using SQL.
Insert the tables associated with the DataSet added to CO_DT_ST_TB_IDDI using SQL.
Run the following queries to insert new DataSet information and table association to DataSet.
Example 6-2 Adding New DataSet
insert into CO_DT_ST_IDDI (ID_DT_ST, ID_STR_RT, NM_DT_ST) values (maxid+1,<<'StoreID'>> ,<<'DataSetName'>>); TableName: CO_DT_ST_IDDI Column Description ID_DT_ST : DataSet ID ID_STR_RT: Store ID NM_DT_ST : DataSet Name eg: insert into CO_DT_ST_IDDI (ID_DT_ST, ID_STR_RT, NM_DT_ST) values (6,'04241','NEW');
Example 6-3 Adding Table Association to New DataSet
insert into CO_DT_ST_TB_IDDI (ID_DT_ST, ID_STR_RT, NM_TB, NM_FL,AI_LD_SEQ) values (<<New DataSet ID>>, <<'Store ID'>>,<<'Table1'>>,<<'Table1.txt'>>,1 ); eg: insert into CO_DT_ST_TB_IDDI (ID_DT_ST, ID_STR_RT, NM_TB, NM_FL,AI_LD_SEQ) values (6,'04241','TABLE1','TABLE1.TXT',1 ); insert into CO_DT_ST_TB_IDDI (ID_DT_ST, ID_STR_RT, NM_TB, NM_FL,AI_LD_SEQ) values (6,'04241','TABLE2','TABLE2.TXT',2 );
Create <DataSetKey>Producer
and <DataSetKey>Consumer
classes extending from AbstractDataSetProducer and AbstractDataSetConsumer respectively.
Example 6-4 DataSetProducer Code
package oracle.retail.stores.domain.iddi; import oracle.retail.stores.foundation.iddi.AbstractDataSetProducer; import oracle.retail.stores.foundation.iddi.DataSetMetaData; import oracle.retail.stores.foundation.iddi.TableQueryInfo; import oracle.retail.stores.foundation.iddi.ifc.DataSetMetaDataIfc; public class NewDataSetProducer extends AbstractDataSetProducer { private final String[] TABLE_FIELDS={"*"}; /** * NewDataSetProducer constructor */ public NewDataSetProducer () { } /** * Get DataSetMetatIfc reference * */ public DataSetMetaDataIfc getDataSetMetaData() { // Get the table names for the Key return dataSetMetaData; } /** * Initialize the MetaData for the DataSetProducer */ public void initializeDataSet() { dataSetMetaData = new DataSetMetaData(dataSetKey); } /** * Create TableQueryInfo object with the column names to fetch * @param TableName * @return TableQueryInfo Object */ public TableQueryInfo getTableQueryInfo(String tableName) { TableQueryInfo tableQueryInfo = new TableQueryInfo(tableName); tableQueryInfo.setTableFields(TABLE_FIELDS); return tableQueryInfo; } /** * Finalize DataSet Method * */ public void finalizeDataSet() { } }
Example 6-5 DataSetConsumer Code
package oracle.retail.stores.domain.iddi; import oracle.retail.stores.foundation.iddi.AbstractDataSetConsumer; //-------------------------------------------------------------------------- /** The NewDataSetConsumer defines methods that the application calls to import Employee dataset files into offline database. @version $Revision: $ **/ //-------------------------------------------------------------------------- public class NewDataSetConsumer extends AbstractDataSetConsumer { /** DataSet key name for currency dataset. */ private String dataSetKey = null; // -------------------------------------------------------------------------- /** @return Returns the dataSetKey **/ //-------------------------------------------------------------------------- public String getDataSetKey() { return dataSetKey; } // -------------------------------------------------------------------------- /** @param dataSetKey The DataSetKey to set **/ //-------------------------------------------------------------------------- public void setDataSetKey(String dataSetKey) { this.dataSetKey = dataSetKey; } }
Define service_config_<<DataSetKey>>
in ServiceContext.xml:
<bean id="service_config_<<datasetKey>> " class="java.lang.String"> <constructor-arg type="java.lang.String" value="<<DataSetKey>>"/> </bean>eg: <bean id="service_config_NEW_KEY" class="java.lang.String"> <constructor-arg type="java.lang.String" value="NEW"/> </bean>
Define service_<<DataSetKey>>Producer
with class=<DataSetKey>Producer
and service_<<DataSetKey>>Consumer
with class=<DataSetKey>Consumer
in ServiceContext.xml:
<bean id="service_NewProducer" class="oracle.retail.stores.domain.iddi.NewDataSetProducer" lazy-init="true" singleton=”true”> <property name="dataSetKey" ref="service_config_NEW_KEY"/> <property name="dataExportFilePath" ref="service_config_DataExportFilePath"/> <property name="dataExportZipFilePath" ref="service_config_DataExportZipFilePath"/> </bean> <bean id="service_NewConsumer" class="oracle.retail.stores.domain.iddi.NewDataSetConsumer" lazy-init="true" singleton="true"> <property name="dataSetKey" ref="service_config_NEW_KEY"/> <property name="dataImportFilePath" ref="service_config_DataImportFilePath"/> </bean>
Add to service_<<DataSetKey>>Producer
and service_<<DataSetKey>>Consumer
to service_DataSetService
and service_ClientDataSetService
respectively in ServiceContext.xml
<bean id="service_DataSetService" class="oracle.retail.stores.foundation.iddi.DataSetService" singleton="true"> <property name="producers"> <map> <entry key-ref="service_config_EMP_KEY" value-ref="service_EmployeeProducer"/> <entry key-ref="service_config_ITM_KEY" value-ref="service_ItemProducer"/> <entry key-ref="service_config_PRC_KEY" value-ref="service_AdvancedPricingProducer"/> <entry key-ref="service_config_TAX_KEY" value-ref="service_TaxProducer"/> <entry key-ref="service_config_CUR_KEY" value-ref="service_CurrencyProducer"/> <entry key-ref="service_config_NEW_KEY" value-ref="service_NewProducer"/> </map> </property> </bean> <bean id="service_ClientDataSetService" class="oracle.retail.stores.foundation.iddi.ClientDataSetService" singleton="true"> <property name="consumers"> <map> <entry key-ref="service_config_EMP_KEY" value-ref="service_EmployeeConsumer"/> <entry key-ref="service_config_CUR_KEY" value-ref="service_CurrencyConsumer"/> <entry key-ref="service_config_TAX_KEY" value-ref="service_TaxConsumer"/> <entry key-ref="service_config_ITM_KEY" value-ref="service_ItemConsumer"/> <entry key-ref="service_config_PRC_KEY" value-ref="service_AdvancedPricingConsumer"/> <entry key-ref="service_config_NEW_KEY" value-ref="service_NewConsumer"/> </map> </property> <property name="dataImportFilePath" ref="service_config_DataImportFilePath"/> </bean>
Add DataSet key to service_FrequentProducerJob/service_InfrequentProducerJob
and service_FrequentConsumerJob/service_InfrequentConsumerJob
in ServiceContext.xml
<bean id="service_FrequentProducerJob" class="org.springframework.scheduling.quartz.JobDetailBean"> <property name="jobClass"> <value>oracle.retail.stores.foundation.iddi.DataSetProducerJob</value> </property> <property name="jobDataAsMap"> <map> <entry key="producer" value-ref="service_DataSetService"/> <entry key="dataSets"> <list> <ref local="service_config_EMP_KEY"/> <ref local="service_config_PRC_KEY"/> <ref local="service_config_TAX_KEY"/> <ref local="service_config_NEW_KEY"/> </list> </entry> </map> </property> </bean> <bean id="service_FrequentConsumerJob" class="org.springframework.scheduling.quartz.JobDetailBean"> <property name="jobClass"> <value>oracle.retail.stores.foundation.iddi.ClientDataSetController</value> </property> <property name="jobDataAsMap"> <map> <entry key="dataSets"> <list> <ref local="service_config_EMP_KEY"/> <ref local="service_config_PRC_KEY"/> <ref local="service_config_TAX_KEY"/> <ref local="service_config_NEW_KEY"/> </list> </entry> </map> </property> </bean>
Add CREATE TABLE scripts and insert scripts to newly added DataSet in CreateSchema.sql.
CREATE TABLE "offlinedb"."TABLE1" ( "COLUMN1” <<TYPE>> <<Constraint>>, "COLUMN2, <<TYPE>> <<Constraint>>) CREATE TABLE "offlinedb"."TABLE2" ( "COLUMN1” <<TYPE>> <<Constraint>>, "COLUMN2, <<TYPE>> <<Constraint>>) insert into CO_DT_ST_IDDI(ID_DT_ST, ID_STR_RT, NM_DT_ST) values(6,'04241','NEW');
Do the following to add a new dataset type using the build script:
Open <source_directory>\modules\utility\build.xm
l .
Find the section that defines the offline table lists (target assemble.iddi).
Create the ordered list of tables, following the pattern established in the file. All create scripts are located at <source_directory>\modules\common\deploy\server\common\db\sql\Create
.
Add a call to concat.file for the new data set schema, following the other calls in the file:
<antcall target="concat.file"> <param name="target.file" value="${raw.sql.file}"/> -- The path and name of the file being generated <param name="file.comment" value="-- Employee DataSet Tables"/> -- Comment added to the file ahead of the create SQL <param name="src.dir" value="${sql.src.dir}"/> -- Path to the create scripts listed in the "ordered.<data set name>.tables" list <param name="file.list" value="${ordered.employee.tables}"/> -- Variable holding the ordered list of create scripts <reference refid="comment.filter" torefid="filter"/> </antcall>
The Point-of-Service client uses the Derby database. However, the modifications to the code are minimal for replacing the Point-of-Service client database from Derby to another database. Do the following to change the Point-of-Service client database:
Add Offline<<DBName>>Helper
class which implements offlineDBHelperIfc
.
Change the installer to have new database driver jar file paths.
Update the "<POOL name="jdbcpool class="DataConnectionPool" package="oracle.retail.stores.foundation.manager.data">"
section of PosLFFDataTechnician.xml file with the driver, databaseUrl, userid, password.