Workflow to Decode Audit Data
The workflow that you use to decode audit data depends on how you use the data. If you are implementing an archival solution, then you might want to implement a solution that reads all created records. If you are implementing a more data-oriented solution, then you must implement a solution that has more control of the data.
The following is an example of a high level flow for reconstructing data:
Start the Audit Trail Item 2 virtual business component.
In a search time window, run a query for Audit Trail Item 2.
Run an query for Audit Trail Item - Data, where the date and time match the previous record.
Write out the decoded record ID: New Val, Old Val.
This step is entirely dependant on your specific requirements and can range from writing to a custom table in the Siebel database to sending the decoded data to an external data source using one of the EAI Transports.
You can modify this flow to restrict the data returned by business component, user, position, and so on. However, the flow must always include the fundamental aspect of querying both the Audit Trail Item 2 virtual business component and the Audit Trail Item – Data business component to construct a record consistent with format of a previous version of a Siebel Audit Trail record.
Use this flow to make sure that duplicate records are not included in the export of audit trail data to a data warehouse. In this flow, row IDs are not obtained from the Audit Trail Item - Data business component to create records with unique row IDs but to identify already-exported records. This identification occurs in the data warehouse. (Due to the encoded format of the Audit Trail Item - Data business component, a virtual record in the Audit Trail Item 2 virtual business component does not always map one-to-one to a physical database record in the Audit Trail Item - Data business component. Consequently, a record for the Audit Trail Item 2 virtual business component does not always have a unique row ID.)
Next, you must create a trigger for the workflow you are using to decode audit data. You can use a reactive trigger through runtime events or workflow policies. If you need real-time or near-real-time onward transmission of the audit trail data, then consider using a reactive trigger. However, consider that the volume of transactions occurring on the audit tables is high, so any per-record solution might cause unwanted performance effects on the entire Siebel application. Any deployment using a reactive trigger on audit data requires extensive performance profiling.
For any situation that does not require real-time data transmission, consider an asynchronous batch approach using a mechanism such as repeating component requests. You can tune and adjust these requests so that the frequency of the job matches the data and performance requirements of the Siebel application. In many cases, you can time these requests to run during periods of reduced activity in the Siebel application. It is recommended that you use this approach for the majority of use cases.
You can use either a custom business service in conjunction with a Workflow Process, or you can use a Workflow Process on its own to implement your solution to decode audit trail data. Leveraging only a Workflow Process might assist in the long term maintenance of the solution if you think that it might require modification over time; however this approach also adds complexity to the solution due to the requirements of stepping through data sets and restricting records.