|Oracle® Fusion Middleware User's Guide for Oracle Business Intelligence Data Warehouse Administration Console
11g Release 1 (11.1.1)
Part Number E14849-02
|PDF · Mobi · ePub|
This appendix describes common problems and how to resolve them. It contains the following topics:
The process of recovering from the loss of an encryption key involves clearing the encrypted data from the DAC repository, creating a new cwallet.sso file (with a new encryption key), and stamping the DAC repository with the new encryption key.
This procedure involves steps that use automation utility commands. Before you can use these commands, you need to configure the automation utility property file.
To recover from the loss of an encryption key
Remove all the encrypted data from the DAC repository by calling the automation utility command clearEncryptedData.
Call one of the following automation utility commands to generate a new cwallet.sso file:
dbCredentials <cwallet.sso file path> <user name> -withKey
Use this command to generate a cwallet.sso file with a encryption key that you specify.
dbCredentials <cwallet.sso file path> <user name> -randomKey
Use this command to generate a cwallet.sso file with a randomly generated encryption key.
Copy the new cwallet.sso file to the appropriate directory where the DAC repository can access it. The default directory is <DAC_Config_Location>\conf-shared\security\repository.
Stamp the DAC repository with the new encryption key:
Log in to the DAC Client.
The following message will be displayed:
Encryption key validation value is missing from the DAC repository. Would like to add it now?
Click OK to add the new encryption key to the DAC repository.
Enter the table owner name and password, and click OK.
A message informs you whether the operation was successful.
Distribute the new cwallet.sso file to all DAC installations that connect to this repository.
When an execution plan runs, if a task fails, the status of the tasks that are dependent on the failed task is changed to Stopped. While tasks are still running, the execution plan's status is Running. After all the tasks have run, if one or more tasks failed, the execution plan's status is changed to Failed. Unrelated tasks run to completion.
To restart an execution plan that failed
Check which tasks in the execution plan failed.
Go to the Current Run tab in the Execute view.
Click the Tasks subtab, and review the Task Status.
For tasks with the status Failed, click Details in the subtab toolbar.
The Details dialog displays additional information about the task details.
Fix the problem that caused the task to fail.
Requeue the failed task.
In the Details dialog, click the Task Details tab.
In the Status column, select Queued from the drop-down list.
Restart the execution plan that failed by selecting it in the top pane of the Current Run tab, and then clicking Restart.
DAC will rerun all the tasks with a status other than Completed.
Note: You can also manually run a task, change the task status to Completed, and then restart the execution plan. Tasks with the status Completed will be skipped.
Caution:The DAC Server does not validate tasks that have been run manually.
If the DAC Server crashes while an execution plan is running, some workflows associated with the execution plan may have been submitted to Informatica. When the DAC Server is restarted and the failed ETL process resumes, the workflows that DAC submitted to Informatica before the crash may have completed or failed or may still be running. You do not have to perform any manual intervention. DAC manages the tasks submitted for execution prior to the crash in the following manner:
Running. If a workflow is still running in Informatica, DAC waits for the running workflow to finish execution.
If the workflow has successfully completed, DAC will change the status of the task detail to Completed and will resume execution of the successors.
If the workflow has failed, DAC will change the status of the task detail to Failed and will eventually halt the ETL. This is analogous to a normal failure encountered during ETL execution.
Completed. If the workflow had completed in Informatica, the task will not be re-run. DAC will change the status of the task detail to Completed and will resume execution of the successors.
Failed. If the workflow in Informatica had failed, DAC will resubmit the workflow for execution when the ETL process restarts.
Note: If a failed task includes truncate statements, the truncate action will occur again when the task is restarted. This will also trigger the Upon Failure Restart action if one is defined for the task.
You can discard an execution plan that failed by navigating to the Current Run tab, right-clicking on the execution plan and changing its status to Mark as Completed. This will force the run status to be updated as Completed. When you submit a request for another run, the DAC Server creates another instance of it.
Caution:Perform this procedure in a development or test environment only, because it might leave the data in an inconsistent state, causing you to have to reload all of the data.
Tasks that use Informatica Aggregator Transformation can fail when the Sorted Input option is active. The tasks SDE_DTLFORECASTFACT and SDE_COSTLIST are examples of tasks that can fail in such a situation.
To prevent such tasks from failing, in Informatica PowerCenter Designer, navigate to Mapping Designer, open the corresponding mapping, and in the Aggregator transformation, deselect the Sorted Input check box.
If the DAC Server fails during the execution of an execution plan, the status of the execution plan will remain as Running. When the DAC Server is started again, it will automatically run the execution plan if the Auto Restart ETL system property is set to True. If the same system property is set to False, when the server restarts, it will set the correct status as Failed. In order to execute the execution plan from the point of failure, submit the request to the server again.
The DAC Server will automatically terminate if it looses connection to the DAC repository.
DAC Server uses pmcmd to initiate the workflows on Informatica Server. In the English-based operating systems, DAC issues the commands in the non-blocking mode (asynchronously), and polls Informatica for the status of the workflow. The output of the pmcmd getWorkFlowDetails is spooled to the Domain_Home\dac\log directory, and then gets parsed to determine whether the workflow is still running, completed successfully, or failed.
However, for non-English-based operating systems, DAC issues commands in the waiting mode (synchronously). This means that when the process completes the exit code tells DAC whether or not the workflow succeeded.
The commands used by DAC to communicate with Informatica are externalized in a file called infa_commands.xml. The DAC command template does not have a place holder for specifying the wait mode. Without this wait mode configuration, on a non-English- based operating system, DAC proceeds with the execution even before the workflow completes executing. This might result in errors, such as Informatica's bulk loader failing because of indexes are present or fact tables are being loaded without foreign key references.To fix the problem, go to Domain_Home\dac\conf folder and edit the file called infa_commands.xml. Depending upon the version of informatica you are using, edit either the block called START_WORKFLOW_7 or START_WORKFLOW_8 and verify whether %WAITMODE is in the syntax. If it is not, add it as follows:
For START_WORKFLOW_7 replace the following line:
pmcmd startworkflow -u %USER -p %PASSWORD -s %SERVER:%PORT -f %FOLDER -lpf %PARAMFILE %WORKFLOW
pmcmd startworkflow -u %USER -p %PASSWORD %WAITMODE -s %SERVER:%PORT -f %FOLDER -lpf %PARAMFILE %WORKFLOW
For START_WORKFLOW_8 replace the following line:
pmcmd startworkflow -sv %SERVER -d %DOMAIN -u %USER -p %PASSWORD -f %FOLDER -lpf %PARAMFILE %WORKFLOW
pmcmd startworkflow -sv %SERVER -d %DOMAIN -u %USER -p %PASSWORD %WAITMODE -f %FOLDER -lpf %PARAMFILE %WORKFLOW
Once you modify this file (the modifications should be done both on the DAC client and the server machines), restart the DAC server and client for the changes to take effect.
When you have a multi-source environment, you can use a single execution plan to extract data for a given subject area for all the sources. When you build the execution plan, DAC creates as many instances of the extract tasks as there are sources, without having the need to duplicate the workflows or DAC metadata.
At runtime, DAC creates an individual parameter file for each session, which contains the evaluated name-value pairs for all parameters. DAC stores the parameter file at the location specified by the DAC system property InformaticaParameterFileLocation. The default location is <Domain_Home>\dac\Informatica\parameters.
If a task in a multi-source environment fails and you want to restart it, you can do so from DAC or from Informatica. See the following instructions.
To restart a failed task in a multi-source environment using DAC
Go to the Execution Plans tab in the Execute view, and select the appropriate execution plan.
Click the Ordered Tasks subtab.
Query for the task that you want to restart.
Click Unit Test in the toolbar.
The Unit Test dialog displays the steps that DAC will carry out if the task is executed.
Click Execute in the Unit Test dialog.
Note: The DAC system property Dryrun must be set to False for the task to run.
To restart a failed task in a multi-source environment using Informatica
In DAC, locate the parameter file for the failed task.
The parameter file is stored in the location specified by the DAC system property InformaticaParameterFileLocation. The default location is <Domain_Home>\dac\Informatica\parameters.
Copy the file to the location specified by the Informatica parameter $PMSourceFileDir. This is the location where Informatica looks for parameter files.
Note: If DAC is configured correctly, the location specified by $PMSourceFileDir and the location specified by the DAC system property InformaticaParameterFileLocation should be the same. If the locations are the same, the parameter file will already be in the correct location.
Change the parameter file name to match the naming convention Informatica uses for parameter files:
<Informatica directory name>.<Informatica session name>.txt
Run the workflow in Informatica.
When the DAC repository resides on a non-Unicode Oracle database, the Informatica workflow names may not fit into the corresponding DAC fields, which causes tasks to fail during ETL processes.
To work properly in multi-byte character environments, the DAC repository should be created with the Unicode option selected.
Perform the following procedure to fix an existing DAC repository with this problem. You can use the DAC Client or DAC command line parameters to perform the procedure.
Connect to the existing DAC repository.
Export the entire repository (logical, system, and run time categories).
Stop the DAC Server and all DAC Clients except for the one you are using.
Drop the current repository.
If you are using a DAC Client, restart it.
Create a new repository with the Unicode option selected.
Import the DAC metadata that you exported in step 2.
Re-enter the passwords for all Informatica services and all physical data source connections.
Informatica workflows initiated by DAC fail with error code 17 and the error message "Parameter file does not exist" when the parameter file has multi-line parameters. See the below text for an example. Note that in the following example, an error occurs because DAC issues
pmcmd with -lpf in the syntax.
$$SYND_DS_PARTITION_TRUNCATE_SQL_TEXT=SELECT LTRIM(MAX(SYS_CONNECT_BY_PATH('execute immediate ''ALTER TABLE getTableName() TRUNCATE PARTITION ' || '"' ||COUNTRY_REGION_NAME|| '"' ||'''' ,';')) KEEP (DENSE_RANK LAST ORDER BY curr),';') AS SQL_TXTFROM (SELECT SOURCE,COUNTRY_REGION_NAME,ROW_NUMBER() OVER (PARTITION BY SOURCE ORDER BY COUNTRY_REGION_NAME) AS curr, ROW_NUMBER() OVER (PARTITION BY SOURCE ORDER BY COUNTRY_REGION_NAME) -1 AS prev FROM W_SYND_PARTITION_TMP WHERE SOURCE='W_SYNDD_DS_FS') CONNECT BY prev = PRIOR curr START WITH curr = 1
To prevent this issue, edit <DAC_Config_Location>\conf-shared\infa_command.xml and replace all instances of <-lpf> with <-paramfile>.
This workaround will ensure that DAC uses -paramfile in the
pmcmd syntax and Informatica can recognize the multi-line parameters.
In DAC, you can reset refresh dates at the task level rather than the table or subject area level by using one of the following options:
Create an execution plan with the appropriate subject areas and select the Full Load Always property. Run the execution plan once; this will cause the tasks in the execution plan to run in full mode. After the execution plan successfully completes, delete the execution plan.
Create an execution plan with the appropriate subject areas. Right-click the execution plan and select Reset source(s). Select all the sources, and then click OK.
Note: With this option, given that the execution plan will run in full mode, there is the risk that source tables populated by this execution plan could also be included in other subject areas, and in such instances, could be populated in error. To work around this issue, you can set the DAC system property "No Run" to True, which will generate tasks but not execute them. You can review the Task Details in the Current Runs tab to verify if the correct tasks are running in full mode.
If you are using the Oracle (OCI8) connection type when defining a physical data source and receive an error such as "no ocijdbc11," consider the following points:
The JDBC calls go through the local database client installation libraries. This means that the client version and the database version should match. For example, you cannot use an Oracle 9i client to access an Oracle 10g database.
Do not use multiple versions of the same library.
Consider using the Oracle (Thin) connection, which has no overhead of going through the client libraries.
The DAC behavior for truncating tables when more than one DAC task writes to the same target table is as follows:
The first DAC task truncates the target table. DAC does not truncate the target tables during the execution of the subsequent tasks even if the Truncate Always or Truncate for Full Load flags are checked on the subsequent tasks.
DAC truncates a target table only once during the life span of one ETL execution.
If the DAC task belongs to a group and the group truncate property is different from the individual task truncate properties, the target table will be not truncated.
If you want to override this default behavior, you can create a task action and use the Preceding Action action type to specify in a custom SQL statement that the table should be truncated. For more information about configuring task actions, see "Using Actions to Optimize Indexes and Collect Statistics on Tables".
Discrepancies may exist between the status of a DAC task and the status of an Informatica workflow for the following reasons:
The Informatica pmcmd utility stops responding or "hangs." DAC issues a command using the pmcmd utility to start the workflow. DAC waits on this process to finish. If the command "hangs," then DAC will also "hang." DAC issues this command in a polling mode, which means first DAC issues the command to start the workflow and then issues the getWorkflowStatus command to find out if the execution is completed. As long as the pmcmd output states the process is executing, DAC waits. The default polling interval is 30 seconds. Therefore, if a process runs for more than several minutes, you can assume it is "hanging."
In Windows, you can open the Task Manager and observe the pmcmd executables. If an executable has the same process ID for more than a few minutes, then the process is "hanging."
Informatica loses connection to its own repository. If Informatica loses the connection to the database, even if Informatica Workflow Monitor indicates the workflow ran to completion, Informatica will not be able to update the workflow status to completed. Therefore, when DAC issues a command to determine whether the workflow completed, DAC will always receive a response of "Running." To confirm if the workflow is in a perpetual "Running" status, review the log file with the naming convention folder.workflow.log in the DAC\log folder.
If the DAC Server is not visible to all DAC Clients, make sure the DAC system property "DAC Server Host" has the correct value and not "localhost." Depending on your environment, you may need to use a fully qualified name. You can also try using the IP address of the DAC Server machine. You should be able to ping the DAC Server machine from the DAC Client machine. Make sure the ping requests do not time out.
You should also inspect the network settings to make sure the port number that is registered for the DAC Server Port system property is open and available.