Results

This topic provides an overview of quality results, including the process for entering them. You are shown how to composite test results, compare specifications, change quality disposition, and manage expired lots.

This chapter covers the following topics:

Understanding Results

Quality results serve as the basis for measuring actual material characteristics against expected quality records, and for assigning follow-up actions to be taken throughout the supply chain. Results are logged against an individual sample, where scheduled tests are driven by the specification and additional impromptu tests can be added to the sample analysis. After a technician enters results, a Quality Manager has the option to evaluate the results set against targets and limits to determine the sample disposition based on whether the results are acceptable. The collective disposition of samples in a sample group can be used by the application for material lot usage decisions in production and inventory. Composite results and specification comparison capabilities also assist in result interpretation.

Tracing Results by Testing Materials, Instruments, and Tester

The Results window has test kit information that helps you trace the material that is used in testing. If test results need to be investigated, then you can identify both the materials used in testing and the internal or external testing resources that performed the testing.

Restricting the Decimals Displayed for the Result

Saved numeric results are restricted to the stored decimal precision or number of decimal places defined at the test or specification level.

Entering a Result

To enter test results, your responsibility must have access to the quality laboratory. You can enter results by accessing the Samples Summary window from the following:

You can edit previously entered result values when the sample has a disposition of In Progress or Complete.

Assigning Result Evaluation and Sample Disposition Automatically

Test results are automatically evaluated for conformance to specification if the test is part of the specification associated to the sample. Results that are in-specification are assigned an evaluation of Accept. If the Control Lot Attributes indicator is enabled on the specification validity rule that is in effect, then the sample disposition can also update the lot status automatically. If all results are in-specification, then the sample disposition is set to Accept. If any of the results are out-of-specification, then the sample disposition is set to Complete and you can change the sample disposition to either Reject or Accept with Variance. The Oracle Process Manufacturing (OPM) Quality Sample Group Disposition workflow must be enabled without notification for the automatic sample disposition change.

Assigning Result Evaluation Manually

The following result evaluations are possible:

After entering a result, you can change the Evaluation field on the Results window from:

Evaluating Expressions as You Report Results

Expression results cannot be entered directly. They are calculated from a mathematical expression, or by the results of tests referenced in the expression. You can calculate Expression test results before saving the results set. If all tests used in an expression have a result, then the expression is calculated irrespective of whether the result is within or outside specification limits. If tests used in an expression have replicates, then the most recent result replicate is used. Test results that are marked for deletion are not considered for expressions. The application displays a warning if the expression result is outside its limits.

Evaluating Partial Expressions

You can use Oracle SQL*Plus or a user defined PL/SQL NVL function to provide interim calculations of expressions that have missing test results.

Here is an example:

A = B + NVL(C, <numeric value>)

where C is assigned a default numeric value to be used when C is not entered.

In this example expression, A is calculated when B is entered, even without the value of C. A default numeric value is assigned to C. If at a later point, the value of C is entered, then the expression is recalculated to include the actual value of C. Use the Recalculate option to recalculate the expression on an interim basis. The application recalculates the expression when you save your work.

Compositing Test Results

The arithmetic mean of numeric results and the mode statistic for nonnumeric results can be reported on a Certificate of Analysis when results are composited. The mean, median, high, low, range, standard deviation, and mode of numeric results are calculated statistics for numeric results. Text range results display the high, low, and mode values. You can enter text as the composite result for non-validated tests. If a test is replicated, then the most recent result replicate is included in the composited calculation. If a test is part of the specification that is tied to the sample and the sample group, then the composite result is evaluated automatically against the specification, and the specification information is displayed. Impromptu tests are displayed in the composite results set without specification information. You can exclude individual sample results from the calculation statistics for a test, add a test, exclude a sample, or add a sample to the composite results set.

Archive and reserve samples are not included in the composite results set. You can view results associated with a parent sample in context with composite results, however these results do not contribute to the calculated statistics.

Comparing Specifications

You can compare results for an analyzed sample to another item specification with a status of Approved in a specific process-enabled inventory organization. The sample may already have an association with a current specification where results were evaluated against its limits to determine if they are in-specification. A comparison specification is selected for a side by side comparison of two specifications evaluated against the samples results. Tests from the comparison specification can be added to a results set to initiate additional testing. You can make the comparison specification the current specification applied to the sample so that a new sample disposition can be recorded against the new specification, if the new specification is approved for laboratory use or for general use.

You cannot compare specifications for a monitoring sample, because you should not replace the specification tied to a monitoring sample. Similarly, you should not retrieve stability samples from the Specification Comparison window. Any changes to a stability study or stability study sample for time point testing must be tracked within the stability study functionality, and not through the Specification Comparison window.

Retesting Samples

In the case of retesting, where a previously performed test is added to the sample again, the replicate number on the Results window increases each time a test is added to the results set. For example, a test is usually run only once on a sample. The test is performed and a result is entered. If something occurs in the laboratory that requires the test to be run again, then you can add the same test to the results set again. The test appears on the Results window with a value 2 displayed in the Replicate field. You enter the repeated test result.

Preventing Batch Step Completion

This feature prevents additional batch processing and the completion of a batch step until acceptable results are recorded and approved for the samples taken from the batch step.

You can control batch step status for quality control with a WIP specification validity rule specified for a particular routing step or batch step. Sampling at the batch step requires acceptable results for tests not marked as optional before the batch step can be completed.

Responsibilities

Formulator, Process Engineer, Production Supervisor, Quality Manager

Prerequisites

To prevent a batch step from completing until acceptable quality results are approved:

  1. Create a formula.

  2. Change the status of the formula to Approved for General Use.

  3. Create a routing.

  4. Change the status of the routing to Approved for General Use.

  5. Create a recipe that contains the formula and routing.

  6. Create a recipe validity rule for the recipe and the plant to produce the batch.

  7. Change the status of the recipe and the status of the recipe validity rule to Approved for General Use.

  8. Create and approve a specification for the product of the formula.

  9. Create and approve a WIP specification validity rule, specifying the recipe, routing, or both, and routing step, or batch and batch step if these are already created. Ensure the Control Batch Step Status indicator is selected.

  10. Create a batch from the recipe.

  11. Release the batch.

    • From the Batch Steps window, the Batch Step Quality Status indicates Sample Required.

    • With the Production Sample Creation Notification workflow enabled, the application sends a notification for the creation of a WIP sample when the batch is created.

  12. Release the batch step.

  13. Create a sample from the open notification for the Sample Creation Notification workflow. You are not able to complete the batch step at this point, since no results are entered.

  14. Enter and save results. If the sample group disposition for the batch step is Accept or Accept with Variance, then the Batch Step Quality Status is updated to Results In Spec, and you can complete the step.

  15. Complete the batch step. If the batch step quality status does not reflect Results In Spec, then a message asking for override appears. Refer to "Removing the Override Control of Batch Step Completion" for additional information.

Excluding the Override Control of Batch Step Completion

Designated responsibilities can override the control of batch step completion despite the quality inspection requirement. The overriding of batch step completion privilege is granted to the menu associated to a responsibility. For example, the Production Supervisor menu can be granted this privilege. Exclude this override function from this responsibility by using the System Administrator responsibility.

  1. Navigate to Application and choose Responsibility.

  2. Query the appropriate responsibility name. For example, query Production Supervisor.

  3. On the retrieved record, add the menu exclusion for the function Batch Steps:Complete without QC .

  4. Save the record. This removes the override capability from the menu associated with this responsibility. When you try to complete a batch step, and results are not entered or in specification, then an error message displays.

Controlling Updates to Lot Status, Grade, and Hold Date Based on Quality Test Results

This feature helps determine material usage based on updates to lot status, grade, and hold date that are driven from quality results. An inventory, supplier, and WIP specification validity rule can control the update of lot status depending on whether results are in or out-of-specification for all samples in a sample group. A sample disposition that is marked Accept or Accept with Variance is in-specification. A sample disposition that is marked Reject is out-of-specification.

For samples against a particular lot, the final sample group disposition can update lot attributes such as material status and grade for the lot specified. For samples against a parent lot only, the lot attributes update can apply to all lots one level below the parent lot.

Responsibility

Quality Manager

Prerequisites

To update the status and grade of the sampled lot, perform the following:

  1. Create and approve a specification for the item or lot.

  2. Create and approve an inventory or supplier specification validity rule. Ensure the Control Lot Attributes indicator is selected. The In-Specification Lot Status and Out-of-Specification Lot Status defaults if previously defined on the Process Quality Parameters window. Deselect the Lot Optional on Samples indicator so that updates can be enforced on a lot.

  3. Create a sample.

    • If the Sample Creation Notification workflow is not enabled, then you can use the Get Spec function to associate the sample record to the applicable specification validity rule.

    • The lot number is required to update the lot status based on in-specification or out-of-specification results, lot grade, and hot hold release date.

  4. Enter and save results for the sample.

  5. Repeat the sample creation, results entry, and evaluation process if there are several samples to be taken from a single source.

  6. When testing is complete for all samples required, specify or confirm the new lot status, grade, and hold date at the same time as you change the sample group disposition.

    • If all samples in the sample group have the disposition of Accept, then the lot status defaults to In-Specification.

    • If any of the samples have the disposition of Reject, then an Out-of-Specification lot status defaults into the Change Disposition window. This quality-driven lot status update affects all subinventories and locators of the lot.

Completing Receiving Inspection Based on Sample Results

You can determine the receiving inspection result based on the status or acceptability of the samples analyzed in a quality laboratory. After you make a material receipt, the sample creation workflow and supplier specification validity rule expedite the sample drawing and testing process. When the receipt is routed for inspection, the collection plan specifies the requirements for sample collection. The collection plan is based on a template plan for receiving inspection of process-enabled items. Based on the outcome of sampling, you can enter an inspection result, and the accepted or rejected quantity of material. A sample group disposition of Accept or Accept with Variance permits you to enter an inspection result of Accept to approve the shipment. A sample group disposition of Reject only permits an inspection result of Reject, in which case the material is delivered to an alternate location for quarantine or returned to the vendor.

Note: A final disposition for the samples is required before completing the data collection at the receiving inspection. Mandatory collection elements must be entered or defaulted to save the inspection results.

Responsibilities

Quality Inspector, Quality Manager, Quality Technician, System Administrator, and Purchasing Super User

Prerequisites

Oracle Quality

OPM Quality Management

Refer to "Specifications" for detailed information on entering a specification and item specification validity rule.

Approvals Management Engine (AME)

Refer to the Oracle Approvals Management Implementation Guide and "Appendix C" for detailed information on setting up and using OPM Quality Management workflows.

Oracle Purchasing

To determine receiving inspection results based on sample disposition:

  1. Receive against a purchase order line. A receipt is created.

  2. If sampling is required for the receipt, then receive the workflow notification to take a sample.

  3. Create the supplier sample.

  4. Perform tests per the specification.

  5. Enter results and assign evaluations and actions.

  6. Assign a final sample disposition.

  7. If multiple samples are taken, then enter and composite their results and assign a final sample group disposition.

  8. Query the receipt. Enter the quantity received and the destination location. Receiving inspection is required for this receipt.

  9. Select Inspection. The appropriate collection plan is retrieved.

  10. Select the appropriate specification, if applicable.

  11. Enter quality results in the collection plan. The sample group disposition for the most recent receipt line displays.

  12. Enter the overall inspection result and accept or reject quantities.

  13. Save the collection plan results to complete the receiving transaction.

  14. Deliver the material into the desired location.

Entering Results

The Results window lets a Quality Technician enter results for samples. For samples that are associated to a specification and version, a list of tests from the specification is provided for entry of results. You can add a test to the results set if this function is included in your responsibility. You can edit previously entered result values when the sample has a disposition of Pending, In Progress, or Complete. The application validates the result against the test range or valid test values unless the test is non-validated or guaranteed by the manufacturer. It also validates the result against the specification range if the test is part of the specification. If the result falls into an action zone for experimental error, then an operator message displays when you try to save the result. When you enter results without a specification tied to the sample, certain business rules driven by a specification and the specification validity rule cannot be enforced. You can also select Guaranteed by Manufacturer for each result.

A Quality Manager reviews results based on their conformance to specification as determined by the application. The manager can assign an evaluation of Accept, Accept with Variance, Reject, Cancel, or Void, and a follow-up action for each result. You can access the Composite Results window from this window to aggregate results of multiple samples in a sample group. Also access the Specification Comparison window to compare results to other specifications.

E-Record and E-Signature Approvals

There may be an e-signature event associated with this window. Refer to "Appendix D" for additional information on e-record and e-signature approvals associated with this window.

Managing Optional Tests without Results

Tests that are marked Optional on the specification do not require result entry to finalize the sample disposition as Accept, Accept with Variance, or Reject. However, if you enter a result for an optional test, then you can configure whether the result contributes to the final disposition of a sample using Consider Optional Tests Results in the Process Quality Parameters window and this is enforced for the context organization. If an optional test result is included, then its result evaluation uses the existing business rules for changing the sample disposition. For example, an out-of-specification result for an optional test only allows the sample disposition to change to Accept with Variance, Reject, or Cancel.

Refer to "Assigning Guaranteed by Manufacturer Results" for more information on tests without results that are guaranteed by a manufacturer.

Refer to "Associating Selected Test Results to Another Sample" to associate results from a parent sample.

You can use folders, flexfields, and attachments with this window.

See: Oracle E-Business Suite User's Guide and Oracle E-Business Suite Flexfields Guide

Assigning Guaranteed by Manufacturer Results

Manufacturers who repeatedly meet quality specifications defined by their customers establish a good performance record. They guarantee that the quality of their materials meets or exceeds customer specifications without actually sending test results to them. The practice of guaranteeing test results reduces the cost of full inspection on raw materials. By mutual agreement, the process of sampling and testing incoming materials is reduced. These results need to be marked as Guaranteed by Manufacturer in the results set. Tests that are guaranteed by one manufacturer may be required for other manufacturers who have not established a good performance record. The Quality Manager responsibility assigns this functionality.

Guaranteed by Manufacturer displays in the Evaluation column on the Results window for test results that are not entered because the manufacturer guarantees that the material meets specification for the identified tests. This includes results generated for customer samples.

Unlike the optional test result, a Guaranteed by Manufacturer test result is a required test that is needed to fulfill specification requirements. There are no results recorded for tests that are guaranteed by their manufacturers. The Quality Manager accepts the blank Guaranteed by Manufacturer result, the In Spec test indicator remains cleared, and the result can be assigned an evaluation of Accept with Variance.

If no result is reported for one of the referenced manufacturer-guaranteed tests, then Expression test calculations remain the same.

Results are not reported for tests that are guaranteed by a manufacturer, so composite results statistics for the sample group do not include them. Manufacturer guaranteed test results appear with a blank result on the View Samples by Test window and the result is omitted from calculating the number of tests included in composite results.

Prerequisites

To enter test results:

To enter results for a test, your responsibility must have access to the quality laboratory. The quality laboratory is a process-enabled inventory organization where you test the sample.

  1. Navigate to the Results window. Select a process-enabled inventory organization that your responsibility can access using the Organizations window. The Find Samples window displays.

  2. Query the Sample by sample number, disposition, item, item revision, lot, or LPN marked for deletion. You can select the LPN number that you want to use from the LOV. The sample number consists of the organization code and the document number. The sample description displays. Sample access is enforced at the test result level.

  3. Composited indicates that results for the sample are aggregated with other samples from the same sample group. Composited does not display when results are not aggregated.

  4. Disposition indicates the disposition of the sample as:

    • Pending when the sample is saved, and results are expected, but not yet recorded. Default.

    • In Progress when at least one result exists for the sample, but testing is not yet complete.

    • Complete when testing of all required tests and all additional tests is complete and result evaluations are assigned.

    • Retain when the sample is taken and put in storage. Results cannot exist for a sample with this disposition. Change the disposition to Pending before results entry.

    • Planned when the sample is scheduled for testing. Change the disposition to Pending before results entry.

  5. Spec indicates the specification and Version associated to the sample.

  6. Remaining Sample Qty displays the unused portion of the sample quantity with its UOM.

  7. Item indicates the item code for the material sampled. Item defaults from the sample and must be set up as process quality-enabled in the Item Master. This field is blank for a monitoring sample.

  8. Revision indicates the item revision, if the item is revision-controlled.

  9. Parent Lot displays if the testing is done on a specific parent lot.

  10. Lot displays if the testing is done on a specific lot .

  11. LPN displays the LPN number associated with the queried sample.

  12. Batch organization and document number display only for a WIP sample.

Tests

  1. Seq indicates the sequence of the test in the specification. This field defaults from the specification used when the sample was created. You can only edit this field when you add a new test. You cannot duplicate a sequence number within the same test set for different tests. Required.

  2. Test is the predefined test code. This defaults from the specification used when the sample was created. You can duplicate a test within the same results set if you have appropriate access. For example, if retesting is required, then an additional test needs to be added.

  3. Test Method indicates the test method used for the selected test, and it defaults from the specification. If there is no specification, then this field defaults from the test definition.

  4. Replicate is the number of times the test is repeated. There are several result entries that correspond to the number of test replicates defined on a specification. If a test replicate is not specified for the test, then the field defaults to 1. Required.

  5. Enter the Result for the test. You can select from the LOV for List of Test Values and Text Range test data types. The result entered must be within the test minimum and maximum ranges for Numeric Range, Numeric Range with Display Text, or Text Range test data types, or it must be in the list of valid values for the List of Test Values test data type. You cannot edit this field for the Expression test data type. You must have access to the laboratory assigned to the test in order to enter results. Result entry is optional for guaranteed by manufacturer tests.

  6. Test Unit indicates the quality test unit of measure for the selected test.

  7. Enter Actual Result Date as the date and time the result was recorded. You can use the LOV to select a date and time from the Calendar window. This field defaults to the system date and time. This date cannot be in the future, or prior to the system minimum date.

  8. Parent Sample displays the parent or source sample if results are associated from another sample.

  9. Target is the expected value to meet specification requirements if the test data type is Non-Validated, Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field defaults from the Specifications window.

  10. Minimum is the minimum acceptable value to meet specification requirements, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field defaults from the Specifications window.

  11. Maximum is the maximum acceptable value to meet specification requirements, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field defaults from the Specifications window.

  12. Priority indicates the test priority. This defaults from the specification or the test definition. Test priority identifies the relative priority of the test among the tests listed in the specification.

  13. Planned Result Date displays the date planned for result entry. This is calculated by adding the test duration from the test method to the sample date drawn.

  14. Test Alias displays an alternate name for the test. This field defaults from the Specifications window.

  15. Test Class displays a class name for the test. This field defaults from the test definition.

  16. Test Data Type is the test type defined on the Tests window.

  17. Test Description is a brief description of the selected test.

  18. Test By Date displays the date that the test must be completed. This is calculated by adding the test expiration time from the specification (in days, hours, minutes, and seconds) to the sample date drawn.

  19. Test Unit Description is a description of the unit of measure used for reporting test results.

Internal

  1. Seq indicates the sequence of the test in the specification. This field defaults from the specification used when the sample was created. You can only edit this field when you add a new test. You cannot duplicate a sequence number within the same test set for different tests. Required.

  2. Test is the predefined test code. This defaults from the specification used when the sample was created. You can duplicate a test within the same results set if you have appropriate access. For example, if retesting is required, then an additional test needs to be added.

  3. Enter Laboratory as the process quality-enabled organization where the test is performed You must have access to this laboratory to enter results. This field defaults to the quality laboratory organization specified at the time of sample creation. Required.

  4. Enter the Tester as the person performing the test. The tester can be different from the individual who enters results. The field defaults to the current user.

  5. Enter the Reserve Sample when the remaining sample quantity is insufficient to complete the selected test. The reserve sample must belong to the same sample group as the original sample queried. If a reserve sample is entered, then the actual test quantity is consumed from this sample.

  6. Planned Test Qty displays the test quantity defined on the specification, or the test quantity that defaults from the test method. This is the quantity needed to perform the test.

  7. Enter the Actual Test Qty as the quantity of sample used to perform the test. This field defaults from the planned test quantity. You can edit this field after result entry. When you save your work, the actual test quantity is subtracted from the remaining quantity of the sample selected.

  8. Test Qty UOM displays the unit of measure for the test method. This UOM must be convertible to the primary UOM for the item.

  9. Enter the Test Kit Item as the item number for the test kit, reagent, or other material required to perform the test. The item must be a part of the same inventory organization as the quality laboratory of the test on the sample.

  10. Enter the Test Kit Lot as the test kit lot number. This entry ensures traceability to reagent lots if results require investigation. This field is only available if you enter a test kit item.

  11. Test Method Resource displays the instrument defined for the test method associated to the test.

  12. Planned Resource displays the generic resource that is planned to be used for performing the test. This defaults from the defined test method resource.

  13. Planned Resource Instance displays the resource instance that is planned to be used for performing the test.

  14. Enter the Actual Resource as the generic resource used for performing the test.

  15. Enter the Actual Resource Instance as the resource instance used for performing the test.

External

  1. Seq indicates the sequence of the test in the specification. This field defaults from the specification used when the sample was created. You can only edit this field when you add a new test. You cannot duplicate a sequence number within the same test set for different tests. Required.

  2. Test is the predefined test code. This defaults from the specification used when the sample was created. You can duplicate a test within the same results set if you have appropriate access. For example, if retesting is required, then an additional test needs to be added.

  3. Enter the Test Provider as the supplier who performs the test. The LOV provides the current list of valid suppliers.

  4. Test Provider Description displays a description of the test provider.

Evaluation

  1. Seq indicates the sequence of the test in the specification. This field defaults from the specification used when the sample was created. You can only edit this field when you add a new test. You cannot duplicate a sequence number within the same test set for different tests.

  2. Test is the predefined test code. This defaults from the specification used when the sample was created. You can duplicate a test within the same results set if you have appropriate access. For example, if retesting is required, then an additional test needs to be added.

  3. Replicate displays the test repetition number.

  4. Result is the result for the selected test. You can select from the LOV for List of Test Values and Text Range test data types. The result entered must be within the test minimum and maximum ranges for Numeric Range, Numeric Range with Display Text, or Text Range test data types, or it must be in the list of valid values for the List of Test Values or Text Range test data types. You cannot edit this field for the Expression test data type.

  5. Test Alias displays an alternate name for the test. This field defaults from the Specifications window.

  6. Test Class displays an alternate name for the test. This field defaults from the test definition in the Tests window..

  7. Test Description is a brief description of the selected test.

  8. In Spec is determined by the application as the evaluation of conformance to the specification, and is:

    • Selected if the result conforms to the specification requirement for the test.

    • Cleared if the result does not meet the specification requirement, the test is a non-validated test data type, or the test is not part of the specification.

      The In Spec indicators for test results that have an evaluation of Void or Cancel do not contribute to determining the disposition of the sample. For example, a sample has three tests, T1, T2, and T3. Tests T1 and T2 are in-specification, and have evaluations of Accept. Test T3 test is out-of-specification, and has an evaluation of Accept with Variance. The sample for these tests cannot have disposition of Accept at this point. However, if the evaluation of Test T3 is changed to Void or Cancel, then the sample can have a disposition of Accept.

    • Cleared for tests results that are guaranteed by the manufacturer.

  9. Enter Evaluation as:

    • Accept to confirm that the result meets the specification requirement.

    • Accept with Variance if the result falls within an experimental error region, but is considered acceptable.

    • Guaranteed by Manufacturer if the manufacturer guarantees that the test result meets or exceeds customer specifications. Actual test results are not entered when you use this evaluation.

    • Reject to reject the result for the test.

    • Experimental Error displays if the result falls within the experimental error regions for the test. Change the evaluation to another selection. Refer to the "Entering Experimental Error" topic for an explanation of experimental error.

    • Cancel when the test is canceled.

    • Void to indicate that the test result is not reported due to a test result processing failure. The result cannot be reported because the test method itself failed.

      Refer to the "Understanding Results" topic for a complete discussion of the business rules associated to this field.

  10. Action displays the action code set up in Oracle Inventory that is associated to the experimental error region, or the out-of-specification region for the test result. Select an action if there is no default from the specification.

  11. Optional is selected when the test is not required for the specification. You can complete a result set that is missing optional test results.

  12. Target is the expected value to meet specification requirements if the test data type is Non-Validated, Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field defaults from the Specifications window.

  13. Minimum is the minimum acceptable value to meet specification requirements, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field defaults from the Specifications window.

  14. Maximum is the maximum acceptable value to meet specification requirements, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field defaults from the Specifications window.

  15. Priority indicates the test priority. This defaults from the specification or the test definition. Test priority identifies the relative priority of the test among the tests listed in the specification.

  16. Actual Result Date indicates the date and time for this result.

  17. Print Additional Test on COA is selected when an additional test is included on the printout of the Certificate of Analysis.

To edit a result:

You can edit previously entered result values when the sample disposition is In Progress, Pending, or Complete.

  1. Navigate to the Results window.

  2. Query for an existing sample with the sample disposition In Progress or Complete.

  3. Edit the Result in the desired test line.

  4. Enter the Result Date.

  5. Select the Evaluation if there is no default.

  6. Select the Action if there is no default.

To access the Allow Edit Results function:

Make corrections to test results anytime before assigning a final sample disposition. The edit results feature is enabled when the user responsibility has access to the Allow Edit Results function (GMD_ ALLOW_EDIT_RESULT).

Changes to saved results are permitted only while the sample disposition is In Progress or Complete. The Quality Manager can correct result entry errors before finalizing the sample disposition.

To mark a result record for purge:

Choose Mark for Purge from the Actions menu. The result record is marked for deletion.

To edit text in a result:

  1. Choose Edit Text from the Actions menu. The Select Text Paragraph window displays.

  2. Make appropriate text changes.

  3. Save the window.

To display the Item Samples or Monitoring Samples windows:

  1. Choose Samples from the Actions menu. The corresponding Item Samples or Monitoring Samples window displays for the selected results set.

  2. Refer to "Entering Samples" for information on samples.

  3. Close the window.

To display the Tests window:

  1. Place the cursor on the desired test line.

  2. Choose Tests from the Actions menu.

  3. Refer to the "Entering Tests" topic for information on tests.

To display the Test Methods window:

  1. Place the cursor on the desired test line.

  2. Choose Test Methods from the Actions menu.

  3. Refer to the " Entering Test Methods" topic for information on test methods.

  4. Close the window.

To display the Specifications window:

  1. Choose Specifications from the Actions menu.

  2. Refer to the "Entering a Specification" topic for information on specifications.

  3. Close the window.

To add tests:

  1. Place the cursor in the test details region.

  2. Choose Add Tests from the Actions menu.

  3. Enter a Test Class to narrow the list of tests.

  4. Select the Test to add. The list of tests available is not limited to those tests listed on the specification.

  5. Test Qty displays the planned test quantity defined on the specification or test method in its Test Qty UOM. If the test is not part of the specification, then you can edit this field.

  6. Click Add to add the selected test to the results set.

To assign a test group to a set of results:

  1. Query the desired result set to assign a test group.

  2. Choose Add Test Group from the Actions menu. You can add tests to a results set that has a sample disposition of Pending, In Progress, or Complete.

  3. Query the desired Test Class. The window displays all tests included in that test class.

    • Ordering is the sequence of the test in the group as entered on the Tests window.

    • Test is the name of the test entered.

    • Description is the description of the test code.

    • Data Type is the test data type as described in "Understanding Tests."

    • Qty is the planned amount of material required for the test. If the test is passive and does not require a physical sample quantity, then leave this field blank.

    • UOM is the unit of measure for the test quantity defined in the test method.

  4. Select Include to copy to the tests in the group to be copied to the results.

  5. Click OK. Selected tests are added to the results.

  6. Save your work.

To change the disposition of the sample:

  1. Choose Change Disposition from the Actions menu.

  2. Refer to "Changing Quality Disposition."

  3. Select a new disposition for the sample.

  4. Click OK to save changes.

To display the Composite Results window:

If there is at least one sample in the sample group, then this option is available.

  1. Choose Composite Results from the Actions menu.

  2. Refer to the "Compositing Test Results" topic for information on compositing test results.

To display the Specification Comparison window:

  1. Choose Specification Comparison from the Actions menu.

  2. Refer to the "Comparing Specifications" topic for information on comparing specifications.

To associate results from another sample:

  1. Choose Results Association from the Actions menu.

  2. Query the parent sample, select, and review results retrieved.

  3. Click OK to associate the parent sample results selected to the current sample results record.

  4. Close the window.

    Refer to "Associating Results to Another Sample" for detailed information.

To display the parent sample for an associated result:

  1. Place the cursor on the desired test line that has a result association to a parent sample.

  2. Choose Parent Sample from the Actions menu.

Selecting Samples and Tests for Mass Results Entry

The Mass Results Entry Test Selection window lets you access the Mass Results Entry window by selecting a set of samples and tests prior to mass results entry. Select samples from either the Sample Groups window or Samples By... window. You can select item, monitoring, or stability samples with a disposition of Pending, In Progress, or Complete.

You can use folders with this window.

Prerequisites

To select samples and tests for mass entry from the Sample Groups window:

  1. Navigate to the Sample Groups window.

  2. Select the sample group that has the samples to process.

  3. Click Mass Results. The Mass Results Entry Test Selection window displays.

  4. The Test and Description display for each test. You can select up to 20 tests assigned to samples at a time.

  5. Enter the Tester as the individual who enters results. Defaults to the current user. Required.

  6. Enter the Result Date as the date and time the result was entered. Defaults to the current system date and time. Required.

  7. Click Select All to select all tests, or Select the tests to enter for the sample group. Deselect Select to remove tests you do not want to enter for the sample group.

  8. Click Mass Results Entry. The Mass Results Entry window displays to enter test results for all samples selected. You cannot enter mass results for archive or reserve sample types.

To select samples and tests for mass entry from the Samples By... windows:

  1. Navigate to the Quality Workbench.

  2. Select Show Active Samples to display only those samples that are active. Active samples are defined as those samples that have a disposition of Pending or In Progress. Deselect this indicator to display item samples with all sample dispositions.

  3. Select View By Organization, Item, Disposition, Test, Test Class, Test Method, or Laboratory to sort samples by organization, item, disposition, test, test class, test method, or laboratory.

  4. Expand the Samples By... node, and select the desired top-level node that contains multiple samples. The appropriate Samples By... window displays.

  5. Click Mass Results Entry next to all samples to select for entry of mass results.

  6. Click Mass Results Entry to display the Mass Results Entry Test Selection window. The Test and Description display for each test.

  7. Enter the Tester as the individual who enters the test result. Defaults to the current user. Required.

  8. Enter the Result Date as the date and time the result was entered. Defaults to the current system date and time. Required.

  9. Click Select All to select all tests, or Select the tests to enter for the sample group. Deselect Select to remove tests you do not want to enter for the samples. You can select up to 20 tests assigned to samples at a time.

  10. Click Mass Results Entry. The Mass Results Entry window displays to enter test results for all samples selected. You cannot enter mass results for archive or reserve sample types.

To select samples and tests for mass entry from the Samples Summary window:

  1. Navigate to the Samples Summary window.

  2. Select the sample group that has the samples to process.

  3. Click Mass Results Entry. The Mass Results Entry Test Selection window displays. The Test and Description display for each test.

  4. Enter the Tester as the individual who enters the test result. Defaults to the current user. Required.

  5. Enter the Result Date as the date and time the result was entered. Defaults to the current system date and time. Required.

  6. Click Select All to select all tests, or Select the tests to enter for the sample group. Deselect Select to remove tests you do not want to enter for the sample group. You can select up to 20 tests assigned to samples at a time.

  7. Click Mass Results Entry. The Mass Results Entry window displays to enter test results for all samples selected. You cannot enter mass results for archive or reserve sample types.

To select all tests for results entry:

Click Select All on the Mass Results Entry Test Selection window to select all tests for results entry.

To clear all tests selected for results entry:

Click Clear All on the Mass Results Entry Test Selection window to clear all tests selected for results entry.

Using the Mass Results Entry Window

The Mass Results Entry window lets you enter results in a spreadsheet array for numerous samples without the need for extensive navigation. The primary purpose of this window is for results data collection on samples requiring testing, where result entry validation, expression test calculations, experimental errors, and e-signatures are supported. The samples listed can represent the same sample group material, resource, or locator, and can also span different sample groups for item and lot combinations, resources, or locators.

A sample can be listed several times in the array to correspond with the number of test replicates. You can enter results for each test in a column. Tests are assigned either by a specification, or manually added to each sample. Tests displayed are not necessarily common to each sample. You can view additional details about each sample and test, such as item, lot, and test method in context with the cell selected in the array. This unified view provides increased productivity by reducing window navigation and the requirement to drill down to enter results one at a time. It also gives you a tool to identify tests that still need to be performed on samples.

Refer to "Understanding Results" and "Entering Results" for additional features available after result entry.

Following is a typical scenario for entering mass results. Run 20 samples in duplicate for a chloride test. This generates 40 chloride test results to enter. Select the samples from the Sample Groups or Samples By... windows. Use the Mass Results Entry Test Selection window to select the chloride test for results entry. Navigate to the Mass Results Entry window. Enter the results, and save your work. If all test results are not complete, then return to the Mass Results Entry window, identify those samples that require a chloride test result, perform those tests, and enter the results.

Prerequisites

To use the Mass Results Entry window:

  1. Navigate to the Mass Results Entry window.

  2. The following fields are display only:

    • Sample is the combination of the three-letter organization code hyphenated with the sample number.

    • Replicate is the replicate number for the test listed.

  3. Enter the test result for each test replicate.

    • If the cell is disabled from result entry, then the test is not assigned to the sample or your responsibility does not have the privilege to access the laboratory organization.

    • If a result was previously entered for the sample and test combination, then it appears with the Tester and Result Date in the array, and it cannot be edited.

  4. Click Test Details to display the Test Details window to view detailed information for the current test and sample.

  5. Click Experimental Error to display the Experimental Errors window to review any results that fall into an experimental error region and the associated action code assigned by the specification or test definition.

  6. Save your work.

To calculate expression test results:

Click Calculate Expression to calculate an expression test.

Refer to "Understanding Results" for additional information.

To display experimental errors:

Click Experimental Error to display the Experimental Errors window.

Refer to "Displaying Experimental Errors from Mass Results Entry" for additional information.

To display test details:

Click Test Details to display the Test Details window where you can view detailed information for the current test and sample.

Refer to "Displaying Test Details from Mass Results Entry" for additional information.

Displaying Test Details from Mass Results Entry

The Test Details window displays the sample, test, and test method information associated to a specific sample when you select a cell in the Mass Results Entry window. Each cell represents a sample and test combination. This window can remain open. It refreshes as you change cursor focus in the Mass Results Entry window to reflect the current information associated to the cell.

Prerequisites

Displaying test details from the Mass Results Entry window:

  1. Navigate to the Test Details window.

  2. The following fields are display only:

    • Sample Description is the description for the sample.

    • Item is the process quality-enabled item associated to the sample.

    • Revision is the item revision, if the item is revision-controlled.

    • Parent Lot is the parent lot number. This displays if the item is child lot-controlled.

    • Lot is the sample lot.

    • Subinventory is the subinventory code for the sample if its source is either inventory or locator.

    • Locator is the locator code for the subinventory if the sample source is either inventory or locator.

    • Resource is the resource for the sample if its source is resource.

    • Test Method is test method assigned to the test.

    • Test Unit is the unit of measure used for reporting test results.

    • Test Quantity is the planned quantity of material required for testing. This field defaults from the test method or specification. UOM displays as the inventory UOM for the planned quantity.

    • Batch is the batch number for a WIP sample.

    • Optional is selected if the test is marked as optional on the specification. Otherwise, the test is required.

    • Display Label is the text displayed for a Numeric Range with Display Text test type.

    • Expression is the mathematical expression used to calculate an expression test type.

  3. Click Close.

Displaying Experimental Errors from Mass Results Entry

The Experimental Errors window summarizes any results entered on the Mass Results Entry window that fall into the experimental error regions. It also displays the assigned actions for these results as defined on a test or specification.

Refer to "Entering Experimental Error" for additional information.

Prerequisites

To display experimental errors from the Mass Results Entry window:

  1. Navigate to the Experimental Errors window.

  2. The following fields are display only:

    • Sample is the combination of the three-letter organization code hyphenated with the sample number.

    • Test is the code for the test in the current specification.

    • Replicate is the replicate number for the test. The replicate corresponds to the number of times the test is assigned or added to the sample.

    • Result is the test result entered on the Mass Results Entry window.

    • Action Code is the action code associated to the experimental error region for the test result.

Finding Samples from the Results Window

The Find Samples dialog box accessed from the Results window lets you find a sample by sample disposition, item, lot, or marked for deletion status.

Prerequisites

To find a sample from the Results window:

  1. Navigate to the Find Samples dialog box. The context organization displays in the window title. Select a different organization using the Organizations window.

  2. Enter any of the following to narrow your search:

    • Sample as the sample number.

    • Disposition of the sample as Pending, In Progress, Complete, Accept, Accept with Variance, Reject, or Cancel. If you leave this field blank, then it does not include sample disposition in the search.

    • Item as the process quality-enabled item associated to the sample. The item description displays.

    • Revision as the item revision, if the item is revision-controlled.

    • Lot for the item associated to the sample.

  3. Select Marked for Deletion as:

    • Yes to find samples that are marked for deletion.

    • No to find samples that are not marked for deletion.

    • Leave this field blank if you do not want to limit your search based on whether a sample is marked for deletion.

  4. Click Find. The Results window displays the first sample that matches the search criteria.

Compositing Test Results

The Composite Results window lets you aggregate results across samples for the same event or sample group. This window summarizes the composite statistics for results for each test. If results were previously composited, then you can view the existing composite results. Otherwise, the composite result set is created, and it incorporates the most recent data for the samples and results in the sample group. You can view the result for each sample that contributes to the statistics for each test. You can also create samples, and add tests to the composite results set.

E-Record and E-Signature Approvals

There may be an e-signature event associated with this window. Refer to "Appendix D" for additional information on e-record and e-signature approvals associated with this window.

Prerequisites

To composite test results:

  1. Navigate to the Sample Groups window.

  2. Select the row of the sample group to composite. Each row begins with a Source, followed by the Item and Description.

  3. Click Composite Results. The Composite Results window displays.

  4. The following fields describe the sample group, and are display only:

    • Sample Taken indicates the number of samples taken for the event.

    • Disposition indicates the disposition of the composite results set for the current specification associated to the sample group.

    • Item indicates the process quality-enabled item code for the material sampled.

    • Revision indicates the item revision.

    • Parent Lot displays the parent lot number, if the item is child lot-controlled.

    • Lot displays the lot number, if the item is lot-controlled.

    • LPN indicates the LPN number for the sample.

    • Specification displays the specification name.

    • Version displays the specification version.

  5. The following fields describe each composite result row, and are display only:

    • Test indicates each test performed on the item listed.

    • Mean indicates the arithmetic mean of the results composited.

    • Median indicates the median value of the results composited.

    • Mode indicates the most frequently occurring value of the results composited. If no value occurs more frequently than any other, then no value displays.

    • Low indicates the lowest result value of the results composited.

    • High indicates the highest result value of the results composited.

    • Range indicates the range of values for the results composited. For numeric based tests, this is the high result value minus the low result value.

    • Deviation indicates the standard deviation of the results composited.

    • Tests Used indicates the number of test results that comprise the statistics. The number of tests used is equal to or less than the total number of result records used to composite the test.

  6. Enter the Non-Validated Result for the Non-Validated test data type.

  7. The following fields are display only:

    • In-Spec indicates that the composite result is in-specification.

    • Total Tests indicates the total number of test results available for composite results.

    • Associated Result indicates that the composite result row reflects results associated from a parent sample group.

    • Report Precision indicates the reporting decimal precision for the test according to the specification.

    • Unit indicates the quality test unit of measure for the selected test.

    • Test Method indicates the test method code for the test.

    • Test Data Type indicates the test data type.

    • Test Description indicates a brief description of the test.

  8. Select Accept to accept the test results.

  9. Click Save Composite to save the composited results.

To view existing composite results:

  1. Select the row of the sample group to view.

  2. Click Composite Results.

  3. You can composite results as described in steps 4 to 6 of the previous task.

  4. Click Save Composite to save the composited results.

To view samples contributing to the composite result for a test:

  1. Select the row in the sample event or sample group.

  2. Click View Samples to display the samples used to composite results for the selected test. Refer to the "Viewing Samples for Composited Results" topic to view the samples for a composited test. You can exclude samples from the composite result set and recalculate the statistics displayed on this window.

To add samples to the composited results:

  1. Click Add Sample to add a sample to the sample group.

  2. Refer to the "Entering Samples" topic for more information on creating samples.

  3. Save the window.

To add a test to the composited results:

Click Add Test to display the Add Test To Samples window to add a test to the composited results.

Refer to "Adding Tests to Composited Results" for additional information.

Viewing Samples for Composited Results

The View Samples For Test window lets you view samples in the sample group that comprise the composited results for a specific test. You can exclude samples, and recalculate the statistics displayed on this window.

You can use flexfields and attachments with this window.

See: Oracle E-Business Suite User's Guide and Oracle E-Business Suite Flexfields Guide

Prerequisites

To display samples for composited results:

  1. Navigate to the Sample Groups window.

  2. Select the row of the desired sample group.

  3. Click Composite Results.

  4. Click View Samples. The View Samples For Test window displays. The sample organization displays in the window title.

  5. The following fields are display only for each selected test:

    • Target indicates the specification target. This field displays only if a target was defined, and the test is part of the current specification tied to the sample.

    • Minimum indicates the specification minimum, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field displays only if the test is part of the current specification tied to the sample.

    • Maximum indicates the specification maximum, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field displays only if the test is part of the current specification tied to the sample.

    • Test Method indicates the test method code for the test.

    • Unit indicates the quality test unit of measure specified at the test level for the selected test.

    • Test Data Type indicates the test data type entered for the test.

  6. The following fields list the samples that comprise the composite result for each selected test:

    • Sample indicates the combination of the three-letter organization code and the sample number.

    • Result indicates the test result for the sample.

  7. Click Exclude to exclude the sample from the composited result for the test.

  8. The following fields are display only:

    • Result Date indicates the date and time the result was entered.

    • Tester indicates the person performing the test. The tester can be different from the individual who enters results.

    • Parent Sample indicates the the combination of the three-letter organization code and the parent or source sample number, if one exists. The parent sample can belong to an organization that is different from the sample organization.

    • Disposition indicates the disposition for the sample as Pending, In Progress, Complete, Accept, Accept With Variance, Reject, or Cancel.

      Refer to "Understanding Sampling" for a complete explanation of sample disposition.

    • Reserve Sample indicates the retained sample for reserve use that is used to perform the test. Test quantities are consumed from this sample.

  9. The following fields describe each selected test, result, and sample, and are display only:

    • Description indicates a description of the sample.

    • Source indicates the source of the sample as Customer, Inventory, Physical Location, Resource, Supplier, Stability Study, or WIP.

  10. The following fields are display only and summarize the statistics of results that are not excluded from the composite:

    • Mean indicates the arithmetic mean of the results composited.

    • Median indicates the median value of the results composited.

    • Mode indicates the mode value of the results composited.

    • Low indicates the lowest result of the results composited.

    • High indicates the highest result for the test in results composited.

    • Range indicates the range of values for results composited. For numeric based tests, this is the high result minus the low result.

    • Deviation indicates the standard deviation of the results composited.

    • Tests Used indicates the number of test results that comprise the statistics. The number of tests used is equal to or less than the total number of result records used to composite the test.

    • Total Tests indicates the total number of tests used to composite the results.

To recalculate composite results:

  1. Select or deselect Exclude.

  2. Click Recalculate. The statistics displayed recalculate according to the changes made.

  3. Click OK.

Adding Tests to Composited Results

The Add Tests window lets you add a test to the results set of samples in the sample group. If the sample group disposition is Accept, Accept with Variance, or Reject, then tests cannot be added.

When adding a test, the planned test quantity and UOM fields default into the Results window from the Specification when the test is part of the specification, or from the Test Method when the test is not part of the specification.

You can use flexfields with this window.

See: Oracle E-Business Suite User's Guide and Oracle E-Business Suite Flexfields Guide

Prerequisites

To add a test to the Composite Results window:

  1. Navigate to the Sample Groups window.

  2. Select the row of the desired sample group.

  3. Click Composite Results. The Composite Results window displays. The context organization displays in the window title.

  4. Click Add Tests. The Add Tests window displays.

  5. Enter the Test to add. The test Description displays. Required.

  6. Enter the Test Qty as the planned quantity of material required to perform this test. Test Qty UOM displays the unit of measure in Oracle Inventory for the test method. This UOM must be convertible to the primary UOM for the item.

  7. Select Include to add the test to the selected sample.

  8. The following fields are display only:

    • Sample is the combination of the three-letter organization code and the sample number.

    • Disposition indicates the disposition of the sample as Pending, In Progress, Accept, Accept with Variance, Complete, Cancel, or Reject.

    • Description indicates a description of the sample.

    • Source indicates the source of the sample as Inventory, WIP, Customer, or Supplier.

    • Target indicates the specification target. This field displays only if a target was defined, and the test is part of the current specification tied to the sample. Display indicates the display text for the specification target for the test, if the test type is Numeric Range with Display Text.

    • Minimum indicates the specification minimum, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field displays only if the test is part of the current specification tied to the sample. Display indicates the display text for the specification minimum for the test, if the test type is Numeric Range with Display Text.

    • Maximum indicates the specification maximum, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression. This field displays only if the test is part of the current specification tied to the sample. Display indicates the display text for the specification maximum for the test, if the test type is Numeric Range with Display Text.

    • Unit indicates the quality test unit of measure specified at the test level for the selected test.

    • Method indicates the test method entered for the test.

    • Test Data Type indicates the test data type entered for the test.

    • Expression indicates the expression used for the Expression type test.

  9. Click Add.

Comparing Specifications

The Specification Comparison window lets you compare the current specification associated to the specific sample with a comparison specification that has a status of Approved in a specific process-enabled inventory organization to determine how closely the results set meets the comparison specification. These comparisons are useful for:

The results evaluated against the current specification are displayed when you access the Specification Comparison window from the Item Samples or Results windows, since appropriate information about the sample and sample group is available. The specification comparison functionality does not apply to retained, monitoring, or stability study samples. The current specification must have a status of Approved for Laboratory Use or Approved for General Use.

With the current specification displayed for an item, you can select a comparison specification using the LOV in the Spec field of the Comparison Specification. A comparison displays. If no specification is tied to the sample and its results, then you can select a comparison specification using the Spec LOV.

You have the following options:

You can use flexfields with this window.

See: Oracle E-Business Suite User's Guide and Oracle E-Business Suite Flexfields Guide

Prerequisites

To compare results against current and comparison specifications:

  1. Navigate to the Specification Comparison window. The context organization displays in the window title. The Find Sample dialog box displays.

  2. Query the Sample to use for specification comparison using the Find Samples window. Required.

  3. Disposition indicates the disposition of the sample as:

    • Pending when the sample is saved, and results are expected, but not yet recorded. Default.

    • In Progress when at least one result exists for the sample, but testing is not yet complete.

    • Complete when testing of all required tests and all additional tests is complete.

    • Accept when sample testing is complete, and results are in-specification.

    • Accept With Variance when sample testing is complete, and at least one result is in the experimental error region or out-of-specification. However, the sample approver has accepted this with variance.

    • Reject when sample testing is complete, and results are not acceptable, or out-of-specification.

    • Cancel when the sample is canceled.

  4. Item displays the process quality-enabled item associated to the sample.

  5. Revision displays the item revision.

  6. Parent Lot displays the parent lot number if specified on the sample.

  7. Lot displays the lot number if specified on the sample.

  8. LPN displays the LPN number associated to this sample. Click Find.

  9. Enter Spec as the comparison specification for reviewing results in the Comparison Specification section. You can select the LOV for the Spec field to find a specification based on grade, or an Inventory, WIP, Customer, or Supplier Specification Validity Rule.

  10. The following fields are display only:

    • Version indicates the version of the comparison specification.

    • Grade indicates the item grade for the comparison specification.

    • Status indicates the status of the comparison specification and version.

Current Specification

  1. The following fields are display only:

    • Spec indicates the current specification associated to the sample. Required.

    • Version indicates the current specification version.

    • Grade indicates the item grade for the current specification.

    • Status indicates the status of the current specification.

    • Test indicates the code for the test in the current specification.

    • Result indicates the latest result replicate for the selected test for the current specification. The result must be within the test minimum and maximum ranges for Numeric Range, Numeric Range with Display Text, or Expression test data types, or it must be in the list of valid values for the List of Test Values or Text Range test data types.

    • Target indicates the target for the current specification for the selected test.

    • Min indicates the specification minimum for the current specification, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression.

    • Max indicates the specification maximum for the current specification, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression.

    • In Spec is selected if the result is within the specification minimum and maximum or in conformance with the current specification. In Spec is cleared, if the result in outside the specification minimum and maximum or the test is not part of the specification.

Comparison Specification

  1. The following fields are display only:

    • Target indicates the target for the comparison specification, if defined, for the selected test.

    • Min indicates the specification minimum for the comparison specification, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression.

    • Max indicates the specification maximum for the comparison specification, if the test data type is Numeric Range, Numeric Range with Display Text, Text Range, or Expression.

    • In Spec is selected if the result is within the specification minimum and maximum or in conformance with the comparison specification. In Spec is cleared, if the result in outside the specification minimum and maximum or the test is not part of the comparison specification.

  2. Description is the test description.

  3. Type is the test data type.

  4. Unit is the test unit for the test.

  5. Method is the test method.

  6. Class is the class code for the test data type.

To use the comparison specification:

  1. Click Use Comparison Spec. The comparison specification becomes the current specification, and the sample disposition is reset to Complete. If tests are added, then the disposition is set to In Progress.

  2. You have the options to:

    • Select a new comparison specification.

    • Select a new comparison specification and add tests to the results set.

    • Change the disposition based on the new specification to Accept, Accept with Variance, or Reject.

  3. Save the window.

To add tests from the comparison specification:

  1. Click Add Tests. The tests from the comparison specification that are not part of the current specification are added to the end of the results set.

  2. If workflow is enabled, then required test notifications are sent.

  3. Save the window.

To view the sample associated to the sample group:

Click View Sample to display the Item Samples window. Refer to the "Samples" topic for more information on samples.

To display the current specification:

Choose Current Specification from the Actions menu. Refer to the "Specifications" topic for information on specifications. You can only view the specification.

To display the comparison specification:

Choose Comparison Specification from the Actions menu. Refer to the "Specifications" topic for information on specifications. You can only view the specification.

To change disposition:

Choose Change Disposition from the Actions menu. Refer to the "Changing Quality Disposition" topic for information on changing the disposition of a sample.

To display the Item Samples window:

Choose Sample from the Actions menu. Refer to the "Samples" topic for information on samples.

Finding Samples for Specification Comparison

The Find Samples dialog box lets you find a sample that is currently associated to a specification that you can use for comparison to another specification.

Prerequisites

To find samples for specification comparison:

  1. Navigate to the Find Samples dialog box. The context organization displays in the window title.

  2. Enter any of the following to narrow your search:

    • Sample as the sample number.

    • Item as the item sampled.

    • Revision as the item revision, if the item is revision-controlled.

    • Parent Lot as the parent lot sampled.

    • Lot as the lot sampled.

    • LPN as the LPN number for the sample.

    • Disposition as Pending, In Progress, Accept, Accept with Variance, Complete, Cancel, or Reject. If you leave this field blank, then it does not include sample disposition in the search.

  3. Click Find. The Specification Comparison window displays any current specification that matches the parameters entered.

Changing Quality Disposition

The Quality Change Disposition dialog box lets you change the disposition of a sample, or sample group to indicate the final review of the Quality Manager. When the sample or sample group disposition is finalized to Accept, Accept with Variance, Reject, or Cancel, then additional tests cannot be added. Changing to a final sample group disposition of Accept, Accept with Variance, or Reject can also update lot status, grade, hold date, and the item lot UOM conversion factors between UOM classes. The disposition of a sample or sample group can be assigned automatically or manually according to preseeded business rules. Sample groups that have only one sample have the same disposition as the single sample in that group.

You can access this dialog box from the following windows:

A sample or sample group can represent the quality characteristics of a single lot or multiple lots, indicated by specifying a parent lot only. For samples against a particular lot, the final sample group disposition can update lot attributes such as material status and grade for the lot specified. For samples against a parent lot only, the lot attributes update can apply to all lots one level below the parent lot. Refer to the Oracle Inventory User's Guide for detailed information.

Assigning Sample Disposition Automatically

The sample disposition is set automatically based on the status of sampling and testing as follows:

Assigning Sample Disposition Manually

You can change the sample disposition manually as follows:

Assigning Sample Group Disposition Automatically

The sample group disposition reflects the disposition of the samples that comprise the sample group for an event. This includes receiving, batch step release, and lot expiration. Disposition is set to:

Assigning Sample Group Disposition Manually

You can change the sample group disposition manually as follows:

Assigning a Lot during Sample Group Disposition

Delayed lot entry lets you create a sample without entering a lot number. It postpones the identification of a lot number during sample creation until it is known at the time of assigning a final disposition before movement into inventory. This delay accommodates sampling processes in which the material is analyzed before delivery to inventory. When assigning a final sample group disposition, you can specify a lot, so that quality-driven updates to the lot status, grade, hold date, reason code, and the recommended lot-specific unit of measure conversions can occur simultaneously. For WIP and Supplier samples, the lot must belong to the batch or receipt specified at the time of sampling. Refer to "Delaying the Entry of Lot Numbers" for detailed information.

Changing Lot Status

The application evaluates and defaults the lot status of a sampled lot based on whether or not the sample group meets the associated specification. You can also change the lot grade if the results meet the specification grade. Lot status and grade are used to allocate inventory properly for production and shipment, or to prevent failed lots from being sold or shipped. These decisions are based on your individual business requirements.

A sample taken against a lot-controlled item and a location-controlled subinventory can change the lot status for the subinventory and locator specified by the sample. You can perform a lot status change if the sample disposition is changed to Accept or Accept with Variance for in specification results or Reject for out-of-specification results.

Changing Lot Hold Date

The Quality Change Disposition dialog box lets you update the hold release date for the sampled lot when changing the sample group disposition to Accept, Accept with Variance, or Reject. The lot hold date is determined by the normal duration of quality inspection or post-processing lead time after the lot is created with inventory. The Quality Manager can change the default hold date to release the lot earlier or later depending on required period of analysis. The lot must be specified on the sample and Control Lot Attributes must be enabled on the associated specification validity rule to display and update the lot hold date at the time of final disposition.

Selecting an Inventory Transaction Reason Code

When you select an inventory transaction reason code on this window, you can only display those reason codes that you were granted permission to access. In order to view a reason code on this window, your responsibility must have security enabled for STSI and GRDI document types.

Entering UOM Conversions

Refer to the "Using Lot UOM Conversions Based on Quality Results" for a complete explanation of how to enter lot unit of measure conversions based on quality results.

Refer to the Oracle Inventory User's Guide for more details on changing lot status and grade.

Prerequisites

To change the quality disposition of a sample:

  1. Navigate to the Quality Change Disposition dialog box.

  2. The following fields are display only:

    • Sample indicates sample number as a combination of the three-letter organization code and the sample number. This field is blank for a sample group.

    • Item indicates the process quality-enabled item sampled.

    • Revision indicates the item revision.

    • Parent Lot indicates the parent lot sampled.

    • Lot indicates the lot sampled.

    • LPN indicates the LPN number for the sample.

      Note: If the Delayed LPN Entry checkbox is selected for the chosen validity rule in the WMS-enabled organization, then you can enter the LPN number for the sample if the LPN number is blank. Please note that any change to material status maintained at the Lot level. By performing lot splits, you can associate different statuses for material that is assigned to the same LPN.

    • Spec displays the specification, if one is associated to a sample, and the specification Version.

Disposition

  1. Current Disposition displays the present disposition of the sample.

  2. Retain As displays the sample type as Archive or Reserve sample.

  3. Enter Change Disposition To as:

    • Pending when the sample is saved, and results are expected, but not yet recorded. This is the default disposition.

    • In Progress when at least one result exists for the sample, but testing is not yet complete.

    • Complete when testing of all required tests and all additional tests is complete.

    • Accept when sample testing is complete, and results are in-specification.

    • Accept With Variance when sample testing is complete, and at least one result is in the experimental error region or out-of-specification. However, the sample approver has accepted this with variance.

    • Reject when sample testing is complete, and results are not acceptable, or out-of-specification.

    • Retain when the sample is reserved and put in storage. Typically, results do not exist for a sample with this disposition.

    • Cancel when the sample is canceled.

      Required.

  4. Select Schedule Specification Tests when you are changing an archive sample from a disposition of Retain to Pending.

    • Select this indicator to add all tests listed on the associated specification to the Results window.

    • Deselect this indicator to add tests directly to the sample on the Results window.

  5. Change Lot Status From displays the current status of the lot.

  6. Enter Change Lot Status To as the new lot status. This field defaults to the Lot Status or the Out-of-Spec Lot Status that is defined on the specification validity rule if the Control Lot Attributes indicator is selected.

  7. Change Grade From displays the current lot grade.

  8. Enter Change Lot Grade To as the new lot grade. This field defaults to the grade specified on the associated specification if the Control Lot Attributes indicator is selected on the specification validity rule.

  9. Enter a Reason Code for a lot status change, grade change, or both. This defaults from the Process Quality Parameters window, and is restricted by reason code security.

  10. Enter a Hold Date when the lot is released from quality hold for inspection. This date defaults from the Item Lots window in the Oracle Inventory application, and can be changed to reflect the expedition or delay in testing.

UOM Conversions

  1. Select Recommend to indicate that you approve the displayed UOM conversion factor to be stored and sent for consideration as the item lot UOM conversion. Deselect this indicator to indicate that you do not want the conversion to be sent to Oracle Inventory.

  2. Test displays the test used to process the conversion. This can be a numeric or an expression test. Refer to the "Using Lot UOM Conversions Based on Quality Results" topic for an explanation of how this test is entered and how it is used in the UOM conversion process.

  3. From UOM displays the item primary UOM on the specification. Each of the conversions corresponds to a test on the specification where the Calculate UOM Conversion indicator is selected, the result value is entered, and the result status is not Cancel or Void. For example, the From UOM is in kilograms which is defined as a Mass unit of measure type. The base unit of measure for Mass is pounds.

  4. Current conversion displays the value of the existing lot UOM conversion for the item lot between the item primary UOM and the To UOM (the unit of measure to convert to). If this conversion exists, it must be converted to the item primary UOM and the To UOM since conversions are stored in the base units.

  5. Replicate displays the test replicate number.

  6. Proposed conversion displays the test result or proposed value of the existing lot UOM conversion for the item. The result for the specification test is used to calculate the UOM conversion.

  7. To UOM displays the unit of measure to convert to as specified in the test listed in the specification. For example, the To UOM is in liters which is defined as a Volume unit of measure type. The base unit of measure for Volume is gallons.

  8. Proposed Conversion in Base UOM is Item Base UOM (first field) equal to the calculated conversion factor (second field) multiplied by the base To UOM (third field). For example, one pound equals X gallons, where X is the calculated conversion factor. The second line is the inverse conversion.

  9. Click OK.

Associating Results to Another Sample

The Result Association window lets you associate test results from a parent (source) sample, to a child (target) item sample. You can associate results for a sample or sample group. This process prevents redundant data entry, and it maintains an association to the parent test result or parent composited test result. One of the benefits of results association is to leverage intermediate results for finished goods reporting.

In the results association process, all test results that are common to the parent and child sample or sample groups are associated. Parent sample or sample group results can be associated repeatedly in several other child samples or child sample groups. You cannot undo a result association after the child sample results are saved. If you make a mistake in associating a result, then you can delete the child sample, or void the result that was inherited through the association.

After results are associated to the child sample, you can review them and add new tests or replicates as required. Enter results for these additions and save your work. Generate a Certificate of Analysis based on the revised child sample.

Associated results appear on the Quality Workbench indicating the parent sample number. Tests with inherited results do not initiate Oracle Process Manufacturing (OPM) Quality Test Workflow Notifications, and do not contribute to the composite result statistics when the child sample results are aggregated.

Prerequisites

To associate results for a single sample from the Results window:

  1. Navigate to the Results window.

  2. Query the child Sample.

  3. Choose Result Association from the Actions menu. The Result Association window displays.

  4. Select any of following fields to find a parent sample:

    • Organization is the organization of the sample.

    • Item is the sample item and its description.

    • Parent Lot is the optional sample parent lot.

    • Lot is the optional sample lot.

  5. Enter Source as the source of the parent sample:

    • Customer indicates that the sample source is from a customer.

    • Inventory indicates that the sample source is from inventory.

    • Supplier indicates that sample source is from a supplier.

    • WIP indicates that the sample source is from work in process.

  6. Select the desired Parent Sample group based on the selection criteria. Required. The Organization, Creation Date, and disposition of the sample group is retrieved.

Include

  1. Select Text to include text associated from the parent sample to the child sample.

  2. Select Flexfield to include the flexfield associated from the parent sample to the child sample.

  3. Select Attachment to include the attachment associated from the parent sample to the child sample.

Result Association

  1. The following fields are display only:

    • Test displays the predefined test code for each result.

    • Parent Replicate is blank because a composite result is used.

    • Result displays the composite test result from the parent sample group.

    • Result Date displays the date the result was entered on the parent sample.

    • Child Replicate displays the replicate number for the result in the child sample.

  2. Select the test record.

  3. Click OK to associate the parent sample results to the child sample.

To associate results from the Composite Results window:

  1. Navigate to the Composite Results window.

  2. Query the child Sample Group.

  3. Choose Result Association from the Actions menu. The Result Association window displays.

  4. Query any of following fields to find a parent sample group:

    • Organization is the organization of the sample.

    • Item is the process quality-enabled sample item and its description.

    • Parent Lot is the optional sample parent lot.

    • Lot is the optional sample lot.

  5. Enter Source as the source of the parent sample group:

    • Customer indicates that the sample group is from a customer.

    • Inventory indicates that the sample source is from inventory.

    • Supplier indicates that sample source is from a supplier.

    • WIP indicates that the sample source is from work in process.

  6. Select the desired Parent Sample based on the selection criteria entered. Required.

Include

  1. Select Text to include text associated from the parent sample group to the child sample group.

  2. Select Flexfield to include the flexfield associated from the parent sample group to the child sample group.

  3. Select Attachment to include the attachment associated from the parent sample group to the child sample group.

Result Association

  1. The following fields are display only:

    • Test displays the predefined test code for each result.

    • Parent Replicate displays the highest acceptable replicate of the most recent result for the parent sample.

    • Result displays the test result.

    • Result Date displays the date the result was saved.

    • Child Replicate is blank because a composite result is used.

  2. Select the test record.

Parent Group Samples

  1. The following fields are display only:

    • Organization is the organization where the sample is drawn.

    • Sample is the combination of the three-letter organization code hyphenated with the sample number.

  2. Click OK to associate the composited results to the target sample group.

To display the parent sample for an associated result:

  1. Place the cursor on the desired test line that has a result association to a parent sample.

  2. Choose Parent Sample from the Actions menu. The Samples window displays the parent sample for the associated result.

  3. Close the window.