Test
Complete the functional, performance, and user acceptance testing for a successful implementation. The specific testing requirements may vary depending on the implementation and customization of the application.
Ensure that you:
- Use a variety of testing techniques. In addition to functional and performance testing, consider using other testing techniques such as usability testing, security testing, system integration testing, and regression testing to ensure that the Fusion Data Intelligence artifacts are meeting all business requirements.
- Use realistic test data that accurately reflects the data used in the production Fusion Data Intelligence instance. This will help ensure that the workbooks are working as expected and the results are accurate.
- Involve business stakeholders in testing to ensure the development objects meet business needs and requirements.
Review the following for:
- Functional testing - As a functional administrator:
- Plan to grant appropriate application roles for each functional activity.
- Plan to test the typical functional activities, such as creating a diversity report.
- Perform role-based testing of Fusion Data Intelligence metrics.
- Validate data after loading it into Fusion Data Intelligence. Validate the common library of metrics that exist in both Fusion Data Intelligence and Oracle Transactional Business Intelligence.
- Validate the functional security setup in Fusion Data Intelligence and the alignment with the source system data.
- Clearly define functional scenarios and use cases.
See Validate Your Data.
- Performance testing - As a service administrator and functional
administrator:
- Once you configure the Fusion Data Intelligence test environment, you must execute the performance tests. This involves running the performance testing scenarios and recording the performance metrics. Ensure you scale up the additional test environment (ATE) storage, CPUs, and memory as needed to achieve desired results. While connecting to Oracle Autonomous Data Warehouse to query the Fusion Data Intelligence data, ensure that you select the service level as Low.
- After completing performance testing, you must analyze the results to identify performance issues or bottlenecks. The recommendation is to use the Warehouse Refresh Statistics subject area for analysis. This may involve reviewing performance metrics, analyzing system logs, performing root cause analysis, and analyzing the size and customizations complexity of the source system.
- If testing identifies performance issues or bottlenecks, then you must work with the development team to optimize system performance. This may involve changing the system configuration, activating only the required functional areas, modifying the ETL process, or changing the data sources. Note that patching of source or target systems might impact the performance, hence avoid benchmarking the performance during this time window.
- ❇ User acceptance testing - As a functional administrator and
quality assurance specialist:
- Ensure that the testing end users have the necessary groups and roles to perform the tests.
- Develop a set of test scenarios covering all aspects of Fusion Data Intelligence. Test scenarios should be written in a way that's easy to understand and follow. They should also be representative of real-life scenarios that users will encounter.
- Perform tests based on the test scenarios that were created. This includes performing functional tests, usability tests, and performance tests. It's essential to document any issues or defects that are encountered during testing.
- After all the tests have been completed and any high priority issues resolved, review and approve the application for production release.
Use the checklist to confirm that the action items are planned for. See Test Checklist.