Test

Complete the functional, performance, and user acceptance testing for a successful implementation. The specific testing requirements may vary depending on the implementation and customization of the application.

Ensure that you:
  • Use a variety of testing techniques. In addition to functional and performance testing, consider using other testing techniques such as usability testing, security testing, system integration testing, and regression testing to ensure that the Fusion Analytics artifacts are meeting all business requirements.
  • Use realistic test data that accurately reflects the data used in the production Fusion Analytics instance. This will help ensure that the workbooks are working as expected and the results are accurate.
  • Involve business stakeholders in testing to ensure the development objects meet business needs and requirements.
Review the following for:
  • Functional testing - As a functional administrator:
    • Plan to grant appropriate application roles for each functional activity.
    • Plan to test the typical functional activities, such as creating a diversity report.
    • Perform role-based testing of Fusion Analytics metrics.
    • Validate data after loading it into Fusion Analytics. Validate the common library of metrics that exist in both Fusion Analytics and Oracle Transactional Business Intelligence.
    • Validate the functional security setup in Fusion Analytics and the alignment with the source system data.
    • Clearly define functional scenarios and use cases.

    See Validate Your Data.

  • Performance testing - As a service administrator and functional administrator:
    • Once you configure the Fusion Analytics test environment, you must execute the performance tests. This involves running the performance testing scenarios and recording the performance metrics. Ensure not to use the additional test instances to conduct any performance testing because these instances are provisioned with less resource capacity compared to the standard instances provisioned as part of the original order. While connecting to Oracle Autonomous Data Warehouse to query the Fusion Analytics data, ensure that you select the service level as Low.
    • After completing performance testing, you must analyze the results to identify performance issues or bottlenecks. The recommendation is to use the Warehouse Refresh Statistics subject area for analysis. This may involve reviewing performance metrics, analyzing system logs, performing root cause analysis, and analyzing the size and customizations complexity of the source system.
    • If testing identifies performance issues or bottlenecks, then you must work with the development team to optimize system performance. This may involve changing the system configuration, activating only the required functional areas, modifying the ETL process, or changing the data sources. Note that patching of source or target systems might impact the performance, hence avoid benchmarking the performance during this time window.

    See Best practices to optimize the Fusion Analytics data pipeline refresh performance - YouTube and About Data Refresh Performance.

  • User acceptance testing - As a functional administrator and quality assurance specialist:
    • Ensure that the testing end users have the necessary groups and roles to perform the tests.
    • Develop a set of test scenarios covering all aspects of Fusion Analytics. Test scenarios should be written in a way that's easy to understand and follow. They should also be representative of real-life scenarios that users will encounter.
    • Perform tests based on the test scenarios that were created. This includes performing functional tests, usability tests, and performance tests. It's essential to document any issues or defects that are encountered during testing.
    • After all the tests have been completed and any high priority issues resolved, review and approve the application for production release.

Use the checklist to confirm that the action items are planned for. See Test Checklist.