Oracle® Fusion Middleware
Installing and Configuring Oracle Stream Analytics
12c (12.2.1.3.0)
E95529-03
December 2019
About the Oracle Stream Analytics Installation
This document explains how to install and configure a new Oracle Stream Analytics 12c (12.2.1.3.0) Oracle home.
After you install Oracle Stream Analytics (OSA), you can configure a standalone OSA domain. Note that an OSA standalone-server domain does not require Oracle WebLogic Server. For more information, see Standalone-Server Domains in Administering Oracle Stream Analytics. Optionally, you can integrate Apache Spark with your OSA installation, as described in Installing and Configuring Apache Spark.
If you are using a previous version of Oracle Event Processing (the prior name for OSA), note there is no upgrade to the OSA 12c (12.2.1.3.0) runtime software. If you created an Oracle Event Processing 11g standalone-server domain, then you must install a new OSA 12c (12.2.1.3.0) Oracle home and configure a new OSA 12c (12.2.1.3.0) standalone-server domain.
Obtaining the Product Distribution
The distribution for Oracle Stream Analytics (OSA) is available on the Oracle Technology Network (OTN).
To obtain OSA:
-
Go to the OSA Installer download page on OTN at http://www.oracle.com/technetwork/middleware/complex-event-processing/download/index.html
-
Accept the license agreement, then select +Recommended Install Process, then the download link for the Oracle Stream Analytics Installer (
fmw_12.2.1.3.0_osa_Disk1_1of1.zip
). -
Extract the contents of this
.zip
file onto your system. One of the files extracted will befmw_12.2.1.3.0_osa_generic.jar
, which runs the product installer and installs the software onto your system (see Installing Oracle Stream Analytics).
Installing Oracle Stream Analytics
This section describes how to install the OSA software and create the Oracle home directory.
Topics:
- Starting the Installation Program
Before running the installation program, you must verify the JDK and prerequisite software is installed. - Navigating the Installation Screens
The installer shows a series of screens where you verify or enter information. - Verifying the Installation
After you complete the installation, verify whether it was successful by completing a series of tasks.
Starting the Installation Program
Before running the installation program, you must verify the JDK and prerequisite software is installed.
To start the installation program:
Note:
You can also start the installer in silent mode using a saved response file instead of launching the installer screens. For more about silent or command line installation, see Using the Oracle Universal Installer in Silent Mode in Installing Software with the Oracle Universal Installer.
When the installation program appears, you are ready to begin the installation.
Parent topic: Installing Oracle Stream Analytics
Navigating the Installation Screens
The installer shows a series of screens where you verify or enter information.
The following table lists the order in which installer screens appear. If you need additional help with an installation screen, click Help.
Table - Oracle Stream Analytics Install Screens
Screen | Description |
---|---|
Installation Inventory Setup |
On UNIX operating systems, this screen opens if this is the first time you are installing any Oracle product on this host. Specify the location where you want to create your central inventory. Make sure that the operating system group name selected on this screen has write permissions to the central inventory location. For more about the central inventory, see About the Oracle Central Inventory in Installing Software with the Oracle Universal Installer. This screen does not appear on Windows operating systems. |
Welcome |
Review the information to make sure that you have met all the prerequisites, then click Next. |
Auto Updates |
Select to skip automatic updates, select patches, or search for the latest software updates, including important security updates, through your My Oracle Support account. |
Installation Location |
Specify your Oracle home directory location. You can click View to verify and ensure that you are installing Oracle Stream Analytics in the correct Oracle home. |
Installation Type |
Select either Stream Analytics or Stream Analytics With Examples. Toggle the option you prefer and review the items that will be installed in the list below the installation types. |
Prerequisite Checks |
This screen verifies that your system meets the minimum necessary requirements. To view the list of tasks that gets verified, select View Successful Tasks. To view log details, select View Log. If any prerequisite check fails, then an error message appears at the bottom of the screen. Fix the error and click Rerun to try again. To ignore the error or the warning message and continue with the installation, click Skip (not recommended). |
Installation Summary |
Use this screen to verify installation options you selected. If you want to save these options to a response file, click Save Response File and enter the response file location and name. The response file collects and stores all the information that you have entered, and enables you to perform a silent installation (from the command line) at a later time. Click Install to begin the installation. |
Installation Progress |
This screen shows the installation progress. When the progress bar reaches 100% complete, click Finish to dismiss the installer, or click Next to see a summary. |
Installation Complete |
This screen displays the Installation Location and the Feature Sets that are installed. Review this information and click Finish to close the installer. |
Parent topic: Installing Oracle Stream Analytics
Verifying the Installation
After you complete the installation, verify whether it was successful by completing a series of tasks.
- Reviewing the Installation Log Files
Review the contents of the installation log files to make sure that the installer did not encounter any problems. - Checking the Directory Structure
The contents of your installation vary based on the options that you selected during the installation. - Viewing the Contents of the Oracle Home
You can view the contents of the Oracle home directory by using theviewInventory
script.
Parent topic: Installing Oracle Stream Analytics
Reviewing the Installation Log Files
Review the contents of the installation log files to make sure that the installer did not encounter any problems.
By default, the installer writes logs files to the Oracle_Inventory_Location/logs
(on UNIX operating systems) or Oracle_Inventory_Location\logs
(on Windows operating systems) directory.
For a description of the log files and where to find them, see Installation Log Files in Installing Software with the Oracle Universal Installer.
Parent topic: Verifying the Installation
Checking the Directory Structure
The contents of your installation vary based on the options that you selected during the installation.
See What Are the Key Oracle Fusion Middleware Directories? in Understanding Oracle Fusion Middleware.
Parent topic: Verifying the Installation
Viewing the Contents of the Oracle Home
You can view the contents of the Oracle home directory by using the viewInventory
script.
See Viewing the Contents of an Oracle Home in Installing Software with the Oracle Universal Installer.
Parent topic: Verifying the Installation
Roadmap for Verifying Your System Environment
Before you begin the installation and configuration process, you must verify your system environment.
The following table identifies important tasks and checks to ensure that your environment is properly prepared for installing and configuring OSA.
Table - Roadmap for Verifying Your System Environment
Task | Description | Documentation |
---|---|---|
Verify certification and system requirements. |
Verify that your operating system is certified and properly configured for installation and configuration. |
See Verifying Certification, System Requirements, and Interoperability in Planning an Installation of Oracle Fusion Middleware. |
Identify a proper installation user. |
Verify that the installation user has the proper permissions to install and configure the software. |
See Selecting an Installation User in Planning an Installation of Oracle Fusion Middleware. |
Select the installation and configuration directories on your system. |
Verify that you are able to create the necessary directories for installation and configuration, according to the recommended directory structure. |
See About the Directories for Installation and Configuration in Planning an Installation of Oracle Fusion Middleware. |
Install a certified JDK. |
The installation program for the distribution requires a certified JDK present on your system. |
See About JDK Requirements for an Oracle Fusion Middleware Installation in Planning an Installation of Oracle Fusion Middleware. |
Configuring the Oracle Stream Analytics Domain
An OSA installation does not include Oracle Fusion Middleware Infrastructure, so only standalone-server domains may be created for OSA. You can learn more about standalone-server domains by reading Standalone-Server Domains in Oracle Fusion Middleware Administering Oracle Stream Analytics.
For instructions on configuring a standalone domain, see Create a Standalone-Server Domain in Oracle Fusion Middleware Administering Oracle Stream Analytics.This guide also contains other administrative tasks for OSA, including updating a domain and starting and stopping the servers in the domain.
Installing and Configuring Apache Spark
Apache Spark (Spark) is an open source big data processing framework built around speed, ease of use, and sophisticated analytics.
Note:
OSA supports only cluster mode Spark deployments.-
OSA is installed in
OSA_HOME
.For example,
OSA_HOME = /apps/oracle/middleware
. -
The OSA application domain has been created, named
OSA_DOMAIN
.For example,
Note that there is no step specific to Spark installation when you create the OSA application domain.OSA_DOMAIN=OSA_HOME/user_projects/domains/osa/defaultserver
. -
A Spark cluster is set up, using one of the cluster types supported by OSA (such as Spark standalone or Hadoop YARN). For links to the supported Spark version download and documentation, see Prerequisites for Apache Spark Integration.
When you are ready, use the information in the following sections to integrate Spark with OSA:
- Prerequisites for Apache Spark Integration
Using Spark with OSA requires installing and configuring third party components. - Installing Apache Spark
Install the version of Spark that is supported by your version of OSA. - Installing the OSA-Spark Integration Component
The OSA-Spark integration component adds Oracle’s Continuous Query Language (CQL) support, along with an OSA-specific runtime environment, to the Spark framework to implement application deployment. - Installing Kafka
Apache Kafka is required by an OSA-Spark environment to send and receive data. - Configuring the OSA Domain for Spark
OSA-Spark integration relies on settings in one or more configuration files. - Configuring the osa.properties File
Theosa.properties
file is the configuration file for OSA-Spark integration.
Prerequisites for Apache Spark Integration
Using Spark with OSA requires installing and configuring third party components.
The following table provides information about the components that are required in a Spark cluster environment to use Spark with OSA. This information is referenced from the subsequent topics that describe installing and configuring these components.
Component | Version | Links |
---|---|---|
Apache Kafka | 0.8.2.2 for Scala 2.10 | Download: http://kafka.apache.org/downloads.html Documentation: http://kafka.apache.org/documentation.html |
Hadoop YARN | 2.6.X (recommended) | Download: http://hadoop.apache.org/releases.html |
Apache Spark |
1.5.X - Prebuilt for Hadoop 2.6 (or your Hadoop version) Note: 1.6.X not supported |
Download: http://spark.apache.org/downloads.html
Documentation: http://spark.apache.org/docs/1.5.1/ |
Parent topic: Installing and Configuring Apache Spark
Installing Apache Spark
Install the version of Spark that is supported by your version of OSA.
To install Spark:
-
Refer to Prerequisites for Apache Spark Integration to install Spark in
SPARK_HOME
.
Notes:
-
OSA supports only cluster mode Spark deployments.
-
Install the version of Spark that is supported by your version of OSA. For 12c (12.2.1.3.0), see Prerequisites for Apache Spark Integration.
-
If you want to use Spark together with Hadoop Distributed File System (HDFS) and YARN, install the Spark version that was compiled for the Hadoop version compatible with your current environment.
-
Your Spark distribution must also contain
SPARK_HOME/lib/spark-examples-X.X.X-hadoopY.Y.Y.jar
, which is also required by OSA. -
Spark must be installed on each node in your cluster, and also on the node where the OSA domain runs.
-
SPARK_HOME
must have the same value on each node, including the OSA node. It makes sense to mount the same network drive to the same location on each node.
Parent topic: Installing and Configuring Apache Spark
Installing the OSA-Spark Integration Component
The OSA-Spark integration component adds Oracle’s Continuous Query Language (CQL) support, along with an OSA-specific runtime environment, to the Spark framework to implement application deployment.
This component is delivered as part of the OSA server installation as a single JAR at OSA_HOME/oep/spark/lib/spark-osa.jar
.
This JAR file must be copied to all worker nodes and also to the OSA node. Ideally, you copy this file into your Spark installation at SPARK_HOME/lib/spark-osa.jar
.
Parent topic: Installing and Configuring Apache Spark
Installing Kafka
Apache Kafka is required by an OSA-Spark environment to send and receive data.
Kafka is the only external system that an OSA-Spark exploration can accept data from or send data to. If you want to chain two explorations (where one is running in Spark, and the other running in either Spark or OSA), the only way to do this is to explicitly create Kafka targets and streams to connect these explorations. Additionally, Kafka is used to push data from an OSA-Spark exploration back to the live output stream of the exploration editor. This is configured in the osa.properties
file using the osa.kafka
properties (see Kafka Settings).
To install Kafka:
-
Refer to the download and documentation links in Prerequisites for Apache Spark Integration.
Note:
-
OSA only needs to know the Kafka endpoints, such as broker and zookeeper addresses.
-
OSA uses short-lived Kafka topics to communicate with applications in Spark. To allow OSA to explicitly delete these topics, consider adding this line in the server configuration file
KAFKA_HOME/config/server.properties
(whereKAFKA_HOME
is the Kafka installation folder) when installing Kafka:delete.topic.enable=true
Parent topic: Installing and Configuring Apache Spark
Configuring the OSA Domain for Spark
OSA-Spark integration relies on settings in one or more configuration files.
To configure your OSA domain for Spark:
-
Stop the OSA server (OSA domain).
-
Create the Spark configuration folder in your domain as:
OSA_DOMAIN/config/spark
. -
Create the OSA configuration file in the Spark configuration folder as:
OSA_DOMAIN/config/spark/osa.properties
. -
Edit the OSA configuration file according your environment (see Configuring the osa.properties File).
-
Create any additional configuration files required by your environment in the Spark configuration folder. This may include files specific to Hadoop Distributed File System (HDFS) and YARN, such as
yarn-site.xml
,core-site.xml
,hdfs-site.xml
, and more. For details, refer to the documentation for your cluster.These files are required by OSA to deploy files to your cluster. OSA needs client-side settings only (such as name node URL, resource manager URL, timeouts, credential settings, and so on). Do not copy the full server-side YARN/Hadoop configuration.
-
Edit the content of additional configuration files according your environment.
-
Start your OSA domain.
-
Access your OSA applications at the following URLs:
-
OSA for Spark:
http://host:port/sxspark
-
OSA:
http://host:port/sx
If you accept the default installation parameters, the URLs are:-
OSA for Spark:
http://localhost:9002/sxspark
-
OSA:
http://localhost:9002/sx
-
Note:
Whenever you deploy an OSA streaming application, the content of the configuration folder is zipped and sent to the Spark cluster. In this way, the same configuration is available for the OSA streaming application while running in the cluster.Tips:
-
Find useful configuration templates for different scenarios in
OSA_HOME/oep/spark/config-templates
. You can start with these files and customize them according to your needs. -
If you want to experiment with different Spark configurations, you can set up multiple configuration folders outside your OSA domain and set up the
OSA_DOMAIN/config/spark
folder as a symlink that points to one of your configuration folders at a time. For example,OSA_DOMAIN/config/spark
points to your YARN-based OSA configuration in/apps/osa/config/yarn-cluster
, and to your Spark Standalone cluster configuration in/apps/osa/config/spark-cluster
. In this way, you can easily switch between your configurations.
Parent topic: Installing and Configuring Apache Spark
Configuring the osa.properties File
The osa.properties
file is the configuration file for OSA-Spark integration.
- General Deployment Settings
General deployment settings inosa.properties
define the Spark cluster type and distribution folder. - Kafka Settings
Kafka settings inosa.properties
define the Kafka brokers and zookeeper. - JAR File Settings
JAR file settings inosa.properties
define the paths to the Spark assembly and example JAR files. - Runtime Settings
Runtime settings inosa.properties
tune the deployed OSA streaming applications. - Spark Settings
Spark settings inosa.properties
define resource consumption of the deployed Spark application in the cluster. - Example osa.properties File
Review an example configuration in theosa.properties
file.
Parent topic: Installing and Configuring Apache Spark
General Deployment Settings
General deployment settings in osa.properties
define the Spark cluster type and distribution folder.
osa.properties
.
Parameter | Description |
---|---|
osa.deploy.spark.master |
Mandatory. Specifies the Spark cluster type and master URL.
You can configure OSA for Spark running on Spark Standalone or YARN cluster:
|
osa.deploy.spark.fileshare |
Mandatory for Spark Standalone. Not used for YARN. Specifies the shared location that can be used by OSA as a distribution folder for the applications. The folder must be accessible from every node under the same path. This can be an Network File System (NFS) or Hadoop Distributed File System (HDFS) path. If you want to use HDFS, client configuration files (core-site.xml , hdfs-site.xml ) must be provided in the configuration folder.
Examples:
|
Parent topic: Configuring the osa.properties File
Kafka Settings
Kafka settings in osa.properties
define the Kafka brokers and zookeeper.
osa.properties
.
Parameter | Description |
---|---|
osa.kafka.brokerlist |
Mandatory. Comma-separated list of Kafka brokers in the form host:port .
Example: |
osa.kafka.zookeeper |
Mandatory. The Kafka zookeeper in the form host:port .
Example: |
Parent topic: Configuring the osa.properties File
JAR File Settings
JAR file settings in osa.properties
define the paths to the Spark assembly and example JAR files.
spark-assembly
and spark-examples
packages of your Spark distribution on each worker node and on the OSA node. The path to each JAR must be the same on all nodes. Additionally, the spark-osa
package must be copied to each node.
osa.properties
.
Parameter | Description |
---|---|
osa.jars.spark-assembly |
Mandatory. Path to the spark-assembly JAR file.
Example: /apps/spark/spark-1.5.1/lib/spark-assembly-1.5.0-hadoop2.6.0.jar |
osa.jars.spark-examples |
Mandatory. Path to the spark-examples JAR file.
Example: /apps/spark/spark-1.5.1/lib/spark-examples-1.5.0-hadoop2.6.0.jar |
osa.jars.spark-osa |
Mandatory. Path to the spark-osa JAR file, which you must coppy to each node manually.
Example: /apps/spark/spark-1.5.1/lib/spark-osa.jar |
Parent topic: Configuring the osa.properties File
Runtime Settings
Runtime settings in osa.properties
tune the deployed OSA streaming applications.
osa.properties
.
Parameter | Default | Description |
---|---|---|
osa.runtime.executor.instances |
1 | Optional. Specifies how many workers will do stream processing in parallel. Note that this number will affect the required executor cores. This parameter value applies to each deployed OSA application. |
osa.runtime.batchDuration |
1000 ms | Optional. Specifies the batch interval of the streaming data processing. |
Parent topic: Configuring the osa.properties File
Spark Settings
Spark settings in osa.properties
define resource consumption of the deployed Spark application in the cluster.
Note:
OSA may override your Spark property settings inosa.properties
if they are not sufficient to run OSA streaming applications in the Spark cluster.
osa.properties
that are most typically used in the context of OSA-Spark. Setting values lower than the default values may result in an out of memory error. For more information about Spark property settings, see http://spark.apache.org/docs/1.5.1/configuration.html.
Parameter | Default (minimum) | Description |
---|---|---|
spark.executor.memory |
1800m | Optional. Amount of memory to use per executor process. |
spark.executor.cores |
1 | Optional. The number of cores to use on each executor. In standalone mode, setting this parameter allows an application to run multiple executors on the same worker, provided that there are enough cores on that worker. Otherwise, only one executor per application will run on each worker. |
spark.driver.memory |
1500m | Optional. Amount of memory to use for the driver process. |
spark.driver.cores |
1 | Optional. Number of cores to use for the driver process, only in cluster mode. |
Parent topic: Configuring the osa.properties File
Example osa.properties File
Review an example configuration in the osa.properties
file.
osa.deploy.spark.master=spark://spark.mycompany.com:6066
osa.deploy.spark.fileshare=/osa-share/spark-deployments
osa.kafka.brokerlist=kbroker1.mycompany.com:9092,kbroker2.mycompany.com:9092
osa.kafka.zookeeper=zk.mycompany.com:2181
osa.jars.spark-assembly=/apps/spark/spark-1.5.1/lib/spark-assembly-1.5.0-hadoop2.6.0.jar
osa.jars.spark-examples=/apps/spark/spark-1.5.1/lib/spark-examples-1.5.0-hadoop2.6.0.jar
osa.jars.spark-osa=/apps/spark/spark-1.5.1/lib/spark-osa.jar
osa.runtime.executor.instances=1
osa.runtime.batchDuration=1000ms
spark.executor.cores=2
spark.executor.memory=3g
spark.driver.memory=2500m
-
Spark runs in a standalone cluster and the OSA applications are distributed through an NFS file share mounted to
/osa-share/spark-deployments
on each node. -
Kafka is installed and configured, and has has two dedicated broker servers and a separate zookeeper server.
-
Spark 1.5.1 is installed on each node to
/apps/spark/spark-1.5.1
. -
Runtime parameters specify that there is no parallel processing (single processor instance) and batch interval is set to 1 second.
-
Spark-specific parameters specify
spark.executor.cores=2
(defaults to1
),spark.executor.memory=3g
(defaults to1500m
as the minimum), andspark.driver.memory=2500m
(defaults to1800m
as the minimum) -
This configuration does not use HDFS or YARN, so the
osa.properties
file is the only configuration file that must be copied to the configuration folder.
Parent topic: Configuring the osa.properties File
Uninstalling the Software
Follow the instructions in this section to start the Uninstall Wizard and remove the software.
If you want to uninstall the product in a silent (command-line) mode, see Running the Oracle Universal Installer for Silent Uninstallation in Installing Software with the Oracle Universal Installer.
- Starting the Uninstall Wizard
- Navigating the Uninstall Wizard Screens
- Removing the Oracle Home Directory Manually
After you uninstall the software, you must manually remove your Oracle home directory and any existing subdirectories that the Uninstall Wizard did not remove. - Removing the Program Shortcuts on Windows Operating Systems
On Windows operating systems, you must also manually remove the program shortcuts; the Deinstallation Wizard does not remove them for you.
Starting the Uninstall Wizard
To start the Uninstall Wizard:
Parent topic: Uninstalling the Software
Navigating the Uninstall Wizard Screens
The Uninstall Wizard shows a series of screens to confirm the removal of the software.
If you need help on screen listed in Table -, click Help on the screen.
Table - Uninstall Wizard Screens and Descriptions
Screen | Description |
---|---|
Welcome |
Introduces you to the product Uninstall Wizard. |
Uninstall Summary |
Shows the Oracle home directory and its contents that are uninstalled. Verify that this is the correct directory. If you want to save these options to a response file, click Save Response File and enter the response file location and name. You can use the response file later to uninstall the product in silent (command-line) mode. See Running the Oracle Universal Installer for Silent Uninstall in Installing Software with the Oracle Universal Installer. Click Deinstall, to begin removing the software. |
Uninstall Progress |
Shows the uninstallation progress. |
Uninstall Complete |
Appears when the uninstallation is complete. Review the information on this screen, then click Finish to close the Uninstall Wizard. |
Parent topic: Uninstalling the Software
Removing the Oracle Home Directory Manually
After you uninstall the software, you must manually remove your Oracle home directory and any existing subdirectories that the Uninstall Wizard did not remove.
For example, if your Oracle home directory is /home/Oracle/product/ORACLE_HOME
on a UNIX operating system, enter the following commands:
cd /home/Oracle/product rm -rf ORACLE_HOME
On a Windows operating system, if your Oracle home directory is C:\Oracle\Product\ORACLE_HOME
, use a file manager window and navigate to the C:\Oracle\Product
directory. Right-click on the ORACLE_HOME folder and select Delete.
Parent topic: Uninstalling the Software
Removing the Program Shortcuts on Windows Operating Systems
On Windows operating systems, you must also manually remove the program shortcuts; the Deinstallation Wizard does not remove them for you.
To remove the program shortcuts on Windows:
-
Change to the following directory:
C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Oracle\ORACLE_HOME\Product
-
If you only have one product installed in your Oracle home, delete the ORACLE_HOME directory. If you have multiple products installed in your Oracle home, delete all products before you delete the ORACLE_HOME directory.
Parent topic: Uninstalling the Software
Oracle Fusion Middleware Installing and Configuring Oracle Stream Analytics, 12c (12.2.1.3.0)
E95529-03
Copyright © 2015, 2019, Oracle and/or its affiliates. All rights reserved.
Primary Author: Oracle Corporation
Documentation for installers and system administrators that describes how to install and configure Oracle Stream Analytics.
This software and related documentation are provided under a license agreement containing restrictions on use and disclosure and are protected by intellectual property laws. Except as expressly permitted in your license agreement or allowed by law, you may not use, copy, reproduce, translate, broadcast, modify, license, transmit, distribute, exhibit, perform, publish, or display any part, in any form, or by any means. Reverse engineering, disassembly, or decompilation of this software, unless required by law for interoperability, is prohibited.
The information contained herein is subject to change without notice and is not warranted to be error-free. If you find any errors, please report them to us in writing.
If this is software or related documentation that is delivered to the U.S. Government or anyone licensing it on behalf of the U.S. Government, then the following notice is applicable:
U.S. GOVERNMENT END USERS: Oracle programs, including any operating system, integrated software, any programs installed on the hardware, and/or documentation, delivered to U.S. Government end users are "commercial computer software" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such, use, duplication, disclosure, modification, and adaptation of the programs, including any operating system, integrated software, any programs installed on the hardware, and/or documentation, shall be subject to license terms and license restrictions applicable to the programs. No other rights are granted to the U.S. Government.
This software or hardware is developed for general use in a variety of information management applications. It is not developed or intended for use in any inherently dangerous applications, including applications that may create a risk of personal injury. If you use this software or hardware in dangerous applications, then you shall be responsible to take all appropriate fail-safe, backup, redundancy, and other measures to ensure its safe use. Oracle Corporation and its affiliates disclaim any liability for any damages caused by use of this software or hardware in dangerous applications.
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.
Intel and Intel Xeon are trademarks or registered trademarks of Intel Corporation. All SPARC trademarks are used under license and are trademarks or registered trademarks of SPARC International, Inc. AMD, Opteron, the AMD logo, and the AMD Opteron logo are trademarks or registered trademarks of Advanced Micro Devices. UNIX is a registered trademark of The Open Group.
This software or hardware and documentation may provide access to or information about content, products, and services from third parties. Oracle Corporation and its affiliates are not responsible for and expressly disclaim all warranties of any kind with respect to third-party content, products, and services unless otherwise set forth in an applicable agreement between you and Oracle. Oracle Corporation and its affiliates will not be responsible for any loss, costs, or damages incurred due to your access to or use of third-party content, products, or services, except as set forth in an applicable agreement between you and Oracle.
This documentation is in preproduction status and is intended for demonstration and preliminary use only. It may not be specific to the hardware on which you are using the software. Oracle Corporation and its affiliates are not responsible for and expressly disclaim all warranties of any kind with respect to this documentation and will not be responsible for any loss, costs, or damages incurred due to the use of this documentation.
For information about Oracle's commitment to accessibility, visit the Oracle Accessibility Program website at http://www.oracle.com/pls/topic/lookup?ctx=acc&id=docacc.
Access to Oracle Support
Oracle customers that have purchased support have access to electronic support through My Oracle Support. For information, visit http://www.oracle.com/pls/topic/lookup?ctx=acc&id=info or visit http://www.oracle.com/pls/topic/lookup?ctx=acc&id=trs if you are hearing impaired.