1 Introduction to GoldenGate Stream Analytics

The Oracle GoldenGate Stream Analytics (GGSA) runtime component is a complete solution platform for building applications to filter, correlate, and process events in real-time. With flexible deployment options of stand-alone Spark or Hadoop-YARN, it proves to be a versatile, high-performance event processing engine. GGSA enables Fast Data and Internet of Things (IOT) – delivering actionable insight and maximizing value on large volumes of high velocity data from varied data sources in real-time. It enables distributed intelligence and low latency responsiveness by pushing business logic to the network edge.

Key features of GGSA:

  • Natively integrated with Oracle GoldenGate to process and analyze transaction streams from relational databases
  • Interactive pipeline designer with live results to instantly validate your work
  • Zero-code environment to build continuous ETL and analytics workflows
  • Pattern library for advanced data transformation and real-time analytics
  • Extensive support for processing geospatial data
  • Secured connectivity to diverse data sources and sinks
  • Built-in support for real-time visualizations and dashboards
  • Automatic application state management
  • Automatic configuration of pipelines for high availability and reliability
  • Automatic configuration of pipelines for lower latency and higher throughput
  • Automatic log management of pipelines for better disk space utilization

GGSA Architecture Overview

GGSA Architecture

Acquiring data

Stream Analytics can acquire data from any of the following on-premises and cloud-native data sources:
  • GoldenGate: Natively integrated with Oracle GoldenGate, Stream Analytics offers data replication for high-availability data environments, real-time data integration, and transactional change data capture.
  • Oracle Cloud Streaming: Ingest continuous, high-volume data streams that you can consume or process in real-time.
  • Kafka: A distributed streaming platform used for metrics collection and monitoring, log aggregation, and so on.
  • Java Message Service: Allows java-based applications to send, receive, and read distributed communications.

Processing data

With Stream Analytics, you can filter, correlate, and process events in real-time.

Perform actions on the data
After Stream Analytics processes the data, you can output the results to any one of the following external target data sources:
  • Coherence
  • Kafka
  • Oracle Cloud Streaming
  • Java Message Service
  • Database
  • Notification
  • REST

Learn more about Managing Targets

Steps to build Continuous-ETL and Realtime-Analytics Pipelines

Step Action Description
Step 1 Create a Connection

You must create a connection to an external system, to be

Supported stream sources:
  • Kafka
  • OCI Streaming Service
  • JMS
  • Oracle Advanced Queuing

See Managing Connections.

Step 2 Create a Stream

From the Catalog, create a Stream using the connection from Step 1.

Supported stream definitions:
  • File
  • Kafka
  • JMS
  • AQ
  • GoldenGate

See Managing Streams.

Step 3 Create a Pipeline

From the Catalog, create a Pipeline using the stream from Step 2.

See Creating a Pipeline

Step 4 Add Business Logic

Add business logic to the pipeline to analyze the input data stream.

See Creating a Pipeline to Transform and Analyze Data Streams.

Step 5 Publish the Pipeline See Publishing a Pipeline