Managing Service Connectors
This section describes how to manage service connectors .
A service connector defines the flow of data between a source and target service.
Prerequisites
IAM policies: To use Service Connector Hub, you must be given the required type of access in a policy written by an administrator, whether you're using the Console or the REST API with an SDK, CLI, or other tool.
To move data, your service connector must have authorization to access the specified resources in the source , task , and target services. Some resources are accessible without policies.
Default policies providing the required authorization are offered when you use the Console to define a service connector. These policies are limited to the context of the service connector. You can either accept the default policies or ensure that you have the proper authorizations in group-based policies.
For more information about service connector authorization, see Access to Source, Task, and Target Services.
If you get a response that you don’t have permission or are unauthorized, check with your administrator. You may not have the required type of access in the current compartment . For more information on user authorizations, see Authentication and Authorization.
Using the Console
- Open the navigation menu and click Analytics & AI. Under Messaging, click Service Connector Hub.
-
Choose a compartment you have permission to work in (on the left side of the page). The page updates to display only the resources in that compartment. If you're not sure which compartment to use, contact an administrator.
- Click Create Service Connector.
-
On the Create Service Connector page, fill in the settings:
-
Connector Name: User-friendly name for the new service connector. Avoid entering confidential information.
- Description: Optional identifier.
- Resource Compartment: The compartment where you want to store the new service connector.
- Configure Service
Connector:
- Source: Select the service containing the data you want to transfer from the following options.
- Logging: Transfer log data from the Logging service. See Logging Overview.
- Monitoring: Transfer metric data points from the Monitoring service.
- Streaming: Transfer stream data from the Streaming service. See Streaming.
- Target: Select the service that you want to transfer the data to.
-
Configure your source and task:
- Configure source connection:
Click the source service you want.
LoggingFields:
- Compartment Name
- Log group
- Logs
Comments:
-
Supported tasks and targets
All targets are supported by a service connector that is defined with a Logging source and optional task (Functions or Logging).
Callouts for Logging source Number Description 1 Service Connector Hub reads log data from Logging. 2 Optional: If configured, Service Connector Hub triggers one of the following tasks: - Functions task for custom processing of log data.
- Log Filter task (Logging service) for filtering log data.
3 The task returns processed data to Service Connector Hub. 4 Service Connector Hub writes the log data to a target service. Example of a service connector that uses Logging as source, with Functions as task: Scenario: Send Log Data to an Autonomous Database.
- Log group for Notifications target is limited to _Audit.
- For log input schema, see LogEntry.
- For more information about Logging, see Logging Overview.
MonitoringFields:
-
Metric compartment: Select the compartment containing the metrics you want.
Maximum metric compartments per Monitoring source: 5.
-
Namespaces: Select one or more metric namespaces that include the metrics you want.
All metrics in the selected namespaces are retrieved.
Must begin with "
oci_
". Example:oci_computeagent
.Maximum namespaces per Monitoring source (across all metric compartments): 50. Example sets of compartments in a single source that remain within this maximum:
- 5 metric compartments with 10 namespaces each
- 3 metric compartments with varying numbers of namespaces (20, 20, 10)
- 1 metric compartment with 50 namespaces
Comments:-
Supported tasks and targets
The following targets are supported by a service connector that is defined with a Monitoring source and (optional) Functions task: Functions, Object Storage, and Streaming.
Callouts for Monitoring source Number Description 1 Service Connector Hub reads metric data from Monitoring. 2 Optional: If configured, Service Connector Hub triggers the following task: - Functions task for custom processing of metric data.
3 The task returns processed data to Service Connector Hub. 4 Service Connector Hub writes the metric data to a target service. -
Format of received metric data
The metric data received from Monitoring is raw. (Contrast with aggregated data shown in metric charts.)
A response from Monitoring typically contains several metric data points. The following abbreviated example of a response shows two raw metric data points for disk bytes read from a compute instance:[ { "namespace":"oci_computeagent", "compartmentId":"ocid1.tenancy.oc1..exampleuniqueID", "name":"DiskBytesRead", "dimensions":{ "resourceId":"ocid1.instance.region1.phx.exampleuniqueID" }, "metadata":{ "unit":"bytes" } }, "datapoints":[ { "timestamp":"2022-03-10T22:19:00Z", "value":10.4 }, { "timestamp":"2022-03-10T22:20:00Z", "value":11.3 } ] ]
- Format of data at target: See the target under Configure target connection.
- For more information about the Monitoring service, see Monitoring.
StreamingFields:
-
Compartment: Select the compartment containing the stream.
-
Stream pool: Select the name of the stream pool containing the stream.
Note
Private endpoint configuration is supported. For stream pool configuration details, see Creating Stream Pools. -
Stream: Select the name of the stream that you want to receive data from.
-
Read position: Specify the cursor position from which to start reading the stream. For more information, see Using Cursors.
- Latest: Starts reading at messages published after saving the service connector.
- Trim Horizon: Starts reading at the oldest available message in the stream.
Comments:
-
Supported tasks and targets
The following targets are supported by a service connector that is defined with a Streaming source and (optional) Functions task: Functions, Notifications*, Object Storage, and Streaming. The Notifications target (asterisked in illustration) is supported except when using the Functions task.
Callouts for Streaming source Number Description 1 Service Connector Hub reads stream data from Streaming. 2 Optional: If configured, Service Connector Hub triggers the following task: - Functions task for custom processing of stream data.
3 The task returns processed data to Service Connector Hub. 4 Service Connector Hub writes the stream data to a target service. - For more information about Streaming, see Streaming.
Note
- For stream input schema, see Message Reference.
- Notifications target with Streaming source: All messages are sent as raw JSON blobs.
-
Configure task:
Click the task you want.
Log filter taskFields:
- Audit Logs: When _Audit is selected for Log group:
- When Attribute is selected for Filter type:
- Filter type: Attribute
- Attribute name
- Attribute values
- When Event type is selected for Filter type:
- Filter type: Event type
- Service name
- Event type
- When Attribute is selected for Filter type:
- Service Logs, Custom Logs: When another log group (not _Audit) is selected for Log group:
- Property
- Operator
- Value
Comments:
- This task filters source logs using the Logging service.
Configure function taskFields:
- Select task: Select Function.
- Compartment: Select the compartment containing the function.
-
Function application: Select the name of the function application that includes the function you want.
-
Function: Select the name of the function you want to use to process the data received from the source.
For use by the service connector as a task, the function must be configured to return one of the following responses:
- List of JSON entries (must set the response header
Content-Type=application/json
) - Single JSON entry (must set the response header
Content-Type=application/json
) - Single binary object (must set the response header
Content-Type=application/octet-stream
)
- List of JSON entries (must set the response header
- Show additional options: Optimal batch size: Specify limits for each batch of data sent to the function.
- Use automatic settings
- Use manual settings
- Batch size limit (KBs)
- Batch time limit (seconds)
Comments:
- This task processes data from the source using the Functions service).
- For supported targets with Streaming, see Streaming Source.
Note
- Service Connector Hub does not parse the output of the function task. The output of the function task is written as-is to the target. For example, when using a Notifications target with a function task, all messages are sent as raw JSON blobs.
- Functions are invoked synchronously with 6 MB of data per invocation. If data exceeds 6 MB, then the service connector invokes the function again to move the over-limit data. Such over-limit invocations are handled sequentially.
- Functions can execute for up to five minutes.
- Audit Logs: When _Audit is selected for Log group:
- Configure source connection:
- Configure target
connection: Select the Service
Compartment (where the target service resides)
and fill in more fields as needed:
Click the target service you want.
FunctionsFields:
- Compartment: Select the compartment containing the function.
- Function application: Select the name of the function application that includes the function you want.
- Function: Select the name of the function you want to send the data to.
Comments:
Note
- The service connector flushes source data as a JSON list in batches. Maximum batch, or payload, size is 6 MB.
- Functions are invoked synchronously with 6 MB of data per invocation. If data exceeds 6 MB, then the service connector invokes the function again to move the over-limit data. Such over-limit invocations are handled sequentially.
- Functions can execute for up to five minutes.
- Do not return data from Functions targets to service connectors. Service Connector Hub does not read data returned from Functions targets.
Logging AnalyticsFields:
- Compartment: Select the compartment containing the log group.
-
Log group: Select the log group you want.
MonitoringFields:
- Compartment: Select the compartment containing the metric.
-
Metric namespace: Select the metric namespace that includes the metric you want. It can be an existing or new namespace.
-
Metric: Select the name of the metric that you want to send the data to. It can be an existing or new metric.
-
Configure dimensions (optional): Specify a name-value key pair for each dimension you want to send data to. The name can be custom and the value can be either static or a path to evaluate. Use dimensions to filter the data after the log data is moved to a metric. For an example dimension use case, see Scenario: Create Dimensions for a Monitoring Target.
To open the Add dimensions panel, click Add dimensions. Following are instructions for this panel.
To extract data (path value)-
Under Select path, browse the available data for the path you want.
The six latest rows of log data are retrieved from the log specified under Configure source connection.
-
Select the check box for the path you want.
Note:
If no data is available, then you can manually enter a path value with a custom dimension name. The path must start with
logContent
, using either dot (.
) or index ([]
) notation. Dot and index are the only supported JMESPath selectors.Examples:
logContent.data
(dot notation)logContent.data[0].content
(index notation)
For more information about valid path notation, see JmesPathDimensionValue.
Example of a selected path (
bucketName
) and an unselected path (eTag
):Under Edit path, the following fields are automatically populated from your selected path:- Dimension name
- Value
- Optionally edit the dimension name.
-
-
To tag data (static value)
-
Under Static values, enter a Dimension name and Value.
Example dimension name and value:
traffic
andcustomer
.
-
Comments:
Note
- Do not use the reserved
oci_
prefix for new metric namespaces and names. Metrics are not ingested when reserved prefixes are used. See Publishing Custom Metrics and PostMetricData Reference (API). - When typing a new metric namespace or name, press ENTER to submit it.
What's included with your metric
In addition to any dimension name-value key pairs you specify under Configure dimensions, the following dimensions are included with your metric:- connectorId
- The OCID of the service connector that the metrics apply to.
- connectorName
- The name of the service connector that the metrics apply to.
- connectorSourceType
- The source service that the metrics apply to.
The timestamp of each metric data point is the timestamp of the corresponding log message.
NotificationsFields:
- Compartment: Select the compartment containing the topic.
- Topic: Select the name of the topic that you want to send the data to.
- Message format: Select the option you want: Note
Message format is available for service connectors with Logging source only. Not available for service connectors with function tasks. When this option is not available, messages are sent as raw JSON blobs.- Send formatted messages: Simplified, user-friendly layout.
To view supported subscription protocols and message types for formatted messages, see Friendly Formatting.
- Send raw messages: Raw JSON blob.
- Send formatted messages: Simplified, user-friendly layout.
Comments:
Note
- Log group for Notifications is limited to _Audit.
- SMS messages exhibit unexpected results for certain service connector configurations. This issue is limited to topics that contain SMS subscriptions for the indicated service connector configurations. For more information, see Multiple SMS Messages for a Single Notification.
Object StorageFields:
- Compartment: Select the compartment containing the bucket.
- Bucket: Select the name of the bucket that you want to send the data to.
- Show additional options:
- Batch size (in MBs)
- Batch time (in milliseconds)
Comments:
-
Batch rollover details:
- Batch rollover size: 100 MB
- Batch rollover time: 7 minutes
-
Files saved to Object Storage are compressed using gzip.
-
Format of data moved from a Monitoring source: Objects. The service connector partitions source data from Monitoring by metric namespace and writes the data for each group (namespace) to an object. Each object name includes the following elements.
<object_name_prefix>/<service_connector_ocid>/<metric_compartment_ocid>/<metric_namespace>/<data_start_timestamp>_<data_end_timestamp>.<sequence_number>.<file_type>.gz
Within an object, each set of data points is appended to a new line.
StreamingFields:
-
Compartment: Select the compartment containing the stream.
-
Stream: Select the name of the stream that you want to send the data to.
Note
Private endpoint configuration is supported. For stream pool configuration details, see Creating Stream Pools.
Comments:
- Format of data moved from a Monitoring source: Each object is written as a separate message.
- Source: Select the service containing the data you want to transfer from the following options.
- Show Advanced Options: If you have permissions to create a resource, then you also have permissions to apply free-form tags to that resource. To apply a defined tag, you must have permissions to use the tag namespace. For more information about tagging, see Resource Tags. If you are not sure whether to apply tags, skip this option (you can apply tags later) or ask your administrator.
Default policies are offered for any authorization required for this service connector to access source, task, and target services.
You can get this authorization through these default policies or through group-based policies. The default policies are offered whenever you use the Console to create or edit a service connector. The only exception is when the exact policy already exists in IAM, in which case the default policy is not offered. For more information about this authorization requirement, see Authentication and Authorization.
-
-
To accept default policies, click the Create link provided for each default policy.
Note
- If you don't have permissions to accept default policies, contact your administrator.
- Automatically created policies remain when service connectors are deleted. As a best practice, delete associated policies when deleting the service connector.
View links are provided for you to optionally review the newly created policies.
-
Click Create to create the service connector.
Within a few minutes, the service connector begins moving data according to its configuration. The service connector applies tasks to data from the source service and then moves the data to the target service.
- Open the navigation menu and click Analytics & AI. Under Messaging, click Service Connector Hub.
- Choose the Compartment containing the service connector.
- Click the name of the service connector you want to edit.
- Click Edit.
-
Make your changes.
Note
If you did not previously create the default access policy to allow this service connector to write to the target service, you can do so now. You can get this authorization through these default policies or through group-based policies. The default policies are offered whenever you use the Console to create or edit a service connector. The only exception is when the exact policy already exists in IAM, in which case the default policy is not offered. For more information about this authorization requirement, see Authentication and Authorization. -
Click Save Changes.
If you updated the source service or tasks, then data movement may pause for a few minutes, as indicated by Data Freshness metrics. Within a few minutes, the service connector begins moving data according to its configuration. The service connector applies tasks to data from the source service and then moves the data to the target service.
Friendly message formats are available with service connectors that use Notifications as target.
- Open the navigation menu and click Analytics & AI. Under Messaging, click Service Connector Hub.
- Choose the Compartment containing the service connector.
- Click the name of the service connector you want to edit.
- Click Edit.
-
Under Configure target connection, select the Message Format you want:
- Send formatted messages: Simplified, user-friendly layout.
To view supported subscription protocols and message types for formatted messages, see Friendly Formatting.
- Send raw messages: Raw JSON blob.
Note
If you did not previously create the default access policy to allow this service connector to write to the target service, you can do so now. You can get this authorization through these default policies or through group-based policies. The default policies are offered whenever you use the Console to create or edit a service connector. The only exception is when the exact policy already exists in IAM, in which case the default policy is not offered. For more information about this authorization requirement, see Authentication and Authorization. - Send formatted messages: Simplified, user-friendly layout.
-
Click Save Changes.
If you updated the source service or tasks, then data movement may pause for a few minutes, as indicated by Data Freshness metrics. Within a few minutes, the service connector begins moving data according to its configuration. The service connector applies tasks to data from the source service and then moves the data to the target service.
- Open the navigation menu and click Analytics & AI. Under Messaging, click Service Connector Hub.
- Choose the Compartment containing the service connector.
- Click the name of the service connector you want to activate.
-
Click Activate and then confirm.
The service connector immediately begins moving data according to its configuration, applying tasks to data in the source service and then moving the data to the target service.
- Open the navigation menu and click Analytics & AI. Under Messaging, click Service Connector Hub.
- Choose the Compartment containing the service connector.
- Click the name of the service connector you want to deactivate.
-
Click Deactivate and then confirm.
The service connector stops moving data.
Default policies stop working for moved service connectors. To give a moved service connector the required authorization, edit the service connector using the Console and accept the offered default policy. For more information about service connector authorization, see Access to Source, Task, and Target Services.
- Open the navigation menu and click Analytics & AI. Under Messaging, click Service Connector Hub.
- Choose the Compartment containing the service connector.
- Click the name of the service connector you want to edit.
- Click Move Resource.
- Choose the destination compartment from the list.
- Click Move Resource.
Automatically created policies remain when service connectors are deleted. As a best practice, delete associated policies when deleting the service connector.
- Open the navigation menu and click Analytics & AI. Under Messaging, click Service Connector Hub.
- Choose the Compartment containing the service connector.
- Click the name of the service connector you want to deactivate.
-
Click Delete and then confirm.
The service connector stops moving data.
Using the Command Line Interface (CLI)
Open a command prompt and run oci sch service-connector list
to list
service connectors in the specified compartment:
oci sch service-connector list --compartment-id <compartment_OCID>
Open a command prompt and run oci sch service-connector get
to get the
specified service connector:
oci sch service-connector get --service-connector-id <service_connector_OCID>
Open a command prompt and run oci sch service-connector create
to create
a service connector:
oci sch service-connector create --display-name
"<display_name>" --compartment-id <compartment_OCID> --source [<source_in_JSON>] --tasks [<tasks_in_JSON>] --target [<targets_in_JSON>]
Open a command prompt and run oci sch service-connector update
to edit a
service connector:
oci sch service-connector update --service-connector-id <service_connector_OCID> --display-name
"<display_name>" --source [<source_in_JSON>] --tasks [<tasks_in_JSON>] --target [<targets_in_JSON>]
Open a command prompt and run oci sch service-connector activate
to
activate the specified service connector:
oci sch service-connector activate --service-connector-id <service_connector_OCID>
Open a command prompt and run oci sch service-connector deactivate
to
deactivate the specified service connector:
oci sch service-connector deactivate --service-connector-id <service_connector_OCID>
Open a command prompt and run oci sch service-connector
change-compartment
to move the service connector to the specified
compartment:
oci sch service-connector change-compartment --service-connector-id <service_connector_OCID> --compartment-id <destination_compartment_OCID>
Open a command prompt and run oci sch service-connector delete
to delete
the specified service connector:
oci sch service-connector delete --service-connector-id <service_connector_OCID>
Using the API
For information about using the API and signing requests, see REST APIs and Security Credentials. For information about SDKs, see Software Development Kits and Command Line Interface.
Use these API operations to manage service connectors:
- Activate ServiceConnector
- ChangeServiceConnectorCompartment
- CreateServiceConnector
- DeactivateServiceConnector
- DeleteServiceConnector
- GetServiceConnector
- ListServiceConnectors
- UpdateServiceConnector
Use these API operations to manage work requests: