11 Configuring Jetty for Oracle Event Processing

This chapter describes how to configure Jetty for use with Oracle Event Processing, including configuring network I/O, and work managers, as well as configuring a Jetty server instance.

This chapter includes the following sections:

11.1 Overview of Jetty Support in Oracle Event Processing

Oracle Event Processing supports Jetty (see http://www.mortbay.org/) as Java Web server to deploy HTTP servlets and static resources.

Oracle Event Processing support for Jetty is based on Version 1.2 the OSGi HTTP Service. This API provides ability to dynamically register and unregister http://java.sun.com/products/servlet/docs.html objects with the run time and static resources. This specification requires at minimum version 2.1 of the Java Servlet API.

Oracle Event Processing supports the following features for Jetty:

For details about configuring Jetty, see Section 11.2, "Configuring a Jetty Server Instance."

11.1.1 Servlets

In addition to supporting typical (synchronous) Java servlets, Oracle Event Processing supports asynchronous servlets. An asynchronous servlet receives a request, gets a thread and performs some work, and finally releases the thread while waiting for those actions to complete before re-acquiring another thread and sending a response.

11.1.2 Network I/O Integration

Oracle Event Processing uses network I/O (NetIO) to configure the port and listen address of Jetty services.


Jetty has a built-in capability for multiplexed network I/O. However, it does not support multiple protocols on the same port.

11.1.3 Thread Pool Integration

Oracle Event Processing Jetty services use the Oracle Event Processing Work Manager to provide for scalable thread pooling. See Section 11.3, "Example Jetty Configuration."


Jetty provides its own thread pooling capability. However, Oracle recommends using the Oracle Event Processing self-tuning thread pool to minimize footprint and configuration complexity.

11.1.4 Jetty Work Managers

Oracle Event Processing allows you to configure how your application prioritizes the execution of its work. Based on rules you define and by monitoring actual run time performance, you can optimize the performance of your application and maintain service level agreements. You define the rules and constraints for your application by defining a work manager.

This section describes:

For more information, see Section 11.2.3, "work-manager Configuration Object." Understanding How Oracle Event Processing Uses Thread Pools

Oracle Event Processing uses is a single thread pool, in which all types of work are executed. Oracle Event Processing prioritizes work based on rules you define, and run-time metrics, including the actual time it takes to execute a request and the rate at which requests are entering and leaving the pool.

The common thread pool changes its size automatically to maximize throughput. The queue monitors throughput over time and based on history, determines whether to adjust the thread count. For example, if historical throughput statistics indicate that a higher thread count increased throughput, Oracle Event Processing increases the thread count. Similarly, if statistics indicate that fewer threads did not reduce throughput, Oracle Event Processing decreases the thread count. Understanding Work Manager Configuration

Oracle Event Processing prioritizes work and allocates threads based on an execution model that takes into account defined parameters and run-time performance and throughput.

You can configure a set of scheduling guidelines and associate them with one or more applications, or with particular application components. For example, you can associate one set of scheduling guidelines for one application, and another set of guidelines for other applications. At run time, Oracle Event Processing uses these guidelines to assign pending work and enqueued requests to execution threads.

To manage work in your applications, you define one or more of the following work manager components:

  • fairshare—Specifies the average thread-use time required to process requests.

    For example, assume that Oracle Event Processing is running two modules. The Work Manager for ModuleA specifies a fairshare of 80 and the Work Manager for ModuleB specifies a fairshare of 20.

    During a period of sufficient demand, with a steady stream of requests for each module such that the number requests exceed the number of threads, Oracle Event Processing allocates 80% and 20% of the thread-usage time to ModuleA and ModuleB, respectively.


    The value of a fair share request class is specified as a relative value, not a percentage. Therefore, in the above example, if the request classes were defined as 400 and 100, they would still have the same relative values.

  • max-threads-constraint—This constraint limits the number of concurrent threads executing requests from the constrained work set. The default is unlimited. For example, consider a constraint defined with maximum threads of 10 and shared by 3 entry points. The scheduling logic ensures that not more than 10 threads are executing requests from the three entry points combined.

    A max-threads-constraint can be defined in terms of a the availability of resource that requests depend upon, such as a connection pool.

    A max-threads-constraint might, but does not necessarily, prevent a request class from taking its fair share of threads or meeting its response time goal. Once the constraint is reached the Oracle Event Processing does not schedule requests of this type until the number of concurrent executions falls below the limit. The Oracle Event Processing then schedules work based on the fair share or response time goal.

  • min-threads-constraint—This constraint guarantees a number of threads the server will allocate to affected requests to avoid deadlocks. The default is zero. A min-threads-constraint value of one is useful, for example, for a replication update request, which is called synchronously from a peer.

    A min-threads-constraint might not necessarily increase a fair share. This type of constraint has an effect primarily when the Oracle Event Processing instance is close to a deadlock condition. In that case, it the constraint causes Oracle Event Processing to schedule a request even if requests in the service class have gotten more than their fair share recently.

11.2 Configuring a Jetty Server Instance

You use the following configuration objects to configure an instance of the Jetty HTTP server in the config.xml file that describes your Oracle Event Processing domain:

For information on security configuration tasks that affect Jetty, see Section 10.8.1, "Configuring Jetty Security".

For more information, see:

11.2.1 jetty Configuration Object

Use the parameters described in the following table to define a jetty configuration object in your config.xml file.

Table 11-1 Configuration Parameters for the jetty Element

Parameter Type Description

The name of the NetIO service used. The NetIO service defines the port the server listens on.

See Section 11.2.2, "netio Configuration Object" for details.


The name of the Work Manager that should be used for thread pooling. If not specified, the default work manager is used.

See Section 11.2.3, "work-manager Configuration Object."


The name of a directory where temporary files required for web applications, JSPs, and other types of Web artifacts are kept.


Enable debugging in the Jetty code using the OSGi Log Service.


The name of the jetty server instance.

11.2.2 netio Configuration Object

Use the parameters described in the following table to define a netio configuration object in your config.xml file.

Table 11-2 Configuration Parameters for the netio Element

Parameter Type Description

The name of this configuration object.


The listening port number.


The address on which an instance of netio service listens for incoming connections.

  • It may be set to a numeric IP address in the a.b.c.d format, or to a host name.

  • If not set, the service listens on all network interfaces.

The value of this parameter cannot be validated until the service has started.

11.2.3 work-manager Configuration Object

Use the parameters described in the following table to define a work-manager configuration object in your config.xml file.

Table 11-3 Configuration Parameters for the work-manager Element

Parameter Type Description

The minimum threads this work manager uses.


The fairshare value this work manager uses.


The maximum threads constraint this work manager uses.


The name of this work manager.

11.2.4 jetty-web-app Configuration Object

Use the following configuration object to define a Web application for use by Jetty:

Table 11-4 Configuration Parameters for the jetty-web-app Element

Parameter Type Description

The context path where this web app is deployed in the web server's name space.

If not set, it defaults to "/".


The location where Jetty stores temporary files for this web app.

Overrides the scratch-directory parameter in the Section 11.2, "Configuring a Jetty Server Instance."


A file name that points to the location of the web app on the server. It may be a directory or a WAR file.


The name of the Jetty service where this web application is deployed. It must match the name of an existing Section 11.2, "Configuring a Jetty Server Instance."


The name of this configuration object.

11.2.5 Developing Servlets for Jetty

Oracle Event Processing supports development of servlets for deployment to Jetty by creating a standard Java EE Web Application and configuring it using the Section 11.2.4, "jetty-web-app Configuration Object."

11.2.6 Web Application Deployment

Oracle Event Processing supports deployments packaged either as WAR files or as exploded WAR files, as described in version 2.4 of the Java Servlet Specification.

You can deploy pre-configured web applications from an exploded directory or WAR file by including them in the server configuration.

Security constraints specified in the standard web.xml file are mapped to the Common Security Services security provider. The Servlet API specifies declarative role-based security, which means that particular URL patterns can be mapped to security roles.

11.3 Example Jetty Configuration

The following snippet of a config.xml file provides an example Jetty configuration; only Jetty-related configuration information is shown:

Example 11-1 Example Jetty Configuration