Sun ONE logo      Previous      Contents      Index      Next     

Sun ONE Application Server 7 Performance Tuning Guide

Chapter 2
About Sun ONE Application Server Performance

In this chapter, the following topics are discussed:


Why Tune Application Server?

Performance can be significantly enhanced by adjusting a few deployment descriptor settings or server configuration file modifications. However, it is important to understand the environment and performance goals. An optimal configuration for a production environment may not necessarily be optimal for a development environment. This guide helps you to understand the tuning and sizing options available, providing you the capabilities and practices to obtain the best performance out of your Sun ONE Application Server.

The process architecture of Sun ONE Application Server is represented in the following figure:

This figure shows the Sun ONE Application Server Process Architecture for a Single Domain.

Figure 2-1  Sun ONE Application Server Process Architecture for a Single Domain


Understanding Operational Requirements

Before you begin to deploy and tune your application on the Sun ONE application Server, it is important to clearly define the operational environment. The operational environment is determined by high-level constraints and requirements such as:

Application Architecture

The J2EE Application model, as shown in the following figure, is very flexible; allowing the application architect to split application logic functionally into many tiers. The presentation layer is typically implemented using servlets and JSPs and executes in the web container.

This figure shows the J2EE Application Model.

Figure 2-2  J2EE Application Model

It is not uncommon to see moderately complex enterprise applications developed entirely using Servlets and JSPs. More complex business applications are often implemented using EJBs. The Sun ONE Application Server integrates the web and EJB containers, in a single process. Local access to EJBs from Servlets is very efficient. However, some application deployment may require EJBs to execute in a separate process; and be accessible from standalone client applications as well as servlets. Based on the application architecture, the server administrator can employ Sun ONE Application Server in multiple tiers, or simply host both the presentation and business logic on a single tier.

It is important that the server administrator understand the application architecture before designing a new application server deployment, or while deploying a new business application to an existing application server deployment.

Security Requirements

Most business applications require security. The various security considerations and available choices are discussed in this section.

User Authentication and Authorization

Application users must be authenticated. Sun ONE Application Server provides three different choices for user authentication.

The default file based security realm is suitable for developer environments, where new applications are being developed and tested. At deployment time, the server administrator can choose between the LDAP or Unix security realms.

LDAP stands for Lighweight Directory Access Protocol. Many large enterprises use LDAP based directory servers to maintain employee and customer profiles.

Small to medium enterprises that do not already use a directory server, may also find that it is advantageous to leverage their investment in Solaris security infrastructure.

More information on how to integrate with various security realms, can be found in the Sun ONE Application Server Administrator’s Guide to Security.

The type of authentication mechanism chosen may require additional hardware for the deployment. Typically a Directory server executes on a separate server, and may also require, a backup for replication and high availability. Refer to Sun ONE Directory Server documentation for more information on deployment, sizing, and availability guidelines.

An authenticated user’s access to various application functions may also need authorization checks. If the application uses the role based J2EE authorization checks, some additional checking is performed by the application server. This places additional overheads, which need to be accounted for while doing capacity planning.

Encryption

For security reasons, sensitive user inputs and application output must be transmitted in an encrypted form. Most business oriented web applications encrypt all or some of the communication flow between the browser and Application server, if required. Online shopping applications, typically do not encrypt traffic, except when the user is completing a purchase or supplying private data. Portal applications such as news and media typically do not employ encryption. SSL is the most common security framework on internet, and is supported by many browsers and application servers.

Sun ONE Application Server supports SSL 2.0 and 3.0 and contains software support for various cipher suites. It also supports integration of hardware encryption cards for even higher performance. Security considerations, particularly when using the integrated software encryption, will impact hardware sizing and capacity planning. While considering the encryption needs for a deployment, the administrator needs to consider the following:

Application Usage

Every application user will have some expectations with respect to application performance. Often they can be numerically quantified. The server administrator must understand these expectations clearly, and use them in capacity planning to ensure that the deployment will meet customer needs, when completed.

With regard to performance, you need to consider the following:

Hardware Resources

The type and quantity of hardware resources at the disposal of the administrator greatly influence performance tuning and site planning.

Sun ONE Application Server provides excellent vertical scalability. It can scale to efficiently utilize up to 12 high performance CPUs, using just one application server process. The smaller number of application server instances provides ease of maintenance and lowered administration costs. Also, deploying several related applications on fewer application servers, can lead of performance improvements, due to better data locality, and reuse of cached data between collocated applications.Such servers must also contain large amounts of memory, disk and network capacity in order to cope with increased load.

Sun ONE Application Server can also be deployed on large “farms” of smaller hardware units. Business Applications can be partitioned across various server instances. By employing one or more external load balancers, user access can be efficiently spread across all the application server instances. A horizontal scaling approach may improve availability, lower hardware costs and is certainly suitable for certain types of applications. However, greater number of application server instances and hardware nodes need to be administered

Administration

A single Sun ONE Application Server installation on a server can be used to created several instances. One or more instances are administered by a single Administration Server, and this grouping of the Administration server and administered instances is called a “Domain.” Several administrative domains can be created to permit different people to independently administer groups of application server instances.

A single instance domain may be created to create a “sandbox” for a particular developer in a developer environment. In this scenario, each developer administers his/her own application server, without interfering with other application server domains. A small development group may choose to create multiple instances in a shared administrative domain, for collaborative development.

In a deployment environment, the server administrator could create administrative domains based on application and business function. For example, internal Human Resources applications may be hosted on one or more servers in one Administrative domain, while external customer applications are hosted on several administrative domains, in a server farm.

Sun ONE Application Server supports Virtual Server capability for web applications. A web application hosting service provider, may wish to host different URL domains on a single Sun ONE Application Server process, for efficient administration. The server administrator must determine, if they need to or want to use this capability.

At this point, the server administrator should be able to list all the applications, and their broad performance characteristics, security requirements and sketch the deployment environment, at a high level. The next step is to understand how to predict performance and do capacity planning.


Capacity Planning

The previous discussion guides the administrator towards defining a preferred deployment architecture. However, the actual size of the deployment is determined by a process called capacity planning.

How does one predict either the capacity of a given hardware configuration or predict the hardware resources required to sustain a specified application load and customer criteria? This is done by a careful performance benchmarking process, using the real application and with realistic data sets and workload simulation.The basic steps are briefly described below.

  1. Determine performance on a single CPU
  2. You need to first determine the largest load that can be sustained with a known amount of processing power. You can obtain this figure by measuring the performance of the application on a uniprocessor machine. You can either leverage the performance numbers of an existing application with similar processing characteristics or, ideally, use the actual application and workload, in a testing environment. Make sure that the application and data resources are configured in a tiered manner, exactly as they would be in the final deployment.

  3. Determine vertical scalability
  4. You need to know exactly how much additional performance is gained when you add processors. That is, you are indirectly measuring the amount of shared resource contention that occurs on the server for a specific workload. You can either obtain this information based on additional load testing of the application on a multiprocessor system, or leverage existing information from a similar application that has already been load tested. Running a series of performance tests on one to eight CPUs, in incremental steps, generally provides a sense of the vertical scalability characteristics of the system. Make sure that the application, application server and backend database resources, operating system etc., are properly tuned so that they not skew the results of this study.

  5. Determined horizontal scalability
  6. If sufficiently powerful hardware resources are available, a single hardware node may meet the performance requirements. However for better service availability, two or more systems may be clustered. Employing an external load balancers and workload simulation, determine the performance benefits of replicating one well tuned application server node, as determined in step (2).

The following table describes the steps in capacity planning:

Table 2-1  Factors That Affect Performance - Applying Concepts 

Concept

Applying the Concept

Measurement

Value Sources

User Load

Concurrent Sessions at Peak Load

Transactions Per Minute (TPM)

Web Interactions Per Second (WIPS)

( (Number of Concurrent Users at Peak Load) * Expected Response Time) / (Time between clicks)

For example, (100 Concurrent Users * 2 seconds Response Time) / (10 seconds between clicks) = 20.

Application Scalability

Transaction Rate measured on one CPU

TPM or WIPS

Measured from workload benchmark. Needs to be performed at each tier.

Scalability within a server (additional performance for additional CPU)

Percentage gain per additional CPU

Based on curve fitting from benchmark. Perform tests while gradually increasing the number of CPUs. Identify the “knee” of the curve, where additional CPUs are providing uneconomical gains in performance. Requires tuning as described in later chapters of this guide. Needs to be performed at each tier and iterated if necessary. Stop here if this meets performance requirements.

Scalability within a cluster (additional performance for additional server)

Percentage gain per additional server process and/or hardware node.

Use a well tuned single application server instance, as in previous step. Measure how much each additional server instance and/or hardware node improves performance.

Safety Margins

High Availability Requirements.

If system should cope with failures, size the system to meet performance requirements assuming that one or more application server instances are non functional

Different equations used if High Availability is required.

Slack for unexpected peaks

It is desirable to operate a server at less than its benchmarked peak, for some safety margin

80% system capacity utilization at peak loads, may work for most installations. Measure your deployment under real and simulated peak loads.


Performance Tuning Sequence

Tuning a deployment may be performed in the following sequence:


Configuration Files

The files init.conf, obj.conf, and server.xml are Sun ONE Application Server configuration files containing many attributes that can be modified to improve performance. They are frequently mentioned within this guide and can be found in the directory:

<APPSERVER_HOME>/appserv/domains/<DOMAIN_NAME>/<SERVER_NAME>/con fig/

APPSERVER_HOME is the installation directory for the Sun ONE Application Server. DOMAIN_NAME and SERVER_NAME refer to the domain and server names for the server instance to be configured.

The following figure shows the configuration file for a given instance.

This figure shows the Sun ONE Application Server Configuration Files.

Figure 2-3  Sun ONE Application Server Configuration Files

The config/backup directories contain a replica of the server configuration files. These files are created by the administration server instance. In general, users should not change these files by hand. If the config files are edited by hand, make a copy of the files and place them in the backup directory. Additionally, the server instance should be restarted.


Logging and Performance

The Sun ONE Application Server produces log messages and exception stack trace output that gets written to the log file. These log messages and exception stacks can be found in the logs directory of the instance. Naturally, the volume of log activity can impact server performance; particularly in benchmarking situations.

By default, the log level is set to INFO. The log level can be set for all the server subsystems by changing the attribute level in the log_service element. You can override the logging level by adjusting it at a particular subsystem. For example, mdb_container can produce log messages at a different lever than server default by adjusting the log_level attribute under the mdb_container element. To get more debug messages, set the log level to FINE, FINER, or FINEST. Under benchmarking conditions, it may be appropriate to set the log level to SEVERE.



Previous      Contents      Index      Next     


Copyright 2003 Sun Microsystems, Inc. All rights reserved.