Sun Update Connection - Automated Baseline Management Service 1.0 User's Guide

Chapter 1 Sun Update Connection - Automated Baseline Management Service 1.0 (Overview)

This chapter describes the SunSM Update Connection - Automated Baseline Management Service (ABMS) 1.0 service offering that uses the Traffic Light Patch Management (TLP) 2.3 tool. The TLP tool is used to automatically generate patch sets on multiple systems in large data centers. This chapter specifically provides overview information about the TLP 2.3 tool.

This is a list of the overview information in this chapter.

For the step-by-step procedures that are associated with installing and using the TLP tool, see Chapter 2, Working With the Sun Update Connection - Automated Baseline Management Service 1.0 (Tasks). For TLP reference information, see Chapter 3, Sun Update Connection - Automated Baseline Management Service 1.0 (Reference).

Detailed information on the recommended patching strategy for the SolarisTM Operating System (Solaris OS) is not included in this guide. For this information, go to http://docs.sun.com/db/doc/817-0574-12.

This guide is written for experienced engineers and system administrators. This guide neither explains the system administration tools in the Solaris OS, nor the fundamentals of patch management. For additional information on these topics, see Table 1–1.

Where to Find Additional Solaris System Administration and Patch Management Information

Table 1–1 Additional Patch Management and System Administration Documentation

Topic 

For Information 

OS Installation - Solaris JumpStart and advanced installations 

Solaris 10 Installation Guide: Custom JumpStart and Advanced Installations

OS Installation - Solaris Flash archive installations 

Solaris 10 Installation Guide: Solaris Flash Archives (Creation and Installation)

OS Installation - Network-based installations 

Solaris 10 Installation Guide: Network-Based Installations

OS Installation - Solaris Live Upgrade installations 

Solaris 10 Installation Guide: Solaris Live Upgrade and Upgrade Planning

Basic system administration 

System Administration Guide: Basic Administration

Advanced system administration 

System Administration Guide: Advanced Administration

Networking 

System Administration Guide: Network Services

Patch management in the Solaris OS 

What Is the TLP Tool?

TLP is a command-line interface (CLI) tool that enables you to centralize patch set creation on multiple systems in large data centers. The TLP tool enables you to plan patch deployment cycles that fit into your change management policies. The tool also has an HTML interface that provides an overview of the patch status of all your systems. The TLP tool was designed to assist you in performing many of the complex and time-consuming tasks that exist in large data centers. The main focus of TLP is patch set creation for Solaris systems. TLP uses external analyzers to analyze each system. The analyzer tool selects the patches to include in a given patch set, based on the system information that is gathered.

The TLP tool's functionality differs from the Patch Manager software. With Patch Manager, the tasks of patch analysis, the downloading of patches, patch set creation, and patch installation all take place on the target system. With the TLP tool, patch analysis, the downloading of patches, and patch set creation are performed on the server. Patch installation is a separate task that you perform by using scripts that are provided by the TLP tool.


Note –

The TLP tool does not install patch sets. Patch set installation is a separate task that you perform after the tool automatically generates the patch sets for your systems.


Advantages of Using the TLP Tool

This section highlights several key advantages of using the TLP tool.

TLP Features and Benefits

This section describes features and benefits of the TLP tool.

TLP Server and Client Software Installation Requirements

The TLP Tool consists of two software packages:

The TLP server software is installed on a dedicated system. The TLP client software is installed on all the systems in the data center to be patched. To install the TLP client software, you need approximately 10 Mbytes of disk space. On the TLP client systems no further disk requirements apply.

On the TLP server system you will need to reserve additional space. For each set of baselines that you use, you will need an additional 3 to 4 Gbytes of disk space. The amount of space that is required for each client system can vary between 1 to 5 Mbytes. In rare instances, the amount of disk space that is required could reach 200 Mbytes. Because the created patch sets are built from symbolic links that point to the patch repository, each patch set only requires about 100 Kbytes of disk space.

The TLP tool is a pure Perl application and uses no native code. Because all of the required modules are distributed within the TLP tool, no additional Perl modules are needed. The minimum Perl version that is required to run TLP is 5.005_03. This version is bundled with the Solaris 7 OS.

TLP Concepts

This section describes the main concepts and functionality of the TLP tool.

TLP Definitions

This section defines some of the key terms that are used within TLP.

Patch Set Creation

The most important objective of the TLP tool is patch set creation. The tool's job is to create individual patch sets that you can easily install. One of the main components of TLP is its phase concept. A snapshot is a closed set of patches, for example a Sun Baseline or an Enterprise Installation Standards CD-ROM (EIS-CD). The snapshot is tagged with a date by the tool. A snapshot becomes a phase when a color designation and a name is assigned to it. Using historical patches ensures that the patches that are included in the snapshot have matured over time. Withdrawn patches, or bad patches, are regularly checked for and replaced in all of the snapshots. The patch analyzer is responsible for determining the applicable patches for a given system.

Patch Analysis

System analysis engines are used to analyze single system patch requirements. These analyzers gather information about installed and missing patches and provide an exact list of patches that are missing on a given system. All of these patches are applicable to the target system. The TLP tool then compares the output of the analyzer against a given baseline. TLP calculates the gap between both sets to create a system-specific patch set. TLP uses an EIS-CD to define a unique baseline for your data center. The tool takes the output of the selected baseline, and combines it with the patches on the selected baseline, to create a new patch set. You can then efficiently install these patch sets with minimal effort. In mathematical terms, this process is a simple intersection of sets. However, in practice, the process is much more complicated.

You can configure the TLP tool to run with several different external system analyzers. You can also combine two or more analyzers. Because TLP uses analysis engines to create system-specific patch sets, the number of patches that are actually installed are minimized, thus reducing required maintenance window times.

The following are examples of these external analyzers:

Utilization of EIS-CDs

Utilizing EIS-CDs enables you to bring all of the systems in your data center to a unique patch level, meaning that each system in the data center appears to have been installed with the underlying EIS-CD. Because the patch sets for all of the systems were created with the same baseline, these systems also appear to have gone through installation at the same time.

TLP Modules and Roles

TLP is based on a modular framework. Within the TLP tool, some modules are specific to the TLP server and other modules are specific to the client. This modular framework enables you to replace individual modules, which are independent from each other. You can add new functionality or modify the tool to changing conditions, such as replacing an existing patch analysis tool. You can also define different roles for different tasks. These roles are represented by interchangeable modules. The script that is used to create patch sets refers only to these roles, thereby enabling TLP to conform to changes in its environment. In addition, you can declaratively include new modules in the default configuration file. Existing modules can be used as a basis for the development of new modules. To fully understand TLP functionality, it is essential that you have a basic understanding of the modular architecture of the tool.

The following output shows a portion of the TLP server configuration file. In this example, you can see the association between roles and modules.


<Module sunsolve>
Class  Tlp::Loader::SunSolve	# Perl Module to use
       ....					 # Module specific config
</Module>

Loader sunsolve				 # Associate Module with Role

Note that the role Loader is associated with the module, sunsolve. The module sunsolve is associated with the class Tlp::Loader::SunSolve. If you chose to download patches from another location, you would modify the TLP configuration file, replacing this module with another Loader module.

For more information on the TLP server and client module definitions, see TLP Roles and Modules.

TLP Processes

This section describes TLP concepts and related processes, which include the following cycles:

Figure 1–1 describes the TLP cycles and processes. Note that some of these processes are manually performed, while others are automatically or semiautomatically performed. These processes are described in greater detail in the sections that follow.

Figure 1–1 TLP Process Cycles for Patch Set Creation and Installation

Graphic that shows TLP processes. Processes are colored light
grey for manual, white for semiautomatic, and dark grey for automatic.


Note –

The TLP tool's primary function is patch set creation. Although the TLP patch set installation process is included in Figure 1–1, the TLP tool does not provide this functionality. Patch set installation is a separate task that you perform after the tool has created the patch sets.


TLP Baseline Loop

TLP uses the concept of baselines, which are a set of patches and patch revisions that are frozen at a given point in time. This set is tested as a unity and doesn't change after it is released. Baselines enable you to bring all of your systems to the same patch level. The baselines are tested together, thereby reducing the risk of patch incompatibilities. Having all the systems in a large data center at the same patch level makes administration and error detection much easier.

Baselines usually change once per quarter. First, an appropriate baseline is selected. For example, an EIS-CD dated, January 2005 would contain all of the patches that were burned in January of 2005. The baseline is installed once per quarter. When a new baseline is installed, TLP automatically updates the various reports.

TLP Analysis Loop

The analysis Loop is central to the TLP process. It runs for each client system in the data center. After you have installed TLP, you can choose to set up a cron job that starts TLP automatically, once per week. The TLP Client will utilize PatchPro to analyze the target system. The output of all the target systems are collected on the TLP server. The TLP server then compares the output with the installed baseline and creates a patch list. TLP automatically adds or removes patches from whitelist or blacklist configuration files. These files enable you to add or remove patches, if necessary. Whitelist and blacklist files are manually configured. Note that some systems require special patches, such as when an application requirement exists. For more information on maintaining whitelist and blacklist configuration files, see How to Customize Whitelists and Blacklists. In addition, TLP checks the WITHDRAWN patches list and removes any bad patches from the baseline. You can modify this list by using the TLP CLI commands, or by setting up a cron job. You should plan to update the WITHDRAWN patches list on a weekly basis. For more information on working with the WITHDRAWN patches list, see How to Update the WITHDRAWN Patches List.

The TLP tool then takes the resulting list and checks for patch dependencies. If any missing patches are found, the tool attempts to download these patches from the SunSolve web site. Note that Internet connectivity is required to complete this task. In addition, you need a login and password to access the SunSolve web site to download patches.


Note –

The new SunSolve web site allows the downloading of patches with arbitrary revision levels, which is contrary to the old method. This capability is necessary for the proper working of the TLP tool.


If no Internet connection exists, you need to manually install any missing patches. After the patch dependencies are resolved, all of the patches are put in the correct order. At this time, the tool removes any patches that cannot be installed automatically. In addition, all firmware and OpenBootTM PROM patches that require special treatment are stored in a separate directory for manual installation at a later time. The result is a final patch set that is placed in a dedicated directory. For ease of use, TLP does the following:

The last step in this process in the installation of the patch sets. This step is a separate task that you perform after the TLP tool creates the patch sets for your systems. For more information on the patch set installation process, see How to Install a TLP Patch Set.

Deployment Method

To install patch sets, you can choose one of the following installation options:

If you use Solaris Live Upgrade, the TLP patch sets fit the selected deployment method. For ease of deployment, TLP provides installation and back-out scripts, along with README files. These README files include additional useful data, such as a collection of special installation instructions. For more information about installing the TLP, software see TLP Software Installation (Task Map).

Sun Update Connection - Automated Baseline Management Service 1.0 Service Offering Activities and Deliverables

The Sun Update Connection - ABMS 1.0 service offering that uses the TLP 2.3 tool is divided into two stages, an initial stage and an ongoing stage. Table 1–2 describes these activities and deliverables.

Table 1–2 Sun Update Connection - ABMS 1.0 Service Offering Activities and Deliverables

Task 

Stage/Frequency of Occurrence 

Determine strategy 

Initial 

Determine frequency of patch cycles 

Initial (Default: quarterly) 

Define test scenarios 

Initial 

Determine fallback/back-up mechanism 

Initial 

Install TLP tools 

Initial 

Instruct system administrators in the use of TLP tool 

Initial 

Run patch updates 

Ongoing 

Update patch baselines 

Ongoing (up to 4 times per year) 

Automatic data analysis 

Ongoing 

Automatic generation of system-specific patch sets 

Ongoing 

Install patch sets 

Ongoing 


Note –

This guide does not contain information on all of the tasks that are described in the previous table. Information about patch strategy, including the Patch Strategy Checklist, can be found at http://onestop/tlp. Information on Solaris patch management strategy can be found at http://docs.sun.com/db/doc/817-0574-12.


TLP Configuration

TLP configuration is divided into three parts:

For a more detailed explanation of each part, see TLP Configuration Information.

TLP Server Configuration File

TLP obtains its server configuration information from a single file. The name of the default configuration file depends on the TLP script that is running when you call the application. For example, if you call tlp, then the default configuration file that is used is conf/tlp.cfg. If you create a symbolic link, cpc, and point to tlp from this link, the default configuration file that is used is conf/cpc.cfg. With this mechanism, you can easily select different default configurations. Note that you can always override the default configuration by using the command line option, --config. For more information, see TLP Default Command-Line Options and Arguments.

TLP Client Configuration File

TLP obtains its client configuration information from a single file. The name of the default configuration file depends on the TLPC script that is running. For example. if you call tlpc, then the default configuration file that is used is conf/tlpc.cfg. If you create a symbolic link, mytlpc and point to tlpc from this link, the default configuration file that is used is conf/mytlpc.cfg. With this mechanism, you can easily select different default configurations. Note that you can always override the default configuration by using the command line option , --config.


Example 1–1 TLP Server Configuration File

This example shows the beginning portion of the TLP server configuration file. This configuration is also used as the default configuration in the conf/tlp.cfg file.


###################################################################
#
# Configuration file for TLP Patch-Management
# 
###################################################################


# ===================================================================
# Global variables 

# The global variable BaseDirectory is implictly set to the TLP
# installation directory but can be overridden. Please note, that all
# relative pathes given here are relative to the current working
# directory.
# BaseDirectory /opt/SUNWtlp

# You can define you own variables here and refer later to it, e.g if
# you define "DataDirectory  /usr/local/tlp" you can later use it like
# in "SnapshotDirectory $DataDirectory/repository"

DataDirectory $BaseDirectory/data

# Helper-Programs
# Tar /usr/bin/tar
# Uncompress /usr/bin/uncompress
# Gzip /usr/bin/gzip
# Zip /usr/bin/zip
# Bzcat /usr/bin/bzcat
# Pkginfo /usr/bin/pkginfo
# Pkgadd /usr/bin/pkgadd

...

# ===================================================================
# Patch Repository. Only one repository can be used at a time.

<Module repo>
Class Tlp::Repository::DirRepository	  # A file repository

  # Directory holding all snapshots
  SnapshotDirectory  $DataDirectory/repository
  # Directory holding phases
  PhaseDirectory     $DataDirectory/phase
  # Directory holding all Patches
  PatchDirectory     $DataDirectory/patches


  # Phases. The following directives should be given in the order
  # "most current" to "least current" (e.g. Phase "GREEN" before Phase
  # "AMBER") The interval is given in month or day differences to the
  # previous phase
	<Phase GREEN>
		Interval 30 Days
	</Phase>
	<Phase AMBER>
		Interval 3 Months
	</Phase>
</Module>

Repository repo

...

The format of the TLP server and client configuration files is similar to the well-known Apache configuration format. Lines beginning with a hash mark (#) and empty lines are ignored. Spaces at the beginning and the end of a line, as well as tabulators, are also ignored. If you need to include spaces at the end or beginning of a value, you can use double quotation marks. A configuration variable is set by giving its name, followed by an assigned value. An equal sign is optional. Path variables that contain relative paths are resolved relative to the current directory from where you called tlp.

TLP Command-Line Functionality

The TLP 2.3 tool uses the following commands, options, and arguments:


tlp [global options] command [command options] [arguments] 

TLP behavior is steered by commands. These commands determine the mode of operation for the tool. For example, the command, tlp main, is used to generate patch sets. Other commands are used for maintenance or verification purposes. The name of a command reflects the name of the corresponding module, as it is defined in the default configuration file. See TLP Configuration Information. There are two different sets of options:

TLP Global Options

The following are the TLP global options:

--conf config, -c config

Provides a configuration file. With this option, you can point TLP to a different configuration file. The default configuration file name depends on the name of the actual script, or the symbolic link that points to this script.

--help, -h

Prints out a help message. Depending on the use of the --verbose option, either the syntax alone or a detailed description is printed out.

--verbose, -v

Prints verbose output. This option influences help messages, as well as the output that is created during a TLP run.

--quiet

Avoids any screen output (Be quiet). This option is especially useful for automated operations, such as in the running of cron jobs.

--version

Prints version information. This option prints the version number of TLP, the version of Perl used, along with the version numbers of all the modules used within TLP, and the Perl module that is used.

TLP Server Command Options

You can obtain a list of all the available TLP commands by typing the following command:


$ tlp -h

The following are the tlp command options and their associated TLP modules:

main (Tlp::Main)

Main module for creating a patch set

dir-producer (Tlp::Producer::DirProducer)

Module for creating a patch set within a directory

patchdiag (Tlp::Resolver::PatchDiag)

Dependency resolver that use the patchdiag.xref file

snapshot (Tlp::Snapshot::Snapshot)

Module for handling snapshots

sunsolve (Tlp::Loader::SunSolve)

Module for downloading a patch by using a URL

repo (Tlp::Repository::DirRepository)

Module for handling snapshots stored in a directory

explorer (Tlp::Collector::Explorer)

This module is used for maintenance of Explorer dumps.

These TLP module options are module-specific and vary from module to module. To obtain the options for a specific command, use the following command:


$ tlp -h command

Or, you can use this command:


$ tlp -v -h command

where command is the name of the module, for example, main.

TLP Reporting

Every time you create a patch set, the TLP tool automatically generates a report in two formats, HTML and ASCII text. These reports are stored in a directory for later retrieval and interpretation. Generation of the HTML report is completed within a few seconds. These reports contain essentially the same information. However, the HTML report is easier to interpret.


Note –

Detailed information about the text report that is automatically generated is not included in this guide.


The tool reports the patch status for each system in the data center in the form of a dashboard. For each phase, a color is designated. This color designation is determined by the default configuration file. The following default color designations are used:

Any system that is not compliant with a given phase is displayed in the HTML report with a RED color designation.

These color designations are based on how much the system's patches differ from the current patch baseline. The local patch database is updated at agreed-upon intervals, which in turn initiates the analysis process and the patch set creation. These patch sets are installed at the appropriate time, typically during a maintenance window.

TLP Logging

The default configuration file for logging is conf/log.cfg. This file contains some examples for fine-tuning TLP logging. By default, errors are logged into the log/tlp.log file and rotated when 1 Mbyte of logging data has been recorded. Up to three backup logs are kept. With TLP logging, you can do the following:

See the examples in log.cfg for additional ideas. A detailed description of all the logging features is not within the scope of this document.