A LogEvent sink is a LogListener that performs a final action on a LogEvent. This can include writing the LogEvent to a file, sending the LogEvent as email, or writing the LogEvent to a database. Oracle ATG Web Commerce defines several different kinds of LogEvent sinks:


A PrintStreamLogger writes logging messages to a PrintStream. By default, a PrintStreamLogger is configured to write logging messages to System.out, which usually leads to the console.

A PrintStreamLogger is useful as a debugging tool during development. Oracle ATG Web Commerce defines a PrintStreamLogger called /atg/dynamo/service/logging/ScreenLog of the atg.nucleus.logging.PrintStreamLogger class. By default, the ScreenLog component is a logListener for all Nucleus components that implement ApplicationLogging. You can disable the ScreenLog component by setting its loggingEnabled property to false. This is the recommended setting for live Oracle ATG Web Commerce sites.


A FileLogger writes logging messages to a text file. Two properties define an instance of a FileLogger:




The path to the directory that holds the log file. The path can be relative to the directory where the Oracle ATG Web Commerce server runs. For example, logFilePath=./logs points to the <ATG10dir>/home/logs directory, while logFilePath=logs points to the <ATG10dir>/home/servers/<server>/logs directory.


The actual name of the log file, within the logFilePath. So if logFilePath is ./logs, and logFileName is warnings.log, the logging messages are written to <ATG10dir>/home/logs/warnings.log.

You can disable any FileLogger component by setting its loggingEnabled property to false.


A RotatingFileLogger is a subclass of atg.nucleus.logging.FileLogger that periodically archives its log file to another directory. This prevents log files from growing without bound, but still lets you keep some log file history around.

The archiving is controlled by the following properties:




The Scheduler to use to perform the archiving. This is usually set to /atg/dynamo/service/Scheduler.


The Schedule to use to perform the archiving (see Configuring a Schedulable Component). This is often set to a CalendarSchedule, allowing it to perform the archiving on a calendar-based schedule such as every Sunday morning at 1am.


The directory where the archived log files are to be placed. This is usually different from the logFilePath, to make it easier for you to manage your log files and your archive files separately.


This is the maximum number of archive files that are kept for a particular log file. After this maximum has been reached, the oldest file is discarded whenever the log file is archived.


Specifies whether log files are compressed before being archived. See below.

When the log file is archived, it is moved from the logFilePath to the logArchivePath, and is renamed <logFileName>.0. If there already is a <logFileName>.0, it is renamed <logFileName>.1. 1 is renamed to 2, 2 is renamed to 3, and so on. This rotation stops at the maximumArchiveCount. If the maximumArchiveCount is 10, <logFileName>.9 is not moved to <logFileName>.10, but is instead erased.

After the log file is archived, a new log file is opened in the logFilePath, and logging continues as normal.

You also have the option of compressing log files before they are archived. If the archiveCompressed property is set to true, log files are compressed into a ZIP file format. The archived log files also have the extension .zip. These compressed log files can be read by a standard ZIP file reader, or by using the jar command that comes with the JSDK:

jar xvf info.log.0.zip

One example instance of RotatingFileLogger can be found at /atg/dynamo/service/logging/InfoLog. It has the following properties:


schedule=calendar * . 1 1 0

An EmailLogger takes log messages and sends them out as email to a list of recipients. This is useful for system administrators who wish to be notified whenever certain parts of the system malfunction. Administrators who use email-to-pager gateways can be paged when certain critical events take place.

The EmailLogger batches log messages before sending them as email. This is extremely valuable in situations where the system malfunctions in such a way that it is generating many error messages in a short amount of time. In such a situation, an administrator finds it much more helpful to receive, say, ten pieces of email with 100 error messages in each, than to receive 1000 messages with one error in each. The logger can be triggered to send its batched log messages when a certain number of messages are batched, or after a certain amount of time.

When the logger sends its email message, it generates an EmailEvent, which is then sent to an EmailSender.

The following properties control the configuration of an EmailLogger:




The number of log messages that are batched before being sent as email.


Using the above threshold, messages are not sent until the threshold is reached. So if the threshold is 10, and 9 log events are issued, email is still not sent until the 10th is received. By specifying a schedule, you can tell the EmailLogger to send out email according to a time trigger as well as a threshold. So if the schedule is set to every 5 minutes, email is sent within 5 minutes of receiving a log event, whether or not the log event threshold has been reached.


If you are going to specify a schedule, you must also specify a scheduler. This is usually set to /atg/dynamo/service/Scheduler.


This is a pointer to the EmailSender that performs the task of sending email. This is usually set to /atg/dynamo/service/SMTPEmailQueue.


This is a comma-separated list specifying the email addresses of those for whom the email is intended. For example, sysadmin@example.com,test@example.com.


This is what you want to appear in the from field of the email.


This is what you want to appear in the subject field of the email.


Anything placed in here appears at the top of the email body. The log messages are placed after the defaultBody.

A sample EmailLogger can be found at /atg/dynamo/service/logging/EmailLog:

schedule=every 5 minutes
defaultSubject=Main Reactor Core Down
defaultBody=Run now!