-
+
Server Configuration
This parameter can only be set in the postgresql.conf>
file or on the server command line.
-
If log_destination> is set to csvlog,
- the log is output as comma seperated values. The format is:
- timestamp with milliseconds, username, database name, session id, host:port number,
- process id, per process line number, command tag, session start time, transaction id,
- error severity, SQL state code, statement/error message.
+ If csvlog> is included in log_destination>,
+ log entries are output in comma separated
+ value> format, which is convenient for loading them into programs.
+ See for details.
+ logging_collector must be enabled to generate
+ CSV-format log output.
This parameter allows messages sent to
stderr>,
- and CSV logs, to be
+ and CSV-format log output, to be
captured and redirected into log files.
- This method, in combination with logging to
stderr>,
- is often more useful than
+ This approach is often more useful than
logging to
syslog>, since some types of messages
might not appear in
syslog> output (a common example
is dynamic-linker failure messages).
This parameter can only be set at server start.
- logging_collector must be enabled to generate
- CSV logs.
file or on the server command line.
- If log_destination> is set to csvlog>,
+ If CSV-format output is enabled in log_destination>,
.csv> will be appended to the timestamped
- log_filename> to create the final log file name.
- (If log_filename ends in .log>, the suffix is overwritten.)
- In the case of the example above, the
- file name will be server_log.1093827753.csv
+ log file name to create the file name for CSV-format output.
+ (If log_filename> ends in .log>, the suffix is
+ replaced instead.)
+ In the case of the example above, the CSV
+ file name will be server_log.1093827753.csv.
- Here is a list of the various message severity levels used in
- these settings:
-
-
- DEBUG[1-5]
-
- Provides information for use by developers.
-
-
-
+ explains the message
+ severity levels used by
PostgreSQL>. If logging output
+ is sent to syslog or Windows'
+ eventlog, the severity levels are translated
+ as shown in the table.
+
-
- INFO
-
- Provides information implicitly requested by the user,
- e.g., during VACUUM VERBOSE>.
-
-
-
+
+
Message severity levels
+
+
+ |
+ Severity
+ Usage
+ syslog>
+ eventlog>
+
+
-
- NOTICE
-
- Provides information that might be helpful to users, e.g.,
- truncation of long identifiers and the creation of indexes as part
- of primary keys.
-
-
-
+
+ |
+ DEBUG1..DEBUG5>
+ Provides successively-more-detailed information for use by
+ developers.
+ DEBUG>
+ INFORMATION>
+
-
- WARNING
-
- Provides warnings to the user, e.g., COMMIT>
- outside a transaction block.
-
-
-
+ |
+ INFO>
+ Provides information implicitly requested by the user,
+ e.g., output from VACUUM VERBOSE>.
+ INFO>
+ INFORMATION>
+
-
- ERROR
-
- Reports an error that caused the current command to abort.
-
-
-
+ |
+ NOTICE>
+ Provides information that might be helpful to users, e.g.,
+ notice of truncation of long identifiers.
+ NOTICE>
+ INFORMATION>
+
-
- LOG
-
- Reports information of interest to administrators, e.g.,
- checkpoint activity.
-
-
-
+ |
+ WARNING>
+ Provides warnings of likely problems, e.g., COMMIT>
+ outside a transaction block.
+ NOTICE>
+ WARNING>
+
-
- FATAL
-
- Reports an error that caused the current session to abort.
-
-
-
+ |
+ ERROR>
+ Reports an error that caused the current command to
+ abort.
+ WARNING>
+ ERROR>
+
-
- PANIC
-
- Reports an error that caused all sessions to abort.
-
-
-
-
-
+ |
+ LOG>
+ Reports information of interest to administrators, e.g.,
+ checkpoint activity.
+ INFO>
+ INFORMATION>
+
+
+ |
+ FATAL>
+ Reports an error that caused the current session to
+ abort.
+ ERR>
+ ERROR>
+
+
+ |
+ PANIC>
+ Reports an error that caused all database sessions to abort.
+ CRIT>
+ ERROR>
+
+
+
+
-
Using the csvlog
+
Using CSV-Format Log Output
Including csvlog> in the log_destination> list
provides a convenient way to import log files into a database table.
- Here is a sample table definition for storing csvlog output:
+ This option emits log lines in comma-separated-value format,
+ with these columns: timestamp with milliseconds, username, database
+ name, session id, host:port number, process id, per-process line
+ number, command tag, session start time, transaction id, error
+ severity, SQL state code, statement/error message.
+ Here is a sample table definition for storing CSV-format log output:
CREATE TABLE postgres_log
(
- log_time timestamp,
+ log_time timestamp with time zone,
username text,
database_name text,
- sessionid text not null,
+ sessionid text,
connection_from text,
- process_id text,
- process_line_num int not null,
+ process_id integer,
+ process_line_num bigint,
command_tag text,
- session_start_time timestamp,
- transaction_id int,
+ session_start_time timestamp with time zone,
+ transaction_id bigint,
error_severity text,
sql_state_code text,
statement text,
- In order to import into this table, use the COPY FROM command:
+ To import a log file into this table, use the COPY FROM>
+ command:
- There are a few things you need to import csvlog files easily and
- automatically:
+ There are a few things you need to do to simplify importing CSV log
+ files easily and automatically:
- Use a consistant, predictable naming scheme for your log files
- with log_filename. This lets you predict what
-the file name will be when it is ready to be imported.
-guess what
- the file name will be and know when an individual log file is
- complete and therefore ready to be imported.
+ Set log_filename and
+ log_rotation_age> to provide a consistent,
+ predictable naming scheme for your log files. This lets you
+ predict what the file name will be and know when an individual log
+ file is complete and therefore ready to be imported.
- Set log_truncate_on_rotate = on so that old
- log data isn't mixed with the new in the same file.
+ Set log_truncate_on_rotation to on> so
+ that old log data isn't mixed with the new in the same file.
- The example above includes a useful primary key on the log
- file data, which will protect against accidentally importing
- the same information twice. The COPY command commits all of
- the data it imports at one time, and any single error will
- cause the entire import to fail.
- If you import a partial log file and later import the file again
- when it is complete, the primary key violation will cause the
- import to fail. Wait until the log is complete and closed before
- import. This will also protect against accidently importing a
- partial line that hasn't been completely written, which would
- also cause the COPY to fail.
+ The table definition above includes a primary key specification.
+ This is useful to protect against accidentally importing the same
+ information twice. The COPY> command commits all of the
+ data it imports at one time, so any error will cause the entire
+ import to fail. If you import a partial log file and later import
+ the file again when it is complete, the primary key violation will
+ cause the import to fail. Wait until the log is complete and
+ closed before importing. This procedure will also protect against
+ accidentally importing a partial line that hasn't been completely
+ written, which would also cause COPY> to fail.