springboot uses logback to generate log files by date and size

Posted by jklanka on Mon, 06 Apr 2020 10:22:10 +0200

Change from: https://www.ericgg.com/archives/3848.html

The default log files of springboot will not be automatically divided by day, so the log files of the production environment are getting larger and larger, which is not conducive to troubleshooting. After checking a lot of data, the final configuration is as follows. The log files can be divided by day and error level perfectly. The configuration is as follows.

Because springboot takes the way that convention takes precedence over configuration, so does the log file. The spring boot project is in the official document( https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-logging.html )Note: by default, some logging frameworks have been relied on. The recommended one is Logback. Note that spring boot already depends on Logback, so you don't need to add a dependency manually.

First of all, the logback configuration in different environments must be different, so my solution is:

The application.properties in the project have been divided into different property configurations under different environments through spring.profiles.active, such as application-dev.properties (development environment), application-uat.properties (test environment), application-prod.properties (production environment), In addition, application.properties (this file may only contain spring.profiles.active, and the real configuration may be in the file with -, because springboot will load it by default, and then specify which file to use). There are 4 properties files in total, and then in application.properties Specify the real properties to use through spring.profiles.active

ok, the above is a brief talk about the configuration of application.properties. Now comes the play

Next, we will create three files: logback-spring-dev.xml, logback-spring-uat.xml and logback-spring-prod.xml, and then specify the logback XML file through logging.config in the properties of each corresponding environment

logging.config=classpath:logback-spring-prod.xml

The contents of the logback xml file are as follows:

In the logback spring dev.xml development environment, you don't need to output to a file, just print it on the console

<?xml version="1.0" encoding="UTF-8"?>

<!-- From high to low OFF , FATAL , ERROR , WARN , INFO , DEBUG , TRACE , ALL -->
<!-- Log output rules based on current ROOT Level, when log output, the level is higher than root The default level is output -->
<!-- For each of the following configurations filter Filter out the output file, high-level file will appear, and low-level log information will still appear filter Filter logs at this level only -->


<!-- Attribute description scan: Sex set to true If the configuration file changes, it will be reloaded. The default value is true scanPeriod:Set whether the monitoring profile has a modified time interval. If no time unit is given, the default unit is milliseconds. When scan by true This property takes effect. The default time interval is 1 minute. 
	debug:When this property is set to true Will be printed out logback Internal log information, real-time viewing logback Operation status. The default value is false.  -->
<configuration scan="true" scanPeriod="60 seconds" debug="false">
	<contextName>d1money-web-ys-ems</contextName>

	<!-- ConsoleAppender Console output log -->
	<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
		<!-- Format the log -->
		<encoder>
			<pattern>
				%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger -%msg%n
			</pattern>
		</encoder>
	</appender>

	<logger name="java.sql.PreparedStatement" value="DEBUG" />
	<logger name="java.sql.Connection" value="DEBUG" />
	<logger name="java.sql.Statement" value="DEBUG" />
	<logger name="com.ibatis" value="DEBUG" />
	<logger name="com.ibatis.common.jdbc.SimpleDataSource" value="DEBUG" />
	<logger name="com.ibatis.common.jdbc.ScriptRunner" level="DEBUG" />
	<logger name="com.ibatis.sqlmap.engine.impl.SqlMapClientDelegate"
		value="DEBUG" />
	<logger name="com.apache.ibatis" level="TRACE" />



	<!-- root level DEBUG -->
	<root level="debug">
		<!-- console output  -->
		<appender-ref ref="STDOUT" />
	</root>
</configuration>  

Then there are logback-spring-prod.xml and logback-spring-uat.xml. The UAT environment is different from prod's only file location, so only one copy is pasted. Just change the next path, and no console output is needed for production and testing, or catalina.out file will explode

<?xml version="1.0" encoding="UTF-8"?>

<!-- From high to low OFF , FATAL , ERROR , WARN , INFO , DEBUG , TRACE , ALL -->
<!-- Log output rules based on current ROOT Level, when log output, the level is higher than root The default level is output -->
<!-- For each of the following configurations filter Filter out the output file, high-level file will appear, and low-level log information will still appear filter Filter logs at this level only -->


<!-- Attribute description scan: Sex set to true If the configuration file changes, it will be reloaded. The default value is true scanPeriod:Set whether the monitoring profile has a modified time interval. If no time unit is given, the default unit is milliseconds. When scan by true This property takes effect. The default time interval is 1 minute. 
	debug:When this property is set to true Will be printed out logback Internal log information, real-time viewing logback Operation status. The default value is false.  -->
<configuration scan="true" scanPeriod="60 seconds" debug="false">
	<contextName>d1money-web-ys-ems</contextName>
	
	<!-- Define log file input location -->
	<property name="log_dir" value="/soft/apache-tomcat-8.5.30-ems/logs" />
	<!-- Maximum log history 30 days -->
	<property name="maxHistory" value="30" />
	<property name="maxFileSize" value="10MB" />


	<!-- ERROR Level log -->
	<!-- Scroll to record the file, first record the log to the specified file, and when a certain condition is met, record the log to other files RollingFileAppender -->
	<appender name="ERROR"
		class="ch.qos.logback.core.rolling.RollingFileAppender">
		<!-- Filters, recording only WARN Level log -->
		<filter class="ch.qos.logback.classic.filter.LevelFilter">
			<level>ERROR</level>
			<onMatch>ACCEPT</onMatch>
			<onMismatch>DENY</onMismatch>
		</filter>
		<!-- The most commonly used rolling strategy, which makes rolling strategy according to time.Responsible for rolling as well as starting rolling -->
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			
			<!--Log output location can be relative, absolute path -->
			<fileNamePattern>
				${log_dir}/app_error.%d{yyyy-MM-dd}.%i.log
			</fileNamePattern>
			<!-- Optional node, which controls the maximum number of archive files to be retained. If the number is exceeded, the old files will be deleted. Suppose that the setting scrolls every month, and<maxHistory>If it is 6, only the last 6 months' files will be saved, and the previous old files will be deleted. Note that deleting old files means that directories created for archiving will also be deleted -->
			<maxHistory>${maxHistory}</maxHistory>
			<timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
				<maxFileSize>${maxFileSize}</maxFileSize>
			</timeBasedFileNamingAndTriggeringPolicy>
			
		</rollingPolicy>
		

		<!-- Generate log file according to fixed window mode, when the file is greater than 20 MB A new log file is generated. The window size is 1 to 3. When three archives are saved, the oldest log will be overwritten. 
			<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy"> 
			<fileNamePattern>${log_dir}/%d{yyyy-MM-dd}/.log.zip</fileNamePattern> <minIndex>1</minIndex> 
			<maxIndex>3</maxIndex> </rollingPolicy> -->
		<!-- View the size of the current active file. If it exceeds the specified size, it will be informed RollingFileAppender Trigger current active file scrolling <triggeringPolicy 
			class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy"> <maxFileSize>5MB</maxFileSize> 
			</triggeringPolicy> -->

		<encoder>
			<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger - %msg%n</pattern>
		</encoder>
	</appender>



	<!-- WARN Level log appender -->
	<appender name="WARN"
		class="ch.qos.logback.core.rolling.RollingFileAppender">
		<!-- Filters, recording only WARN Level log -->
		<filter class="ch.qos.logback.classic.filter.LevelFilter">
			<level>WARN</level>
			<onMatch>ACCEPT</onMatch>
			<onMismatch>DENY</onMismatch>
		</filter>
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			<!-- Roll back by day daily -->
			<fileNamePattern>
				${log_dir}/app_warn.%d{yyyy-MM-dd}.%i.log
			</fileNamePattern>
			<!-- Maximum log history 30 days -->
			<maxHistory>${maxHistory}</maxHistory>
			<timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
				<maxFileSize>${maxFileSize}</maxFileSize>
			</timeBasedFileNamingAndTriggeringPolicy>
		</rollingPolicy>
		<encoder>
			<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger - %msg%n</pattern>
		</encoder>
	</appender>




	<!-- INFO Level log appender -->
	<appender name="INFO"
		class="ch.qos.logback.core.rolling.RollingFileAppender">
		<!-- Filters, recording only INFO Level log -->
		<filter class="ch.qos.logback.classic.filter.LevelFilter">
			<level>INFO</level>
			<onMatch>ACCEPT</onMatch>
			<onMismatch>DENY</onMismatch>
		</filter>
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			<!-- Roll back by day daily -->
			<fileNamePattern>
				${log_dir}/app_info.%d{yyyy-MM-dd}.%i.log
			</fileNamePattern>
			<!-- Maximum log history 30 days -->
			<maxHistory>${maxHistory}</maxHistory>
			<timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
				<maxFileSize>${maxFileSize}</maxFileSize>
			</timeBasedFileNamingAndTriggeringPolicy>
		</rollingPolicy>
		<encoder>
			<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger - %msg%n</pattern>
		</encoder>
	</appender>




	<!-- DEBUG Level log appender -->
	<appender name="DEBUG"
		class="ch.qos.logback.core.rolling.RollingFileAppender">
		<!-- Filters, recording only DEBUG Level log -->
		<filter class="ch.qos.logback.classic.filter.LevelFilter">
			<level>DEBUG</level>
			<onMatch>ACCEPT</onMatch>
			<onMismatch>DENY</onMismatch>
		</filter>
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			<!-- Roll back by day daily -->
			<fileNamePattern>
				${log_dir}/app_debug.%d{yyyy-MM-dd}.%i.log
			</fileNamePattern>
			<!-- Maximum log history 30 days -->
			<maxHistory>${maxHistory}</maxHistory>
			<timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
				<maxFileSize>${maxFileSize}</maxFileSize>
			</timeBasedFileNamingAndTriggeringPolicy>
		</rollingPolicy>
		<encoder>
			<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger - %msg%n</pattern>
		</encoder>
	</appender>

	<!-- TRACE Level log appender -->
	<appender name="TRACE"
		class="ch.qos.logback.core.rolling.RollingFileAppender">
		<!-- Filters, recording only ERROR Level log -->
		<filter class="ch.qos.logback.classic.filter.LevelFilter">
			<level>TRACE</level>
			<onMatch>ACCEPT</onMatch>
			<onMismatch>DENY</onMismatch>
		</filter>
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			<!-- Roll back by day daily -->
			<fileNamePattern>
				${log_dir}/app_trace.%d{yyyy-MM-dd}.%i.log
			</fileNamePattern>
			<!-- Maximum log history 30 days -->
			<maxHistory>${maxHistory}</maxHistory>
			
			<timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
				<maxFileSize>${maxFileSize}</maxFileSize>
			</timeBasedFileNamingAndTriggeringPolicy>
		</rollingPolicy>
		<encoder>
			<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger - %msg%n</pattern>
		</encoder>
	</appender>

	<logger name="java.sql.PreparedStatement" value="DEBUG" />
	<logger name="java.sql.Connection" value="DEBUG" />
	<logger name="java.sql.Statement" value="DEBUG" />
	<logger name="com.ibatis" value="DEBUG" />
	<logger name="com.ibatis.common.jdbc.SimpleDataSource" value="DEBUG" />
	<logger name="com.ibatis.common.jdbc.ScriptRunner" level="DEBUG" />
	<logger name="com.ibatis.sqlmap.engine.impl.SqlMapClientDelegate"
		value="DEBUG" />
	<logger name="com.apache.ibatis" level="TRACE" />

	<!-- root level DEBUG -->
	<root level="debug">
		<!-- File output -->
		<appender-ref ref="ERROR" />
		<appender-ref ref="INFO" />
		<appender-ref ref="WARN" />
		<appender-ref ref="DEBUG" />
		<appender-ref ref="TRACE" />
	</root>
</configuration>  

The final effect is to generate log files based on the size date. If the size exceeds the set size, new files will be generated again and accumulated from 0.

Topics: Programming Spring xml Java SQL