I need to handle duplication of specific exeptions in my logs.
I use slf4j and logback for logging in my application. I use some external services(DB, apache kafka, third-party libs, etc.). When connection was lost to such service I got exception, such as
[kafka-producer-network-thread | producer-1] WARN o.a.kafka.common.network.Selector - Error in I/O with localhost/127.0.0.1
java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_45]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_45]
at org.apache.kafka.common.network.Selector.poll(Selector.java:238) ~[kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:192) [kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:191) [kafka-clients-0.8.2.0.jar:na]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:122) [kafka-clients-0.8.2.0.jar:na]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
The problem is that I got this message every second. This exception message will flood my log file, so I'll have a few GBs in log file in N hours.
I want to have log message about this exception once per 1-5 minutes. Is there any way to ignore duplication of exceptions in log file?
Possible solutions:
Ignore all logs for specific package and class.
[bad, beucase I can skip important message]
Use http://logback.qos.ch/manual/filters.html#DuplicateMessageFilter
[bad, because I can set only properties AllowedRepetitions or CacheSize. It'll match to all messages, but I need only specific exeption]
Write custom filter
Maybe, you know already imlemented solutions?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…