I'm interested in sending all Rails application logging to a database (MySQL or MongoDB) either in addition to or instead of to a log file. There are a few reasons, most of which are concerned about log file analysis. We already use Google Analytics, but there are a variety of things we want to do that aren't as workable in Analytics.
Furthermore, I'd like to do "real time" investigation of issues by looking at logs. Sifting through a log file is a tedious way to do that, and I'd like to do better searching and filtering than a log file (easily) allows for.
Finally, I often want to examine something closer to site visitor behavior: tracing the path through the site for example, so that I can see what the last page was that a user was looking at before an error occurred. Given we have multiple app servers, the separate log files make this a real pain. If all the data were in a database, I could then easily see the proper sequence of pages for a given visitor. I know that Syslog would be one way to solve this particular thing (single log file/repository), but I want to combine that with better searching abilities that I associate with database searches.
I'm wondering what folks recommend to solve this. Do you directly log to a database, or do you dump log files into a DB (but what's your approach for that so that it's essentially realtime/as up to date as the logfile itself)?
I am currently determining at what level I'd like this logging, because another thing I looked at is writing a small Rack filter that would log all requests. This would miss all the extra output that the normal Rails logging dumps out (all the SQL and output on cache hits and misses, etc.), but it would achieve a big part of my goal, and seems to have the advantage of not disturbing anything else in the system.
Anyway, I am not looking for one right answer, more of a discussion and information on what anyone else might be doing in this same light.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…