-
Notifications
You must be signed in to change notification settings - Fork 976
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Log queries in JSON format #871
Comments
Cross reference to #561 |
Something along these lines, which I used in the mysql_proxy lua version I have:
There may well be some other interesting values we can pick up from the DB, without causing any extra work |
I would be very interested in this feature, it would allow me to implement ProxySQL in at least one of my customers as they require full audit logging. |
@renecannao Any update on this? A bunch of us are very eager to see this in fruition. |
+1 |
eager to see this feature. |
I have a preliminary prototype of this working on a fork using the Protocol Buffers v3 JSON mapping. We explored using Protocol Buffers (as opposed to just using a JSON library) thinking we might be able to simultaneously contribute a more programming language friendly binary serialization than what's currently available before realizing that proto2 had been added and subsequently removed quite some time ago (corresponding Google Group thread). I'm curious to hear how the maintainers feel about these changes and whether (either as-is or with modifications) it could be an acceptable solution to this feature request. |
@rkennedy-zendesk : I replied to your email thread (a bit late). I like your prototype. This is far from a complete review, but I have some comments at this moment:
Alternatively to the two previous points:
Probably this solution allows more flexibility, for example it may be possible to have mixed logging based on the value of |
Ryan, following your question on the mailing list, I think that swap the Protocol Buffers-based implementation for one built specifically for JSON output is the right way to go. My comment above is still relevant tho, and I would appreciate your feedback on file header vs event header (or yet another solution). |
I'm working on this right now. Do you have any preferred JSON libraries for C/C++? I've been looking at a few and some of them have some serious baggage in terms of their dependencies. I'm going to give
I'm curious how this would work in practice for things wanting to consume the log file. Would, for instance, We could consider optionally adding a file suffix based on the file format. I'm curious to hear some feedback from the folks asking for the feature to know what their use case would be to know whether some sort of file/row header would be beneficial or burdensome. Our primary use case is going to be to have |
I've rewritten the JSON support using json-c (fun fact, the ubuntu 16 package is ~7MB with json-c and ~11MB with protobuf). Changes can be seen here. I've branched Regarding multiple file formats, my personal preference would be to indicate the format in the filename, if anywhere. I'm mostly concerned about logging pipelines (many of which I suspect will be off the shelf solutions) needing to handle this extra bit of information. As of right now the log file will be rolled over if the format changes, so each file should have only contain records of a single format. Additionally, the setting parameter only allows one format at a time, so there's no possibility of producing |
@rkennedy-zendesk Could I get your version to install and run for production? https://github.com/rkennedy-zendesk/proxysql/tree/v2.0.0-json-support the v2.0.0-json-support branch, right? |
If you're feeling adventurous, I have an Ubuntu 16 build on my fork:
https://github.com/rkennedy-zendesk/proxysql/releases/tag/v2.0.0-jsonc-json-query-logging
It was built from a branch in my fork (
https://github.com/rkennedy-zendesk/proxysql/tree/v2.0.0-json-support)
about a month and a half ago, which is currently 70 commits behind the
v2.0.0 mainline. It's been lightly tested locally, but I've been
unsuccessful in figuring out how to get the ProxySQL tests to run as both
the tests and instructions appear to be a bit dated.
…On Sun, Jan 13, 2019 at 6:39 AM 張旭 ***@***.***> wrote:
@rkennedy-zendesk <https://github.com/rkennedy-zendesk> Could I get your
version to install and run for production?
Any recommend or tips need to know?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#871 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AgV7CbDtcjHf8_DcPJ3u1WrPsr1PnbX9ks5vC0UrgaJpZM4LdZad>
.
|
I checked CHANGELOG but it's not update, has the json format been added to 2.0.0-rc2? did anyone find a way to run eventslog_reader_sample in following mode like tail? |
When will this feature be released? Any update? |
I really want this function(query logging in json), when does it apply new version? |
Enhancements: - added metrics rows_affected and rows_sent - added global variable mysql-eventslog_default_log : if 1 , logging is enabled for every query unless explicitly disabled in mysql_query_rules.log . Default is 0 - added global variable mysql-eventslog_format : default is 1 (legacy format). A value of 2 enables logging in JSON format. Issue #871 Changing value at runtime causes the current file to be closed and a new one created - fixed logging for prepared statements: till 2.0.5 only some percentage of prepared statements was correctly logged Extended tables stats_mysql_query_digest and stats_mysql_query_digest_reset to also include sum_rows_affected and sum_rows_sent Extended `eventslog_reader_sample.cpp` to support the new enhancements
This is extremely useful for audit logging as well, since as far as I can tell neither the MariaDB or the Percona Audit logging plugin or the general log log affected rows / sent rows. This can be quite helpful as a metric to warn / alert on. |
Thanks for the feedback! :) |
Incase you are working with Go, you can use our lib: |
Closing |
Query logging is already possible, but queries are logged in binary format.
Logging queries in JSON format will allow to save them into Elasticsearch, Kabana, and alike.
See http://woodygsd.blogspot.com.es/2014/07/how-do-you-log-problem-like-mysql.html for ideas
The text was updated successfully, but these errors were encountered: