-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Logfile full within 1s results in error #378
Comments
Do you have any updates to fix for the issue? |
No, I just observed that this may happen, but it doesn't happen in practice in our environment so I didn't make a fix. |
When i run on target i could'nt find the issue . the isue reproduces when i use cppunit to test glog . i think that we can use check size of file ? |
As far as I had seen, the issue is not related to the file size. What happens is that, whenever a new file is opened within 1s, e.g. because its file size limit was reached within 1s, the new file would be named the same as the previous file, and therefore cannot be opend, resulting in an error. I see two solutions:
|
Yes, I will write unit tests within 1s to test again . one more, do you know API get size of buffer when using glog ? |
Set FLAG_max_log_size. See logging.cc: GLOG_DEFINE_int32(max_log_size, 1800, |
Set FLAG_max_log_size, or the environment variable GLOG_max_log_size. See logging.cc: |
I have a small test that limits the max log size to a few MB and then write logs fast:
FLAGS_max_log_size = 5;
for (int i = 0; i < 1024000; i++)
LOG(INFO) << i;
Results in:
COULD NOT CREATE A LOGGINGFILE 20181030-122434.31822!Could not create logging file: File exists
If I increase the max_log_size, I see that the "next file" that is created, is always an integral number of seconds newer... E.g. if the first file is "p1.h1.u1log.INFO.20181030-122434.31822", the next one would be "p1.h1.u1log.INFO.20181030-122435.31822", not something like "...122434.90000".
I.e. whenever a log file is full within 1s, we get this error.
The text was updated successfully, but these errors were encountered: