Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logfile full within 1s results in error #378

Closed
plmuon opened this issue Oct 30, 2018 · 7 comments
Closed

Logfile full within 1s results in error #378

plmuon opened this issue Oct 30, 2018 · 7 comments

Comments

@plmuon
Copy link
Contributor

plmuon commented Oct 30, 2018

I have a small test that limits the max log size to a few MB and then write logs fast:
FLAGS_max_log_size = 5;
for (int i = 0; i < 1024000; i++)
LOG(INFO) << i;

Results in:
COULD NOT CREATE A LOGGINGFILE 20181030-122434.31822!Could not create logging file: File exists

If I increase the max_log_size, I see that the "next file" that is created, is always an integral number of seconds newer... E.g. if the first file is "p1.h1.u1log.INFO.20181030-122434.31822", the next one would be "p1.h1.u1log.INFO.20181030-122435.31822", not something like "...122434.90000".

I.e. whenever a log file is full within 1s, we get this error.

@letrthong
Copy link

Do you have any updates to fix for the issue?

@plmuon
Copy link
Contributor Author

plmuon commented Feb 26, 2019

No, I just observed that this may happen, but it doesn't happen in practice in our environment so I didn't make a fix.

@letrthong
Copy link

When i run on target i could'nt find the issue . the isue reproduces when i use cppunit to test glog . i think that we can use check size of file ?

@plmuon
Copy link
Contributor Author

plmuon commented Mar 12, 2019

As far as I had seen, the issue is not related to the file size. What happens is that, whenever a new file is opened within 1s, e.g. because its file size limit was reached within 1s, the new file would be named the same as the previous file, and therefore cannot be opend, resulting in an error.

I see two solutions:

  1. refuse to reopen a file within 1s, i.e. allow it to exceed the maximum file size as specified,
  2. reopen and use an alternate name in case this happens (e.g. append a number until the newly opened file has a unique name) or rename the previous file first.

@letrthong
Copy link

Yes, I will write unit tests within 1s to test again . one more, do you know API get size of buffer when using glog ?

@plmuon
Copy link
Contributor Author

plmuon commented May 21, 2019

Set FLAG_max_log_size. See logging.cc:

GLOG_DEFINE_int32(max_log_size, 1800,
"approx. maximum log file size (in MB). A value of 0 will "
"be silently overridden to 1.");

@plmuon
Copy link
Contributor Author

plmuon commented May 21, 2019

Set FLAG_max_log_size, or the environment variable GLOG_max_log_size.

See logging.cc:
GLOG_DEFINE_int32(max_log_size, 1800,
"approx. maximum log file size (in MB). A value of 0 will "
"be silently overridden to 1.");

@plmuon plmuon closed this as completed May 21, 2019
@plmuon plmuon reopened this May 21, 2019
@sergiud sergiud closed this as completed Mar 30, 2021
@sergiud sergiud mentioned this issue May 6, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants