Browse Source

Limit the size of log files generated by the toolset (#3466)

When a framework generates an enormous log file it's an indication that
that framework has a problem we need to fix.  But the failure mode of
"our continuous benchmarking environment goes down because of a full
disk and needs manual intervention and we can't restart it until we fix
the framework" is not really acceptable.
Michael Hixson 7 years ago
parent
commit
567abd0d07
1 changed files with 6 additions and 1 deletions
  1. 6 1
      toolset/utils/output_helper.py

+ 6 - 1
toolset/utils/output_helper.py

@@ -9,6 +9,11 @@ seq = re.compile(r'\x1B\[\d+m')
 
 FNULL = open(os.devnull, 'w')
 
+# To prevent the entire disk from being consumed, refuse to
+# append more lines to a log file once it's grown too large.
+# Logs that hit this limit are probably repeating the same
+# message endlessly anyway.
+TOO_MANY_BYTES = 50 * 1024 * 1024
 
 def log(log_text=None, **kwargs):
     '''
@@ -47,7 +52,7 @@ def log(log_text=None, **kwargs):
             sys.stdout.write(new_log_text)
             sys.stdout.flush()
 
-        if file is not None:
+        if file is not None and os.fstat(file.fileno()).st_size < TOO_MANY_BYTES:
             file.write(seq.sub('', log_text))
             file.flush()
     except: