[cups.general] [Fwd: [Printing-user-general] Whyhasnothingchanged?]

Michael Sweet mike at easysw.com
Tue Aug 1 04:08:50 PDT 2006


Johannes Meixner wrote:
> Hello,
> 
> On Jul 31 11:34 wtautz wrote (shortened):
>> Michael Sweet wrote:
>>> wtautz wrote:
>>>> Michael, Would it be possible to have a per queue log files setup?
> ...
>>> As for providing separate log files per printer, there are some
>>> scaling issues to consider (one file per active printer...) as well
>>> as how to expose this in a secure way.
> ...
>> I have filed STR #1873.
> 
> This seems to be almost a duplicate of my old feature request
> for per job log files:
> http://www.cups.org/str.php?L1228
> 
> I think logs per queue are still a bit inconvenient for the user
> because the messages of the various jobs from various users are
> stored in one log file (but then at least no longer mixed up
> but in convenient job-by-job order) so that a user may have to
> search a bit for the messages of his particular job.

Actually, in the future it will be possible to send multiple
jobs to the same printer at the same time - some IBM "production"
printers need this to run at full speed.

> Regarding too many open file descriptors:
> Perhaps it is possible to add the log messages per job
> to the existing job control file?

The job control file is in IPP format and it re-written when the
job state or other attributes change, so it isn't a good choice
for storing the log file.  Regardless, we don't keep that file
open while the scheduler runs, we only open, read/write, and close
it as needed.

Right now the scheduler has 3 file descriptors open while a job
is printing (backend pipe, back-channel pipe, stderr pipe) which
limits the maximum number of simultaneously printing jobs to about
230 with the typical 1024 file descriptor limit.

Adding an open log file per job (whether the log is per-job or
per-printer) would reduce this to about 170 unless we implement
some sort of open log file "pool" to open and close the files
on-demand/as-needed.

Opening the log file for every message logged would be
prohibitively expensive (read: very slow), and just redirecting
the filter's stderr to the log file would lose the time stamp and
log level info.  We might be able to pipe stderr into another
helper program (cups-logd?) that parses the log messages and writes
them with the standard info, but that will need to be carefully
written to handle multiple writers to the same log file...

> I assume that when only INFO, WARN, and ERROR messages are stored,
> it should avoid flooding the files with tons of debug messages.

Without the debug messages, the log file would be pretty much useless
for tracking down problems... :)

> Perhaps simply a reserved block of fixed size for log messages
> in the existing job control file is sufficient?

No, as I mentioned above the job control file isn't suitable for this.

-- 
______________________________________________________________________
Michael Sweet, Easy Software Products           mike at easysw dot com
Internet Printing and Publishing Software        http://www.easysw.com




More information about the cups mailing list