printing UTF-8 text mixed with PCL code

Helge Blischke h.blischke at acm.org
Sun Jul 17 05:48:39 PDT 2011


Matthias Apitz wrote:

> Hello,
> 
> We ported a complex Library Management System with success from ISO 8859-1
> to UTF-8 (database, UNIX application servers, Java frontends, etc.). This
> only as a background to explain from where one leftover problem in
> printing comes from.
> 
> The application is punching as one of its feature small pieces of paper
> (A5) with (in the past) a mixture of ISO 8859-1 text describing a book
> (title, author etc.) and some library management information, for example
> the shelf number. For reasons of the workflow in the library some of this
> text must be high-lighted in the printout (bold, bigger font) and some
> number is printed in an OCR-B font.
> 
> In the old ISO 8859-1 world this was just a mix of text with some PCL
> sequences to let the printer do its work correctly, i.e. switching to bold
> and OCR-B. Now in UTF-8 world we can print UTF-8 text to Postscript with
> CUPS' filter texttops, but of course we can't embed into this the PCL for
> the above mentioned fonts. The actual implemented solution is a fallback
> to ISO 8859-1 (iconv the data mix of text and PCL), but of course this
> can't stay this way.
> 
> We could change the application, punching the data, to produce for example
> HTML code with HTML-tags for bold font; or I was also thinking in
> producing Groff or Latex code for additional postprocessing resulting at
> the end in Postscript.
> 
> Any other ideas or comments?
> 
> Thanks in advance
> 
>     matthias

Well, why the conversion to PostScript at all? From your post I guess that 
formerly the "text mixed with PCL code" has been transferred to the 
printer(s) without furteher manipulation.
Why not converting the UTF-8 text back to ISO8859-1 for printing?

Helge





More information about the cups mailing list