Why text=binary mounts
Wed Jan 14 03:39:00 GMT 1998
>I guess the point I was trying to make is that it doesn't seem to me that
>there is a good argument for there to be text processing functionality in
>the fopen() family of functions (I know, it's a little late now!). The
>difference I see between 'modes' and 'formats' is this: we don't have a
>JPG mode or a WAV mode in fopen(), so why do we have a text mode? When
>somebody wants to open up and manipulate a JPEG file, they use a JPEG
>library that gives them access to methods that are meaningful only on JPEG
>files. I see text files in the same way. If you want to read a line of
>text, it seems to me that the most logical thing to do would be to use a
>library which gave you access to functions such as fscanf() etc. which have
>no meaning for generic (binary) files. This library then would be the
>place to do things like making all text files look the same to the
>programmer whether they're DOS/UNIX/Mac/whatever, in the same way that a
>PCX library might 'gloss over' the differences between the different PCX
Good point. It's also important to remember that not all text is ASCII or
ANSI, there's EBDIC (?) and a whole bunch of others too. Maybe a decent
text library could even handle unicode files or something (I know nothing
about unicode so dont flame me please) as well. Personally, when I open a
file, I expect to get what's there. That *should* be the default. A file is
just a bunch of bytes and that's the way it should be treated. If you want
some kind of filter or interpretation, get a library.
A well written text processing program should recognise any combination of
<cr> and <lf> as an end-of line marker and should write either the
operating system default (But the OS should have no concept of "text"
files) or ansi standard (if there is such a beast) or maybe even a format
selected by the user.
Even better would be that your program could register a callback function
with the text processing library allowing complete control. For example, I
define a text file format with each line being a field of 81 characters,
the first byte representing the length of text on the line, subsequent
characters being represented by 2*the alphabet position +1 if upperrcase
(a=2, A=3, b=4, etc.....). How does fopen (fname, "rt") handle this? It is
a text file. It doesnt use ANSI characters but it could and it still
wouldnt be handled correctly. So how is this "text mode"? It's not, it's
"let's kludge the end-of-line" mode. Text mode should imply that there's no
post-processing to be done on the input, you open the file with the proper
format filter and treat it as text from there-on-in.
Other advantages: Opening binary files with a hex-mode filter or even
executables with a disassembly/assembly filter and, of course, using
whichever editor you prefer as long as it's compiled with the text-library.
Of course, it could be done by patching fopen but the behaviour for that is
already standardised and it would be a cludge. What's needed is a properly
For help on using this list (especially unsubscribing), send a message to
"email@example.com" with one line of text: "help".
More information about the Cygwin