This is the mail archive of the cygwin mailing list for the Cygwin project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

RE: can't read sequential files

On 01 November 2007 06:43, zirtik wrote:

> After adding the line:
> 	if (fp==NULL)
> 	{
> 	   printf("error, NULL pointer!\n");
> 	   return(1);
> 	}
> and then rebuilding the code, everything worked. But it's strange that if I
> delete that code segment and never check whether the fp pointer is NULL or
> not, I always get a segmentation fault. Can this be some kind of an
> optimization problem? I don't know why it happens. Thank you for the
> comments.

  A NULL pointer is never valid in C, and a segfault is what you get if you
try to make use of one (by dereferencing it).  You get a NULL pointer back
from fopen when it fails; a lot of library routines do this to indicate
failure, because any other value could be a valid pointer.  The other library
routines, such as fread and fwrite, will assume that you have done your error
checking and won't be passing them a NULL pointer, so they won't bother to
check what file pointer you pass them, they'll just go ahead and try and use
it.  So if you get a NULL pointer back from fopen and you don't check for it,
your code carries on and passes that same pointer to fread, which tries to use
it as if it pointed to a real FILE object, and crashes.

  The comp.lang.c FAQ has an entire section on NULL pointer, section 5.  It
should be available at
but the website seems to be temporarily down right now; there's also a copy at

Can't think of a witty .sigline today....

Unsubscribe info:
Problem reports:

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]