Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How to deal with stubborn "Dereference to null pointer" errors?

Status
Not open for further replies.

Sincrono

Technical User
Sep 25, 2012
5
0
0
When errors appear at random, say, when the identical procedure sequences of a program crash only sometimes, first we doubt about the reliability of the code. But as in this example, I can't imagine how a simple initialization fails sometimes. Like the one below. The image shows an example of the error under a code compiled in Labwindows/CVI environment.

_buffer = (double*)malloc(elementos*sizeof(double));

for (iii=0; iii< elementos; iii++)

_buffer[iii]=0.0;

On the first iteration (iii=0), Labwindows shows a "Dereference to null pointer" error message.

I've heard that in this cases is better to change the programming language to other that deals better with dereferences.

Please help!!
 
>I've heard that in this cases is better to change the programming language to other that deals better with dereferences.
It sounds like after crash in the expression x/0.0 you want to switch to the programming language that deals better with divide by zero operations.

Better look at the malloc specification:
C Standard said:
...If the space cannot be allocated, a null pointer is returned. If the size of the space requested is zero, the behavior is implementation-defined: either a null pointer is returned, or the behavior is as if the size were some nonzero value, except that the returned pointer shall not be used to access an object.

Now test if the value of elementos is too large, or negative, or is equal to zero. Good practice: always check out the pointer value returned by malloc (!= NULL). Regrettably, no such programming language that deals better with incorrect (too large, for example) memory requests.

Null pointer points to nowhere. It's not dereferencing error: it's memory allocation unsuccessful request.


 
Very good answer.

So, if elementos is well defined, should I just expect to use something like
[C]if (_buffer==NULL)
MessagePopup ("here I am", "try again"); [/C]

??

In that case, I'd repeat the whole procedure (reloading the file) until the error stops??

I tried a stupid thing before

[C]if( _buffer =(double*)malloc(elementos*sizeof(double)) ==NULL){

MessagePopup ("rr", "rr");
return;
}[(C]
(this is in a [C]void abrir (char rwfile[]) [/C] function

so that is why I'd really appreciate an explicit code for the good practice you mentioned.

many thanks again, this could really take many headaches away!!
 
Well, it seems you understand the problem ;)
Your if (condition) is not stupid, it's incorrect expression example (see operator priorities in C). That's it in C language style (look at parenteses balance):
Code:
if ((_buffer = malloc(elementos*sizeof(double))) != NULL) { /* no need for cast if you have true C compiler */
    printf("*** Bad allocation, elementos == %d\n",elementos);
    ezit(1); /* or what else... */
}
/* let's continue... */
 
Sorry, must be == NULL in my prev snipet ;(
 
exit(1)... Probably, it's bedtime...

Good luck!
 
Thanks fot the code, It worked to check the pointer value returned by malloc, but after that, if repeating the function, memory never gets allocated. For some reason, when memory for _buffer is not allocated, the same happens to cleanbuffer. I thought that because this occurs sometimes, retrying it would solve the issue, but it seems not.

For what reason, other than the memory size (not being 0 nor too large), does malloc fail ??? I suspect this can be a complicated issue but really need help to solve this. It happens so little, and apparently at random, that is very annoying.

Portion of the relevant code
Code:
void abrir (char rwfile[])
{
	
	int iii;
	short buffer_abierto=0;		

	leidos = 0;	

	if (buffer_abierto)
		free (_buffer);
	
	GetFileSize (rwfile, &rwsize);	
	elementos = rwsize/sizeof(short);			

	buffer = (short*)malloc(elementos*sizeof(short));	
	cleanbuffer =(double*)malloc(elementos*sizeof(double));

	if(    ( _buffer=(double*)malloc(elementos*sizeof(double)) )  == NULL){
		
		printf("*** Bad allocation, elementos = %d\n",elementos);
		abrir(rwfile); //Here the function restarts, hopefully
	}
		
	buffer_abierto = 1;
	
	for (iii=0; iii< elementos; iii++){
		
		buffer[iii]=0;		
		cleanbuffer[iii]=0.0;  
		_buffer[iii]=0.0;  
	}
	
	fclose (bin_rw);
	
	for (iii=0; iii< elementos; iii++){
		
		if (buffer[iii]==NULL){		
	
		        if (iii==0)	
				buffer[0]=0;
			else			
				buffer[iii]=buffer[iii-1];
		}
			
		cleanbuffer [iii] = (double)buffer[iii] * BIT1V * (10000.00/(double)amplif)*f;	
	}
			
	free (buffer); 			
	
	
}

Now I'll try with a while, hoping that at some moment malloc works fine.

Still unresolved but on its way, thanks to you.

 
Think I got It.

I just have too free memory (i.e. closing other running software) before calling the same function again.

My program is very non economical so 3 Gb of RAM is not enough sometimes. I'll have to do something with that. Not many time right now (I must USE the program not reprogram it right now), but at least I think I'm going in the right direction.


Thanks again!![thumbsup]
 
Dangerous code. You assign pointers to global variables, don't free (and never check) cleanbuffer allocation. You compare short buffer[iii] with pointer value NULL (why?). The last loop is exactly buffer and clearbuffer "assign zero to all elements" operation (think why - and look at the previous loop: it performs exactly the same work more clearly;).

Moreover: you are trying to call the function recursively but don't free possibly allocated buffer and clearbuffer - obviously, your program eats all accessible memory - it's a classic memory leak (and possible stack overflow) example. Recursive call don't stop previous call: it creates new stack frame and continues to eat memory...

If your program can't allocate dynamic memory chunk then chances to successful allocation without massive free operations (especially as recursive calls;) are equal to (short, int, long, float, double;) zero. Try to inform user then do exit...
 
That is why I am a tech user [upsidedown].

Dind'nt understand all you said, but I dind't correct the code to ensure that there were problems (for example than cleanbuffer is never freed). The buffer==NULL comparation is remnant of previous intent of solving the issue.

Thanks I'll pass this to an expert to see it in person and to explain me the details of what you said.

 
If it's possible, try to pass your problem(s) to the personal expert(s); don't waste a time on forums ;)

BTW, many years ago we solved 1-2-3GB data physical problems with 192K byte computer memory. Key words: proper algorithmes selection ;)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top