Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Memory problem during iterative data output

Status
Not open for further replies.

kostaskon

Technical User
Mar 28, 2011
10
GR
Hi all!

My program is constructed using three nested for loops: for each distance value the program executes for e.g. 10 energy values and for each energy value the program executes for e.g. 10 phase values. Using CALL system('mkdir ...') I organize the output in folders accordingly.

Thus, for example the output for the 2nd value of the distance, the 1st value of the energy and the 4th value of the phase would be found at .../2nd_distance/1st_energy/4th_phase.

Pseudocode for the above:
Code:
DO distance = 1.,10.,1.
    DO energy= 1.,10.,1.
        DO phase= 1.,10.,1.
            open file output_dis_en_ph
            ... !Calculate data to be written in output_dis_en_ph
            write output_dis_en_ph
            close output_dis_en_ph
        END
    END
END
When I do the test runs (for short run times) everything works ok and the folder structure is created as I describe it above. When I attempt the full run (overnight) I return to find the run stopped and reporting error 'severe (43)'. The only folder created is the first of the above structure and the combined size of the .dat output files within is about 2.5 Gb.

As this is my RAM size I suspect that the program keeps all data on the RAM crashes when it gets full, and 'dumps' all data it has kept so far in a single folder. I think this shouldn't happen since I open and close all output files properly within the innermost loop and the test runs work ok. Any ideas what might cause the problem?

 
The indices in your loop are reals and MUST be INTEGERS. The resaon is simple, you can't do a command for the 2.3th time...

Try to change that first. I also advise you to do try outs with reduced loops, before walking away for an overnight run.
 
Thanks for the reply!

Ok, it turns out that the data were dumped in only one output folder because of a wrong if statement ([blush]). I also changed my indices to integers.

The output is done now correctly but still, once the size of the output files reaches ~2.5 Gb, the size of my RAM, the program crashes and produces error "severe (43): file name specification error" which I think also occurs when there are issues with insufficient memory. Since I'm closing each file properly for every program iteration, shouldn't each file be written to the disk instead of being kept on the RAM?
 
Yes it should, I have been told that this only happens in windows, but I can't confirm that. Anyway, you can solve it by saying
Code:
CALL FLUSH(unitnr)
within the loop
 
I don't think that a FLUSH statement will solve your trouble. Could you show us the right OPEN and CLOSE statements please ? I suspect that you forget to close the files properly. I also suggest that you add a write statement (on the unit *) just before the CLOSE statement to verify that you really close each file.

If too many files are open simultaneously, then you can reach an operating system limit with, as consequence, a crash of the code.

François Jacq
 
I put a write statement just before the closing of each file and it showed that they closed ok in the test runs.

The exact OPEN - CLOSE statements are:
Code:
DO 10 n10=0,360,30 ! the innermost loop
   OPEN (UNIT=11,FILE=outfile_coords)
   OPEN (UNIT=16,FILE=outfile_invars)
C outfile_coords and outfile_invars are the output paths that change iteratively according Cto distance, energy, phase
... !calculations
   CLOSE(UNIT=11)
   CLOSE(UNIT=16) 
10	Continue !End of the innermost loop

 
It seems the program exhausts all memory resourses not in open/close statements. See code (array allocation/deallocation and other memory-consumed ops) which is located between open and close stmts in the inner loop.

Files are symptoms only, they ticks off this path to the crash...
 
I read here that
Code:
write (6,*)
(which I use a lot in my code) causes some sort of memory leak that
Code:
print *,
does not. Would that make any sense at all, or am I too desperate for an easy solution :)?

Does anyone know any good way to check memory usage and allocation in fortran?

Thanks a lot for the answers!
 
It must be an error in the filename itself, possibly something with your index counter, the string that generates the filename. Error 43 seems to mean that your filename was not accepted.

You may open with STATUS='NEW', try with STATUS='REPLACE' after a long run, you may find some indices missing because they have been overwritten.

Try printing only the generated file and directory names to a text file.

Anyway, with only pseudo code, it's hard to help you.
 
Try to comment (temporary) all file i/o stuff in the program then run it again. I suspect that the program will catche the same memory overflow error...
 
Information found on internet about you error message number (43) :

severe (43): File name specification error
A pathname or file name given to an OPEN or INQUIRE statement was not acceptable. The problem may also be related to lack of disk space.

So I suggest you print out all the names of the files before the open statement :

Code:
   write(6,*) trim(outfile_coords)
   write(6,*) trim(outfile_invars)
   flush (6) ! this is not a subroutine but a statement
   OPEN (UNIT=11,FILE=outfile_coords)
   OPEN (UNIT=16,FILE=outfile_invars)
   ...

The "flush(6)" is useful to be sure to get all the file names printed until the crash (the unit 6 has a buffer which must be flushed frequently else one can loose pieces of information).

Check also you disk space.


François Jacq
 
For the enlightenment of future generations:

What I finally did (and should have done from the beginning apparently) instead of looping from inside fortran is I wrote a batch file performing the above loops. Since the program executed for each iteration and then closed before beginning a new execution the memory problem was eliminated. Also overall this solution seems more "elegant".

The batch file goes something like:
Code:
::distance, energy and phase loops respectively
FOR /L %%d IN (7,1,8) DO (
  FOR /L %%e IN (1000,100,1300) DO (
    FOR /L %%p IN (0,60,360) DO (
::write d,e,p in a file for reference
	@ECHO %%d >>bat_input.dat
	@ECHO %%e >>bat_input.dat
	@ECHO %%p >>bat_input.dat
	START /wait myprogram.exe)))

Thanks to everyone for their help!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top