Hi all,<br><br>I have a problem with a batch program I'm running. This program is calculating results on data which are stored in several tables. As long as I run the batch-program on a small amount of data it works fine. However, when I run it on a large amount of data ( this means that the same calculations are done over and over again a lot more times) the program becomes slower and slower (I can check the time each loop takes) and finaly it blocks and sometimes the message "Out of memory appears". Therefore I have the impression that my program is "eating" memory. First I thought it was because of recordsets that I opened and didn't close again but I made sure all recordsets are closed again before the loop re-begins and still this memory-problem persists. Has anyone an idea what else can cause this problem? It's Access 95 I'm using (yes I know it's getting quite old but switching to a newer version is not (yet) an option, and anyway I would really like to know the cause of this problem since I suppose it's a fundamental-programming error I'm making (as not closing opened recordsets was also one, but that one I learned now thanks to the people on the Tek-forum ;-)). My computer has a Pentium III processor with 192 MB RAM memory.<br><br>You would really help me a lot if you could give me some tips so thans a lot in advance!<br><br>Greetings,<br><br>Dirk<br><br><A HREF="mailto:dirk.news@yucom.be">dirk.news@yucom.be</A><br><br>PS: while the loops are running there are three recordsets constantly open because the data they contain are needed to guide the batch-process. I don't know how that is influencing memory performance but I guess they can't cause the program to slow down because they don't change in size or contents since I only use them to read data from...<br><br>