FractalWalk
Technical User
This is a general question and not about specific code. I utilize multiple processes that open Excel spreadsheets, extract data to arrays and then prints those arrays to text files. My problem is that after a few runs of doing this with bigger files, I end up with an "Out of memory" error. It isn't any specific file or file size that it is happening with. Sometimes it happens on a very small file but only after working with numerous larger files. So it certainly seems like a cumulative memory leak issue.
The arrays should not be the problem as I use a Redim statement, erase them and the End Sub should clear the memory (right?). I also set all my objects to nothing when done. So, I suspect it is the opening of the Excel files that is the culprit. It seems like if I open a file and close it from a sub in VBA that the memory for that file is not cleared when closed, rather it takes quitting the application to clear it. The research I have done on the web seems to confirm this, but I couldn't find anything to resolve it.
So my question is this. I was using workbooks.open and workbook.close, but if instead I use an object like, Set wb = Workbooks.Open(fnme) and then set wb = Nothing when I am done, does that help with my memory issue or should that not make a difference? Also if I open as read only, Set wb = Workbooks.Open(fnme, True, True), does that help? I have done that with a test script and I haven't had an "Out of Memory" issue yet, but before I re-write all my macros, I want to make sure there is value in what I am doing.
Any other suggestions regarding the opening of Excel files from within VBA that might help reduce the memory overhead?
The arrays should not be the problem as I use a Redim statement, erase them and the End Sub should clear the memory (right?). I also set all my objects to nothing when done. So, I suspect it is the opening of the Excel files that is the culprit. It seems like if I open a file and close it from a sub in VBA that the memory for that file is not cleared when closed, rather it takes quitting the application to clear it. The research I have done on the web seems to confirm this, but I couldn't find anything to resolve it.
So my question is this. I was using workbooks.open and workbook.close, but if instead I use an object like, Set wb = Workbooks.Open(fnme) and then set wb = Nothing when I am done, does that help with my memory issue or should that not make a difference? Also if I open as read only, Set wb = Workbooks.Open(fnme, True, True), does that help? I have done that with a test script and I haven't had an "Out of Memory" issue yet, but before I re-write all my macros, I want to make sure there is value in what I am doing.
Any other suggestions regarding the opening of Excel files from within VBA that might help reduce the memory overhead?