Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

"out of memory" exception 1

Status
Not open for further replies.

kittyyo

Programmer
Oct 17, 2001
89
AT
Hi all!

I'm programming a very RAM-expensive thing and I've come over the following problem:

Until now, when all available RAM was used up on my test machine, Windows started to swap and the application got very slow. All of a sudden, the swapping didn't start any more when RAM was used up, but I always got the error "out of memory" (without further explanation and without hitting my app's error log).

At first I thought this was a Windows issue because I had played around with the swap file, but after a complete Windows reinstall it did not work either.

Do you think this could be a program-caused error, or have you ever had it yourself?

Thank you for any suggestions,
Anne
 
Are you programming an exception for memory everytime you allocate it?

try
new(memptr)
except
On EMemoryError do
write('Out of memory!');
end;

That's really all you can do with the Windows platform, or at least with standard Delphi, as far as I'm aware. My main problem with a memory intensive app of mine and memory reporting is that it doesn't report real memory versus swap space. So when it hits swap space, the thing starts running like a dog. So I have to code to restrain the amount of memory involved and swap processing to disk if necessary. More code and more headaches. I ended up making my memory allocations into a function that has 5 separate conditionals before it'll pass on a allocation.

If you have something like what is above, then it's a Windows problem because it's not clearing your swap properly. Or you're trying to grab more memory than the machine has (both real and swap).

But from experience, seriously consider artificially limiting the amount of memory you get.
 
or now that I see you made the heap fragmentation thread, perhaps it has to do with that. Usually when you do a heap alloc, most things will try to grab the first biggest spot that the memory will fit in. (and dynamic structures aren't easier either) If it can't find that spot, then you get a memory error, no matter if the free fragments can hold what you are wanting to store.

Or you would get that as well with a heap leak (meaning you don't cleanup/dispose all your structures OK). I wonder if there's a function that you can run which will check that correctly (like MemAvail in Turbo Pascal). Usually what I did there was do a write at the very beginning and very end of the program and made sure the two numbers were equal. If they were not, then I had a heap leak to deal with.
 
Thanks Glenn.

I do not throw any 'out of memory' exception anywhere in my code.
The annoying thing is really, that I did not change the code but the exception suddenly started to occur. (At least, I do not think that any of my little code changes could have caused this error to happen.)
Maybe you're right that it has to do with the memory fragmentation issue - but, as mentioned, my code did not change - what could have caused the error to suddenly happen?

Anyway, I'm also thinking about how I could 'warn' my program that the RAM is on the way to be used up and then finish my simulation; because if the program crashes I get no satisfying statistics... :-(

BTW, I do all my memory allocations with object constructors or SetLength... no pointers involved (except for Delphi-internal object references).
 
From looking at the other thread, it sounds like you need to design your program to page off units of work to disk. From what was said there, you use a LOT of memory. Perhaps unnecessarily, I don't know? But it sounds like you need to code a memory restraint in your program and page your completed work to disk. Much more complex, but you need to take your memory requirements way down from what I see in that other thread. WAY DOWN.
 
More thoughts:
"I do not throw any 'out of memory' exception anywhere in my code."

Because you don't technically allocate anything dynamically, I would guess.

"as mentioned, my code did not change - what could have caused the error to suddenly happen?"

Perhaps the memory availability issue. The machine had more processes loaded that made it "suddenly" happen. Remember that memory availability on a system can and will change.

"Anyway, I'm also thinking about how I could 'warn' my program that the RAM is on the way to be used up and then finish my simulation; because if the program crashes I get no satisfying statistics..."

Like I mentioned, limit the amount of memory you use programmatically and then dynamically allocate all the "simulation data" and page your completed work to disk. It sounds like you are going to need to rewrite your program to dynamically allocate all your memory, page in your processing as needed, and take your results and page them to disk. If a variable is statically defined at compile time, I assume the compiler assumes the memory will be present to run it properly.

Basically this is the way my memory intensive app had to be coded to make it run properly. I didn't know whether I'd have 16MB or 16GB of data to process. So I had to work around that.

If you want more details, post back and I can give you a bit of an idea of what I coded to do this.
 
The machine had more processes loaded that made it "suddenly" happen. Remember that memory availability on a system can and will change.

Hm, I thought I could prevent that by shooting down all processes that I could on my test machine, so that only the necessary system processes were running and nothing else. But maybe the influences are somewhere else?

It sounds like you are going to need to rewrite your program to dynamically allocate all your memory, page in your processing as needed, and take your results and page them to disk.

I've never been thinking about that (paging results to disk...) - more details would be great!

Thanks a lot,
Anne
 
Hm, I thought I could prevent that by shooting down all processes that I could on my test machine, so that only the necessary system processes were running and nothing else. But maybe the influences are somewhere else?"

You don't know when and how the system processes will be using memory. The best way to think of Windows memory is that it is going to shift like the sands in the seas. If you can get the memory one execution, don't be surprised to not get it the next execution. No way to prevent it, you have to check what you have when you run your program if you have a memory-intensive process. From what I know, it's really hard to gracefully process on a set of statically defined variables if you don't know whether they can be allocated or not.

"I've never been thinking about that (paging results to disk...) - more details would be great!"

The first question to ask is if you can create your process to run in a segment of processes? i.e.

1) Can you process a set of data and then write it to disk, and then process the next set without issue?
2) Can you page a completed set of data to disk and then recall it if you need to change it and change it easily.

It's a lot more code to consider, but those are initial thoughts to think about in your design. As for my program goes, I could fill memory up with one set of data, process it, page it all to disk, and continue on until my whole data was processed. Then I could pick up my page files and process them until I had one complete correct output or process the page files into another page file if I ran out of allowable page files. Whatever it would take to preserve the data and continue the processing.

It was a lot more complex and not an easy thing to design or code, but was necessary (and a lot of learning went into it too and will continue to go into it I'm sure since it's not perfect for what I want). Like I said, I could have 16MB of data into my program, which would be fine to put into memory, but I could have 16GB too, which would not work. I don't know that, so I have to account for it. So I ended up doing a dynamic allocation of work units.

I'll show a sample of that below. The main thing that might of issue is the calculation of the allocation units, which probably wouldn't hurt to be placed into KB instead of bytes (you'll get an arithmetic overflow if you tweak the memory limit var up). Console mode program, tested under Delphi 3:

program memtest2; uses sysutils;

{ Prototype test program: grab as much memory in system as possible according to various rules.

Released to tek-tips.com as demo. Any use of code is to include a statement "copyright 2005 by Glenn9999@tek-tips.com" }

const
memlist = 4096; { maximum # of work units }
alloclist = 65536; { governs size of each work unit }
upperlimit = 1024; { upper limit of memory in MB }

type
arraytype = array[1..alloclist] of longint;
arrptr = ^arraytype;

var
mem_allocated: longint;
mem_segments: longint;
mem_full: boolean;
seg_size: longint;
i: longint;
proc_seg: longint;
memarray: array[1..memlist] of arrptr;


function alloc_memory: boolean;
begin
{ have we hit our maximum number of segments? }
if mem_segments = memlist then
begin
mem_full := true;
writeln('Maximum segments reached.');
end
else
{ have we hit our arbitrary maximum allocated? }
if mem_allocated >= (upperlimit * 1024 * 1024) then
begin
mem_full := true;
writeln('Maximum amount of allotted memory used.');
end
else
{ OK now to try to allocate memory }
begin
inc(mem_segments);
try
new(memarray[mem_segments]);
inc(mem_allocated, seg_size);
except
on EOutOfMemory do
begin
mem_full := true;
writeln('Out of System Memory.');
dec(mem_segments);
end;
end;
end;
alloc_memory := mem_full;
end;

begin
mem_allocated := 0;
mem_segments := 0;
proc_seg := 0;
mem_full := false;
seg_size := sizeof(arraytype);

{ initial pass - allocate memory, while read. }
while alloc_memory = false do inc(proc_seg);

{ Now that we know memory stats, we can post those to the user }
writeln(mem_allocated, ' bytes memory in ', mem_segments, ' segments.');
writeln('Each segment is ', seg_size, ' bytes.');
writeln;


writeln('Disposing of ', mem_segments, ' segments.');
for i := 1 to mem_segments do
dispose(memarray);
end.

Of course after the appropriate amount of memory is allocated, I do my processing with it, and then dispose of it afterwards. Of course, too, it can be refined, but hopefully it'll give you a good start. If you give mem_list a high enough number and comment out the arbitrary size section you should see it go until you run out of system memory. But like I've noticed in testing too, paging to swap file makes it run like a old dog (and you're paging your work to disk ANYWAY), so it's best to try to find an arbitrary value that will stick to memory in most systems. Unfortunately, there's no way to find out how much real memory there is or whether you're going to swap file or not. (also I'll remove any reference to "segments" in the final version - the main program that uses this memory allocation scheme is in alpha 3)

Hope this helps start you on the way to an idea or two.
 
Thanks a lot Glenn! Good to know that there are other people programming memory-intensive things :)
I'll try to integrate a memory limit then, and post back if further questions arise.
 
Alright, I'm sure someone can try to help answer them if I can't.

Meanwhile, I ran a quick test on the program I posted here with the memory collar removed ("if mem_allocated >= (upperlimit * 1024 * 1024) then") and the number of work units set to a very high number. Net result is that it only stops when the memory maximum is hit.

Actually it illustrates what I was trying to get at exactly. Amount of memory available can even vary between executions, which was what the rough thing was to figure out, even with having that knowledge of the exception.

Run #1: 1,919,156,224 bytes
Run #2: 1,918,369,792 bytes
Run #3: 1,918,894,080 bytes
Run #4: 1,919,156,224 bytes

And this is a memory variance between different executions.
 
Hey, I just found something VERY interesting on the net:

"Virtual Address Space",

Which says that if not "4GT RAM Tuning" is enabled, only the lower 2 GB of Virtual Memory are available to the user process.

Seems like an answer to my original problem, the "out of memory" exception!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top