Re: Memory overcommitting (was Re: http://www.redhat.com/redhat/)

Herve R.-P. (regad@micronet.fr)
Sun, 2 Mar 1997 17:42:12 +0100


No missile intended :-)

Could you explain what algorithm/method fails in an optimistic
allocation scheme like all COW UNIXES, including Linux, use ?

In fact, I do know of one, invented by a friend (and probably lots
of guys around the world). This happened in a program that used to
go faster when eating lots of memory.

So he went in something like while((ch[i++]=malloc(sizeof(chunk)))); to
get *all* the available memory on the machine ... and get caught with
this allocation scheme ... as well as the unlucky X-user on the
machine (yes, he didn't do that on his machine :-).

Of course, this was a "brute force and massive ignorance" method as
the job can be done via /proc/meminfo with little work.

But other than that, I don't see.

Herve Regad-Pellagru

zen: vt. To figure out something by meditation or by a sudden flash of
enlightenment. Originally applied to bugs, but occasionally applied
to problems of life in general.

>>>>> "John" == John Wyszynski <wyszynsk@clark.net> writes:

John> Thanks to all who have lobbed missiles at me, especially
John> those who believe that they known all that can be known. I
John> simply cannot respond to them all. If this method of
John> allocating memory is indeed as wide spread as some have
John> claimed, it hasn't been going on as long as some of you
John> "experts" claim. It is clear that some people have
John> different design "goals" than others. This does not mean
John> that your's is the right answer for everyone else.

John> It may be the explanation why in the last few years I have
John> seen so many programs die for no cause in the middle of the
John> day. (On non-Linux systems so far.) In an operational
John> environment, such havoc is not appreciated.