Shared memory

Thomas Bjork (
Wed, 31 Jul 1996 23:07:40 +0100 (GMT+0100)


Im working on a project, where a process should maintain a huge memory
array (about 30-40 MB).

Other processes or communicating via connect/bind/accept to this process.
Each time a connection is made, the server-process is forked. But all my
server processes, has to have access to the same memory array.

How do I make this?

shmget can only take 16MB arrays, why? Can I change the #define in

I cannot use fork, because it makes a new array... I have read about the
clone call, can I use it?

Please help...

Thomas Bjoerk