Exchanging large data between processes.

A

Amol

Hi,

I'm writing an application that requires exchaging large data between
processes (of the order of 300KBytes). I was trying to use shared
memory for this, but I realised that I could not set up shared memory
for more than a few tens of bytes (not even 100) on the system I was
running.

perror("shmget") prints:
shmget: Invalid argument

Is it common to use shared memory to exchange large data
between processes? If not then what is the preferred way?
Would using sockets be efficient enough? (Oh yes, and I need the
most efficient way to exchange this data)

Thanks,

~ amol
 
T

Tristan Miller

Greetings.

Is it common to use shared memory to exchange large data
between processes? If not then what is the preferred way?
Would using sockets be efficient enough? (Oh yes, and I need the
most efficient way to exchange this data)

Sockets, pipes, and other specialized means of interprocess communication
are not covered by the C standard. Since comp.lang.c deals only with
standard C, you're unlikely to get any helpful replies here. Try posting
your query on comp.unix.programmer or some other newsgroup targetted to
your particular compiler and/or operating system.
 
V

Valentin Tihomirov

You'd better ask in Windows (or whatever) newsgroup. I would do this with
shared variable (memory), because using pipes, sockets (or whatever)
communication you make a copy of a large variable in fact.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top