We are writing a multi threaded application that does a bunch of bit twiddling and writes the binary data to disk. Is it possible to have each thread std::fopen
the same file for writing at the same time? The reasoning would be each thread could do its work and have its own access to the writable file.
During the actual reading and writing, yes. But multiple processes can open the same file at the same time, then write back. It's up to the actual process to ensure they don't do anything nasty. If your writing the processes, look into flock (file lock).
Yes, but why didn't you just try it? (Might actually be OS dependent.) If you want multiple offsets within the file to read, mmap() may be an option too, depending on your OS. Then you can just index into the file using memory addresses.
The same file can be opened more than once in the same program (or in different programs). Each instance of the open file has its own file pointer that can be manipulated independently.
Splitting the editor (from the tab context menu) allows to have 2 copies of the same file open at the same time.
std::fstream
has functionality defined in terms of the C stdio library. I would be surprised if it were actually specified, but the most likely behavior from opening the same file twice is multiple internal buffers bound to the same file descriptor.
The usual way to simultaneously write to multiple points in the same file is POSIX pwrite
or writev
. This functionality is not wrapped by C stdio, and by extension not by C++ iostreams either. But, having multiple descriptors to the same filesystem file might work too.
Edit: POSIX open
called twice on the same file in Mac OS X produces different file descriptors. So, it might work on your platform, but it's probably not portable.
A definitive answer would require connecting these dots:
fstream
works like a C (stdio) stream.fopen
is only defined to associate a stream with a newly-opened file).This is a bit more research than I'm up for at the moment, but I'm sure someone out there has done the legwork.
I've written some high speed multi-threaded data capture utilities, but the output went to separate files on separate hard drives, and then were post-processed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With