I’m streaming long runs of real time data (2,000,000 bytes per sec) from an XEM7001 and reading it into a PC with a C# program. Up until now I’ve kept the total data size under 2GB, but I need to expand beyond that - probably up to 8GB. I have found it easiest and safest to buffer all the data into RAM on the PC and then write it out, partly because I need to rearrange the data before saving it. I find that buffers bigger than 2GB are difficult to do in C# even in X64 mode, so I break the buffer into 2GB pieces, up to 4 buffers. The data is acting like it is not contiguous as the data turns seemingly random after some point. The Garbage Collector also doesn’t seem to let me reuse memory if I do several runs in a row (without exiting the program). I could very well have a programming error, but I haven’t found it yet. It would be nice to use unmanaged memory but I have not been able to figure out how to cast the pointer to that memory to the byte needed by ReadFromBlockPipeOutX.
So, has anyone dealt with large streaming data transfers like this in C#? I’m open to suggestions.