I’ve had a problem in the past using pipeout when there is the chance that the word being sent is all zeros. This would terminate my pipe transfer, so I’ve been working around this by padding each byte I send with extra ones…i am currently using the XEM6010 and labview and would like to know if I might be doing something wrong, or if this might have been an issue that is fixed, or if this is simply how the transfer should work. I would prefer not to use the padding but I don’t want to remove it if it may cause my pipeout to behave strangely. Thanks.
Why would sending 0’s terminate a PipeOut ?
I think you may be suggesting that your code in LabView is using the data from the pipe as a zero-terminated string rather than a simple data array. And so, to get around this behavior in LabView you’re avoiding sending 0’s through the pipe.
I think the better solution would be to use a different data type in LabView rather than a zero-terminated string. I don’t know LabView, though, so maybe one of the LV experts can chime in?
Thanks, that does give me something to think about. The btpipeout vi (which essentially make a call to a dll function) does return a string, which I would convert to a byte array to obtain the data. I will try changing the type in labview to see if it fixes it.
The problem was in LabVIEW. The callback function node parameter was configured to return a string. We switched this to array of U8, and it worked as expected.