The code below (which I assume is C++) waits for a coherent set of frames.
rs2::frameset frames = pipe.wait_for_frames();
rs2::frame frame = frames.first(RS2_STREAM_DEPTH);
frame.get_data(); // Pointer to depth pixels,
// invalidated when last copy of frame goes out of scope
thank you. The code you provided seems to behave in the same way
frame = pipeline.wait_for_frames()
does. I still get the same offset.
I currently grab a single frame from a different camera (not an Intel product) and then
run the code you provided. The frame I get from my other camera lags behind the
D415 frame set by around 100ms. I would assume this to be impossible if the
code waits for a new set of frames?
It sounds like your project would benefit from hardware sync, where the 400 Series camera can have its timestamp synced with a non-Intel sensor that is attached to the camera via a cable. The 400 Series cameras have something called GPIO pins that can be used for this. Intel wrote a white paper recently about creating a multi-camera setup.
The paper says: "Multiple cameras can be connected to a PC and will be able to stream independent data. The cameras operate in the “Default” mode and will stream asynchronously. However, if it is desired to hardware synchronize (e.g. HW sync) them so they capture at exactly the same time and rate, the cameras will need to be connected via sync cables, and will need to be configured in software to have a single master (or an externally generated sync signal) and multiple slaves. The connector port can be found on the cameras as shown below, and a cable will need to be assembled"
Here is the link to the white paper:
Intel are still working on adding hardware sync support, with a release estimate for this feature of around Q2 2018 (sometime between now and June 2018).