Sending Video Frames
Because you are undertaking all compression of the video yourself, you should provide a full bandwidth and a preview bandwidth video stream. The full bandwidth stream may be any resolution and framerate; the preview bandwidth should be 640 pixels wide and square pixels (e.g., 640x360 pixels when at 16:9 image aspect ratio).
You will then need to submit these two frames separately to the SDK. The creation of the frames is identical to the audio example in the previous section. The following is an example of just one of the two required streams:
Having filled in the audio compressed data structures, fill in a video header as shown in the following example.
It is crucial that the value of the FourCC specifies whether this is the full or preview resolution streams.
A critical part of NDI is that you ask the SDK if you should insert an I-frame. This can be achieved by making a call to NDIlib_send_is_keyframe_required
. This call returns true when an I-Frame should be inserted for a down-stream source to correctly decode the image without errors.
For instance, when there is a new NDI connection this function will return true; when a down-stream source has dropped a packet and can no longer decode the rest of the GOP, then this will be detected and return true, etc.
You are free to insert I-frames when you want based on your GOP requirements, but when this function returns true then you should issue an I-Frame at the next possible time. This is required functionality for a good user experience and a compliant NDI HX source. The stream validation (described later) will verify that this practice is followed.
Although you can determine your own bitrates for H.264 or H.265 streams, the NDI HX SDK will also provide guidance if you wish to call NDIlib_send_get_target_frame_size
. When used with H.264 or H.265 FourCC's in the structures, this will provide a reasonable average bitrate recommendation based on the selected resolution, framerate, etc. This provides estimates for all combinations of framerates and resolutions, but compliance is not required.
Please note that NDI is a real-time API, meaning that you should make every effort to pass off video and audio as they arrive in synchronization with each other. These are passed through the transmission layers downstream to the remote device with the lowest possible latency and may be received by sources that are observing one or both streams (when possible, bandwidth is only allocated for the used streams in transmission).
Audio and video are sent when they are passed to the API, allowing one or both streams to stop or start as needed at any layer. As a result, frames are not "held" to synchronize streams, resulting in higher performance.
Last updated