LogoLogo
🛠️ Tools🗂️ SDK📄 White Paper
  • Getting Started
    • 🟣What is NDI?
    • 🆕Release Notes
    • 📄White Paper
      • Discovery & Registration
        • mDNS
        • Discovery Service
        • Manual Connection
        • NDI Groups
      • NDI Protocols
        • Reliable UDP - NDI 5
        • Multipath TCP - NDI 4
        • UDP with Forward Error Correction – NDI 3
        • Single TCP – NDI 1
      • NDI Related Network Ports
      • Getting video across the network
      • Network Layout
      • Bandwidth
        • NDI High Bandwidth based on SpeedHQ2 (8bit 4:2:2)
        • NDI High Bandwidth based on SpeedHQ7 (8bit 4:2:2:4)
        • NDI HX2 h.264 (8bit 4:2:0)
        • NDI HX2 h.265 (8bit 4:2:0)
        • NDI HX3 h.264 (8bit 4:2:0)
        • NDI HX3 h.265 (8bit 4:2:0)
        • NDI Proxy and bandwidth optimization
      • Network Interface Settings
      • NIC Selection
      • Encoding and Decoding
      • Multicast
      • NDI Administrative Settings
      • Synchronization
      • NDI in the Cloud
    • Glossary
      • NDI Terminology
      • Industry Terminology
  • Using NDI
    • Introduction
    • NDI for Video
      • Digital Video Basics
      • NDI Video Codecs and Format Matrix
      • NDI Encoding Support Matrix
        • Practical NDI Receivers Format Support
        • Practical NDI Transmitters Support
        • Summary Table
      • Interoperability Scenarios
    • NDI for Audio
      • Digital Audio Fundamentals
      • Audio Over IP
      • Technical Facts About NDI for Audio
      • Use Cases
      • Products Using NDI for Audio
    • ⚒️NDI Tools
      • Release Notes
      • Installing NDI Tools
        • Software License Agreement
        • Privacy Policy
      • NDI Tools Launcher
      • NDI Tools for Windows
        • Access Manager
        • Bridge
          • NDI Bridge automation
          • Configuring Port Forwarding
          • Bridge Tool Logging
        • Remote
        • Router
        • Screen Capture
        • Screen Capture HX
        • Studio Monitor
        • Test Patterns
        • Webcam Input
        • Discovery
          • Getting Started with Discovery
          • Discovery Server Additional Information
      • NDI Tools for Mac
        • Access Manager
        • Scan Converter
        • Router
        • Test Patterns
        • Video Monitor
        • Virtual Input
        • Discovery
          • Getting Started with Discovery Service
          • Discovery Server Additional Information
          • Launch Discovery Server using Command Line for MacOS
      • Plugins
        • NDI for After Effects
        • NDI for Premiere Pro
        • NDI Output for Final Cut Pro
        • NDI for VLC
        • Audio Direct
        • OBS
    • Utilities
      • Analysis
    • Using NDI with Software
      • Getting Started with NDI in OBS for Windows or Mac
      • Using OBS Studio as a Commentary System
      • Using NDI Tools as a virtual camera in Mac
      • Using NDI and Dante on the same Network
      • Use OBS video + audio on Zoom with macOS
    • Using NDI with Hardware
      • NDI HX upgrades for cameras
      • How to Activate Panasonic Cameras for NDI HX1 — Step-by-Step
      • Recommended Network Switch Settings for NDI
  • Developing with NDI
    • Introduction
    • 📂SDK
      • Release Notes
      • Licensing
      • Software Distribution
        • Header Files
        • Binary Files
        • Redistributables
        • Content Files
        • Libraries
        • NDI-SEND
        • NDI-FIND
        • NDI-RECEIVE
        • Utilities
        • Command Line Tools
      • CPU Requirements
      • Dynamic Loading of NDI Libraries
      • Performance and Implementation
      • Startup and Shutdown
      • Example Code
      • Port Numbers
      • 🔧Configuration Files
      • Platform Considerations
      • NDI-SEND
      • NDI-FIND
      • NDI-RECV
      • NDI-Recv Discovery, Monitor, and Control
      • NDI Routing
      • HDR
      • Command Line Tools
      • Frame Types
        • Video Frames
        • Audio Frames
        • Metadata Frames
      • Windows DirectShow Filter
      • 3rd Party Rights
      • Support
    • Advanced SDK
      • Release Notes
      • Licensing
      • Overview
      • Configuration Files
      • NDI SDK Review
        • Sending
          • Asynchronous Sending Completions
          • NDI Sending On High Latency Connections
        • Receiving
          • Custom Allocators
            • Video Allocators
            • Audio Allocators
          • Dynamic Bandwidth Adjustment
          • NDI RECV Event Monitoring and Commands
        • Finding
        • Video Formats
          • Receiver Codec Support Level
          • Frame Synchronization
      • Genlock
      • AV Sync
        • Guidelines
        • Creating and Destroying Devices
        • Recovering Audio
      • Using H.264, H.265, and AAC Codecs
        • Sending Audio Frames
        • Sending Video Frames
        • H.264 Support
        • H.265 Support
        • AAC Support
        • OPUS Support
        • Latency of Compressed Streams
        • Stream Validation
      • External Tally Support
      • KVM Support
      • NDI Advanced SDK FPGA Example Designs
        • Prebuilt uSD Images
        • NDI FPGA Reference Design
          • FPGA Projects
            • Changelog
          • C++ Application Code
            • Changelog
          • linux_kernel
            • Changelog
          • uSD Image Builder
            • Changelog
        • Changelog
    • Utilities
      • Unreal Engine SDK
        • Release Notes
        • Licensing
        • NDI Plugin Installation
        • Simple Setup of Broadcast and Receivers
        • NDI Broadcast Actor
        • NDI Receiver Actor
        • NDI Media Assets
        • Getting Started with Example Blueprint Projects
        • Advanced
      • Free Audio
      • Bridge Service
        • Installation
          • Silent Installation
        • Configuration
          • Web UI
          • Configuration File
          • Manual API Key Management
        • Webhooks
        • WebSockets
        • API
    • NDI Certified
      • Certification Guidelines
        • Interoperability Requirements
        • Technical Requirements
      • Certification Process
        • Pre-certification Checklist
        • Device Testing Methods
          • Camera
          • HDMI Encoder
          • SDI Encoder
          • Decoder
          • NDI Controller
          • NDI Monitor
          • Multicast Testing
        • Detailed process
    • Metadata
      • Metadata Sources
      • Metadata for XML
      • XML Validation
      • Metadata Elements
      • Proposed New Metadata Messages
      • PTZ and Control Messages
      • Undocumented Mysteries
  • Developer Guides
    • Decoding with NDI
    • NDI Bridge Deployment
    • Receiver Discoverability, Monitoring, and Control Overview
  • FAQ
    • Index
    • NDI Tools
      • What is the NDI Analysis Tool and where do I get it from?
      • Why does my NDI connection stay active once the source is offline?
      • Why are my changes to the NDI JSON configuration file not being saved?
      • Why is the license for my Panasonic camera not active?
      • How does registration for NDI Tools work?
      • The time code on my file is incorrect, how do I change it?
      • How can I make NDI Tools launch automatically?
      • What is the NDI ECCN?
      • How Do I Uninstall NDI Software?
      • Does Discovery Server support Command Line on MacOS?
      • How do I use NDI output with Microsoft Teams
      • Where is Screen Capture (HX) for Mac?
    • NDI Certified
      • What is the NDI Certification Program?
      • Why did you start this certification program?​
      • What happens to my device after it's certified?
      • Is certification mandatory to be a partner of NDI?​
      • I am an OEM manufacturer, can my products be certified?​
      • How long does the certification process take?
      • How do I become certified?​
      • What happens if my product doesn’t meet the requirements for Certification?​
      • Are there any fees to become NDI Certified?
    • Common Issues
      • I'm having trouble getting multicast set up.
      • I'm having trouble with my NDI HX License
      • Why can't I find my Android 14-based NDI devices on my network?
      • Why won’t NDI Tools install on my Windows PC?
      • How do I enable NDI in “New” Microsoft Teams (Windows only)?
      • Why can’t HX Capture display the full resolution of my iPad or iPhone?
      • The NDI HX Camera app won't launch on older phones and iOS
      • Why won't MacOS Sonoma (14.1) recognize NDI Tools as a virtual camera?
      • I'm having issues with Virtual Input for macOS
      • I'm having issues with Final Cut Pro
      • NDI Camera App Issue
    • SDK
      • Where can I find the source code for the FPGA board?
      • What system resources are required to support a design including the NDI FPGA Codecs?
      • Why can’t my h264/265 video be received by an NDI receiver when using the embedded SDK?
      • I haven't received the email with the download.
      • Can I use the Unreal SDK on Mac?
      • What are the Differences Between the NDI SDK and the NDI Advanced SDK
    • ✨NDI 6
      • Do I need to upgrade to NDI 6 if I'm not using the new features?
      • Is there a fee to upgrade to NDI 6
      • Why doesn't my existing Vendor ID work with NDI 6?
      • How can I get a previous version of NDI Tools or the SDK?
    • NDI HX License Upgrades
      • What is happening with NDI HX Upgrades?
      • Will my existing HX-upgraded camera be affected?
      • I bought a camera before June 30, 2025, but didn't buy an HX license. Can I still get one?
      • I bought a license and a camera but didn't redeem it until after. Can I still get an HX license?
      • If my HX upgrade fails can I transfer the license?
      • Can I sell my upgraded HX camera and keep the license?
      • Will my NDI version work with my current HX license?
      • What does the HX upgrade sunset program mean for camera manufacturers?
      • What will NDI Support do for licenses after the sunset date?
      • I factory reset my camera and forgot the license. Can you help me get it back?
Powered by GitBook

2024 @ NDI Vizrt AB.

On this page
  • Color Space: YUV vs RGB
  • Chroma Subsampling
  • Bit Depth

Was this helpful?

Export as PDF
  1. Using NDI
  2. NDI for Video

Digital Video Basics

Last updated 6 months ago

Was this helpful?

Digital video involves capturing, processing, compressing, storing, and transmitting moving visual images in digital form. Understanding digital video requires knowledge of several key concepts, including chroma subsampling, bit depth, and color spaces such as YCbCr and RGB. These elements are crucial for balancing video quality with bandwidth requirements, particularly in video transmission.

Color Space: YUV vs RGB

Color space refers to the method of representing the range of colors. YUV and RGB are two primary color spaces used in digital video.

RGB is based on the primary colors of light (Red, Green, and Blue) and is used primarily in devices that emit light directly, like computer monitors, TVs, and cameras. It represents colors by combining these three colors at various levels of intensity, suitable for applications where precise color representation is crucial.

Color space refers to the method of representing the range of colors. YUV and RGB are two primary color spaces used in digital video.

In the YUV color space, the Y component represents the luminance, or brightness, of the color, which is essentially a grayscale representation of the image. The U and V components represent the chrominance, or color information, separate from the luminance. Specifically: U (Chrominance-B): The U component indicates the difference between the blue component and the reference luminance (Y). Essentially, it represents the blue projection of the color minus its luminance. This helps in determining how blue or how opposite of blue (which can be interpreted as yellowish) the color is. V (Chrominance-R): The V component indicates the difference between the red component and the reference luminance (Y). It represents the red projection of the color minus its luminance. This component helps determine how red or how opposite of red (which can be interpreted as greenish) the color is.

The U and V components do not directly correspond to specific colors but rather to the chromatic difference from the luminance. By adjusting these components, you can shift the hue and saturation of a color. The Y component ensures that the brightness of the color is maintained independently of its hue and saturation, which is particularly useful in broadcasting and video compression technologies where luminance is more critical to the perceived quality of the image than color details. This separation allows for more efficient compression by reducing the resolution of the U and V components relative to the Y component, exploiting the human visual system's lower sensitivity to fine details in color compared to brightness. To make it simple the U component in the YUV color space essentially represents the color spectrum between blue and its complementary color, which can be seen as a range from blue to yellow-green. This range is not about moving directly from blue to red but rather moving from blue towards green and red, where the midpoint might represent less saturation or a neutral point where the blue influence is minimized. Similarly, the V component represents the spectrum between red and its complementary colors, moving from red towards blue and green, where its midpoint could represent a neutral point with minimized red influence, potentially leaning towards cyan or green.

In essence, the U component controls the blue-yellow balance, while the V component controls the red-cyan (or red-green) balance. By adjusting these two components along with the Y (luminance), you can navigate through the color space to represent a wide range of colors.

The choice between YUV and RGB in a digital video workflow depends on the application's requirements. YUV is typically used for video compression and transmission, where bandwidth efficiency is paramount. In contrast, RGB is used in contexts where accurate color representation and direct control over each primary color are needed, such as in content creation and display technologies.

Chroma Subsampling

With YUV color space it is possible to use the chroma subsampling as a method to spatially reduce the color information in a signal while retaining the luminance data. This technique leverages the human visual system's characteristic of being more sensitive to variations in brightness (luminance) than to color (chrominance). By reducing the amount of color information, chroma subsampling significantly lowers the bandwidth and storage requirements for video data without substantially impacting perceived image quality. Video signals consist of luminance information, which represents the brightness levels, and chrominance information, which represents the color. Chrominance is further divided into two components, U and V Chroma subsampling is expressed in a notation like 4:2:2, 4:2:0, 4:4:4, etc. where the first number (4 in these examples) refers to the reference number of luminance samples in the first row of a 2x2 block of pixels.

The second number indicates the number of chrominance samples (U and V) in the first row of pixels, showing how many color samples are taken compared to the luminance samples. The third number indicates the number of chrominance samples in the second row of pixels.

4:4:4 subsampling means that no chroma subsampling is applied. Every pixel has its own color information, resulting in the highest quality but also the largest bitrate.

4:2:2 subsampling reduces the color information by half horizontally but keeps full-color information in the vertical direction. It's a common compromise between quality and bandwidth used in professional video environments.

4:2:0 subsampling reduces the color information by half both horizontally and vertically, which is widely used in consumer video formats (e.g., Blu-ray, DVD, streaming media PTZ and Prosumer Cameras) because it significantly reduces the bitrate with minimal visible loss of quality.

4:1:1 subsampling reduces the color information by a quarter of the horizontal resolution and keeps full-color information in the vertical direction.

Alpha Channel:

The YUVA color space is an extension of the YUV color space that includes an Alpha channel. In color spaces, the Alpha channel represents the opacity of a color, allowing for varying levels of transparency and compositing images over one another.

The alpha channel sampling is usually the same as the luminance and is expressed in a notation like 4:2:2:4, 4:2:0:4, 4:4:4:4, etc.

Bit Depth

Bit depth, also known as color depth, refers to the number of bits used to represent the color of a single pixel in a digital video. Higher bit depth allows for more colors to be displayed, resulting in more detailed and nuanced images. Common bit depths include:

8-bit: Capable of displaying 256 shades per channel (YUV), resulting in 16.7 million colors in total.

10-bit: Can display 1,024 shades per channel, offering over a billion colors. This greater range allows for finer gradients and more detailed color representation, reducing banding in video images.