Understanding Latency: Causes and Effects
Live streaming to an LED wall can look effortless from the audience perspective. Behind the scenes it is a careful balance of latency, scaling, and robust signal workflows that determine whether the picture feels live or feels delayed. The choices you make in cameras, switchers, transports, processors, and even mapping settings each add a few milliseconds. Those milliseconds add up.
At Mobile View Screens, LLC, we guide clients through this puzzle every week across arenas, campuses, festivals, and corporate halls. The goal is consistent: deliver a bright, sharp wall that responds quickly and keeps audio in sync with faces on screen.
Why latency matters on outdoor LED display screens
The audience notices delay most when you show faces and speech at large scale. If the wall runs behind the PA, lips look late. If the sound trails the image, claps feel off. People are surprisingly sensitive to this gap.
- Many live IMAG shows aim for 1 to 2 frames end to end at 60p, roughly 16 to 33 ms per frame.
- Broadcast-style remote contribution can run at 200 to 500 ms, as long as the audio sent to the room is delayed to match.
- Hybrid events mix both. The IMAG camera feed must stay tight, while remote callers can tolerate more lag if you align the PA.
An LED wall has two clocks in play that often get mixed up. Refresh rate, measured in kHz on LED spec sheets, relates to pulse-width modulation glare and flicker in cameras. Frame latency, measured in ms or frames, describes how long it takes a frame to move through the system. The first affects image stability. The second affects human perception of “liveness.”
Where delay hides in a live stream chain
Every device with a frame buffer adds time. Some devices add sub-frame delay, others accumulate whole frames to scale, re-clock, or de-jitter. The trick is not only picking low-latency gear, it is feeding devices signals they do not need to re-frame.
Below is a practical latency budget for a 60p workflow. Values vary by model and settings, so treat them as planning ranges, not absolutes.
| Stage | Typical latency (ms) | Notes |
|---|---|---|
| Camera sensor + ISP | 5 to 20 | Lower on cinema/live cameras with low-latency modes |
| Wireless camera link | 2 to 7 | RF links for live work; consumer Wi-Fi links can exceed 60 to 120 ms |
| Switcher program path | 8 to 17 | Often 1 frame; additional frames if input needs frame sync |
| Scaler/format converter | 0 to 17 | Many add 0 to 1 frame; avoid double-scaling |
| Encoder (SRT low-latency) | 50 to 150 | Depends on profile, look-ahead, and buffering |
| Network jitter buffer (SRT/RIST) | 50 to 200 | Set by packet loss and desired stability |
| Decoder | 20 to 80 | Hardware decoders tend to be quicker than software on general PCs |
| LED processor/scaler | 8 to 33 | Low-latency modes can be 1 frame; complex scaling can add another |
| Receiver/scan cards + modules | 2 to 10 | Generally sub-frame; depends on driver IC and scan ratio |
When you add those numbers, a direct SDI path to a low-latency LED processor can land near 1 to 2 frames. An SRT path across a campus network may land in the 200 to 400 ms range. An RTMP path can exceed 2 to 5 seconds, which is fine for a webcast but not for a PA-fed room unless you time-align audio.
Building a low-latency signal path
Start by picking a master frame rate and stick to it. If your cameras and switcher run at 59.94, feed the LED processor 59.94. Avoid 60 to 59.94 mismatches. Every conversion invites a frame buffer.
Keep the chain short. If your switcher outputs 1080p59.94, send that directly into the LED processor over SDI. Skip the extra scaler unless you truly need it. If you must route video over IP, pick a transport designed for low latency and set a reasonable jitter buffer.
After the baseline is set, dial in each link in this order.
- Camera I/O: Set low-latency mode, genlock capable models if possible, avoid heavy in-camera noise reduction that adds frames.
- Switcher: Stick to one program format, turn off unnecessary framesyncs, prefer hardware multiview if software adds delay.
- Encoder/Decoder: Choose profiles optimized for speed, keep GOP short, minimize B-frames, use hardware acceleration.
- Network: Wired gigabit, QoS for video, avoid Wi-Fi for stage-to-FOH if you care about timing.
- Processor: Enable low-latency or bypass scaling, match input frame rate, use native color format when possible.
- LED Modules: Use high-quality receiver cards and drivers, keep scan ratio modest for artifacts and response.
- Audio: Measure video delay and add matching delay in the PA DSP or console.
This sequence prevents a common problem: solving latency at the processor while the switcher and encoders keep adding frames you did not account for.
Smart scaling and mapping
Scaling and mapping choices can make or break perceived speed. An LED processor that rescales every frame will carry a full frame buffer to compute the next output. If the content already matches the wall canvas, the processor can often pass pixels with less work.
Aim for 1:1 mapping from the source to the LED canvas. If the wall is 1792 by 896, build a 1792 by 896 feed from your graphics machine or a dedicated scaler that can frame-exactly match. Many GPU outputs can be set to custom resolutions, and processors offer exact pixel maps. The more you avoid fractional scaling, the less work each device does.
EDID control matters. Present clear EDIDs to your source so it drives the proper pixel clock, color space, and frame rate. If the graphics source presents 4K60 but the processor wants 1080p60, a needless scaler in the middle will add delay. Keep the chain native or use a single high-quality scaler in known low-latency mode.
Color pipelines can be a hidden trap. A 4:4:4 to 4:2:2 to 4:4:4 round trip with limited range conversions can force extra processing. Keep everything in one family if you can. 10-bit 4:2:2 is common in broadcast switchers. Many LED processors prefer 8-bit 4:4:4. Pick the handoff that matches your gear to reduce conversions.
Picking the right transport to feed the wall
Direct SDI or HDMI is still the fastest path to a wall. When you need to move video across rooms or buildings, the protocol choice sets your delay floor and your stability ceiling. Decide first whether you need sub-100 ms glass-to-glass or whether a quarter second is fine. Then size your network and jitter buffers.
A quick cheat sheet for common options:
- Baseband SDI or HDMI: sub-frame to 1 frame, robust, simple cabling or fiber runs
- NDI on a managed LAN: visually lossless, 100 to 250 ms depending on hops and HX use
- SRT point to point: 150 to 350 ms typical with light packet loss headroom
- WebRTC: 100 to 300 ms with careful tuning, more sensitive to jitter
- RTMP/HLS: seconds, great for web audiences, not for IMAG timing
When our teams plan distributed rooms on campuses, we often run a direct SDI feed to the main room wall, then use an SRT or NDI HX path to overflow spaces. The main room gets low-latency IMAG. Overflow rooms get a stable feed that still feels live, with delay aligned to speakers in those rooms.
Genlock, frame sync, and refresh rates
Genlock keeps cameras and switchers marching to the same clock. A switcher locked to tri-level sync can pass frames without re-timing. That reduces framesync penalty on each input. Some LED processors can take external reference to align their output sampling. Others free-run and accept a frame of jitter with an internal buffer. Know which you have.
If the LED processor cannot genlock, keep its input stable and at the exact rate it expects. That reduces rate-conversion buffers. And remember that a 3840 Hz or 7680 Hz LED driver refresh is not a frame rate. It is PWM frequency. High refresh helps with camera filming and smooth gradients, but it will not reduce frame count on its own.
Measure it, then align audio
Guesswork about delay rarely ends well. Measure glass-to-glass. A large digital millisecond counter or a clap-slate in view of a camera gives a clean number. Smartphone apps that flash and beep at the same time can also work, as long as you read at the wall.
Once you have the number, apply matching audio delay in the DSP or console for the PA. Many live consoles can insert milliseconds of delay on the main bus or on a matrix feeding the room. If you are feeding in-room presenters with IEMs or foldback, keep those feeds on a low-latency path to avoid disorienting echoes.
If the LED processor offers a low-latency mode that reduces buffering at the expense of some processing, it is often worth the trade. Combine that with a shorter GOP on your encoder and you can shave tens of milliseconds.
Field notes from Mobile View Screens
At a recent outdoor keynote, the brief called for a 24-foot mobile LED trailer screen under full sun while bringing in a remote VIP. We set the show camera and graphics at 1080p59.94, kept the switcher at the same rate, and fed the wall via SDI into a processor with its low-latency mode active. That path came in near 1.5 frames.
For the VIP, we used a dedicated SRT encoder over a private VLAN with a 120 ms receiver buffer. End-to-end measured around 280 ms. The PA was delayed by 280 ms for segments that featured the remote guest, then switched back to a 33 ms profile for IMAG-heavy moments. The audience stayed with the content, and the stage talent never fought echoes.
Bright midday sun can expose hidden issues on panels, so we also ran the wall at a high brightness profile and kept content contrast tuned for legibility. High-brightness displays with the right driver ICs hold color and punch without crushing mid-tones, which pays off when cameras reframe on faces.
Common pitfalls we see
Even seasoned crews run into the same gotchas. A short checklist helps.
- Double-scaling in a switcher and processor
- Mismatched 59.94 vs 60 timelines
- Wi-Fi hops in the critical path
- Long GOP encodes for live IMAG
- Unlocked cameras forcing framesyncs
- EDID confusion on graphics machines
- PA not delayed to match video
Bringing it all together with service and support
A low-latency LED workflow is equal parts planning and execution. The hardware matters, but so does the discipline to keep formats constant, avoid avoidable scaling, and allocate the right buffers. Testing early with the actual cameras, switcher, processors, and a real wall will catch surprises long before doors.
Mobile View Screens, LLC has been building and operating outdoor LED display screens and these systems since 1999. Our team brings over 50 years of combined large-screen experience, from mobile LED trailers to modular video walls at arena scale. We carry high-brightness panels that hold up in direct sun, fine pixel pitches for indoor clarity, and multiple processor options that support low-latency operation. Clients rely on fast response, on-site planning, 24/7 support, and backup equipment that keeps shows moving.
