Vendors including Sony, Grass Valley, Ross and Lawo showcased new tech geared to improving efficiencies and interoperability at this week’s gathering in Las Vegas.
LAS VEGAS — Several new cameras and production switchers were introduced by major vendors at NAB. But the bigger focus was on improving efficiency in existing systems and supporting better interoperability between different vendors’ products.
Getting ‘All Of Our Cameras To Look Alike’
Sony introduced a new high-end Super35mm camera for live production, the HDC-F5500V Super 35mm 4K CMOS system camera, which uses Variable Neutral Density (VND) technology for increased depth of field control and virtual iris capabilities for a wide range of brightness control. The new HDC-F5500V, which is expected to be available this summer, has the same infrastructure as Sony’s existing HDC-5500V 2/3-inch camera including IP networking, remote multi-camera operation and an in-CCU (camera control unit) record option.

With the HDC-F5500V Sony is addressing a trend of multiple large sensor 35mm cameras being used for live production of sports as well as concerts, said Rob Willox, Sony director of live media solutions.
“A lot of those big concerts want that real cinema depth-of-field look, to really separate the backgrounds from the subjects and give it a nice lush look,” Willox said.
As the lighting at these concert venues can change rapidly based on a performer’s lighting effects, Sony has created dynamic filters that automatically account for large changes in exposure when shooting live. When shallow depth of field cameras are being used on such productions, the filters will also adjust the picture to maintain the same depth of field over a wide range of lighting conditions.
Willox also said that sports customers are increasingly using a mix of Sony studio/broadcast, digital cinema and handheld cameras simultaneously for live productions. In response, Sony has created a series of LUTs (lookup tables) that will automatically adjust color and shading across different cameras so their pictures match.
“What we have done over the past year is to try to get all of our cameras to look alike,” he said.
The LUT initiative is working, said Willox, who pointed to the image quality of major sporting events like the Super Bowl, Daytona 500 or CBS’s broadcast of the NCAA Final Four this past weekend, where a broad array of Sony cameras were in use.
“There are a lot of different cameras and a lot of different looks, but the color imagery is all the same, and we’re not pulling you out of the experience because we decided to use a DSLR and a wireless,” Willox said.
Sony has also updated the firmware on its CNA-2 Camera Control Network Adapter to support integrated management of multiple sites through Sony’s Global Multi Camera System (GMCS), enabling remote operation of numerous cameras for large-scale sporting events and other applications. Through the optional HZC-MSUCN2 license, simultaneous configuration changes and color adjustments for multiple cameras can be enabled via a web browser.
Willox also highlighted the increasing array of features that are being built into remote controlled point of view (POV) cameras like the HDC-P50A compact and lightweight POV (point of view) box camera , including 4X 4K support for replays from cameras like “SkyCam” POVs in football coverage or POVs mounted behind a hockey goal or basketball backboard.
“There’s an awful lot of camera built into these POV cameras now,” he said.
Sony also showed the latest improvements to the replay system from its subsidiary Hawk-Eye Innovations, which has found traction in recent years with major networks for sports coverage including NFL football. Over the past year Sony has created an asset management system called HawksNest to work with the HawkREPLAY system.
Sony recently announced that Hawk-Eye’s virtual measurement system will be used by the NFL for officiating purposes starting next season, as the primary method of measuring the line to gain for a first down instead of the traditional method of having officials “walk the chains” for 10 yards. The Hawk-Eye digital measurement tool consists of six 8K cameras for optical tracking of the position of the ball.Further strengthening its relationship with the NFL, Sony is also going to provide all headsets for NFL coaching staff starting with the 2025 season.
Sony showed some lower-end technology aimed at the rapidly growing creator community producing for streaming and social media platforms with its Creators’ Cloud suite of software, services and apps that run on Sony Ci cloud software using AWS compute. They included a “Creator’s App” that connects to cameras to quickly enable live streaming; the Catalyst Browse and Prepare apps that help source and prepare content for editing; and Monitor & Control, a mobile app for creators that enables wired or wireless video monitoring and precise remote-control for single or multi-camera operations. Sony demonstrated Monitor & Control controlling multiple cameras from an iPad interface.
Sony’s Nevion VideoIPath networking technology has also now been certified for cloud use by AWS, said Dean LeCointe, director of networked solutions for Sony imaging and professional solutions Americas. Through Nevion, Sony has developed software that translates the network complexity of cloud operations in a simple GUI that a broadcast engineer can easily work with.
Adding ‘A Different Layer To The Story’
Grass Valley also introduced a large sensor camera designed to capture cinematic shallow depth of field images for live production, the LDX 180, which is built on Grass Valley’s popular LDX 100 Series camera platform. Though it has the same form factor as the company’s existing 2/3-inch broadcast cameras, the LDX 180 has a brand-new Grass Valley-developed S35 global shutter CMOS imager. It works with the company’s “Creative Grading” camera shading control panel and tablet applicationto integrate with 2/3” LDX cameras like the LDX 135 and LDX 150 for consistent colorimetry and shared transmission, and it supports SDI and ST 2110 outputs.

While the vast majority of broadcast cameras used for sports and concerts will continue to be 2/3-inch models due to the utility of their wide shot, large imager cameras that can deliver a shallow depth of field are increasingly being used to capture dramatic closeups of athletes or musicians in action, said Grass Valley Director of Product Marketing Ronny van Geel. About 10% of the cameras on a large-scale multicamera production might be large imager cameras today, he estimated.
“With this camera, you can really add the drama to the production, but within the same visual look,” van Geel said. “It doesn’t break when you watch it, you don’t feel it’s a different camera. You really just have a different layer to the story.”
For smaller productions, Grass Valley also introduced a subcompact version of its K-Frame production switcher, the K-Frame VXP, which has the same processing power and 4K HDR capability but in a maximum 48×24 I/O footprint. It also has a redesigned Karrera panel, which has the same eight-keyer tactile interaction as the Kayenne.
“It’s the same architecture [as Kayenne], but it’s more a cost-efficient way of doing the same level of production with a different panel,” van Geel said.
Grass Valley unveiled two new applications for its software-based AMPP production platform, Sport Producer XandEvent Producer X, both designed to allow a single user to control switching, replay, graphics and audio from one intuitive interface. Sport Producer X is aimed at the lower-tier sports and esports markets, while Event Producer is targeting mostly corporate and education customers.
In other AMPP news, Grass Valley announced that its Maverik X production switcher, with runs on public or private cloud compute via AMPP, can now be directly controlled viaRoss Video’s OverDrive production automation system, a tool used for daily news production by many call-letter stations. Grass Valley and Ross Video said they have worked together to integrate Maverik X with OverDrive as part of their commitment to “open and adaptable production workflows.”
OverDrive, which integrates with newsroom computer systems to automate newscasts via rundown-based control, is now compatible with over 220 third-party devices.
“Nobody wants to buy a closed system nowadays,” says van Geel. “Everything, of course, should be open. Nobody is looking for replacing hardware with being locked into a virtual license for the next 10 years. It’s way too dynamic for that.”
Pushing Live Production Into The Public Cloud
Ross Video had switcher news of its own at NAB with the introduction of Carbonite HyperMax, a flexible platform for production switching, routing, video and audio processing, multiviewer monitoring and high-res video compositing. The compact product is aimed particularly at mobile truck and other smaller production spaces. But Carbonite HyperMax is designed to be more than a switcher, however. It is based on a software-enabled hardware blade — the “Software Defined Production Engine” — that can be configured to run a variety of different Ross tools through dynamic software licensing.

A single software license activates advanced software features with Carbonite HyperMax such as MaxME, MaxMini and MaxScene on any SDPE blade, while the Ross Platform Manager enterprise-level control system manages software licenses and configurations across SDPE blades and standalone Ross switchers. Carbonite HyperMax is fully compatible with Ross’s TouchDrive family of control surfaces including the DashBoard operational control and configuration software.
While it is rolling out new hardware products, Ross is also continuing to push live production in the public cloud. The company recently hired AWS veteran Aaron Tunnel as its business development director for cloud solutions, and Tunnel was at NAB to describe Ross’s “Cloud Provisioning Service” (CPS), a new product aimed at simplifying cloud production for broadcasters by automating the configuration of cloud resources and networking needed to perform production tasks.
Tunnel spent nine years at AWS working on cloud adoption strategies for media and entertainment customers, after previously serving as CTO of systems integrator Digital Video Group (DVG). He said that while live production can be made to work in the cloud, there is still a skills gap among broadcasters that needs to be addressed.
“While we’ve been able to do productions in the cloud for a couple of years now, it takes a cloud engineer hours, if not days, to set up a production,” Tunnel said. “The first thing Ross is trying to achieve with CPS is to just make the orchestration of a production a drag-and-drop affair from that perspective. It’s got a GUI interface, and you could drag in different Ross products and hit ‘Go,’ and it will build the cloud infrastructure and allow you to get started on that.”
In the future, CPS will also be configuring Ross hardware products and allowing customers to take a snapshot of a production and jump back into it at a later date. A goal is for CPS to work on prem as well as with third-party hardware. It may also offer flexible licensing in the cloud so that customers can choose between on prem, hybrid and cloud and burst into the cloud, or do DR in the cloud, as required.
“Really, at the end of the day, our cloud strategy is about flexibility,” Tunnel said. “Customers want choice, and so we want to allow customers to run on prem, run the cloud, run in the hybrid scenario and be agnostic across all those.”
Gabriel Duschinsky, senior manager of cloud product management for Ross Video, said in the future CPS will have preconfigured workflows for different types of productions, much like templatized graphics are used to quickly create common graphics. But he said the value in the system today is to be able to click a button and have CPS deploy and spin up computers and set up the networking. He said that installing applications like OverDrive in the cloud used to take two or three days, but that CPS can do it in an hour.
“We can take this a lot further with configuration management, with the time saved to go to a working state,” Duschinsky said. “An operator still has to log into the apps. The difference is, we do all of the hard IT stuff ahead of time.”
A Credit System
Lawo demonstrated how COTS servers can be flexibly configured to run a variety of production applications through its HOME Apps, which provide customers access to various software tools when they purchase “credits” through a Lawo FLEX subscription. Different apps cost different amounts of credits — a multiviewer costs more than a transcoder, for example — and when an app is stopped the credits go back in the customer’s “purse” and can be used for something else.

Lawo’s premise with HOME is that its hardware-agnostic apps can be run on any type of compute, though right now it is bundling them with high-powered servers, and that they can be controlled from anywhere via a web browser. One of its NAB introductions was Lawo Workspaces, which are remotely accessible user interfaces wrapped around modular microservice-based HOME Apps that provide specific production functionality on the go. Via an HTML5-native UI, HOME Apps with Lawo Workspaces can be controlled via browser from any desktop, laptop, tablet, phone or even AR headset.
The “HOME Commentary”app is a flexible system for both “off-tube” and on-site commentary. It allows commentators or contributors to monitor up to two video streams, send their audio and video to production for contribution or monitoring, and interact with production coordination via a built-in talkback function. In its most compact form, HOME Commentary only requires a portable or mobile host device, a microphone and a pair of headphones for a fully functional commentary station.
The commentator’s coordination mix tunnel and talkback are processed in the HOME Apps backend but can be controlled directly from the Workspace UI, while a built-in audio engine provides local mixing and low-latency monitoring directly in the HOME Commentary app.
Another new Workspaces app, the “HOME Video Monitor,” provides low-latency video and audio monitoring to broadcast professionals anywhere, via laptops, mobile devices or AR headsets. HOME Video Monitor can show between one and nine concurrent video streams, with and without audio metering. And a third, the “HOME mc² crystal Controller” provides an additional control interface for compact Lawo “crystal” audio consoles used in a combination with a large “mc²” audio control surface operated by the lead audio mixer in production hubs and live venues. With HOME mc² crystal Controller, a tablet or laptop placed behind a compact crystal console can display high-resolution audio meters and a video feed and extends the crystal console’s functionality with additional on-screen touch controls.
Lawo also demonstrated a new “HOME Intelligent Multiviewer” based on Lawo’s server-based processing platform for on-prem and cloud production. The system minimizes bandwidth and CPU usage as it dynamically allocates processing power based on the multiviewer job at hand. Paired with Lawo’s .edge SDI-to-IP gateway and edge processing solution, the HOME Intelligent Multiviewer minimizes bandwidth and CPU usage by intelligently selecting optimal downsized video proxies for its mosaic layouts.
The number of PiPs in the Home Multiviewer can easily be adapted to changing requirements, going from one to up to 64 splits by simply setting the relevant parameter in the HOME GUI.
Lawo marketing manager Andreas Hilmer demonstrated the new HOME multiviewer in different scenarios, quickly changing from a master control room (MCR) configuration to a production control room configuration (PCR) and then to an audio control room, each with a different arrangement of PIPs, using Lawo’s VSM control system. A graphical display showed the differences in CPU usage between the three scenarios.
“If you have a dynamic media facility, it’s important that your control system can handle dynamic resources,” Hillmer said. “That’s what we have built up here.”