Editor’s note: Michael Narracci is the coordinating director of Red Sox broadcasts on NESN and has spent 14 seasons with the network. This is the 11th installment of his “Director’s Cut,” a behind-the-scenes look at NESN’s Red Sox game coverage.
We’re always looking for new technology to use in our baseball broadcasts. Earlier this summer, we used a Sony F-55 4k Super-Mo camera in our telecast. It was the first time the camera had been used in 6X mode on a telecast. In keeping with our mission to push technology in our broadcasts, when Sony approached NESN vice president of engineering David Desrochers about doing a demo of their new Sony Stitch product, we jumped at the opportunity. I met with Sony representatives during a Red Sox road trip a few weeks later at their Madison Avenue location in New York City and had some preliminary discussions about dates and logistics. On Sept. 10, the demo came to fruition when we used the Sony 4k Stitch system in our NESN Red Sox broadcast.
Our Sony account manager, Steve Dirksmeier, brought his A-Team to Boston to facilitate the demo. The team consisted of product marketing manager Deon LeCointe, acquisition sales support engineer Steve Flynn and sales support engineer manager Joe Perecman. They worked very closely with Desrochers and Game Creek Video engineers Bruce Kerrigan, Pat Calhoun and Tim Jobin to ensure the demo stood the best chance for success. Bruce, Pat and Tim prepared the B-unit for the demo on Sept. 8 by ensuring we had counter space, a viewing area for NESN management and ownership, video tie-lines, intercom and other infrastructure available.
The Stitch system is comprised of two Sony F-55 4k cameras, which are each capable of four times the resolution of a standard HD camera. The cameras were outfitted with Fujinon 19-99mm lens and mounted on a special jig that aligned the cameras in the proper geometric configuration.
A workstation running the proprietary Sony Stitch software is the brains of the system and is where “stitching” is done. Stitching is where the two cameras are blended seamlessly into one really wide (32 x9) picture. A standard HD image is 16×9.
The 32 x 9 canvas created in the Stitch software. The three colored frames on the canvas are the three virtual HD cameras. One is framed on home plate, the second on the first base bag and the third in the area of second base.
A Sony PWS-4400 4k video server and some miscellaneous monitors rounded out the hardware.
The workstation and server was set up inside our NESN B-Unit. Typically our B-unit houses engineering support and engineers Bruce Kerrigan, Pat Calhouhn and Tim Jobin, robotic operators Joe Francasio and Michael Porta and miscellaneous storage such as bottled Poland Springs water for our broadcast crew.
The Stitch product had been used during some soccer broadcasts in Europe, but never in a baseball broadcast, so on Sept. 9 we were fortunate enough to have a set day to build the system. During our preliminary planning, we decided that the cameras would be positioned in the Fenway high third position to provide a “panoramic view” of the entire infield as well as look into the Red Sox dugout. We also tested a panoramic shot of the entire field of play (minus the left field corner, but we determined that a zoom in to any base was too pixilated for what we wanted to accomplish).
In this extremely high-resolution image, NESN had the option of creating three separate HD viewing ports (or virtual cameras) that can be panned and tilted separately and zoomed in anywhere in the field of view. It’s like having three extra cameras on the broadcast.
The three virtual camera outputs on the monitors on the top shelf and the 32-by-9 image on the VGA monitor on the tabletop. The three joysticks on the tabletop control the zoom, pan and tilt of the three virtual cameras, much like a video game. Only bottled water allowed in our production trucks!
The three cameras allow you to get a shot even if a traditional game camera wasn’t pointing in the right direction. An example of this was when rookie catcher Dan Butler got his first major league hit. In our standard coverage plan, we don’t have a camera that is isolated on the Red Sox dugout unless we break one from its traditional responsibilities. When Dan got his first hit, a bunch of players jumped up and clapped for him. Using the Sony Stitch product, we were able to pan over to the dugout with “Virtural Camera 1″ and play back the players’ reactions to his hit after it had happened!
Coordinating director Michael Narracci operating the three virtual cameras and server operator Daniel Phipps doing motion control of the footage and logging.
Overall, the demo went well. Some of the scenarios we were hoping would happen did not. One scenario we were looking for was a close play at one of the bases that we could zoom into and provide the definitive angle of the play and another was a turn of a doube play at second base that we could zoom towards and see how the shortstop or second baseman avoids collisions.
It was a pleasure meeting and working with the folks from Sony’s Broadcast Division. This was the first attempt at using Stitch in a baseball game, and we learned a lot. Keep in mind, this is the very beginning of this technology and it will only get better. We had some very productive discussions with Sony on how we thought the product could be better. Perhaps we will see it on a Bruins game this winter.