July 11, 2017
Wired
IT'S THE NIGHT before the world's first professional flag football game, and everything's going wrong. Not that it matters.
The league's two teams, led by former NFL all-pros Mike Vick and Terrell Owens, have just taken the field for a dress rehearsal of sorts. On the first play, someone bobbles the throw-off. (Throw-off? Yes. There's no kicking in flag football.) That should end the play, but no one remembers that. The 14 players on the field keep at it until Jeffrey Lewis, the founder of the American Flag Football League, runs toward the field yelling "Dead! Dead! Whistle!"
Not to worry. Everyone will figure out the rules. The flags pose a bigger problem. They use magnets and wireless radios to help referees pinpoint exactly where a ballcarrier went down. But the magnets won't hold through the vinyl on the flag, and Velcro can't stand up to the rigors of gameplay. League employees pace the sidelines here at Avaya Stadium in San Jose, debating a solution. Maybe Gorilla Glue?
Still, Lewis seems pleased. He summons two people to a monitor propped up in the midfield tunnel to explain why. Chaos reigns on the field, but the broadcast feed looks gorgeous. The virtual first-down line holds in place. The swooping SkyCam looks almost Madden-like. The "Go Clock" that counts down the four seconds until the quarterback must pass or run works perfectly. Replays come from all angles just moments after each play. And with no helmets or pads in the way, viewers can see every grimace, every celebration, and frustrated harumph. It makes for great TV, which is all that really matters to Lewis.
Lewis and his team spent just nine months building the AFFL: lining up investments, writing the rules, finding players and broadcasters and live-streaming partners. So far, the league consists of just two teams and the goal of launching the Flag Football US Open next year. This isn't the first league to challenge the supremacy of the NFL, but Lewis believes flag football provides a faster, safer, more social media-savvy take on football. He's definitely on to something. Now if the players could just remember the rules.
Redesigning the Gridiron
The way Lewis tells it, the AFFL started on the sidelines of his son Hayden's flag football game. He'd coached the game for a few years, and found the sport surprisingly entertaining. "It just made me say to myself, 'I wonder what this game would look like if it was played by the greatest players in the world?'" he says. He figured he could tap the pool of skilled players who aren't in the NFL for one reason or another and create a league just as exciting.
The game riffs on the one you probably played as a kid. Players spend 60 minutes working their way up and down a standard football field delineated by four 25-yard zones. Each team features a dozen players and fields seven at a time. Everyone plays offense and defense. Games start with a throw-off, with a player of the coach's choice hurling the ball as far as possible. The receiving team has four plays to reach the next 25-yard zone, then the next, and so on until it scores or throw-punts (no kicking, remember?). The game moves quickly—the quarterback has just four seconds to heave the ball or start running, and the defense can rush the backfield after two seconds. Someone pulls your flag? You're down.
Touchdowns from less than 50 yards earn six points. Anything longer earns seven. After scoring a touchdown, teams can go for one, two, or three more points with a conversion. Tackling, blocking, and kicking? Strictly forbidden. The clock runs constantly, until the last two minutes of the first half and the last five of the second half, when the clock stops after each play.
The league launches next year with the Flag Football US Open, an epic tournament that Lewis envisions drawing 1,000 teams and a seven-figure prize. Winning teams will recruit from losing teams, until only the best remain. Think of of it as American Idol or American Ninja Warrior, but with a football. The eight best teams will then play the inaugural AFFL season.
To pull this off, Lewis hit up friends, family and investors. Everyone called him crazy. They pointed to the long list of wannabes that hoped to challenge the NFL. Remember the XFL? Ever watched arena football? The United Football League lasted all of four seasons (go Destroyers), and even the NFL's European expansion flopped. The NFL has many problems, but disruption is not one of them.
When I point this out, Lewis waves me off. Those leagues couldn't execute, he says. You can see his point. The XFL debuted to enormous ratings in 2001 but quickly fell apart because no one wants wants football with wrestling. Every challenger failed because they offered, well, crap. "It would be like, 'Oh, it's the same game as the pros, with people who aren't as good," says Darren Rovell, a sports business reporter at ESPN. People do want more football, Rovell says, but the game must offer something novel.
The novelty of flag football, of course, contains something at least one potential investor considered a downside: the complete lack of violence. "He felt the violence of football was the primal element, that it was what drew people to football," Lewis says. Eager to prove the naysayers wrong, Lewis and his team scoured social media data, trying to figure out what people talk about while watching NFL games. Turns out words like touchdown, pass, run, and interception come up a lot. Hit, block, and tackle? Not so much. The NFL offers the Red Zone channel to show non-stop scoring, but as Lewis says, "there's no Hard Hits channel." If social media is any indication, then you'll find all of the most exciting and shareable parts of the NFL in flag football.
The more Lewis researched, the more opportunity he saw on social media. For instance: Football remains America's most popular sport, yet no NFL players appear in top 20 of ESPN’s list of the 100 most famous athletes. Only Tom Brady (No. 21) and Cam Newton (No. 47) crack the top 50. Lewis firmly believes that's because NFL players are faceless gladiators that viewers can't relate to. In other sports, "whether you're rooting for a guy or against the guy, he's a character, and you feel like you have a relationship to them," Lewis says. Eliminating the helmets and pads makes it easier for viewers to see, and relate to, players. And the AFFL wants those players tweeting, streaming, snapping, and posting nonstop—things that'll earn you a fat fine in the NFL—to deepen their relationships with fans.
Prototype Pigskin
With no players union to fight, billion-dollar TV contracts to protect, or traditions to uphold, the AFFL has been free to use technology in novel ways to enhance the game.
Take the flags: Refs must know exactly where a player is when his flag gets pulled. This is trickier than you think when you've got 14 guys hustling through a play. The flag might flutter away, or be thrown aside by the guy who yanked it. So the league tapped SMT, the company that invented the virtual first-down line and handles scoring and statistics for the NCAA basketball tourney, the Super Bowl, and Stanley Cup. SMT developed a simple vinyl flag with a magnet sewn into one end that sticks to another magnet on a belt around the player's waist. Separate the magnets and a sensor in the belt sends a signal to a small transmitter on each player's shoulder. That transmitter communicates constantly with wide-band wireless receivers, suspended from the stadium's ceiling, that record every player's location 15 times per second. By matching the detach signal to a location on the field, SMT's system can pinpoint to within 3 inches where a flag was pulled. The system's not just for players, by the way: A sensor in the ball provides incontrovertible proof of whether the quarterback threw the pass in time.
The technology ensures officiating accuracy which, combined with subtle tweaks to the rules of football, makes for snappier gameplay. There will be no interminable four-hour games riddled with eight-minute replay reviews; Lewis wants to see games last two hours and not one minute more. Refs in the review booths way up in the press box will leave most calls to their colleagues on the field, but if someone gets something wrong, the computers will know almost immediately.
The flag-pulling stuff is nothing compared to Oasis, SMT's sophisticated system for tracking a player's heart rate, temperature, speed, impact force, and other data in real time. "The idea that you get that stuff live is a damn big deal," says company CEO Gerard J Hall. All that data can help protect players, and help them optimize their performance. Hall says he's talking to the AFFL about doing more with the system.
It's worth noting that football without tackles is a far safer game, one that greatly diminishes the risk of concussions and CTE. But Lewis sidesteps the idea that part of the appeal here is "the NFL, only safer." If nothing else, safety makes for a boring marketing plan. Yet he can't have missed the growing trepidation, if not moral ambivalence, fans feel as the game's dangers become more widely known. Just as the XFL pioneered on-field interviews and the SkyCam, the AFFL could show the potential of tracking player data on the field. If nothing else, it gives Lewis and Hall a backup plan. "That business [Lewis] says he has," Rovell asks, referring the league itself, "is that really the business? Or is that just a red herring? Is he going to sell the tech to the NFL, which is bigger than any flag football league could ever be?”
Going Live
The day after that haphazard scrimmage, which saw those magnets replaced with Velcro and a long night of production meetings, the AFFL made its first big show. A few hundred fans filed into the stadium, each grabbing a brochure announcing the AFFL and explaining the rules of the game. Some wanted to see a game they know from childhood played at the highest level. Others held loftier goals. "I might try to get on the field," one slightly drunk observer told me. "We'll see." But mostly people wanted to see the likes of Vick and Owens playing again.
The two rosters combined veterans of the 2003 Pro Bowl and guys who almost made it in the NFL. Vick, Owens, and Chad Ochocinco were among the sport's brightest stars during their illustrious (and controversial) careers. Justin Forsett, Jimmy Clausen, and Jonas Gray played one or two great games before moving on. And guys like Najee Lovett, Yamen Sanders, and Stephen Godbolt never saw a pro gridiron. Some of them consider the AFFL a stepping stone to the NFL. Others seem to simply enjoy playing again. And they share a common motivator: Appearing the inaugural game earned them equity in the league. Players on the winning team earn even more shares.
The success of the league depends upon the celebrities it can create. And so about 10 minutes before throw-off, fans gathered at the tunnel as players trickled onto the field for autographs and selfies. After mingling with fans, the players hit the field, hooting and hollering and snapping photos of their own.
The two-hour game went more smoothly than the dress rehearsal, with plenty of circus catches, more than 100 points scored, and Vick doing cartwheels. Players ran three-point conversions and seven-point touchdowns. The final play was a pick-six onside throw-off, run back for a touchdown by Max Siebald, a former lacrosse player. None of which makes any sense, but all of which somehow felt appropriate.
Vick's team won, 64-41. A few minutes after the game ended, the players gathered on the sidelines. Evan Rodriguez, a brawny former NFL tight end who caught four touchdown passes and was named the game's MVP, gathered his teammates behind him. He raised his gold iPhone high overhead in his left hand, and everyone grinned and shouted and threw their arms up. Rodriguez snapped a selfie. At that moment, the AFFL was truly born.
Jason Dachman, SVG
SMT graphics system will leverage NFL Next-Gen Stats, be implemented on CBS’s Skycam and all-22 camera
This Sunday’s Patriots-Steelers AFC Championship showdown in Foxboro, MA, will mark the culmination of one of the most technologically innovative NFL on CBS campaigns in history. After debuting on the broadcaster’s epic production of Super Bowl 50, several innovations have become standard for CBS Sports’ A-game productions. Their deployment will be even more robust on Sunday, when CBS Sports’ largest NFL production of the year will feature more than 50 cameras and the debut of player-tracking graphics on the Skycam and the all-22 camera angle.
“We continue to ramp it up with each game,” says Harold Bryant, executive producer/SVP, production, CBS Sports, “adding more cameras, including more high-speed, super-slo-mo, and 4K [cameras]. We’ll have more than 50 cameras when you add it all up; just a few years ago, that was a Super Bowl level. So that’s pretty impressive for an AFC Championship game.”
Virtual Tracking To Be Integrated Into Replays
For the first time, CBS Sports will implement player-tracking graphics (using the NFL Next-Gen Stats system) on the Skycam and the high-angle all-22 camera position showing all the players on the field. CBS has been working with SMT in recent weeks to develop the system (including a full live test during last week’s Divisional game in New England), which will integrate virtual-tracking graphics/telestration of players in replays.
“Essentially, this is an enhanced telestrator that’s combined with NFL Next-Gen Stats,” says Bryant. “On a big play, you will be able to see the exact moves LeGarrette Blount, for example, makes on a big run using a virtual line drawn on the field. Or, if there’s a defender that’s mirroring Antonio Brown step for step, we can put lines on the field to see that. If every receiver is covered — you hear that a lot — we can highlight all the defenders and show how tight they are against all receivers. The [all-22 camera] will give us a high look, and [the Skycam] will give us a lower wide look.”
According to Bryant, the system is fast enough to serve as the first or second instant replay, allowing the production team to integrate the segments into the telecast as they would any other replay.
CBS Sports also worked with SMT to implement the 1st-and-Ten virtual graphic on the Skycam system, which has been used throughout CBS’s NFL Playoffs coverage and will be integrated into the AFC Championship game telecast.
Pylon Cams Come of Age
Over the past two seasons, pylon cams have dramatically changed the way broadcasters cover the NFL, creating new, never-before-seen angles of key scoring plays. CBS has helped lead that effort, and Bryant expects the use of pylon cams to become more prevalent.
“The pylon cams have been a tremendous addition to football broadcasts,” he explains. “I believe it’s going to become a staple in all major football broadcasting in the [near future]. It gives you that great look down the goal line and end line, and that is so important since so many plays happen in those corners of the end zone. It gives you that really low, intimate look, especially when a player is reaching out trying to get the ball over the pylon, which we had last week. And it will only get better as the cameras get better, so there’s a lot of potential for growth with the pylon cameras as the technology improves.”
Inside the Truck and On the Field
The AFC Championship production will be run out of NEP’s SSCBS, which was rolled out for the 2015 season and continues to serve as the NFL on CBS A unit. At the front bench will be Coordinating/Lead Producer Lance Barrow and Lead Game Director Mike Arnold, both working their 13th consecutive AFC Championship games for CBS.
Among the highlights of the 50-plus cameras deployed at Gillette Stadium are 10 high-frame-rate cameras of various speeds and multiple 4K cameras, including a Sony HDC-4800 4K 8X slo-mo system.
Wrapping Up the 2016-17 Campaign
Sunday marks the finale of CBS Sports’ 57th year broadcasting the NFL, and, with its Sunday schedule and the first half of the Thursday Night Football package, the network aired more NFL games than any other broadcaster this season. Among the highlights for Bryant was a simple and sleek new graphics package, which debuted at Super Bowl 50 and carried into the 2016 regular season.
“One of the things we’re most proud of is the implementation of the new graphics look this year,” he says. “We didn’t let up [after] a Super Bowl year, and graphics was a big part of it. For us, it was continuing to push to advance our coverage — whether it was trying out the new virtual graphics with SMT, working in more high-frame-rate cameras, or using the pylon camera more. Sometimes, those things happen for the Super Bowl only and don’t carry over to the season, but we brought a good number of the achievements from Super Bowl into the regular season and the playoffs. So that’s what made us very proud this year.”
The 2019 College Football Playoff National Championship concludes tonight at Levi’s Stadium in Santa Clara, CA. Like every other football game, it will feature two teams — in this case, Alabama and Clemson — and one broadcaster. For its part, ESPN is once again all-in for the big game, deploying more than 310 cameras to cover all the action and providing 17 viewing options via the MegaCast over 11 TV and radio networks and via the ESPN app.
“The thing that makes this event is the volume and magnitude of what we put behind it but also the time frame,” says John LaChance, director, remote production operations, ESPN. “[There are] other marquee events, which stand alone, but, with the volume and viewer enhancements being done here in a 72-hour window to get everything installed, this event [is] in a unique classification. Trying to integrate everything into place was a herculean effort.”
The game wraps up a season in which ESPN’s production team delivered more than 160 games to ABC and ESPN and more than 1,000 games to various other ESPN platforms.
“To watch that volume and make sure all the pieces are in place is a highlight for all of us, [seeing] it go from plan to working,” says LaChance. “You always have things that are challenges, but it’s about how quickly you can recover, and I think we’ve done it well.”
The core of ESPN’s production efforts will be done out of Game Creek Video’s 79 A and B units with Nitro A and B handling game submix, EVS overflow, 360 replay, robo ops, and tape release. ESPN’s team creating 17 MegaCast offerings is onsite, housed in Nitro and Game Creek’s Edit 3 and Edit 4 trailers andTVTruck.tv’s Sophie HD. Game Creek Video’s Yogi, meanwhile, is on hand for studio operations, and Maverick is also in the compound. All told, 70 transmission paths (50 outbound, 20 inbound) will be flowing through the compound, and 40 miles of fiber and cable has been deployed to supplement what already exists at Levi’s Stadium.
Also on hand are Fletcher, which is providing robotics; BSI, handling wired pylons and RF audio and video; 3G, which is in charge of the line-to-gain PylonCam and the first-and-10–marker camera; Vicareo, with the Ref Cams; and CAT Entertainment, for UPS and power. SMT is on board for the 1st & Ten lines; PSSI, for uplink; Bexel, for RF audio and other gear; and Illumination Dynamics, for lighting.
“It’s a team effort,” says LaChance. “I couldn’t be prouder of the team we assembled here and the vendors, technicians, leads, and staff that have, over the course of the last several months and weeks when it gets to a fever pitch, put it all together.”
The Camera Contingent
A large part of the 300-camera arsenal is comprised of 160 4K DSLR cameras deployed for the 4D Replay system that will provide definitive looks at every play from every angle. Those cameras are mounted around the stadium and, combined, provide images that can be merged on computers and enable an operator to zoom around a play and show any angle.
One place where the 4D system is poised to shine is the Red Zone. The 4D Replay team and ESPN have created templates that can cut the time needed to synthesize the images for plays around the goal line and pylons to eight seconds.
Besides the 160 4D replay cameras, plenty of cameras are focused on the game action, including 90 dedicated to game coverage. Among those are 10 super-slo-mo cameras, nine 4K game cameras, 15 RF cameras, two SkyCams, and two aerial cameras in a blimp and fixed-wing aircraft. The vast majority of cameras are Sony models (mostly Sony HDC-2500 and HDC-4300 with one HDC-4800 in 4K mode) coupled with Canon lenses, including five 100X, two 95X, 21 wide-angle, and 14 22X and 24X lenses. Seven 86X lenses and a 27X lens are also in use.
The game-coverage cameras are complemented by specialty cameras. Four Vicario Ref Cams will be worn by the officials; a line-to-gain RF PylonCam will move up and down the sideline with the first-and-10 marker, which also has a camera; and eight PylonCams around the end zones provide a total of 28 cameras.
The RefCam is new this year, having been tested during last year’s final in Atlanta. The MarkerCam did debut last year, and LaChance says it has been improved: “It has a c360 Live camera in the target portion of the marker to give a 180-degree perspective in 4K. The operator can push in and get a great perspective; we are taking it to another level with the push in.”
A second c360 camera will also be in use on the second SkyCam, again giving the ESPN team the ability to zoom in and capture images.
Another exciting new offering is AllCam, a system designed by ESPN’s in-house team and ChyronHego. It stitches images from three 4K cameras placed alongside the all-22 camera position and gives the production team the ability to zoom in anywhere on the field to capture events that might have taken place away from the action. For example, in a test at a bowl game, the system was used to show an unnecessary-roughness violation that took place during a kickoff far from the other players, who were focused on the run-back.
“It’s another good example of the partnerships we have and working for a common goal,” says LaChance.
Beyond the game coverage cameras there are 20 cameras dedicated to the various MegaCast feeds, 29 for ESPN College GameDay, and nine for the SEC Network. ESPN Deportes also has two dedicated cameras.
All told the production team will have access to 320 sources via 170 channels of EVS playback as well as 32 channels of Evertz Dreamcatcher playback. There are also two Sony PVW-4500 servers in use, a Sony BPU-4800 4K record server, and two c360 record servers.
Non-Stop Action — for the Production Team
“The game wraps up a busy time for the production team as well as for those who work at Levi’s Stadium. LaChance credits Jim Mercurio, VP, stadium operations/GM, Levi’s Stadium, and Nelson Ferreira, director, technical operations, San Francisco 49ers, with being an important part of the process during the past year.
“It’s a solid venue and great group of folks to work with, and that helps,” says LaChance. “They have done the Super Bowl here, and they do a lot of great events, so they are well-equipped. We had to supplement with some fiber, but they had a great infrastructure to start with.”
As for the ESPN team, everybody worked on one of the two semifinals as well as an additional bowl game.
“Folks that did the Cotton Bowl headed on to the Sugar Bowl, and those that did the Orange Bowl headed to the Rose Bowl,” says LaChance. “A lot of the people here have been non-stop since the Christmas Day offerings for the NBA, then right into a semifinal assignment, then the second of the New Year’s bowl offerings, and then making their way here to Santa Clara for one of the largest events the company does every year.”
For anyone looking to see what the new toys will bring to the show, LaChance recommends tuning into the TechCast, which will have a sampling of everything that will be used, including 4D Replay, C360, and the RefCam.
“Besides the game itself,” he says, “tune into the TechCast. Hopefully, the weather is good for us, and we can offer the BlimpCast from the Goodyear airship, which is another opportunity to provide a unique look for viewers at home.”
2018 was one of the most eventful years for sports production in recent memory, with the 2018 PyeongChang Olympics and 2018 FIFA World Cup capturing the nation’s attention over the summer and annual events like the College Football Playoff National Championship Game, Super Bowl, NFL Draft, and others breaking production records and test-driving new technologies and workflows. As if there weren’t enough going on stateside, this year’s Road Warriors features an expanded look at what went on across the Atlantic. Here’s is Part 2 of SVG’s look at some of the sports-production highlights from the past year (CLICK HERE for Part 1).
US OPEN
USTA Billie Jean King National Tennis Center, Flushing Meadows, NY
August 27–September 9
For ESPN, it simply doesn’t get bigger than US Open tennis. In the network’s fourth year as host broadcaster and sole domestic-rights holder — part of an 11-year rights deal — the technical and operations teams continued to evolve production workflows and add elements. Highlights this year included the debut of a Fletcher Tr-ACE/SimplyLive ViBox automated production system covering the nine outer courts and several new camera systems.
This truly is the largest event that ESPN produces out of the thousands of events that we do all year,” said ESPN Director, Remote Operations, Dennis Cleary, “and it’s all done in a 3½-week span.”
For the first time, ESPN covered all 16 courts at the US Open, thanks to a new automated production system deployed on the nine outer courts. Having debuted at Wimbledon in June, the Fletcher Tr-ACE motion-detecting robotic camera system was deployed on each court (with four robos per court) and relied on SimplyLive’s ViBox for switching and replay and an SMT automated graphics system. With this workflow, one robotic-camera operator and one ViBox director/producer covered each of the nine courts.
New this year was a two-point aerial CineLine system (provided by Picture Factory) running between Louis Armstrong Stadium and Court 10, a run of roughly 1,000 ft. After a successful debut at Wimbledon in June and the Australian Open in January, Telstra Broadcast Services’ NetCam made its US Open debut. The Globecam HD 1080i/50 POV miniature robotic camera was deployed on each side of the net for singles matches at Arthur Ashe Stadium, Armstrong, and the Grandstand, providing viewers with a close-up look at the action on the court. In addition, both Intel’s Tru View 360-degree camera system and the SpiderCam four-point aerial system returned to Ashe.
The US Open production compound was almost unrecognizable from five years ago, prior to ESPN’s taking over as host broadcaster. What had been a caravan of production trucks became two permanent structures housing ESPN’s NTC broadcast center and production/operations offices, along with two ultra-organized stacks of temporary work pods housing the TOC, vendors, international broadcasters, and ESPN’s automated production operation for the outer courts. NEP’s NCP8 was on hand for ESPN’s ITV operation (serving AT&T/DirecTV’s US Open Mix Channel), and NEP’s Chromium and Nickel were home to the USTA’s world-feed production. — JD
U.S. OPEN
Shinnecock Hills Golf Club, Shinnecock Hills, NY
June 14-17
The 2018 U.S. Open from Shinnecock Hills Golf Club gave the Fox Sports team challenges in production planning that led to innovations, the opportunity to refresh old workflows and core infrastructure, and a chance to chart some new directions for golf coverage.
Game Creek Video’s Encore production unit was at the center of the coverage for Fox and FS1, with Game Creek Pride handling RF-video control and submix and providing a backup emergency control room. Pride’s B unit handled production control for one of the featured groups, Edit 4 supported all iso audio mixes, and Edit 2 was home to five edit bays with equipment and support provided by Creative Mobile Solutions Inc. (CMSI). There was also the 4K HDR show, which was produced out of Game Creek Maverick.
“All the Sony HDC-4300 cameras on the 7th through 18th greens are 4K HDR-native with a secondary output at 720p SDR,” noted Brad Cheney, VP, field operations and engineering, Fox Sports, during the tournament. There were also six Sony PXW- Z450’s for the featured holes and featured groups, the output of two of them delivered via 5G wireless.
In terms of numbers, Fox Sports had 474 technicians onsite, making use of 38 miles of 24-strand fiber-optic cable to produce the event captured by 106 cameras (including 21 wireless 1080p, 21 4K HDR units, six 4K HDR wireless units, three Inertia Unlimited X-Mo cameras shooting at 8,000 fps, a Sony HDC-4800 at 960 fps, and three Sony HDC-4300’s at 360 fps), and 218 microphones. Tons of data was passed around: 3 Gbps of internet data was managed, along with 83 Gbps of broadcast data, 144 TB of real-time storage, and 512 TB of nearline storage.
Each course provides its unique challenges. At Shinnecock Hills, they included the roads running through the course, not to mention the hilly terrain, which also had plenty of deep fescue. But, from a production standpoint, the biggest issue was the small space available for the compound.
One big step taken in preparation for the 2018 events was that the IP router in Encore was rebuilt from scratch. RF wireless coverage was provided by CP Communications. There were 26 wireless cameras on the course, along with 18 wireless parabolic mics and nine wireless mics for on-course talent. CP Communications also provided all the fiber on the course. — KK
MLB ALL-STAR GAME
Nationals Park, Washington, DC
July 17
With its biggest summer drawing to a close with the MLB All-Star Game, Fox certainly showed no sign of fatigue technologically. Not only did the network roll out a SkyCam system for actual game coverage for the first time in MLB history, but Fox also deployed its largest high-speed–camera complement (including all 12 primary game cameras), two C360 360-degree camera systems, and ActionStreamer POV-style HelmetCams on the bullpen catcher, first-base coach, and Minnesota Twins pitcher José Berríos.
People always used to say Fox owned the fall with NFL and MLB Postseason, but, this year, we owned May through July, too, with the U.S. Open, World Cup, and now All-Star,” said Brad Cheney, VP, field operations and engineering, Fox Sports. “The capabilities of our [operations] team here are just unsurpassed. For big events, we used to throw everything we had at it, and it was all hands on deck. That’s still the case, but now, when we have big events, everybody’s [scattered] across the globe. Yet we’re still figuring out ways to raise the bar with every show.”
Between game coverage and studio shows, Fox Sports deployed a total of 36 cameras (up from 33 in 2017) at Nationals Park, highlighted by its largest high-speed–camera complement yet for an All-Star Game. Building on the efforts of Fox-owned RSN YES Network, all 12 of Fox’s Sony HDC-4300 primary game cameras were licensed for high-speed: six at 6X slo-mo, six at 2X slo-mo. This was made possible by the ultra-robust infrastructure of Game Creek Video’s Encore mobile unit.
Fox also had two Phantom cameras running at roughly 2,000 fps (at low first and low third) provided by Inertia Unlimited and a pair of Sony P43 6X-slo-mo robos at low-home left and low-home right provided by Fletcher. Fletcher provided nine robos in all — including low-home Pan Bar robo systems that debuted at the 2017 World Series — and Inertia Unlimited provided a Marshall POV in both teams’ bullpen and batting cage.
CP Communications supplied a pair of wireless RF cameras: a Sony P1r mounted on a MōVI three-axis gimbal and a Sony HDC-2500 handheld. An aerial camera provided by AVS was used for beauty shots — no easy task in security-conscious Washington.
Inside the compound, a reshuffling of USGA golf events allowed Game Creek Video’s Encore mobile unit (A, B, and C units), home to Fox’s U.S. Open and NFL A-game productions, to make its first All-Star appearance.
The primary control room inside the Encore B unit handled the game production, and a second production area was created in the B unit to serve the onsite studio shows. — JD
The Open Championship
Carnoustie Golf Links, Angus, UK
July 19-22
Sky Sports used its Open Zone in new ways to get closer to both players and the public in its role as the UK live broadcaster from Carnoustie. On Thursday and Friday, Sky Sports The Open channel was on the air from 6:30 a.m. to 9:00 p.m. Featured Group coverage of the 147th Championships was available each day via the red button and on the Sky Sports website. Viewers could also track players’ progress in Featured Hole coverage on the red button, with cameras focusing on the 8th, 9th, and 10th holes. Sky Sports had a team of 186 people onsite in Carnoustie for The Open, which included Sky production and technical staff and the team from OB provider Telegenic. — Fergal Ringrose
WIMBLEDON
All England Lawn Tennis and Croquet Club, Wimbledon, UK
July 2-15
At 11:30 a.m. on Monday, July 2, coverage of the Wimbledon Championships went live from the AELTC, produced for the first time by a new host broadcaster. After more than 80 years under the BBC’s expert guidance, the host baton was passed to Wimbledon Broadcast Services (WBS), bringing production of the Championships in-house. Going live on that Monday was the culmination of two years of planning, preparation, and testing: a process that has allowed the AELTC to “take control” of the event coverage and provide international rightsholders with a better service as well as add some new twists, such as Ultra High Definition (UHD), a NetCam on both Centre Court and No.1 Court, and multicamera coverage of all 18 courts. — Will Strauss
FRENCH OPEN
Stade Roland-Garros, Paris
May 27–June 10
Tennis Channel was once again on hand in a big way at the French Open. The expanded coverage this year meant more than 300 hours of televised coverage for fans in the U.S. as well as 700 hours of court coverage via Tennis Channel Plus. The Fédération Française de Tennis (FFT) increased overall court coverage this year, and Tennis Channel made sure all of that additional coverage made it to viewers. Tennis Channel had approximately 175 crew members onsite, working across the grounds as well as in a main production-control room, an asset-management area, six announce booths, and a main set on Place des Mousquetaires. The production facilities were provided by VER for the fifth year. Centurylink provided fiber transport to the U.S. via 10-Gbps circuits. — KK
The Professional Fighters League (PFL) and SMT (SportsMEDIA Technology) announced an exclusive, long-term technology partnership. Under the terms of the agreement, SMT will partner with the PFL to create proprietary technology that will measure real-time MMA fighter performance analytics along with biometric and positional data that will provide fans a live event experience across all platforms.
Starting in 2019, SMT will help power the PFL’s vision of the first-ever SmartCage. The SmartCage will utilize biometric sensors and proprietary technology that will enable the PFL to measure and deliver real-time fighter performance data and analytics, what the PFL is dubbing: Cagenomics. PFL fans watching linear and digital broadcasts of the league’s Regular Season, Playoff, and Championship events will experience a new dimension of MMA fight action with integration of live athlete performance and tracking measurements including: speed (mph) of punches and kicks, power ratings, heart rate tracking, energy exerted, and more.
“The Professional Fighters League is excited to be partnering with SMT to advance the sport of MMA. The PFL’s new SmartCage will revolutionize the way MMA fans experience watching live fights as next year every PFL fight will deliver unprecedented, real-time fighter performance data and analytics, biometric tracking, and an enhanced visual presentation of this great sport,” says Peter Murray, CEO, Professional Fighters League. “Not only will PFL fans benefit from our SmartCage™ innovation, but our pro fighters will now have access to new performance measurement data, analysis, and tools to help them train and compete. The PFL’s vision has always been two-fold: deliver the absolute best experience to fans and be a fighters first organization and with the SmartCage we will accomplish both.”
“SMT is thrilled to be collaborating with the Professional Fighters League’s forward-thinking innovation team to bring our latest and greatest technology to PFL events,” says Gerard J. Hall, Founder & CEO, SMT. “Starting in 2019, PFL fans will begin to see real-time, live, innovative technology that is unique to the PFL in the MMA space. SMT’s OASIS Platform will provide the PFL with a seamlessly integrated system that combines live scoring with real-time biometric and positional data to enhance the analysis, storytelling and graphic presentation of the PFL’s Regular Season, Playoffs and Championship events next season.”
The PFL 2018 Championship takes place on New Year’s Eve live from The Hulu Theater at Madison Square Garden and consists of the 6 world title fights in 6 weight classes of the PFL 2018 Season. Winners of each title bout will be crowned PFL World Champion of their respective weight class and earn $1M. The PFL Championship can be viewed live on Monday, December 31 on NBC Sports Network (NBCSN) from 7 to 11 pm ET in the U.S. and on Facebook Watch in the rest of the world.
SMT To Partner with PFL to DevelopProprietary Technology to Measure Real-Time Fighter Performance Data and Analytics,Biometric Tracking Along with Innovative Graphic Enhancements for the League’s LiveLinear and Digital Events.
WASHINGTON DC (December 17,2018) The Professional Fighters League (PFL)and SMT (SportsMEDIA Technology) – theleading innovator in real-time data delivery and graphics solutions for thesports and entertainment industries – today announced an exclusive, long-termtechnology partnership. Under the terms of the agreement, SMT will partnerwith the PFL to create proprietary technology that will measure real-timeMMA fighter performance analytics along with biometric and positional data thatwill provide fans a game-changing live event experience across all platforms.
Starting in 2019, SMT will help power the PFL’s vision of thefirst-ever SmartCage™. The SmartCage™ will utilize biometricsensors and proprietary technology that will enable the PFL to measure anddeliver real-time fighter performance data and analytics, what the PFL isdubbing: Cagenomics™. PFL fans watching linear and digital broadcasts ofthe league’s Regular Season, Playoff and Championship events will experience anew dimension of MMA fight action with integration of live athlete performanceand tracking measurements including: speed (mph) of punches andkicks, power ratings, heart rate tracking, energy exerted and more.
“The Professional Fighters League is excited to be partnering withSMT to advance the sport of MMA. The PFL's new SmartCage™ willrevolutionize the way MMA fans experience watching live fights as next yearevery PFL fight will deliver unprecedented, real-time fighter performance dataand analytics, biometric tracking and an enhanced visual presentation ofthis great sport,” said Peter Murray, CEO, Professional Fighters League. “Notonly will PFL fans benefit from our SmartCage™ innovation, but our pro fighterswill now have access to new performance measurement data, analysis and tools tohelp them train and compete. The PFL’s vision has always been two-fold:deliver the absolute best experience to fans and be a fighters-firstorganization and with the SmartCage™ we will accomplish both.”
“SMTis thrilled to be collaborating with the Professional FightersLeague’s forward-thinking innovation team to bring our latest and greatesttechnology to PFL events,” said Gerard J. Hall, Founder & CEO, SMT.“Starting in 2019, PFL fans will begin to see real-time, live, innovativetechnology that is unique to the PFL in the MMA space. SMT’s OASISPlatform will provide the PFL with a seamlessly integrated system that combineslive scoring with real-time biometric and positional data to enhancethe analysis, storytelling and graphic presentation of the PFL’sRegular Season, Playoffs and Championship events next season.”
The PFL 2018Championship takes place on New Year’s Eve live from The Hulu Theaterat Madison Square Garden and consists of the 6 world title fights in 6weight classes of the PFL 2018 Season. Winnersof each title bout will be crowned PFL World Champion of their respectiveweight class and earn $1M. The PFL Championship can be viewed live onMonday, December 31 on NBC Sports Network (NBCSN) from 7 to 11pm ET in the U.S.and on Facebook Watch in the rest of the world.
###
Professional Fighters League
TheProfessional Fighters League (PFL) presents MMA for the first time inthe sport format where individual fighters compete in a regular season,playoffs, and championship. PFL Season has 72 Elite MMAathletes across 6 weight-classes, with each fighting twice in the PFLRegular Season in June, July, and August. The top 8 fighters in each weight-class advance to thesingle-elimination PFL Playoffs in October. The PFL Championship isNew Year’s Eve in Madison Square Gardens with the finals in each of sixweight classes competing for the $10 million prize pool. The PFL is broadcast live on NBC SportsNetwork (NBCSN) and streamed live worldwide on Facebook Watch. Founded in 2017, the PFL is backed by group of sports, media,and business titans. For more info visit PFLmma.com.
SMT
SMT (SportsMEDIA Technology) is the leading innovator inreal-time data delivery and graphics solutions for the sports and entertainmentindustries, providing clients with scoring, statistics, virtual insertion andmessaging for broadcasts and live events. For the past 30 years, SMT’ssolutions have been used at the world’s most prestigious live sports events,including the Super Bowl, Indy 500, Triple Crown, major golf and tennis events,MLB’s World Series, Tour de France, and the Olympics. SMT’s clients includesports governing bodies; major, regional and specialty broadcast networks;event operators; sponsors; and teams. The 32-time Emmy Award-winning company isheadquartered in Durham, N.C., with divisions in Jacksonville, Fla., Fremont,Calif., and London, England.
SMT is once again one of the busiest vendors on hand at the US Open, providing a cavalcade of technology to serve the USTA, broadcasters, spectators, athletes, and media onsite at the USTA Billie Jean King National Tennis Center (NTC). In addition to providing the much discussed serve clock, SMT — now in its 25th year at the Open — is providing scoring systems, scoring and stats data feeds, LED scoreboards, TV interfaces, IPTV systems, and match analysis.
“This event, just like any Grand Slam, is becoming a three-week event,” says Olivier Lorin, business development manager, SMT. “We have more and more recipients asking for data. Today, we’re actually sending 19 different data feeds to recipients for their own platform. Obviously, we have to get the authorization from the USTA, but then they use that for whatever.”
Countdown to the Serve
An on-court digital clock, similar to the shot clock in basketball and the play clock in football, counts down the allotted 25 seconds before a player must begin the serve (previously, the 20-second clock was visible only to the chair umpire).
After the USTA announced plans to display a countdown clock for this year’s tournament, SMT introduced the clock at ATP and WTA events leading up to the Open — most recently, in Winston-Salem, NC, and Cincinnati — to help players acclimate to it.
“The USTA has been looking to do the serve clock at the US Open for a few years, starting in 2016 with the Juniors and then the qualifiers as an experiment, which all went very well,” says Lorin. “The Australian Open and the French Open also did it in quallies, but the US Open wanted to be the first [Grand Slam] to do this for all events, and we were able to work with them to make that happen.”
The clock, visible to players and spectators alike, begins to tick down immediately after the chair umpire announces the score. The umpire will issue a time violation if the player has not started the service motion at the end of the countdown. The first time the clock hits zero before a player begins the motion, the player receives a warning. For every subsequent time, the player loses a first serve. SMT is driving umpire scoring on all 16 courts and offsite for Junior Qualifying (eight courts).
Lorin sees a benefit to TV in the five-minute warmup clock and the serve clock: “At least seven minutes [is saved], so the match is going to [end] on time more often.”
Serving the Media: IPTV and CCTV
SMT is also responsible for the infrastructure for the USTA’s CCTV, IPTV, and Media Room. The IPTV system for the Media Center at this year’s US Open is now “browser-independent.” It allows users to select and view up to five streams/videos at one time from any of the digitally encoded channels available on the 13-channel CCTV system. In addition, the system allows access to archived player interviews. The IPTV system also includes real-time scores, match stats, draws, schedule, results, tournament stat leaders, US Open history, and WTA/ATP player bio information.
“It’s a very slick interface, and the USTA has been very positive about it,” says Lorin. “Today, it is still under a controlled environment here at the US Open, but, if the US Open wanted to make this open to anybody on the outside, we could easily provide a solution for them to log in and have the same information, with the exception of live video.”
Automation Is Key to New Outer-Courts Coverage
A fixture at live-sports-broadcast compounds, SMT is also providing a variety of services to domestic-rights holder and host broadcaster ESPN, as well as other broadcasters onsite. ESPN is deploying an SMT automated-graphics interface as part of its new automated-production system for outer-court coverage, which relies on a Fletcher Tr-ACE motion-detecting robotic camera system and SimplyLive’s ViBox all-in-one production system.
An SMT touchpad at each of the 16 workstations is used only during prematch coverage. All other graphics elements, including the scorebug and lower-thirds, are fully automated, and informational elements are triggered by preconfigured settings in SMT’s data feed (for example, 10 total aces or 10 unforced errors).
“The beauty of our system is that everything is automated and driven by the score notification of the umpire’s tablet,” says Lorin. “We have built up prematch graphics so we know that, when the umpire hits warmup on the tablet, a bio page for both players and a head-to-head graphic will appear, and then they’ll go to the match. When the match starts, the system is just listening to the score notifications, and we have built-in notifications for five aces and things like that. The only thing that is manual and left to the producer for that court is the set summary and the match summary for statistics.”
Also From SMT: Prize Money Report, LED Superwall, More
This year, SMT has updated its Official Prize Money Report, in which prize money is calculated and a report generated at the end of the tournament and distributed to media officials.
SMT also provides content for the massive outdoor LED Superwall at the main entrance of Arthur Ashe Stadium, displaying scoring-system content: schedules, results, matches-in-progress scores, custom information messages (for example, weather announcements). SMT designs the scoring graphics and provides live updates.
“One of the big things is, we rebranded the US Open package for 2018 with a new logo, a new font, and a new background,” says Lorin. “As a result, we had to apply those design changes across all the platforms we are serving. One of the things we try to do more and more in the video production is, instead of having the typical headshot of a player, to integrate more action shots and motion shots, which are a lot more appealing to the design.”
Other services SMT provides to the US Open on behalf of USTA include stats entry on seven courts; serve-speed systems and content on seven courts; playback controls, including lap selector and data-point scrubbing; draw creation and ceremony; and match scheduling.
For the first time, ESPN is covering all 16 courts at the US Open, thanks to a new automated production system deployed on the nine outer courts at the USTA Billie Jean King National Tennis Center (NTC). Having debuted at Wimbledon in June, the Fletcher Tr-ACE motion-detecting robotic camera system has been deployed on each court (with four robos per court) and relies on SimplyLive’s ViBox for switching and replay and an SMT automated graphics system. With this workflow, one robotic camera operator and one ViBox director/producer is covering each of the nine courts.
“With one production room and one rack room here, we are essentially replacing what would have traditionally been nine mobile units,” notes ESPN Director, Remote Operations, Dennis Cleary. “We’ve been working on this plan for a long time, and there is just no way we would have been able to cover all these courts in a traditional [production model]. SimplyLive has been used at other [Grand Slams], and it was used with Fletcher Tr-ACE at Wimbledon but not really to this extent. We feel that we have taken it to the next level [and] are integrating it with our overall [show] and adding elements like electronic line calling and media management.”
With all 16 courts now accessible, ESPN can present true “first ball to last ball” live coverage across its linear networks and the streaming platforms (a total of 130 TV hours and 1,300 more streaming on the ESPN app via ESPN3 and ESPN+. Moreover, ESPN was able to provide the USTA with live coverage of last week’s qualifying rounds for the first time, deploying the Tr-ACE/ViBox system on five courts.
In addition, ESPN, which serves as the US Open host broadcaster, has been able to provide any rightsholder with a live feed of a player from its country — regardless of the court and including qualifying rounds.
On the Outer Courts: LiDAR Drives Fletcher Tr-ACE System
Four Fletcher robotic systems with Sony HDC-P1 cameras have been deployed on each of the nine outer courts: two standard robos (traditional high play-by-play and reverse-slash positions) and two Tr-ACE automated robos (to the left and right of the net).
“From the beginning, one of ESPN’s big focuses was increasing the camera quality of what was being done on the outer courts,” says Fletcher Sports Program Manager Ed Andrzejewski. “So we built everything around the Sony P1’s to increase the camera quality to match the main [TV courts]. When they send a feed to the rightsholder in Australia and the player they are interested is on one of those outer courts, they wanted the basic quality to be the same as in the bigger stadiums. I think we’ve been able to accomplish that.”
Between the two Tr-ACE cameras is “the puck,” which powers the Tr-ACE system at each court via a custom-designed LiDAR (Light Detection and Ranging) image-recognition and -tracking system. The LiDAR tracks every moving object on the court (the ball, players, ball kids, judges) and provides the two Tr-ACE cameras with necessary data to automatically follow the action on the court. The LiDAR can also sense fine details on each player (such as skin tone or clothing color), allowing the cameras to tell the difference between a player and other moving objects.
A Room of Its Own: Nine Mobile Units in a Single Room
ESPN has erected a dedicated production room for the Tr-ACE/ViBox operation across from its NTC Broadcast Center. Inside this room are nine workstations featuring one Fletcher Tr-ACE camera operator and one ViBox director/producer each.
The Tr-ACE operator monitors the camera coverage and can take control of any of the four cameras at any point during the match. Meanwhile, the ViBox operator cuts cameras and rolls replays. An SMT touchpad at the workstation is used only during prematch coverage. All other graphics elements, including the scorebug and lower-thirds, are fully automated, and informational elements are triggered by preconfigured settings in SMT’s data feed (for example, 10 total aces or 10 unforced errors).
“The camera op and director are constantly communicating,” Andrzejewski explains. “ESPN put a lot of trust in us with this, so we brought out the best people we could and have some of the best [robo operators] in the business here. There was a lot of onsite learning, but we were able to give everyone lots of time on the system during setup and qualifying.”
The coverage does not feature commentary, so all nine courts are being submixed out of a single audio room using a single Calrec audio console and operator.
Also inside the automated production room are a video area to shade all 36 cameras, an SMT position to manage the automated graphics systems deployed at each workstation, an electronic line-calling position (which was not available for the systems at Wimbledon), and a media-management area, which was used during qualifying to record all five courts (this operation moved to the NTC Broadcast Center once draw play began on Monday).
Since the automated-production systems had to be up and running for qualifying rounds last week, ESPN built the operation on an island entirely separate from the Broadcast Center.
“It was just too costly and just not sensible to bring the full broadcast center up a week early,” notes Cleary. “So this entire operation is all standalone. All the equipment from Fletcher, SimplyLive, Gearhouse, and even transmission is all separate and on its own.”
Two-Plus Years of Development Pays Off
Although automated production is nothing new for the US Open — Sony Hawk-Eye technology had been used for several years to produce coverage from five outside courts — this new system has expanded the ability to truly cover every ball of the tournament.
Use of the Tr-ACE/ViBox system at Wimbledon in June and now at the US Open was a long time coming. Fletcher has been developing the Tr-ACE system for 2½ years and demonstrated it offline on one court at the NTC last year. In addition to the Fletcher and SimplyLive teams, ESPN Senior Remote Operations Specialist Steve Raymond, Senior Operations Specialist Chris Strong, and Remote Operations Specialist Sam Olsen played key roles in development of the system and its implementation this week.
“This is certainly a new workflow for us, so a lot of thought and time went into it before we deployed it,” says Olsen. “We felt that the ViBox and the Tr-ACE would certainly give us the ability to produce a high level of content using an automated [workflow], and it’s worked out really well thus far. Having it for the qualifying rounds for the first few days also served as a great test bed. I think the best way to put it is, we’ve grown into it and we’ll develop it and take it to higher level each time we use it.”
By Jason Dachman, Chief Editor, SVG
Thursday, August 2, 2018 - 2:52 pm
After a move from Los Angeles to Madison, WI, prior to last year’s event, the CrossFit Games production operation has continued to grow prodigiously. The “Woodstock of Fitness” has grown from a production comprising 35 crew members working out of a single mobile unit just six years ago to one of the largest live productions on the annual sports calendar: more than 10 NEP mobile units, a crew of more than 300, and 50-plus cameras. Add in the fact that the CrossFit competitions change from year to year, and it becomes clear just how challenging the event can be for the production team.
This year’s CrossFit Games — Aug. 1-5 at the Alliant Energy Center in Madison — are being streamed on Facebook, CBSSports.com, and the CBS Sports App and televised live on CBS (one-hour live look-ins on Saturday and Sunday plus a recap show) with a daily highlights show on CBS Sports Network.
CrossFit has its own live-streaming team onsite and handles in-house production for the videoboards at Alliant Energy Center. SMT, which is CrossFit’s scoring partner, provides a wealth of presentation options for the boards as well.
CrossFit has used TVU Networks bonded-cellular and IP systems for several years for point-to-point transmission. This year, CBS Digital also used a TVU system to take in streams from the CrossFit Regionals earlier this summer. That success led to a similar partnership for the Games, with CBS Digital receiving all the live competitions on two streams via TVU receivers.
As CrossFit Games’ Footprint Grows, So Does the Live Production
The Games themselves have expanded and become more complex. The production team is tasked with covering multiple venues throughout Alliant Energy Center, primarily The Coliseum and North Park Stadium. This year, the stadium has been expanded to 10,000 people (nearly 50% more than for the 2017 edition) and has added a new videoboard.
July 18, 2018
Sports Video Group
SMT was back at MLB All-Star in Washington, providing Fox Sports its live virtual–strike-zone system and, for the 14th consecutive year, virtual signage.
SMT rendered the virtual–strike-zone graphic, as well as the watermarks when viewers saw the ball cross the plate.
SMT’s Peter Frank was on hand at 2018 MLB All-Star to support Fox Sports’ virtual efforts.
SMT handled virtual signage behind the plate for Fox’s Camera 4 (the primary pitcher/batter camera) and tight center field. For the third year in a row, the company also integrated its system with the high-home position, inserting virtual signage on the batter’s eye in center field.
“We use stabilization for virtual signage on the main camera, which is used for the virtual strike zone, so that helps out with the stability of both graphics,” said SMT Media Production Manager Peter Frank. “Two years ago at MLB All-Star in San Diego was the first time we did [virtual signage on] the batter’s eye, and Fox was really happy with it. So we also brought it back in
July 5, 2018
Sports Video Group
After a successful pilot game last year, the American Flag Football League (AFFL) is back in action this summer with the U.S. Open of Football (USOF) Tournament. The final 11 games of the tournament kick off NFL Network’s AFFL coverage, and the network is embracing the “Madden-style” coverage and the production elements it debuted last year, including using a SkyCam as the primary game angle, deploying RF Steadicams inside the huddle, rolling out customized SMT virtual graphics across the field, and miking players throughout the game.
“After last year’s pilot show, there was a lot of great feedback. Everybody liked the football on the field and the direction the technology was going,” says Johnathan Evans, who served as executive producer and director of last year’s production and is directing the NFL Network telecasts this year. “So our coverage is going to be almost exactly the same as last year, with a few differences since we are doing 11 games instead of just one. We have come up with a great formula that hasn’t been tried on a consistent basis before and offers a different perspective from watching a [traditional] football broadcast. With [AFFL], you’re watching from the quarterback perspective; you’re watching it just like you’re playing a Madden NFL [videogame].”
How It Works: Breaking Down the AFFL FormatThe 12 teams featured in the USOF Playoffs are composed of eight amateur squads in the America’s Bracket (derived from four rounds of play that began with 128 teams) and four teams captained by celebrities in the Pro Championship Bracket. NFL Network’s USOF coverage began with the America’s Bracket Quarterfinal last weekend from Pittsburgh’s Highmark Stadium and continues with the semifinals this weekend at Atlanta’s Fifth Third Bank Stadium, the America’s Bracket Final and Pro Bracket Final on July 14 at Indianapolis’s Butler Bowl, and the $1 million Ultimate Final (featuring both bracket champions) on July 19 at Houston’s BBVA Compass Stadium.
The 11 AFFL telecasts on NFL Network will feature SMT virtual graphics, including the Go Clock.
The 11 AFFL telecasts on NFL Network will feature SMT virtual graphics, including the Go Clock. The 7-on-7, no-contact 60-minute AFFL games feature many of the same rules that average Americans know from their backyard games. The same players are on the field for both offense and defense, and a team must go 25 yards for a first down. There is no blocking; instead, a “Go Clock” indicates when the defense can rush the QB (after two seconds) and when the QB must release the ball or cross the line of scrimmage (four seconds). There are also no field goals (or uprights, for that matter), and kickoffs are replaced with throw-offs.
“This is not only a sport that creates a lot of intensity and energy; it’s also a sport that you as an average person can relate to because you’re watching an average person play the game,” says Evans. “You’re not watching professional athletes. You’re watching amateurs playing a sport that you can play at home. That is something that every single viewer can relate to.”
Inside the Production: It’s All About Access By using the SkyCam for play-by-play, RF Steadicams on the field, and player mics, the AFFL and NFL Network are focused on providing fans unprecedented up-close-and-personal access to the action on the field.
“We’re most excited about having SkyCam as our game camera, which really adds a different perspective, and also having everybody miked up so we can hear everything that’s going on and listen in,” says producer Tom McNeely. “We’re focused on making [viewers] feel like they’re right there on the field with these guys. Bringing them into the huddle with our cameras and microphones — we will have somebody sitting in the truck with a mute button in case the players are a little rambunctious — is going to make this really appealing and fun.”
The upcoming NFL Network AFFL productions will deploy Game Creek Video mobile units and feature an average of eight cameras: the SkyCam system, two traditional 25-yard-line angles for isos, a mid-level end-zone angle, one handheld high-speed camera, a jib on a cart roving the sidelines, and two RF cameras (Steadicam and a MōVI).
“The only new cameras we are adding is a second [RF camera] so we can cover both sides of the football,” says Evans. “Last year, we had only one Steadicam, which was great, but I realized that we were losing the intimacy on both sides of the ball. Before you get to the red zone, it’s great to be inside the huddle and see from behind the quarterback on the offensive side of the ball. But, once you get to the red zone, you need to get ready for a touchdown, so you have to switch your Steadicam to the defensive side of the ball, and you hope to get a touchdown in the end zone. This time, in Indianapolis and in Houston, we’re going to have a Steadicam on both sides of the ball to retain the potential atmosphere for every single play. Before the snap, during the snap, and after the snap, you’re going to have that great intensity right in your face the entire time.”
Go Clock Returns; Interactive Line of Scrimmage DebutsThe Go Clock, designed by SMT specifically for the fast-paced AFFL, is also back after playing a major role in defining the league’s production style during its pilot game. The system synchronizes with in-stadium displays to indicate when the defense can rush the quarterback.
“The Go Clock was a big success, and we’re bringing it back this year,” says Evans. “We’re also introducing a line of scrimmage that will change color when [the defense] is able to rush. So the virtual graphics are still there and play a big role [in the production].
The same SMT virtual 1st & Ten line used in NFL broadcasts will be deployed from the company’s Camera Tracker system, working in tandem with SkyCam to give viewers the “Madden-style” play-by-play angle used several times by NBC Sports last NFL season.
SMT’s Design Studio also designed and implemented the AFFL graphics package — including show open and score bug — and the virtual-graphics package.
SMT’s clock-and-score technology is made available via its dual-channel SportsCG, a turnkey graphics-publishing system that allows greater autonomy via a second-channel laptop that can be operated remotely. In addition to producing the score bug, the SportsCG offers real-time, in-game offensive and defensive statistics powered by SMT QB Stats (the same system used for NCAA and NFL games).
In addition to the virtual elements, the AFFL has enhanced the physical first-down marker used on the field, so that it digitally displays the down, play clock, game clock, and possession arrow. The system also emits an audible alert when the rusher can break the line of scrimmage after two seconds and when the quarterback has to throw the ball after four seconds.
Beyond the Tech: Storytelling, NFL Network IntegrationAside from the production elements, the AFFL also offers a host of great storytelling opportunities surrounding the squads of Average Joes on the field. McNeely, who knows a thing or two about telling the stories of unknowns on the field, having produced a dozen Little League World Series for ESPN, sees the AFFL as a one-of-a-kind storytelling opportunity.
“These aren’t pro names or pro teams; you’re starting from scratch telling those stories. There are a lot of great stories and personalities with layers — [such as] a 50-year-old, 5-ft.-8 quarterback with a potbelly leading the team from Tennessee or one of the amazing athletes who fell short of the NFL but played in the CFL or the Arena League,” says McNeely. “When I first met [AFFL CEO/founder] Jeff Lewis, who has worked so closely with Jonathan and all of us to develop this, he mentioned what a huge fan he was of Little League World Series. And he promised us all the access we needed so that we would be able to tell introduce these players and tell their stories.”
NFL Network’s commitment to the AFFL goes well beyond just televising 11 games, however. Not only do the telecasts feature NFL Network talent like Good Morning Football’s Kay Adams (serving as sideline reporter throughout the tournament) and NFL Total Access host Cole Wright (calling play-by-play on July 14), the network is also incorporating AFFL segments into its daily studio programming, social-media channels, and digital outlets in an effort to appeal to football-hungry fans during the NFL offseason.
“We really feel like there’s a huge opportunity here during the summer, when the NFL really has nothing going on,” says McNeely. “We’re excited to see some traction with social media and on the NFL Network. They are doing a lot to promote [the AFFL] on their studio shows, and we’re hoping it takes off. I think there will be a grassroots push for this similar to what you’ve seen with the Little League World Series.”
June 29, 2018
Sports Video Group
While the broadcast debut of Dale Earnhardt Jr. in the NASCAR on NBC booth is creating plenty of buzz around NBC’s first races of the season this weekend at Chicagoland Speedway, the uber-popular retired driver isn’t the only new addition to the network’s NASCAR coverage this year. Echoing its rink-side “Inside the Glass” position on NHL coverage, NBC will debut the Peacock Pit Box – a remote studio set built within a traditional pit box frame that will be located along pit road for pre- and post-race coverage at each speedway throughout the season.
NBC will debut the Peacock Pit Box – a remote studio set built within a traditional pit box located on pit road – for its NASCAR pre/post-game shows
“The Peacock Pit Box is going to put us in the middle of the action,” says NBC Sports Group Executive Producer Sam Flood. “We’ve had the big set down on the grid for the first three years of [our NASCAR rights] contract. We realized that sometimes the fans departed from that area as we got closer to race time and took away some of the sense of place. So the idea was to have a real sense of place throughout the day, starting with the pre-race show. And most importantly, it gives us a place inside that mayhem that is pit road, which has become one of the most exciting places at the racetrack each week.”
Inside the Peacock Pit Box: Two Levels With Plenty of Tech FirepowerThe 14-ft.-long x 12.5 ft.-wide Peacock Pit Box (a normal-sized NASCAR pit box is 10×8 ft.) features two-levels and is located in a traditional pit box right along pit road. In addition to serving as the home to NASCAR on NBC’s pre-race coverage throughout the season, the structure also features an arsenal of robotic cameras that will aid in NBC’s coverage of pit road throughout each race.
“Sam [Flood] and Jeff [Behnke, VP, NASCAR production, NBC Sports Group] first had the vision and then there were a lot of great creative and technical people that helped to bring it to life,” says NBC Sports Technical Manager Eric Thomas. “They wanted to give our announcers a uniqe vantage point of the field of play – and that’s obviously pit lane. It’s like the 50-yard line in football or center ice in hockey. Our [announcers] will have an elevated position between all the teams right in the middle of the action, so they not only can see the racetrack but also see the competitors on either side of them.”
The NASCAR on NBC team worked with the NBC Sports Group design team in Stamford, CT, to design the Peacock Pit Box, while Nitro Manufacturing built the structure and Game Creek Video provided technical support and equipment.
The top level of the Peacock Pit Box will serve as the primary home from NBC Sports’ Monster Energy NASCAR Cup Series and Xfinity Series pre- and post-race coverage, with host Krista Voda and analysts Kyle Petty and Dale Jarrett occupying the desk. One handheld and three robotic cameras will be on hand for pre/post-race shows.
The 14-ft.-long x 12.5 ft.-wide Peacock Pit Box (a normal-sized NASCAR pit box is 10×8 ft.) features two-levels and is located in a traditional pit box right along pit road.
“It’s a nice dance floor that can support our announcers and various different configurations,” says Thomas. “We have to work within the space of the pit stall, which depends on the track. We have neighbors on either side of us, so we want to really be respectful of the teams and not interfere with them whatsoever. So we’re going to fit in our space very neatly and very cleanly without having an impact on the actual event. We wanted to make it as big as we could to make our announcers as comfortable as possible and also provide the technical equipment to produce a quality show.”
Meanwhile, the lower level of the Pit Box will provide additional broadcast positions with two wired cameras and an occasionally an RF camera and/or a small jib (depending on the size of pit box at each track). The space features interactive displays and a show-and-tell position for analysts like Daytona 500-winning crew chief Steve Letarte to deliver deeper analysis of the track action.
“The technology will be there for Steve to [provide deeper analysis], particularly in the Xfinity races, where he’s going to be hanging down on pit road in a pit box, restarting his old career of looking at the race when you only can see half the racetrack on pit road,” says Flood. “We think by [locating] Steve [there], it will give him more opportunity to focus that unique mind of his on what the heck all the other cars are doing on the track. So we see that as a huge advantage.”
The lower level also features a patio position where NBC will look to conduct interviews with drivers, pit crew chiefs, owners, and NASCAR officials throughout its race coverage.
All About Flexibility: Nine Robo Positions Give NBC Plenty of OptionsSince NBC’s pre- and post-race setup will vary week-to-week depending on the track, Thomas and company were tasked with making the Peacock Pit Box as versatile as possible. With that in mind, the upper level features nine different robotic camera positions. Three robos can be deployed at a time and – thanks to the small, lightweight cameras and custom-developed camera mounts deployed on the Pit Box – the operations team can quickly swap camera positions at any time during NBC’s coverage.
Beloved NASCAR driver Dale Earnhardt Jr., who retired after last season makes his broadcast debut as NASCAR on NBC Analyst this weekend at Chicagoland.
“If our director wants to change the shot or we want to totally rotate 180 degrees, we can do that in about 10 minutes,” says Thomas. “If we want to do a show with the track in the background first and then, a few minutes later, we want to look toward the garage with a different set of announcers, we can move the cameras quickly and make that happen. So it’s very flexible.”
In addition to being used for pre- and post-race studio coverage, these robos will be utilized for coverage of the action on pit road throughout NASCAR on NBC telecasts.
“The cameras are going to pull double duty because, if something’s going on in pit lane, those cameras are still going to physically be there. So they are going to give us some different angles that we haven’t seen very much of in the past,” says Thomas. “We’ve tried to create as much flexibility as possible so when Sam and Jeff ask, ‘can we do this?’, then we can say, ‘of course you can.’”
BatCam Returns: Aerial System Headlines NBC’s Army of CamerasNBC Sports will deploy an average of 55 cameras – including the return of the BatCam point-to-point aerial system to cover the backstretch – on big races at Daytona, Indianapolis, and Homestead-Miami this season. Thomas also expects to use BatCam, which debuted last year and can hit speeds of over 100+ mph, at the Watkins Glen road course this year. The BatCam also drew rave reviews throughout NBC’s Triple Crown coverage this past spring.
NBCS Sports is bringing back the BatCam point-to-point aerial system will to cover the backstretch at NASCAR races
The bulk of NBC’s camera complement for NASCAR is made up of Sony HDC-4300’s along with a mix of robos (provided by Robovision) and roving RF cameras. BSI will once again be providing eight RF in-car-camera dual-path systems, which allow two angles to be transmitted from each car at any given moment. Thomas also says his NASCAR on NBC team is currently experimenting with several new camera positions, which he expects to roll out throughout the season.
Going Inside the Action With New Graphics, Analysis ToolsNBC is utilizing SMT’s tools for the fourth straight NASCAR season. This year, the SMT race crawl has been updated to show the live running order and driver statistics at the traditional position on top of the screen and in a new vertical pylon display on the left side. The multiple options provide production with a variety of ways to allow fans to track each driver.
Also new this year is the SMT GOTO interactive touchscreen display, which provides several tools NBC can use throughout each race weekend, giving on-air analysts the ability to telestrate highlights, compare drivers and statistics, and interact with fans on social media.
SMT’s new Broadcast Analytics system has also been added to help enhance the coverage. The system live tracks all the cars during each session and allows production to show a virtual replay of any lap run during practice, qualifying and the race. The system allows production to visualize any lap run by any driver. It can provide a combined display of how a single driver ran on different laps, showing changes they’ve made during the session. The system can also show how different drivers ran the same lap. All of these options will allow fans to see the key moments during each session and better understand how that impacted where each driver finished.
In the Compound and Back Home in StamfordGame Creek Video’s PeacockOne (A and B units) will once again serve as the home to the NASCAR on NBC production team on-site, while an additional pair of Game Creek trucks will house mix effects and editing, as well as robo operations and tape release. In all, NASCAR truck compounds will be stocked with an average of 19 trailers (including BSI, Sportvision, NASCAR operations, and more).
“NASCAR does a great job setting up the compounds for us and providing a beautiful sandbox for us to play in,” says Thomas.
In addition, the NBC production team continues to increase rely more and more on file-sharing with the NBC Broadcast Center in Stamford, CT. AT&T and PSSI have partnered established fiber connectivity at the majority of the NASCAR tracks and will provide NBC with a circuit back to Stamford for file-transfer, as well as home-running individual cameras for at-home productions. Pre- and post-race shows from the Peacock Pit Box will regularly send back cameras to a control room in Stamford, where the show will be produced.
“We started [producing shows out of Stamford] last year and we will expand it more this year,” says Thomas. “It worked well last year and we’re making some improvements this year to make it even more seamless. With the increased support from AT&T and PSSI for network connectivity, I think it’s going to be even better this year. Obviously there are big cost savings on travel [as a result], but the product is of the same quality – so it’s really a win-win.”
SMT (SportsMEDIA Technology) continues its collaboration with the American Flag Football League (AFFL) to provide game management technology for the AFFL’s first U.S. Open of Football Tournament (USOF). The teams playing in the Ultimate Final at BBVA Compass Stadium in Houston will battle for a $1 million cash prize. SMT technical teams will be onsite at the USOF Tournament for every game, providing the customized virtual and clock-and score technology and graphics package that helped to define the league last year during its launch on June 27 at Avaya Stadium. Retired NFL stars return to the field to captain the teams, along with basketball legends and an Olympic gold medalist. SMT’s virtual 1st & Ten line system, used in NFL broadcasts, will be deployed from its Camera Tracker system, working in tandem with SkyCam to give viewers the “Madden-style” play-by-play angle used during NBC Sports’ 2017 season. SMT’s virtual Go Clock, designed specifically for the fast-paced AFFL, will synchronize with in-stadium displays to indicate when the defense can rush the quarterback.
SMT’s Design Studio designed and implemented the AFFL graphics package — including show open and score bug — and the virtual-graphics package. SMT’s clock-and-score technology is made available via its dual-channel SportsCG, a turnkey graphics publishing system that allows greater autonomy via a second channel laptop that can be operated remotely. In addition to producing the score bug, the SportsCG offers real-time, in-game offensive and defensive statistics powered by SMT QB Stats, the same system SMT uses for NCAA and NFL games. “SMT is proud to have helped the AFFL launch a new sports era, and we are thrilled to build on last year’s great success by offering flag football fans the same platform they’re used to when watching college and NFL games,” says Ben, SMT Business Development Manager. “With the debut of our dual- channel SportsCG, we can decrease the production bottleneck associated with rendering graphics on-air, allowing the quickly developing storylines to be told in a more dynamic way.”
June 17, 2018
Sports Video Group
The 2018 U.S. Open from Shinnecock Hills Golf Club gave the Fox Sports team challenges in production planning that led to innovations, the opportunity to refresh old workflows and core infrastructure, and a chance to chart some new directions for golf coverage.
The front-bench area in Game Creek Video’s Encore truck is at the center of Fox Sports’ U.S. Open coverage.
Game Creek Video’s Encore production unit is at the center of the coverage for Fox and FS1 with Game Creek Pride handling RF-video control and submix and providing a backup emergency control room. Pride’s B unit is handling production control for one of the featured groups, Edit 4 is handling all iso audio mixes, and Edit 2 is home to five edit bays with equipment and support provided by CMSI. And there is also the 4K HDR show, which is being produced out of Game Creek Maverick.
“All the Sony 4300 cameras on the seventh through 18th greens are 4K HDR-native with a secondary output at 720p SDR,” says Brad Cheney, VP, field operations and engineering, Fox Sports. There are also six Sony PXW-Z450’s for the featured holes and featured group, the output of two of them delivered via 5G wireless.
“We are producing two 4K HDR shows out of one mobile unit with four RF-based 4K cameras,” he adds. “That is another big step forward.”
In terms of numbers, Fox Sports has 474 technicians onsite, making use of 38 miles of 24-strand fiber-optic cable to produce the event captured by 106 cameras (including 21 wireless 1080p, 21 4K HDR units, six 4K HDR wireless, three Inertia Unlimited X-Mo cameras shooting at 8,000 fps, a Sony HDC-4800 at 960 fps, and three Sony HDC-4300’s at 360 fps) and 218 microphones. Tons of data is being passed around: 3 Gbps of internet data is managed, along with 83 Gbps of broadcast data, 144 TB of real-time storage, and 512 TB of nearline storage.
A Second CompoundEach course provides its own unique challenges. At Shinnecock Hills, there is is the presence of roads running through the course, not to mention the hilly terrain, which also has plenty of deep fescue. But, from a production standpoint, the biggest issue was the small space available for the compound.
Director, Field Operations, Sarita Meinking (left) and VP, Field Operations and Engineering, Brad Cheney are tasked with keeping Fox Sports’ U.S. Open production running smoothly.
“We came out here 18 months ago,” says Cheney, “and, when we placed all of our trucks in the compound map, [they] didn’t fit, and that is without the world feed, Sky, TV Asahi, and others. At Erin Hills last year, we had a support tent, and that gave our camera crew more space, dry storage, and a place to work.”
The decision was made to expand on what was done at Erin Hills last year: move the production operations that most benefit from being close to the course to a large field tent located along the third hole. The field tent is about a half mile from the main compound and is home to the technology area (shot-tracing technologies, etc.); the camera, audio, and RF areas; and the robotic cameras provided by Fletcher. Inertia Unlimited President Jeff Silverman is also located in the tent, controlling X-Mo cameras as well as robotic cameras that can be moved around the course to provide different looks.
Cheney says the team took the field tent to a new level by providing an integrated source of distribution and monitoring so that it could effectively be an island to itself. “It has worked out well. People are comfortable there. It’s dry and offers direct access to the course.”
According to Michael Davies, SVP, technical and field operations, Fox Sports, some of the operations in the field tent, such as those related to enhancements like shot tracing and the Visual Eye, could ultimately move even farther from the main compound.
“Typically, they would be in the main compound,” he explains, “but, once we figured out how to connect the two compounds via fiber for a half mile, it [indicates] how far away you can put things [like the shot-tracking production]. It gets the mind going, especially for events like this that can be hard to get to.”
Fox Fiber Technician Bryce Boob (left) and Technical Producer Carlos Gonzalez inside the fiber cabin
Also located closer to the course is the fiber cabin, a move that allows the team to more quickly deal with any connectivity issues on the course. The 37 miles of fiber cable used across the course is monitored in the cabin, and Carlos Gonzalez, technical producer, Fox Sports, and the team troubleshoot and solve any issues.
“We’re isolated from the compound, which can make it a challenge,” he notes, “but we are actually liking it.”
Cheney says that placing the cabin closer to the course means a reduction in the amount of outbound fiber and also makes the operation a true headend. “It’s something that we will continue to do at Pebble next year [for the 2019 U.S. Open] because of the setup there. This has been another good learning experience for us.”
Steps ForwardOne big step taken in preparation for the 2018 events was that the IP router in Encore was rebuilt from scratch.
“All of the programming in the router was there since day one [in 2015], and we have found new ways to do things,” says Cheney. “To strategically try to pull things out of it just wasn’t worth it. So we started from zero, and it paid off in terms of how quickly we could get up and running.”
Also playing an important part in enhancing the workflows was CMSI and Beagle Networks, which made sure networks and editing systems were all ready to go.
“The team from CMSI and Beagle Networks has been phenomenal in wiring up our networks and making sure it’s robust and all-encompassing,” says Cheney. “We also figured out new ways with IP to control things, move signals, and offer better control for our operators no matter where they are.”
RF wireless coverage this year is being provided completely by CP Communications. There are 26 wireless cameras on the course plus 18 wireless parabolic mics and nine wireless mics for talent on the course. All the signals are run via IP Mesh control systems, and CP Communications also provided all the fiber on the course.
The 5G setup includes a 5G cell mounted on the tower connected to processing gear on the back of a buggy.
Fox Sports is at the forefront of wireless innovation, working with Ericsson, Intel, and AT&T on using next-generation 5G wireless technology to transmit 4K HDR signals from Sony PXW-Z450 cameras to the compound. The 4K cameras are wired into an Ericsson AVP encoder, which sends an IP signal to an Intel 5G MTP (Mobile Trial Platform), which transmits the signal in millimeter wave spectrum via a 28-GHz link to a 5G cell site mounted to a camera tower. That cell site is connected to the Fox IP Network and, in the production truck, to an Ericsson AVP that converts the signal back to baseband 4K.
The potential of 5G is promising, according to Cheney. First, the delay is less than 10 ms, and, conceptually, a 10-Gbps (or even 20-Gbps) 5G node could be placed in a venue and the bandwidth parsed out to different devices, such as cameras, removing the need for cabling.
“You can fully control the system as a whole versus allowing direct management on the device level,” he says.
And, although the current setup requires a couple of racks of equipment, the form factor is expected to get down to the size of a chip within a year.
Expanding InnovationIn terms of production elements, Fox Sports’ commitment to ball-tracing on all 18 holes continues in 2018, with the network equipping each tee box with Trackman radar technology. Eight holes are equipped to show viewers a standard ball trace over live video, with enhanced club and ball data. The other 10 holes have Fox FlightTrack, a live trace over a graphic representation of the golf hole, offering more perspective to the viewer.
Beyond tee-shot tracing, three roaming RF wireless cameras are equipped with Toptracer technology, providing trace on approach shots. And new this year is FlightTrack for fairway shots on two holes, Nos. 5 and 16.
Zac Fields, SVP, graphic tech and innovation, Fox Sports, says the goal next year is to expand the use on fairways. “We want to do more next year and also find a way to use that on taped shots as well.”
Virtual Eye, the system at the core of FlightTrack that takes a 3D model of a hole and uses shot data from SMT as well as from the Trackman and Top Tracer shot-tracking systems to show the ball flight within the 3D model, has also been expanded. The Virtual Eye production team began its U.S. Open preparation a couple months back by flying a plane over the course and capturing photos to map the topography. Then, a few weeks ago, a helicopter shot video of the course, and pictures were extracted from the video and laid over the topographical images.
The FlightTrack team is located inside the field tent, making it easier to hit the course and fix any issues related to shot-tracking technology.
One of the goals, says Ben Taylor, operations manager, Virtual Eye, has been to make the system more automated and to allow it to be used on taped shots. For example, the EVS-replay users themselves can now trigger Virtual Eye to be active with the push of a button. And, when the ball comes to a rest, the graphic slides off the screen.
“The system will reset in the background after the shot,” he notes.
Fields and the Fox team have been happy with the performance, particularly the ability for EVS operators to control the graphic overlay. “It’s pretty slick,” he says. “The system takes the EVS feed and runs it through the graphics compositor and then back into the EVS, so the EVS system is recording itself. It seems complex, but, once the operator gets used to it, it’s easy. And now they can do FlightTrack a lot more.”
When Fox Sports took on the challenge of the U.S. Open in 2015, the industry watched to see how it would change the perception of golf coverage. Four U.S. Opens later, it is clear that the innovative spirit that has been part of Fox Sports since its early days continues unabated, especially as the era of sports data takes hold of the visualization side.
“We want to bring the CG world into our coverage and create animations to tell stories like comparing every tee shot a player took on a certain hole or comparing Dustin Johnson’s fade with another player’s draw,” says Fields. “And now we can show how the wind will affect a shot.”
June 8, 2018
Sports Video Group
With the second Triple Crown in just four years on the line, NBC Sports Group is pulling out all the stops for coverage of this weekend’s 150th Belmont Stakes. With Justify poised to capture the final gem of the Triple Crown, NBC Sports Group has boosted its production complement, adding a second onsite studio set, live pointer graphics to identify Justify on the track, and five additional cameras, including the Bat Cam aerial system that drew rave reviews at both the Kentucky Derby and the Preakness Stakes.
“Once Justify won Preakness, we knew what we were in for, and we started putting everything in motion right away,” says Tim Dekime, VP, operations, NBC Sports Group. “The [equipment levels] were increased a good bit, and we added all the bells and whistles. It means a lot more work and preparation, but it’s very exciting for us, and we are very well-prepared.”
All Eyes on Justify: More Cameras and Virtual Tracking Graphics NEP’s ND1 (A, B, C, and D units) mobile unit will once again be on hand to run the show, with a total of 43 cameras deployed — up from 33 for last year’s non-Triple-Crown race. Besides the Bat Cam aerial system covering the backstretch, the camera arsenal includes a Sony HDC-4800 4K camera (outfitted with a Canon UHD 86X lens) on the finish line, five HDC-4300’s running at 6X slo-mo and five more running at 60 fps, 14 HDC-2500’s (eight hard, six handheld), five HDC-1500’s in a wireless RF configuration (provided by BSI), a bevy of robos (provided by Fletcher) and POVs, and an aerial helicopter (provided by AVS weather permitting).
Ready for a Triple Crown effort at Belmont: (from left) NEP’s John Roché and NBC Sports Group’s Keith Kice and Tim Dekime
Five other cameras have been added because of the Triple Crown possibility: a POV camera at Justify’s gate and one in the PA booth with announcer Larry Collmus (which will be streamed live on the NBC Sports App), a robo to capture a 360° view of the paddock, an additional RF camera roaming the grounds, and, most notably, the Bat Cam system.
In addition to more cameras, NBC plans to use SMT’s ISO Track system to identify Justify with a virtual pointer graphic live during the race. The system will incorporate real-time data — speed, current standing, and distance from finish line — into the on-air pointer graphic, helping viewers follow Justify and other key horses throughout the day’s races.
“We’ll have a live pointer that tracks Justify during the race that our director [Drew Esocoff] will insert, if needed, [so] the horse will be tracked for the viewers watching at home,” says Coordinating Producer Rob Hyland. “It will have a little arrow pointing to where he is at certain points in the race.”
Bat Cam Covers the Back StretchThe Bat Cam was a hit at both Churchill Downs and Pimlico, providing a never-before-seen view of the backstretch and also coming in handy when rain and fog complicated matters for NBC at both the Derby and the Preakness. The two-point cable-cam system can travel 80 mph along the backstretch, running 15-18 ft. above the ground.
“NBC had already used the Bat Cam on NASCAR, so we knew what to expect at the Derby, and it was just a matter of figuring out how to implement it into our show,” says Keith Kice, senior technical manager, NBC Sports. “It’s turned out to be a great [tool for us], especially at [the Preakness]. Even if it wasn’t for all the fog, the infield [at Pimlico] with all the tents and stages and infrastructure makes it very difficult; you really need the Bat Cam just to cover the backstretch because you can’t see it otherwise.”
Given the massive size of the Belmont track, the Bat Cam will cover more ground than at either of the two prior races but will not cover the entire backstretch. The system will run 2,750 ft. — more than 700 ft. longer than at the Kentucky Derby, 500 ft. longer than at the Preakness Stakes — of the 3,000-ft. backstretch.
“The length of the backstretch was definitely a challenge in getting the Bat Cam unit [installed],” says Dekime. “But the benefit here as opposed to Preakness is that there’s nothing in the infield the way that it’s one big party at Pimlico. We are unencumbered, so that’s a positive. The length of the backstretch was a challenge in getting the Bat Cam units to cover most of the backstretch.
Although NBC and the Bat Cam team were forced to bring in larger cranes at Belmont in order to install the longer system, says NEP Technical Director John Roché, setup and operation of the Bat Cam has improved significantly since the Derby.
“It’s no longer a science experiment like it was before,” he says. “We’re able to get [Bat Cam owner/operator] Kevin Chase all the gear that they need, and they are able to give us what we need pretty easily in terms of terminal gear, intercoms, and everything. It’s pretty much plug-and-play now.”
Hyland adds that the Bat Cam “will not only cover the backstretch of the race but will also provide dramatic reset shots of this vast facility. When the Triple Crown is on the line at Belmont, the energy in this venue is electric, and we want to capture the sense of place.”
Triple Crown Chance Warrants Double the SetsBesides additional cameras because of the Triple Crown potential, NBC Sports has also added a second studio set. Host Mike Tirico and analysts Randy Moss and Jerry Bailey will man the 18- x 18-ft. set at the finish line, and a secondary 24- x 24-ft. stage located near Turn 2 will feature host Bob Costas and other on-air talent.
“If it was not going to be a Triple Crown, we would likely be down to just the finish-line set,” says Dekime, “but, now that it is, we’ve put the Turn 2 set back into operation.”
SMT’s Betting and Social Media GOTO videoboard will also be situated at the main set for handicapper Eddie Olczyk, who will use the interactive touchscreen for real-time odds and bet payouts for all races throughout the day. The touchscreen technology and betting touchscreen will enable him to explain and educate the viewers on how he handicaps specific races.
In addition to the onsite sets, NBC plans to incorporate several live remote feeds into the telecast, including from Churchill Downs.
“We brought out all of the tools to showcase the Triple Crown attempt, including a number of remotes that will carry live shots from Churchill Downs, where it all began five weeks ago,” says Hyland. “There will be hundreds of people gathered watching the race. We may have a live remote shot from a Yankees-Mets game just a few miles away. We’re working on a couple other fun ones as well, just to showcase this day and this athletic achievement, should it happen.”
Looking Back at a Wet and Wild Triple Crown CampaignAlthough the horse-racing gods have granted NBC the potential for a Triple Crown this weekend — and the big ratings that go along with it — the weather gods have not been so kind. After the wettest Kentucky Derby on record and the foggiest Preakness Stakes in recent memory, a chance of rain remains in the forecast for Saturday. However, Roché notes that the proliferation of fiber and the elimination of most copper cabling onsite has significantly reduced weather-related issues.
“Despite torrential downpours on the first two races, we’ve been really fortunate,” says Roché. “And no matter what happens here [in terms of rain], we’re getting a little spoiled having two Triple Crowns in [four] years after a 37-year drought. For us to be able to have an opportunity to show the public how we cover racing, especially with the addition of Bat Cam, in a Triple Crown situation is really an honor.”
Kice seconds that notion: “Having a Triple Crown [in play] makes all the hard work and troubles we went through with the weather and logistics on the first two races even more worthwhile.”
June 6, 2018
Sports Video Group
SMT will provide fan-engagement technology solutions for NBC Sports Group’s broadcast of the 150th Belmont Stakes. This year marks the eighth consecutive Triple Crown collaboration between SMT and NBC Sports Group and is particularly exciting as Justify seeks to become only the second horse since 1978 to win a Triple Crown.
Much like the Preakness Stakes and the Kentucky Derby, SMT’s suite of products will engage viewers from gate to finish with real-time, data-driven graphics, up-to-the-second odds, and commentator analysis.
SMT’s Live Leaderboard System highlights the running order of the top six horses using positional data updated 30 times per second per horse, ensuring accuracy and speed for SMT’s on-air graphic presentation.
SMT’s ISO Track system identifies the horses and incorporates real-time data such as speed, current standing, and distance from finish line into an on-air pointer graphic, helping viewers follow the action during the race.
SMT’s ticker produces an on-air display of real-time odds and bet payouts using live data from the race’s Tote provider (in-house wagering system). The ticker also curates and visually displays social media feeds that give followers an inside look at happenings at the track.
SMT’s Track Map System gives viewers a display of the lead horse’s real-time position and split times via an on-screen graphic.
SMT’s Betting and Social Media GOTO video board features real-time odds and bet payouts for all the races throughout the day. The system provides an interactive system for talent to explain the process of horse wagering.
The Data Matrix Switchboard (DMX) provides a customized solution for each Triple Crown race, absorbing, collating, and synchronizing live data feeds into SMT’s proprietary horse racing database. The DMX integrates live data for on-air and off-air graphics in real-time and replay modes, enhancing NBC’s live race presentation and pre and post race analysis. These displaysalso feature real-time advanced odds and minutes-to-post countdowns.
“With a Triple Crown in play for the second time in four years, SMT has another unique chance to help document a historic moment,” says Ben Hayes, Manager, Client Services, SMT. “Our systems help novice race fans understand the core aspects of the sport, while also providing in-depth betting and live race analysis for racing aficionados.”
April 24, 2018
Golf Channel
World No. 1 Justin James, Defending Champion Ryan Reisbeck & 2013 Volvik World Long Drive Champion Heather Manfredda Headline First Televised Event of 2018 from Long Drive’s Most Storied Venue
Veteran Sports Broadcaster Jonathan Coachman Making Golf Channel Debut; Will Conduct Play-by-Play at Each of the Five Televised WLDA Events in 2018
Eight men and four women have advanced to compete in tonight’s live telecast of the Clash in the CanyonWorld Long Drive Association (WLDA) event, airing in primetime from Mesquite, Nevada, at 7 p.m. ET on Golf Channel. In partnership with Golf Mesquite Nevada and taking place at the Mesquite Regional Sports and Event Complex, the group of competitors headlining the first televised WLDA event of 2018 are World No. 1 Justin James (Jacksonville, Fla.), defending Clash in the Canyon champion Ryan Reisbeck (Layton, Utah), and 2013 Volvik World Long Drive champion Heather Manfredda (Shelbyville, Ky.)
A familiar setting in World Long Drive, Mesquite previously hosted the Volvik World Long Drive Championship and a number of qualifying events dating back to 1997, including the World Championship having been staged at the same venue as the Clash in the Canyon from 2008-2012.
FORMAT: The eight men advanced from Monday’s preliminary rounds that featured a 36-man field and will compete within a single-elimination match play bracket during tonight’s live telecast. The four women advancing from this morning’s preliminary rounds (18-person field) also will utilize a single elimination match play bracket this evening to crown a champion.
COVERAGE: Live coverage of the Clash in the Canyon will air in primetime on Golf Channel from 7-9 p.m. ET tonight, with Golf Central previewing the event from 6-7 p.m. ET. An encore telecast also is scheduled to air later this evening on Golf Channel from 11 p.m.-1 a.m. ET. Fans also can stream the event live using the Golf Channel Mobile App, or on GolfChannel.com.
The production centering around live coverage of the competition will utilize six dedicated cameras, capturing all angles from the hitting platform and the landing grid, including a SuperMo camera as well as two craned-positioned cameras that will track the ball in flight once it leaves the competitor’s clubface. New to 2018 will be an overlaid graphic line on the grid, the “DXL Big Drive to Beat,” (similar to the “1st & 10 line” made popular in football) displaying the longest drive during a given match to signify the driving distance an opposing competitor will need to surpass to take the lead. The telecast also will feature a custom graphics package suited to the anomalous swing data typically generated by Long Drive competitors, tracking club speed, ball speed and apex in real-time via Trackman. Trackman technology also will provide viewers with a sense of ball flight, tracing the arc of each drive from the moment of impact.
BROADCAST TEAM: A new voice to World Long Drive, veteran sports broadcaster Jonathan Coachman will conduct play-by-play at each of the five WLDA televised events on Golf Channel in 2018, beginning with the Clash in the Canyon.Art Sellinger – World Long Drive pioneer and two-time World champion – will provide analysis, and Golf Channel’s Jerry Foltz will offer reports from the teeing platform and conduct interviews with competitors in the field.
DIGITAL & SOCIAL MEDIA COVERAGE: Fans can stay up-to-date on all of the action surrounding the Clash in the Canyon by following @GolfChannel and @WorldLongDrive on social media. Golf Channel social media host Alexandra O’Laughlin is on-site, contributing to the social conversation as the event unfolds, and, the telecast will integrate social media-generated content during tonight’s telecast using the hashtag, #WorldLongDrive.
In addition to the latest video and highlights from on-site in Mesquite, www.WorldLongDrive.com will feature real-time scoring. Golf Channel Digital also will feature content from the Clash in the Canyon leading up to and immediately following the live telecast.
Coming off record viewership in 2017 and a season fueled by emergent dynamic personalities, the Clash in the Canyon is the second official event of the 2018 World Long Drive season, as Justin Moose claimed the East Coast Classic in Columbia, South Carolina last month.
Showcasing the truly global nature of World Long Drive, several events will be staged in 2018 through officially sanctioned WLDA international partners, including stops in Germany, Japan, New Zealand and the United Kingdom. Additionally, an all-encompassing international qualifier will be staged (late summer) featuring a minimum of four exemptions into the Open Division of the Volvik World Long Drive Championship in September.
April 15, 2018
Boston.com
The light at the end of the tunnel for Boston Marathon runners making the final turn onto Boylston Street will be shining a little brighter this year. One of the changes the Boston Athletic Association made to the finish line for Monday’s 122nd running of the race is a new digital display board, affixed to the photo bridge above the finish line, that will be visible even if the forecasted rain falls.
“The finish times are going to be displayed big and bright and in color on that video board so that the participants and the spectators on Boylston Street will be able to see from afar what the time is,” said Jack Fleming, Chief Operating Officer of the B.A.A.
For their first year with the new board, which is similar to those that ring Gillette Stadium or TD Garden, the race organizers intend to go with a conservative approach and minimal animation. On Friday, it displayed a countdown clock for Saturday’s 5K and on Sunday it will show a tribute to One Boston Day. But the digital display opens up a new path forward for the finish line, and Fleming said that the B.A.A. could use lights and sound to enhance the spectator experience in the years to come.
“Boylston Street is like the home stretch of the Kentucky Derby or when the team comes out of the tunnel in Gillette Stadium,” he said. “We want our participants to feel that same way.”
In 2021, during the 125th Boston Marathon, don’t be surprised if the roar of the crowd over the final 500 meters is set to a background beat. But Fleming said the aesthetic changes will be made in keeping with the tradition of the event. Of course, no matter what sounds are added, the loudest noise in the runners’ heads will always be the ticking of the clock.
To that end, the organizers swapped the old clock — suspended by cable and beam above the street — for two consoles with double-sided clocks facing the oncoming runners on one side and the world’s media on the other. The race tape will be suspended in between the two consoles, and after the elite runners break the tape it will be wheeled out of the way.
Dave McGillivray, the race director, said that runners will notice some changes this year and a few more next year, building towards 2021 when the B.A.A. plans to showcase the finish line as part of the quasquicentennial celebrations. For that race, the organizers are also considering a request for an increased field size or more ancillary events around the Marathon.
The Boston Marathon finish line: a painted strip across a city street that’s taken on a meaning far beyond that.
“Everything to do with 2013 showed us just how loved Boylston Street is by our participants, by our fans, by the neighborhood, by the community,” Fleming said. “So that was sort of the inspiration for taking some actions on it.”
March 23, 2018
Sports Video Group
Although augmented reality is nothing new to sports production — the 1st & Ten line celebrates its 20th anniversary this year — AR has taken a giant leap in the past three years and is dramatically changing the way stories are told, both on the field and in the studio.
From left: Turner Studios’ Zach Bell, Fox Sports’ Zac Fields, Vizrt’s Isaac Hersly, SMT’s John Howell, and ChyronHego’s Bradley Wasilition
At SVG’s Sports Graphics Forum this month, a panel featuring executives from Fox Sports, Turner Sports, The Future Group, ChyronHego, SMT, and Vizrt discussed best-use cases, platforms, and workflows for AR, as well as how its use within live sports coverage is evolving. The one principle the entire panel agreed on was that AR cannot be used for technology’s sake alone: these elements must be used to further the story and provide valuable information to fans.
“Our philosophy has always been to use [AR] as a storytelling tool. We try not to use it for technology’s sake – whether that is in a live event or in the studio,” said Zac Fields, SVP, graphic technology and innovation, Fox Sports. “The interesting thing is that people can interact with [AR] on their phones and are familiar with what AR is now. That puts the onus on us to present those elements at an even higher quality now. [AR has] become the norm now, and it’s just going to continue to grow. The tools are there for people to come up with new ideas. The one thing that I would hope is that we can make it easier [to use] moving forward.”
Fields’s desire for more–user-friendly AR creation and integration was echoed throughout the panel by both users and vendors. Although a bleeding-edge AR project may be exciting and create a new experience for the fan, the goal is to create a solution that can be set up and used simply for every game.
“We’re trying to make sure that customers have ease of usability and repeatability every day,” said Isaac Hersly, director, business development, Vizrt. “It is an issue, and we are always looking for tools that are going to make it easier to set up and not need a rocket scientist. You [need to be able to] have someone that can operate the system very simply. That is our challenge, and we are always looking to come up with solutions to solve that.”
Turner Sports Brings Videogame Characters to Life With ARLast year, Turner Sports teamed with The Future Group to introduce augmented reality to its ELEAGUE coverage. The two companies worked with Ross Video to create life-like incarnations of videogame characters, allowing fans tuning in to watch games like Street Fighter V or Injustice2 to see these characters brought to life in the studio.
“I think creating AR characters from the games and bringing them to the audience adds an enormous amount of value for the fans and the viewing experience,” said Zach Bell, senior CG artist, Turner Studios. “If you can take characters or aspects of the game and have them as dimensional elements within that environment, it creates a much richer experience and allows fans of the game to visualize these characters in a new way. That in itself adds an enormous amount of connection to the experience for the viewer.”
Although esports presents a different case from a live game taking place on a field, Bell said, he believes similar AR elements will soon be making their way into live sports content (for example, NBC’s 3D AR elements from player scans during Super Bowl LII).
More Than Just a Game: Bringing AR to the MassesIt was only a couple years ago that high-end AR elements were reserved for the highest-profile sports events, such as NFL A games. However, with the technology’s rapid advance in recent years, AR has become ubiquitous for most national-level live sports productions and is making its way into even lower-tier properties. In addition, AR elements are becoming available on multiple cameras rather than just the main play-by-play camera (such as the SkyCam), and these systems can even be remotely controlled from offsite.
“The technology is allowing us to drive the next generation of this [content],” noted John Howell, creative strategist, SMT. “We have done the yellow [1st & Ten] line for 20 years, but, two years ago, SMT helped to create a technology that allowed us to do it on the SkyCam. Having that optical vision tracking to create the pan-tilt information off a $30,000 camera head for an image has enabled us not only to do this off the SkyCam but also to do it remotely.
“[That allows us to deploy AR] on more shows [more cheaply],” he continued, “and that technology will then trickle down to more shows. It won’t be just on Fox’s 4 p.m. Sunday NFL game or ESPN’s MNF or NBC’s SNF; now this [technology] gets to go on a lot more shows.”
What’s Next?: Getting More From Player-Tracking Chips, Customizing ARThe use of AR and the technology driving it has evolved rapidly over the past few years, raising the question, What’s next? The panel had plenty of predictions regarding the next great leap forward, but the primary point of excitement revolved around the continued advance of player-tracking RFID chips, especially the NFL’s Next-Gen Stats system.
“With the emergence of Zebra [Technologies] chips on players and [the NFL] looking at instrumenting the football [with a chip], you could see how that can tie to your first-down–line [graphic],” said Bradley Wasilition, director, sports analysis/lead sports analyst, ChyronHego. “The first-down line could actually dynamically change color, for example, when the first down is reached. Now, when that chip crosses that line, you can [definitively] say whether it is a first down or a player was out of bounds [on the sideline].
“Or think of a dynamic strike zone in baseball or a dynamic offside line in soccer,” he continued. “These are all different things that don’t necessarily reinvent the wheel, but they take baseline AR and move it into the 21st century.”
Fields predicted that, as multiplatform content and OTT outlets grow, fans will someday be able to customize their own AR elements within the sports coverage they are watching: “Eventually, it will get to a point where we can put this data in the hands of the viewer on an OTT offering. Once that happens, they can choose to turn off the strike zone over the plate. That is when we’ll really get some flexibility and customization to people so [viewers] can enhance [their experience].
March 16, 2018
Avixa
Sports. The great common denominator of all conversation. Even if you don’t like sports, you know enough to be able to talk about it, at least for a minute. And sports, by convenient association, is actually one of my favorite ways to talk about what it is that AVIXA members do.
We tell sports stories. Through gigantic video boards (forever “Jumbotrons” to the layman, and hey, that’s alright), humongous speaker systems, tiny microphones, variably-sized digital signage displays and perceptually invisible but actually ridiculously huge lighting systems and projection mapping, AV experience designers make the live event into a highlight reel. Everything has impact, in real-time.
So it happens to be that I’m forever on the lookout for evolving ways to tell sports stories in venues. In reading Sports Video Group’s coverage of the Super Bowl, I found another great angle on stadium storytelling. Most sports fans know that we are in the age of abundant sports data analytics, but what I didn’t know is that we are also in the era where those next-gen stats are changing the in-house show on the big screens at stadiums.
In a first for the Super Bowl, the 2018 game brought some television broadcast features to the in-house displays at U.S. Bank Stadium. And on top of that, they challenged audiences with a whole new graphics package featuring next-gen stats (“NGS” if you’re savvy).
With production tools by SportsMEDIA Technology (SMT), the virtual yellow line and some cool new NGS factoids made it to the big-time on the live-game displays. The latter of these came from SMT’s tapping into the NFL Next Gen Stats API to go deeper with the data.
SMT’s goal to delight fans with even more details to obsess over during the game seems like a good one. Especially because, well, “NFL fans are insatiable — they want data,” said Ben Grafchik, Business Development Manager for SMT.
To meet that need, SMT is exploring ways to tie in traditional data points with NGS in a visual format that fans can easily consume during a game. The objectivity and analytical depth of these additions to video board storytelling is compelling to all diehard fans, but in particular, the next-gen stats appeal to next-gen fans, Grafchik added.
These new graphics may have been a first for the Super Bowl, but actually, Vikings fans enjoyed them for the entire season at home at U.S. Bank Stadium. SMT worked with the in-house production team there to add all sorts of visual spice to the show, gradually going more complex with the offerings as the season went on and fans became accustomed to the new depths of data exploration.
But football isn’t the only sport that’s receiving the NGS upgrade. SMT happens to provide video enhancement and virtual insertion graphics for hundreds of major U.S. and international sporting events and broadcasters. So watch for a lot more variety to come both in house and wherever else you consume your sports content. It will certainly give us all a lot more to talk about when we talk about sports.
March 14, 2018
Sportstar Live
For more than 100 years, tennis, unlike team sports, used statistics sparingly. Basketball, baseball and football needed a plethora of stats, such as shooting percentages, batting averages and touchdowns scored, to measure the performances of their athletes and teams. But tennis players were measured chiefly by their wins, losses, titles and rankings. After all, few cared if the Wimbledon champion made 64% of his first serves or the No. 1 player averaged 77 miles per hour on her backhand.
All that changed in the Computer Age. With more information than they ever dreamed possible, tennis coaches, players, media and fans suddenly craved all sorts of revealing match data, not to mention astute analysis of it. No longer was it just whether you won or lost that mattered, but how and why you won or lost — points, games, sets and matches. Training methods, stroke production, tactics and equipment were also dissected and analysed in much greater depth and detail than ever before.
As the demand for data burgeoned, new technologies, such as sophisticated virtual graphics, tracking technology, statistical applications and telestration, have provided yet more valuable services and information to give athletes that “extra edge.”
Like any prescient, enterprising pioneer, Leo Levin seized the opportunity by developing the first computerised stats system for tennis in 1982. Levin’s seminal work was highlighted by creating the concept of and coining “unforced error,” a term now used in most sports and even by pundits to describe a politician’s self-inflicted blunder.
Since then, the genial 59-year-old, based in Jacksonville, Florida, has covered more than 120 Grand Slam events and countless other tournaments to provide the Association of Tennis Professionals (ATP) and other businesses with match statistics. Levin, dubbed “The Doctor” by broadcaster Mary Carillo for his incisive diagnoses of players’ games, is currently director of sports analytics at SportsMEDIA Technology (SMT), a company that provides custom technology solutions for sporting events.
In this wide-ranging interview, Levin explains his many roles in the exciting, fast-growing field of analytics and how it has changed tennis for the better.
What is sports data analytics?
Sports data analytics is a combination of gathering and analysing data that focuses on performance. The difference between analysis and analytics is that analysis is just gathering the basic data and looking at what happened. Analytics is trying to figure out why and how the basic performance analysis works with other factors to determine the overall performance of the athlete or the team.
When and how did this field start changing amateur and pro tennis? And who were the pioneers?
Honestly, I was. At the end of 1981, the first IBM personal computer hit the market for general consumer use. By the middle of 1982, I was working with a company in California to develop the very first computerised stats system for tennis. The key factor was the way we decided to describe the results of a tennis point in three basic areas. The point had to end with a winner, a forced error, or an unforced error. That created the foundation for how we look at tennis today.
How and when did you become interested in tennis analytics?
I was playing on the tennis team at Foothill College in Los Altos, California, about five miles from Stanford University. When I wasn’t playing matches, I was actually charting matches for my team-mates and then providing that information to the coach and the players to try to help them improve their games.
Brad Gilbert, a former world No. 4 and later the coach of Andre Agassi and Andy Murray, played on your Foothill team. Did you help him?
Brad was on that team, and it was interesting because in his first year, he played No. 2. The player who played No. 1 came to me before the state finals where he had to play Brad in the final, and asked me, ‘How do I beat Brad?’ I was able to give him specific information on strategy and tactics that helped him win the state title.
That was the year Brad took his runner-up trophy and smashed it against a tree and vowed never to lose a match the following year. And the following year, Brad didn’t lose a match.
SportsMEDIA Technology’s (SMT) products and services have evolved from a clock-and-score graphic in 1994 to innovative and sophisticated virtual graphics, tracking technology, statistical applications, and telestration. How do you and your team at SMT use these four methods to analyse statistical data at tennis’ four Grand Slams to provide valuable insight that helps players, coaches, broadcasters and the print media determine how and why a match was won or lost?
One of the challenges with tennis, more so than with any other major sport, is the lack of data. When we started doing this, there really wasn’t any consistent gathering of data from matches. So the first piece we developed was simply a system now known as Match Facts. It pulled factual statistical data directly from the chair umpire. That started with the ATP back in the early 1990s. We were then able to create a base for year-round information on the players. It allowed for the next level of analysis. It has expanded from there. We developed the very first serve speed system to start adding additional data and how players were winning or losing based on the serve speeds. As the technology improved, we’ve been able to harness the new generation — tracking video technology and then on the presentation side, using virtual graphics as a way to be able to place data directly into the field of play to help illuminate what is actually going on. Telestration is a tool that allows the broadcasters to get inside the points and help the fans understand the combinations of shots and strategies the players are using.
Your website (www.smt.com) has a section titled “Visual Data Intelligence” with the subtitle, “SMT delivers the world’s most innovative solutions for live sports and entertainment events across the globe.” What is Visual Data Intelligence? And what are its most important, innovative solutions for live sports and entertainment events?
Visual Data Intelligence goes to the heart of what we try to do as a company. In a lot of different sports, there is a lot of information available. But making it useful to the broadcasters, and specifically to the fans, to help them understand the game is a huge part of what we’re providing. That entails simple things like the first-and-10 line in football. That provides the visual set of information for the commentators and fans that really helps them understand where the teams are and how much yardage they need (to get a first down). It’s gotten to the point where fans in the football stadium are yelling, “Where’s the yellow line?” So we’re expanding that to provide the service to the large screens displayed inside the stadium so teams have their own system to be able to show that to the fans.
How does Visual Data Intelligence apply to tennis?
In tennis where you have a lot of data, the challenge is: how do you provide all that data to the fans and the commentators? We do that through a series of different systems. We have what we call our “open vision system,” which is an IPTV solution that has real-time scoring, stats and video as well as historical data. And it ties it all together and puts it in one place so it provides a true research tool for the commentators and the (print and online) media. Along with that, we have a product we call our “television interface,” which is really a system which drives graphics on air for the broadcasters. This tool allows them to look at the data and see where the trends are. Hit the button and have that information directly on the screen.
Please tell me about the new technology service partnership between Infosys and the ATP, and the analytics and metrics this partnership brings to the tennis world.
I’m not really that aware of what Infosys and the ATP are doing. But I do know that a lot of that hinges on the technology we created for Match Facts. One of the unique things about tennis is the scoring system. Unlike other sports, the player or team that wins the most points doesn’t necessarily win the match. That’s not how our scoring system works. I think they are trying to take a deeper look into the individual points, and how winning or losing specific points in key situations impacts a player’s ability to win or lose matches. The same is true for total games. That’s one of the challenges when you’re trying to do analysis of tennis. In a lot of other sports, you’re just looking at the raw numbers and saying how many points did he score or how many rebounds did she get or how many yards did they gain. But in tennis, it has to be compartmentalised into specific performances in specific situations.
How do insights from game and training data analytics improve coaching?
The key to coaching and player improvement is first to understand what is going on out on the court. It’s a matter of gathering data. One of the challenges tennis has faced because of its late start in the world of statistics and data analysis has been a reluctance by a lot of coaches and players to rely on anything other than what they see and feel. So the real challenge and the real key is to be able to relate the data to what coaches see and what players feel out on the court. When you can make that connection, you have a real chance for improvement.
What are one or two insights that have improved coaching?
The challenge is that every player is different. What the data analysis allows you to do is to customise those things and focus not on what a player does, but what your player does, and how you can get the most out of your player’s game. A simple example of this was when we first started doing detailed statistics and analysis, we worked with the Stanford University tennis programme. Their No. 1 woman player, Linda Gates, was struggling, and the coaches couldn’t figure out where or why. We did an analysis of her game, and we found out that she was dominating her service games on her service points in the deuce court, but she was struggling in the ad court. It wasn’t visually obvious. The coaches couldn’t put their finger on what the problem was. But once we started looking at the numbers and the data, it allowed them to focus in practices on her ad-court shot patterns. Linda went on to win the NCAA Championships that year, 1985, in singles and doubles (with Leigh Anne Eldredge).
An Infosys ATP “Beyond The Numbers” analysis of Rafael Nadal’s resurgence to No. 1 in the Emirates ATP Rankings showed that Nadal ranked No. 1 on tour in 2017 for winning return points against first serves, at 35.2 percent (971/2761). That metric shoots up to an astounding 43.4 percent (454/1045) for his clay-court matches. Which other stunning statistics help explain why other players have had outstanding years this decade?
This goes to the basics of looking at players’ strengths and weaknesses. One stat I always look at is serve and return performance because I still split the game up that way. It’s interesting that when you look at a player like Nadal, you see that he is not only dominant on return of serve. He’s also dominant on his own second serve.
Even with all the analytics we have, an old maxim still holds true: “You’re only as good as your second serve.” You’ll find the players at the top of the rankings for the last four or five years were also at the top of both second serve points won and return of second serve points. Despite all the focus on power and big serves, second serve performance is really a huge key to understanding a player’s overall strengths and weaknesses.
How much do the Women’s Tennis Association tour and its players take advantage of analytics?
Although the WTA was a little behind the ATP curve in terms of gathering and storing match data, the good news is that now they’ve caught up. Their association with SAP and that they’re also now using a Match Facts system to provide data for the players on a match-by-match basis has moved them up the curve.
Which pro players have benefited most from tennis analytics so far? And in what specific ways?
That’s a tough question. Because I don’t work directly with the players and coaches as I used to, I don’t know who is utilising the data more so than others. You can tell just by looking at Roger Federer’s improvement over the last year that his team used analytics to determine that he needed to be more aggressive on his backhand. He’s now hitting a much higher percentage of topspin backhands than he did in previous years and that change has made his game more balanced and puts a lot more pressure on his opponents. Playing to Roger’s backhand used to be the safe play — it’s not any more.
Another area of Federer’s game that came to light using analytics was the difference between his winning and losing matches at Wimbledon. When you compare his final match wins to his matches lost since he won his first Wimbledon in 2003 — 8 titles, 7 matches lost — the numbers that jump out are all about his return of serve, and specifically, his performance on break points. Federer’s serving performance barely changed, but his return game fell dramatically in his losses. In his Wimbledon final wins, Federer converted 30 of 69 break points for 44%. In his losses, he converted only 9 of 53 for 17%. In both cases, he averaged around 8 break points per match. In his wins, he converted almost 4 per match, but in his losses he converted just over once per match. His team looked at that crucial data and added in that nearly all his opponents served and volleyed 2% or less of their service points and concluded that Roger needed to work on hitting his returns deep and not worry about his opponents coming in behind their serves.
Younger players are taking most advantage of the information because they’ve grown up in that world. They’re used to the electronics and the digital experience and having all that information available to them.
How do these insights enhance the fan experience?
I credit (renowned former NFL analyst) John Madden for being one of the very first TV commentators who would take fans inside the game to explain to them things they didn’t necessarily see. Madden would explain to women football fans what the centre or guard was doing on a particular play and why that back ran for 50 yards was all because of this really good block.
What we’re trying to do in tennis and what these insights have provided is to do the same kind of things for tennis fans. Help get them inside the game so they understand the nuances of what’s happening on the court, and they’re not just watching two guys running around hitting the ball.
What is radar-based tracking, which is now used by the United States Olympic Committee (USOC) for every throw an Olympic athlete makes? Is it being used in tennis?
Radar-based tracking is simply tracking the speed and location of the ball or object that is being thrown or hit. Radar-based tracking has been typically used for service speeds in tennis. That is something we pioneered in the late 1980s. The tracking used in tennis has been video-based, as opposed to radar. The advantage of that is that you can track movement of the players as well as the movement of the ball and from a variety of positions and angles.
Can analytics predict which junior players will someday become world-class players or even champions? And if so, can it guide their coaches and national federations to increase the odds that will happen?
Not yet. The challenge is that prediction is different from analysis. You’re trying to draw conclusions from the data, and we don’t have a complete set of data. If you wanted to predict which junior players will become world-class players, sure you can do that if we have genetics, biomechanics, all the physical characteristics measured as well as using analytics to measure the player’s overall performance on the court. We can see whether or not they have specific markers that indicate they will make that jump. But the bottom line is that there are so many factors involved. And a lot of it has to do with the physical side that you can’t necessarily determine from data.
What is bioanalytics? And why is measuring and analysing an elite athlete’s perspiration important?
We’re pioneering bioanalytics in football now. We’re taking biometric readings from players at the university level. The players are equipped with motion sensors and full biometric readers, which are reading things like heart rate, body temperature and respiration. And they’re combining that with the movement data from the tracking information. With that, we’re able to measure the physical output of the players. The sensors in the helmet measure impacts (from collisions).
We’ve been working on this project for a few years. It’s been used for the football programme at Duke University. We’re in the process of adding a couple more universities to this project. At this stage it’s being used for medical purposes. So when a player is on the practice field, they can know immediately if his heart rate starts racing or if his body temperature goes up too high, they can immediately pull him out of practice and get him more electrolytes and hydration. They also weigh the players before and after every practice so they know how much fluid the player has lost during their practice times.
How is bioanalytics used in tennis?
Unlike a team sport where a team can outfit all its players with this equipment, tennis players are all independent contractors. So it’s going to take more of a nationalistic approach — something like what the USTA is doing — to step in and say, “For our junior players, we’re going to outfit some courts and we’re going to provide this level of analysis on the physical side.”
Does analytics apply to tennis equipment and court surfaces? And if so, how?
Sure, it can. Analytics can identify how well players perform using different types of equipment and on different surfaces. For instance, if you’re using some tracking technology to determine what racquet and string combination allows a player to have the most amount of power, that’s a relatively simple exercise. You run a player through a set of drills, hitting particular shots, and measuring the speed of the ball coming off the racquet.
For surfaces, analytics can really help with identifying the type of shots that have an effect on particular surfaces or areas where players’ games break down. For example, you have players who have a long backswing, and that works really well on a slower surface where they have time to take a big backswing. But when you put them on a faster court, where the ball bounces lower and faster, it upsets their timing, and it makes it more difficult for them to adjust. Analytics measures the court’s bounce speed and bounce trajectory. So you can take a player and modify his game on a particular surface taking into account how the ball reacts to it.
You’ve analysed thousands of matches. Which factors influence the outcome of matches the most in men’s tennis and women’s tennis? And why?
The No. 1 factor typically is unforced errors. If you’re making mistakes, you’re basically giving the match to your opponent. Being able to measure and quantify that is a huge factor for player improvement. That entails understanding where you’re making your mistakes — which shots and what situations. The caveat to that is that there are certain players whose games are based on absolutely controlling the pace and tempo of the match. And they have the tools to do that. Two of the best players ever to do that are Steffi Graf and Serena Williams.
What are the disadvantages of and dangers involved with analytics? Will some number crunchers and coaches go overboard with analytics and be guilty of Occam’s razor?
The simple danger is to rely on data alone. The challenge is that you have to make the data relatable to what the player is doing physically and mentally on the court. Analytics doesn’t necessarily measure the mental side of the game, at least not yet. If you’re focusing so much on the analytics of certain shots and not looking at the big picture of their mental focus and how they’re preparing for matches, you can get into trouble.
Since tennis players vary greatly in temperament, talent, current form and other variables, do practitioners of analytics risk over-concluding from their numbers? And what mistakes have you and others made in this regard?
There is always a risk. Data can provide you with valuable information. Then you make that next leap that says, “This information says this, and therefore we have to do this, or therefore we have an issue.” I’ll give you a simple story from a few years ago. Jim Grabb, who was the No. 1 doubles player in the world then, came up to me at a tournament before the US Open and said, “I’m struggling with my first volley in singles. I can’t make a first volley.” And I told him, “You’re the No. 1 doubles player in the world. You have great volleys. And you’re saying you can’t make a first volley in singles.” He says, “Yeah.”
A lot of coaches would say, “How are you hitting it? Let’s analyse the stroke.” I asked, “When you step to the baseline to hit the serve, where is your first volley going?” Jim looked at me like I was speaking a foreign language. So I asked again, “Before you hit your first serve, where are you going to hit your first volley?” He said, “I just react to the ball. I don’t know what you’re talking about.”
So I suggested, “Do this. Every first volley goes to the open court. You serve wide in the deuce court and you volley wide into the ad court. You serve wide in the ad court and volley wide into the deuce court. Just for your first volleys.”
Jim goes out to play and comes back and says, “I didn’t miss a first volley.” The next week he got to the fourth round of the US Open, his best result at a Grand Slam (event) ever in singles. That had to do with the fact that all it really required was a little bit of focus by the player. It didn’t require a level of analysis and stroke production changes. It was simply eliminating decision-making.
What is the connection between analytics and the established field of biomechanics?
Analytics can tell you how a player is performing or how a stroke is performing in key situations. That can then identify that we need to examine the biomechanics of the stroke, particularly if it is breaking down under pressure. Or we can determine that the errors are occurring when the ball is bouncing four feet in the air versus three feet in the air, so their contact point is a foot higher. Now we can look at the biomechanics and see what the player is doing when the ball is a foot higher.
What are player rating systems? And what is the connection between analytics and player rating systems? How valid is the Universal Tennis Ratings system?
I don’t think there is any now. But that’s a direction we can take in the future.
Which match statistic or statistics do you foresee becoming increasingly important as a result of analytics?
I think you’ll see more focus on key point performance as we do more and more analysis of players’ games in key pressure situations. Because you’re serving half of the time and receiving serve half of the time, analytics will look increasingly at each half of the game. We talk a lot about unforced errors, but are they occurring on your serve game or return game? We talk about aggressive play and taking control of the points, but when is that happening? And the serve or return games? On the first serve or second serve?
Data analytics is undeniably changing tennis. Do you think it will revolutionise tennis?
Absolutely! Because the game is always changing. The technology around tennis and all sports keeps changing. Analytics is going to make the athletes better. It’s going to provide them with insights about how they can be at their peak for the key matches. It will help them train better, prepare better, execute shots better under pressure. All those pieces and parts will be available for athletes. And all of their nutritional, sleep, and training regimens will also help tennis players to perform better.
March 9, 2018
Sports Video Group
The 2018 NASCAR season is underway and with it comes a new remote production workflow for NASCAR whereby cameras and audio signals are being sent from race tracks to NASCAR’s production center in Charlotte, NC. The efforts began with the Rolex 24 at Daytona race and will continue with the WeatherTech SportsCar Championship racing series next week, and ARCA Racing Series as the season progresses.
“We have done a lot of testing at smaller events the past couple of years but this year we wanted to push the limits and see what we can do,” says Steve Stum, NASCAR Productions, VP of Operations and Technical Production.
The Rolex 24 Hour race used NEP’s NCP IV production unit to put out 12 hard cameras, two RF cameras for the pit announcers, and 14 in-car cameras around the track. RF was handled by 3G and a tech manager and engineering team ensured that 28 video and 75 audio signals were sent to Charlotte via a single antenna from PSSI Global Services. PSSI Global Services leveraged its C27 mobile teleport, equipped with cutting-edge Newtec modulators and GaN SSPB amplifiers from Advantech Wireless.
Rick Ball, Director of Broadcast Sports at PSSI Global Services, adds: “We’re not afraid to go where no one has gone before, and we’re proud that our efforts continue to create new possibilities in live television.”
Once the signals are back in Charlotte the director, producer, TD, replay, SMT virtual graphics, and announcers created the show.
“Round trip the latency is 1.09 seconds so we have camera returns and feeds for the screens for the fans in the stands,” adds Stum.
With upwards of a third of production costs being sunk into travel Stum says that the goal is to put more money into the production itself, get more specialized equipment, and have a production truck unit that is more aligned with the needs of a remote production.
The efforts are part of a season that Stum says has been going great so far. And all of the testing prior to the Rolex race paid off as Stum says nerves at the beginning subsided as the workflow was proven out.
March 2, 2018
Sports Video Group
As the NFL Scouting Combine becomes an increasingly fan-focused event onsite, NFL Media is expanding its already sizeable coverage of the annual event in Indianapolis. Last year, the NFL added Combine events, including the bench press and press conferences, at the Indianapolis Convention Center next door to Lucas Oil Stadium and allowed a limited number of fans into the stadium’s upper bowl in an effort to boost the NFL Combine Experience. With that in mind, NFL Network and NFL Digital outlets are rolling out their biggest productions to date to cover the growing parade of events taking place at both locations.
“We attack this show with everything we have in order to cover it from every aspect,” says Dave Shaw, VP, production, NFL Media. “The league has continued to expand the fan-focused aspect of the Combine at the convention center. They started that last year and are putting even more events over there this year. So we’ve expanded our show to cover some of the more fan-friendly stuff.”
For its 14th Combine, NFL Media is delivering a whopping 52 hours of live coverage during the event (Feb. 28 – March 5), including 34 hours from Indianapolis: 26 hours of Combine coverage Friday-Monday and eight hours of press conferences Wednesday and Thursday.
“This event really didn’t become ‘an event’ until it was covered by NFL Network,” says Christine Mills, director, remote operations, NFL Media. “It’s grown and evolved, and now fans are becoming more involved [onsite]. It’s interesting how it’s grown from a very small intimate event essentially just for scouts to an event covered by NFL Network and NFL Digital and on social. It’s grown into a fan-facing event, but it has kept that intimate feel at its core.”
Onsite in Indy: Encore and Pride, Four Sets Drive Multiplatform ProductionDespite the expansion, NFL Media has maintained the same footprint in the truck compound at Lucas Oil Stadium. Game Creek Video’s Encore is serving the NFL Network show, and Pride is handling the streaming coverage.
The trucks onsite are fully connected to NFL Media’s broadcast center in Culver City, CA, via diverse fiber circuits (with 12 muxed feeds going each way) to allow extensive file-transfer and backhaul of camera feeds.
“For our coverage, we treat this like we’re covering a high-end game,” notes Shaw. “It’s a very slick production that moves quickly. It is a bit of a marathon, but our production teams do an outstanding job of rolling in features and keeping the action moving. It’s an important show for the NFL Network and NFL Media group because it’s the baseline for what we are about, which is giving viewers the inside look and show fans what they should look for in the upcoming players.”
NFL Media has deployed a total of four sets — three at Lucas Oil (one on the field, two on the concourse level) and one at the convention center — to serve its 23-deep talent roster. Two of the three sets at the stadium are dedicated to the digital operation; NFL Network is manning the convention-center set, which is primarily for press-conference coverage.
“The setup we have at the convention center for NFL Network is very similar to [Super Bowl] Opening Night, where they have eight podium positions set up and we’re right in the middle of that room,” says Mills. “It ends up being a really fun and busy couple of days, especially with the fans more involved now [onsite].”
In addition to the four sets, NFL Network has a position in the traditional announce booth at Lucas Oil Stadium, as well as an interview location in a suite, where head coaches often stop by. For example, last year, NFL Media landed a rare interview with Patriots coach Bill Belichick in this location.
“Most of the head coaches are here in a casual atmosphere trying to pull something away from some of these players they’re evaluating,” says Shaw. “And the coaches have [free rein over] where they want to be in the building, so sometimes they will stop by the announce booth. Having Belichick stop by and do some time with our guys took us all off guard a little, but it was great and got a lot of attention. What’s exciting is, you don’t know what you’re going to pull off here since you have all the coaches and GMs. It’s a lot of fun trying to get in their minds and hearing what they have to say in this kind of atmosphere.”
The Camera Complement: Skycam, Robos, and TeamCamsBetween NFL Network and NFL Digital, the operation is deploying a combined 37 cameras at the two venues, including a SkyCam at the stadium and a large complement of robos (provided by Indy-based Robovision) at both locations. In addition, five ENG cameras are roving the grounds capturing content, which is being sprinkled into both the linear and the streaming coverage.
NFL Media will continue to spotlight the 40-yard–dash drill, with a high-speed camera capturing the smallest details. In addition, SMT is providing virtual graphics and graphics overlays for visual comparison of prospects with one another or with current NFL players’ Combine performances: for example, projected top pick QB Sam Darnold vs. Pro Bowl QB Carson Wentz’s sprint).
In addition, NFL Media is leveraging its Azzurro TeamCam system to provide live shots throughout its press-conference coverage. The TeamCam system, which NFL Network has used for a variety of needs for several years, features a single camera and transports bidirectional HD signals via a public-internet connection — along with IFB, comms, and tally — between Indianapolis and Culver City. In addition to a show produced onsite during the first two days, all press conferences are fed to Culver City via the TeamCam system.
“It’s interesting what we do for our live shots with the TeamCam system,” says Shaw. “We can just do one-off cameras, or we can bring it back; we can do two-ways just with a single camera. It’s a great [tool] for our Wednesday and Thursday coverage.”
NFL Digital Bigger Than Ever at CombineNFL Digital’s presence continues to grow at the Combine. NFL Now Live is streaming on NFL.com, the NFL app, and Yahoo.com Friday-Monday beginning at 9 a.m. ET. In addition, NFL Media is providing extensive social-media coverage across Twitter, Facebook, Instagram, and Snapchat. Twitter Amplify is being used to produce highlights, distribute on-the-ground original content of top achievements across social networks, and deliver original social content to all 32 NFL clubs. On top of that, for the first time, the NFL is coordinating with some of the top college football programs to share, create, and amplify social-media content from Indianapolis.
In addition to live coverage, each prospect goes through the “Car Wash” following his press conference at the convention center. Each player progresses through interviews with NFL Media’s features team, digital team, and social-media team.
“These [Car Wash] interviews help us build features and get footage for the Draft,” says Shaw. “It also helps us down the road, and we’ll use footage all the way through the season. This is an NFL Media-exclusive event, so we go out of our way to give the avid NFL fan that inside position they don’t usually get to see.”
February 28, 2018
Sports Video Group
NFL Network will produce and broadcast 11 live American American Flag Football League (AFFL) games during its debut season, as well as distribute highlights from the AFFL’s upcoming 2018 U.S. Open of Football (USOF) Tournament. The agreement is the first-ever broadcast deal for professional flag football, and “provides a unique opportunity for the NFL to explore digital distribution of AFFL content,” according to the league’s announcement. The 11 game telecasts will be produced by NFL Network and feature NFL Network talent.
“Today marks great progress for football fans and players,” says AFFL CEO/founder Jeffrey Lewis. “As the first-ever broadcast and distribution deal focused on bringing the game of flag football to the broadest possible audience, we are thrilled to partner with NFL Network, the premier platform for football.”
The AFFL is set to launch this summer, and NFL Network is expected to build on the unique use of technology deployed for coverage of the AFFL’s first exhibition game on June 27, 2017, at Avaya Stadium in San Jose, CA. In an effort to create a wholly revamped football-viewing experience similar to the Madden NFL gaming look, the AFFL production team deployed SkyCam as the primary play-by-play angle (prior to NBC Sports’ decision to do so for several games during the 20017 NFL season), RF cameras inside the huddle, and SMT virtual graphics and augmented-reality elements all over the field.
The USOF is a 132-team, single-elimination tournament that will ultimately pit a team of elite former professionals against a team that has conquered a 128-team open national bracket. The tournament marks the AFFL’s first major competition, following an exhibition game in June 2017. NFL Network will televise 11 USOF games live June 29-July 19, concluding with the Ultimate Final, where America’s Champion and the Pros’ Champion will meet in a winner-take-all contest for $1 million.
The broadcasts are currently scheduled for the following dates:
The four Pro teams are expected to be led by Michael Vick, Chad “Ochocinco” Johnson, basketball duo Nate Robinson and Carlos Boozer, Justin Forsett, and Olympic champion Michael Johnson. Airtimes and broadcast talent for USOF games on NFL Network will be announced at a later date.
“Football fans are passionate about having continuous access to entertaining football content all year round,” said Mark Quenzel, SVP, programming and production, NFL. “AFFL games on NFL Network will give viewers a chance to experience a new kind of football competition in the summer months, and we’re excited for the opportunity to deliver more live programming that fans enjoy.”
The AFFL is extending the application deadline for the USOF from March 1 to March 8. Interested applicants can apply to play in the USOF here. Those selected will play in America’s Bracket, which comprises 128 teams.
February 19, 2018
Sports Video Group
One of the highlights of Turner’s NBA All-Star Saturday Night coverage was the debut of a shot-tracking technology developed by Israeli startup RSPCT. Deployed for the Three-point Contest, RSPCT’s system, which uses a sensor attached to the backboard to identify exactly where the ball hits the rim/basket, was integrated with SMT’s graphics system to offer fans a deeper look at each competitor’s shooting accuracy and patterns.
“There is a story behind shooting, and we believe it’s time to tell it. Shooting is more than just a make or a miss,” says RSPCT CEO Oren Moravtchik. “Turner and the NBA immediately understood that the first time they ever saw [our system] and said, Let’s do it.”
During Saturday night’s telecast, Turner featured an integrated scorebug-like graphic showing a circle representing the rim for each of the five racks of balls during the competition. As a player took a shot, groupings indicating where the ball hit the rim/basket were inserted in real time, showing where the ball landed on the rim or inside the basket.
“It’s a bridge between the deep analytics that teams are using and the average fan,” says RSPCT COO Leo Moravtchik. “Viewers can understand shooting accuracy faster and better without having to dive into analytics; they clearly see groupings of shots and why a shot is made or missed. Last night, if a player missed all five shots of a rack, you could see why: if they are all going right or all going left.”
The system, which can be set up in just 30 minutes, consists of a small Intel RealSense Depth Camera mounted behind the top of the backboard and connected wirelessly to a small computing unit.
“We have some very sophisticated proprietary algorithms on the sensor,” says Oren Moravtchik. “The ball arrives at a high speed from the three-point line at various angles. We can [capture] the entire trajectory of the ball: where it came from, how it flew in the air, where it hit the basket — everything. We know the height of the player, the release point, and where it hit the basket, and then we can extrapolate back from there.”
Although Saturday night marked the debut of the RSPCT system for the NBA, Leo Moravtchik sees far more potential once complete data sets on players can be captured — such as a full playoff series or even a full season.
“There may be an amazing player shooting 18 out of 20 from every [three-point] location, but there are differences between locations beyond just field-goal percentage,” he says. “Based on our data, we not only can show them [that] shooting [tendencies] can predict, [that] we can actually project their field goals for the next 100 shots. We can tell them, If you are about to take the last shot to win the game, don’t take it from the top of the key because your best location is actually the right corner.”
RSPCT is not only focusing on sports broadcast and media clients but marketing the system as a scouting and player-development tool.
“We’re [targeting] NBA teams, college teams, and even high school and amateur teams,” says Leo Moravtchik. “Wherever there is a basket — camps, gyms, schools — people want to see how they are shooting. We can bring it there because it’s a 30-minute installation and very cost-effective.”
February 16, 2018
Sports Video Group
The 60th running of the Daytona 500 takes place this Sunday, and Fox Sports, as it has done every year, again has found a way to push the technological envelope and expand on the resources dedicated to broadcasting the Great American Race. Coverage of this year’s race includes the introduction of Visor Cam, the return (and refinement) of the dedicated Car Channels on Fox Sports GO, and — in an industry first — a tethered drone that will provide live coverage from behind the backstretch at Daytona International Speedway.
“Every year, there’s something new,” says Mike Davies, SVP, field and technical operations, Fox Sports. “The Daytona 500 is always a great way to kick off the first part of the year in terms of technological testing: a lot of the things that we bring down to Daytona to look at, to test, and to try are things that manifest themselves later and in other sports. It’s a lot of fun to dream these things up.”
A Unique Point of ViewThis weekend’s race will feature all the camera angles that racing fans have come to expect, plus a few new views that promise to enhance the broadcast. Fans have grown accustomed to seeing their favorite drivers up close thanks to in-car cameras, but, on Sunday, they’ll be able to see what the driver sees.
Visor Cam, which first appeared at the Eldora NASCAR Camping World Truck Series race last year, makes its Daytona 500 debut this weekend. The small camera, developed by BSI, will be clipped to the helmets of Kurt Busch (last year’s Daytona 500 champion) and Daniel Suarez.
“You can try to put cameras everywhere you can, but seeing what the driver is seeing through a camera placed just above his eye line on his visor is pretty cool,” says Davies. “We’re looking forward to having that at our disposal.”
Fox Sports worked closely with NASCAR and ISC to provide aerial drone coverage of the Daytona 500. The drone, which will be tethered to allow longer periods of flight time, will move around behind the backstretch — outside of the racing area — to cover the race from a new angle.
Gopher Cam, provided by Inertia Unlimited, returns for its 10th year with enhanced lens quality for a wider, clearer field of view. Three cameras will be placed in the track, including one in Turn 4 and another on the backstretch.
Cameras, Cameras EverywhereFox Sports will deploy a record number of in-car cameras during the Daytona 500. In total, Sunday’s broadcast will feature 14 in-car cameras, including the pace car — more than in any NASCAR race in the past 15 years. Each car will be outfitted with three cameras for three viewing angles.
Last year, Fox Sports launched two dedicated Car Channels on the Fox Sports GO app, each focusing on a single driver. For this year’s race, Fox Sports has opted for a team approach, showing multiple drivers, cars, and telemetry data on the channel.
In total, Fox Sports will deploy a total of 20 manned cameras, including three Sony HDC-4300’s operating in 6X super-slo-mo, one Sony HDC-4800 operating in 16X HD slo-mo, and an Inertia Unlimited X-Mo capturing 1,000 frames per second. Fox Sports will outfit its Sony cameras with a variety of Canon lenses, ranging from handheld ENG to the DIGISUPER 100. The network will also have four wireless roving pit/garage camera crews, 10 robotic cameras around the track (plus three robotic Hollywood Hotel cameras), and a jib camera with Stype augmented-reality enhancement. The Goodyear Blimp will provide aerial coverage.
Not to be forgotten, viewers will be treated to all the sounds of the race as well, thanks to more than 100 microphones surrounding the track. Fox Sports plans to make use of in-car radios throughout the broadcast, both in real time (having the drivers and crew chiefs narrate the race) and after the fact (using the audio to tell a story).
A Compound Fit for the Super Bowl of RacingFor the first time in 12 years, Game Creek Video’s FX mobile unit will not handle Fox Sports’ Daytona 500 production. Instead, Game Creek’s Cleatus (known by another network as PeacockOne) will be responsible for the main race broadcast and will be joined in the compound by 11 additional units for digital production, editing, RF cameras and audio (BSI), telemetry and graphics (SMT), and studio production. Two satellite uplink trucks will be onsite, as well as a set of mobile generators that will provide nearly 2 MW of power independent of the local power source.
Fox Sports is shaking up its transmission as well, relying on an AT&T gigabit circuit capable of transmitting eight video signals (and receiving four) via fiber by way of its Charlotte, NC, facility to Fox Sports’ Pico Blvd. Broadcast Center in Los Angeles.
“Based on some of the things that we’re doing for the World Cup in Moscow as well as home-run productions for MLS and college basketball, we’ve taken some of that knowledge and leveraged it to doing full-on contribution for NASCAR,” Davies explains. “It’s exciting, it’s scalable, and we’re looking forward to doing it. AT&T put in circuit at every track or is in the process of doing so, so this is a first foray into IP transmission as it relates to NASCAR.”
The benefit of transitioning to IP transmission, according to Davies, is the volume of content that Fox Sports will be able to send from tracks that notoriously lack connectivity. “At the end of the day,” he says, “we’ll be able to leverage resources from Charlotte and Pico to do more things. Right now, we’re able to contribute more to our Charlotte shows via fiber, but, like everything in technology, the more we get used to it and the more we know how to use it, the more useful it’s going to be.”
Daytona 500 Gets a Graphics MakeoverThe on-air graphics package for the Daytona 500 will be new, featuring much of the look and feel of Fox Sports’ football, basketball, and baseball graphics with all the data that NASCAR fans expect.
Fox Sports will up the ante on virtual graphics and augmented reality, deploying Stype camera-tracking technology (with a Vizrt backend) on a jib between Turns 3 and 4 in order to place 3D graphics within the broadcast. For example, the system can be used to create virtual leaderboards, sponsor enhancements, and race summaries that are placed on Turn 3 as virtual billboards.
“Where that jib is between Turns 3 and 4, you can place graphics [on screen in] such a way that you don’t necessarily have to leave the track in order to get information across,” Davies explains. “In the past, we might have used full-screen graphics, but now, we can put the graphics in space, and it looks pretty cool. It’s the third year that we’ve been doing that, and we seem to get better at it each year.”
The network has also enhanced its 3D-cutaway car, putting these graphics in the hands of the broadcast team. And, in the booth, Fox Sports NASCAR analyst Larry McReynolds will have his own dedicated touchscreen, allowing him to enhance any technical story and give the viewer clear illustrative explanations during the race.
A Company-Wide EffortBetween the production personnel, camera operators, engineers, on-air talent, and many more, Fox Sports currently has 300 people onsite at the Daytona International Speedway. In addition, Fox Sports’ Pico and Charlotte facilities, as well as its network-operations center in The Woodlands, TX, are very much a part of the action. And, when the Daytona 500 starts on Sunday, all will be ready to deliver this year’s race to NASCAR fans everywhere.
“Between everything that you’re going to see on-screen and everything under the hood, these are all things that are going to help the company as a whole,” says Davies. “We’ve been able to bring together all of the resources across the company, and it’s particularly exciting to get everybody working as one on this event.”
Latest HeadlinesEuropean Championships 2018: BBC To Give ‘Major Event Treatment’ to Inaugural Multi-Sport ShowcaseSCMS 2018: Fox Sports Execs Reflect on Cloud Workflows, Data Strategy for 2018 FIFA World CupITN Productions Provides Live Commentary for International Champions Cup With ArqivaPGA Tour Partners With NBC Sports to Bring PGA TOUR LIVE to Network’s Direct-to-Consumer Platform, NBC Sports GoldMX1, Arista Networks, Tektronix, Warner Chappelll Production Music Renew SVG SponsorshipsCenturyLink Renews SVG Platinum SponsorshipSCMS 2018: Quantum’s Molly Presley Shares the Latest on StorNext, Tiered StorageSCMS 2018: NBC Sports’ Darryl Jefferson Offers a Look at 2018 Olympics Asset ManagementSCMS 2018: How Machine Learning Can Make Sports Workflows More EfficientEBU To Conduct UHD HDR High Frame Rate Tests With NGA at European Championships
February 8, 2018
Digital Journal
DURHAM, N.C.--(Business Wire)--NBC Olympics, a division of the NBC Sports Group, has selected SMT to provide real-time, final results and timing interfaces for its production of the XXIII Olympic Winter Games, which take place in PyeongChang, South Korea, from February 8 - February 25. The announcement was made today by Dan Robertson, Vice President, Information Technology, NBC Olympics, and Gerard J. Hall, Founder and CEO, SMT.
Since 2000, SMT has been a key contributor to NBC Olympics’ productions by providing results integration solutions that have enhanced NBC’s presentations of the Games via on-air graphics, scheduling, and searches for content in the media-asset–management (MAM) system.
For the 2018 Olympic Winter Games, SMT will deliver TV graphics interfaces for NBC Olympics’ Chyron Mosaic systems in its coverage of alpine skiing, freestyle skiing, snowboarding, figure skating, short track speed skating, speed skating, bobsled, luge, skeleton, ski jumping and the ski jumping portion of Nordic combined.
SMT’s Point-in-Time software system integrates live results to allow commentators to locate a specific time during a competition in both live and recorded coverage. The software graphically shows key events on a unified timeline so that NBC Olympics commentators can quickly see how a race began, when a lead changed, where an athlete’s performance improved, and the kinds of details that dramatically enhance the incredible stories of triumphs and defeats intrinsic to the 2018 Winter Games.
“The complexity and sheer amount of scoring, tracking, and judging data that comes with an event of this size, both real-time and post production, is beyond compare,” said Robertson. “The ability to organize and deliver it aids NBC’s production in presenting the stories of these amazing athletes, and requires nothing short of the capabilities, innovation and track record of SMT.”
“It is our privilege to provide our expertise, experience, and results reporting technology for NBC Olympics’ production of the 2018 Olympic Winter Games, SMT’s 10th straight Olympics,” said Hall. “Our team of 10 on-site engineers have rigorously prepared for PyeongChang with a tremendous amount of testing and behind-the-scenes work, ensuring SMT delivers seamless services of a scope and scale unprecedented in a sports production.”
SMT’s partnership with NBC Olympics began with the 2000 Sydney Games and has included providing graphics interfaces as well as NBC’s digital asset management interface that helped the network receive Emmy Awards for “Outstanding Team Technical Remote,” following the 2008 and 2016 Games.
About NBC Olympics
A division of the NBC Sports Group, NBC Olympics is responsible for producing, programming and promoting NBCUniversal's Olympic coverage. It is renowned for its unsurpassed Olympic heritage, award-winning production, and ability to aggregate the largest audiences in U.S. television history.
For more information on NBC Olympics’ coverage of the PyeongChang Olympics, please visit: http://nbcsportsgrouppressbox.com/.
About SMT
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious events, including the Super Bowl, major golf and tennis events, the Indianapolis 500 and the World Series. The 31-time Emmy Award-winning company is headquartered in Durham, N.C. For more information, visit smt.com.
February 5, 2018
Sports Video Group
To put it mildly, the 2017-18 NFL campaign has been a memorable one for SkyCam. In a matter of months, the dual-SkyCam model — an unheard-of proposition just a season ago — has become the norm on high-profile A-game productions. In addition, the company unveiled its SkyCommand for at-home production in conjunction with The Switch, with plans to continue to grow this central-control model. In addition, last year, SkyCam worked with SMT to debut the 1st & Ten line and other virtual graphics on the SkyCam system; today, it is standard practice on almost any show using a SkyCam.
At Super Bowl LII, SkyCam once again deployed dual SkyCams with the high-angle focusing on an all-22 look and the lower SkyCam focusing on play-by-play. SVG sat down with Chief Technology Officer Stephen Wharton at U.S. Bank Stadium during Super Bowl Week to discuss SkyCam’s role in NBC’s game production, the rapidly growing use of the dual SkyCams by broadcasters, NBC’s use of the system as the primary play-by-play game camera on a handful of Thursday Night Football games this season, and an update on the company SkyCommand at-home–production control system, which was announced earlier this year.
Tell us a bit about your presence at U.S. Bank Stadium and the role SkyCam will play in NBC’s Super Bowl LII production?We were fortunate enough to be here with Fox for the Wild Card Game, and that allowed us to keep a majority of our infrastructure in place. Also, when the stadium was built, they built in a booth for SkyCam and cabled the building, so that obviously helped us quite a bit. But we’ve been here since Sunday working with the halftime show to make sure that our rigging isn’t in the way of them and they’re not in the way of us. And then, Monday, full crew in for Tuesday first-day rehearsal, and then all the way through the week.
In a matter of months, several major NFL broadcasters have adopted the dual-SkyCam model. What are the benefits of two SkyCams?We used to say you knew you had a big show when you had SkyCam on it. Now you have a big show when you have two SkyCams on it. I think one of the key driving factors for [the increased use of] dual SkyCam was working with the NFL and the broadcasters to better highlight Next Gen Stats. And, working with SMT on their auto render system, one of the big values that we now bring is this ability to show you the routes and what’s going on with each player as the play develops from the overhead all-22 position.
It just so happened that, as the dual systems started to evolve, we got this amazing opportunity in Gillette Stadium when the fog came in and no other cameras could be used. Typically, you think of SkyCam as being used for the first replay camera; we’re not necessarily live. But, in that instance, we had to go live with SkyCam, and the first replay became the high SkyCam. That opportunity changed how we are seen and used. It demonstrated what you could do with SkyCam, and that obviously penetrated all the other networks. You get two totally different angles, one more tactical and one play-by-play, and there’s really no sacrifice. You’re not giving anything up on the lower system; you’re actually helping because you don’t have to chase down beauty shots and comebacks since the upper system can do that. The lower system can just focus on play-by-play.
Do you expect the use of dual SkyCams for NFL coverage to continue to grow next season?I think that you’ll continue to see the dual SkyCams become more of the norm, not just for the playoff games but for most A-level shows, because it brings such a value for both Next Gen Stats and the broadcasters. We’re obviously super excited about that.
I think there’s a bifurcation between audiences in terms of [SkyCam] as a primary angle: some really love it, and some don’t like it. But what you’re seeing in broadcast today with the growth of technology and evolving media is that people end up with a buffet of options to choose from: OTT, streaming, mobile, television, or something else. And there is a market for all of it. I think, at the national level, you’ll see more play-by-play action live from SkyCam because broadcasters will be able to use it and distribute it however they like.
At NAB 2017, you introduced SkyCommand, an at-home–production tool that allows SkyCam operators to be located remotely. Do you have any update on this platform, and are broadcasters using it already?We have seen tremendous interest. People are asking where and when they can we do this, but there are obviously a couple different challenges we have to address: one, since it’s a cost-saving model, you’re looking at lower-tier shows in venues that don’t have much infrastructure in most cases. That said, when you take lower-tier games that happen to take place in venues that [have the necessary infrastructure], it becomes very appealing. Most of our network partners have been very interested in finding ways of utilizing Sky Command for [at-home] production. [Our partners] Sneaky Big Studios and SMT are on board, and we’re looking at doing a lot more of it in 2018. We’ve actually got some pilot programs already.
Just a couple weeks ago, we relocated SkyCam into an 80,000-sq.-ft. facility a few miles down the road from our old facility. It’s a brand-new facility, built from the ground up, that’s tailored to our needs. We’ve got two entire broadcast booths with SkyCommand in mind. One is a network-operation center with full streaming capabilities and data connectivity to the games that we’re doing. Beyond SkyCommand, when our operators are onsite, we will have a guy in Fort Worth who is basically at NOC watching the game. This person will be looking at the responses coming out of the computer systems and will be on PLs with the [on-site operators]. And then we can send that video back to the NOC and address any type of issues that we have; it gives us a great ability to manage that. The second booth is where we can actually put an operator and a pilot.
We’re continuing to work with the network vendors —The Switch, CenturyLink, and others — but we’ve already got full 10-gig fiber to the facility. So we’re working now to put all that in place for SkyCommand. I think you’ll see that more in 2018.
In what other sectors is SkyCam looking to grow in the near future?We’re also trying to expand [permanent SkyCam installations] throughout the NFL. I expect that we will have some other announcements coming out shortly about additional teams building on what we did with the Baltimore Ravens last year. Those team SkyCams will continue to grow in 2018, and we’re looking at leveraging Sky Command specifically for those cases.
February 5, 2018
Sports Video Group
SMT (SportsMEDIA Technology) is bringing a number of Super Bowl firsts to Minneapolis on both the broadcast and the in-venue production side. On NBC’s Super Bowl LII broadcast, SMT will deploy a telestrator on the high SkyCam for the first time and also will have the 1st & Ten line available on additional cameras. The in-venue production will offer the 1st & Ten line on the videoboards for the first time in a Super Bowl and will also feature enhanced NFL Next Gen Stats integration.
“It’s always exciting to do something brand new for the first time,” says SMT Coordinating Producer Tommy Gianakos, who leads the NBC SNF/TNF team. “And it’s even better when you’re doing it on the biggest show of the year with a lot of extra pieces added on top.”
In addition, during the Super Bowl LII telecast, NBC Sports’ production team will have access to a new telestration system on the high SkyCam for first replays.
“We’re now adding some telestration elements on SkyCam,” Gianakos explains. “In the past, we’ve been able to have a tackle-box [graphic] on one of the hard cameras if there’s an intentional-grounding play, but we haven’t been able to do it from high and low SkyCam on first or second replay. That intentional-grounding [virtual graphic] right above the tackles on SkyCam is something we haven’t been able to do before, but now we are able to do pretty instantaneously.”
SMT demonstrated it for NBC Sports producer Fred Gaudelli on Friday when a high school football team was on the field, and NBC opted to move forward with the system for the game.
“We’re able to do backwards-pass line virtually in real space; we’re able to measure cushions, able to paint routes on the field, all very rapidly,” says Ben Hayes, senior account manager, SMT. “It’s pretty unique to this show and the first time we’re going to be doing it on-air.”
In addition to having the live 1st & Ten line on both SkyCams and the same six hard cameras available for NBC’s Thursday Night Football and Sunday Night Football telecasts, SMT has added it to the two goal-line cameras, the all-22 camera, and two more iso cameras.
SMT also added next-gen DMX switchboard connectivity to NBC’s scorebug, so on-field graphics will update in real time and list personnel and formations of both teams.
“From a crew standpoint, it was really nice for us to have both Thursday Night Football and Sunday Night Football this season because it gave us a second group of people that understood the expectations of this show and what Fred and [director] Drew [Esocoff] really want from the show,” says Hayes. “We were basically able to merge those two crews for this game and not miss a beat.”
On the Videoboards: 1st & Ten line, Enhanced Next Gen StatsFans at the stadium will be able to see the 1st & Ten line system on the videoboards. For the first time at a Super Bowl, the yellow virtual line will be deployed on three cameras –— on the 50- and both 25-yard lines — for the in-venue videoboard production.
Also, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering real-time content not available on broadcasts.
“It’s amazing to be doing this here at Super Bowl,” says Ben Grafchik, business development manager, SMT. “Obviously, we can build upon the technology in the future, but this is our first step into it. And then I’m looking to try to continue that going forward.”
Fans inside U.S. Bank Stadium will have access to real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?). The production is made possible by SMT’s Dual-Channel SportsCG, a turnkey clock-and-score–graphics publishing system that requires just a single operator.
“We knew the Minnesota Vikings were already doing virtual and NFL Next Gen Stats, so we started thinking about what we could do to spice it up for the Super Bowl,” says Grafchik. “We’re throwing a lot of things at this production in hopes of seeing what sticks and what makes sense going forward for other venues.”
In the lead-up to the game, SMT worked with the league to merge the NFL Game Statistics & Information System (GSIS) feed with NFL Next Gen Stats API to come up with a simple lower-thirds graphics interface. This will allow the graphics operator to easily create and deploy a host of new deep analytics graphics on the videoboard during the game.
“These additional NGS elements get viewers used to seeing traditional stats along with nontraditional stats when they are following the story of the game,” says Grafchik. “If Alshon Jeffery has a massive play, the operator can instantly go with the lower third for his average receptions per target. The whole plan was to speed up this process so that this individual isn’t [creating] true specialty graphics; they’re just creating traditional graphics with extra spice on top of it. By getting quick graphics in like that, it helps to tell a story to the viewer in-venue without much narration on top of it.”
February 4, 2018
Sports Video Group
Since the first beam went up on this massive structure in Downtown Minneapolis, U.S. Bank Stadium has been building to this moment. Super Bowl LII is here, and an all-star team from Van Wagner Sports & Entertainment Productions, stadium manager SMG, and the Minnesota Vikings is ready to put on a Super Bowl videoboard production for the ages.
When 66,000-plus pack into the sparkling bowl, they’ll be treated to quite a few in-venue firsts on those boards, including the Super Bowl debut of SMT’s Yellow 1st & 10 line, a completely new Super Bowl LII graphics package, and an expanded arsenal of camera angles.
“Every Super Bowl, we’re tasked with moving the needle,” says Bob Becker, EVP, Van Wagner Sports & Entertainment (VWSE) Productions, which has designed the videoboard. “What can we do differently this Super Bowl that we haven’t done in the past? That’s our constant challenge. This is my 23rd [Super Bowl], and, every year, it gets bigger and bigger and bigger. When it’s over, you say, ‘Wow, what a great job,’ and then you start stressing about next year and wonder, ‘Well, how do we top that?’ That’s how I feel about that: you’ve got to always up your game.”
The stadium’s crown jewels are a pair of Daktronics video displays behind the end zones that measure 68 x 120 ft. and 50 x 88 ft., respectively. This year, for the first time at a Super Bowl, those boards will feature a full complement of the Yellow 1st & 10 line. SMG and the Vikings had a standing relationship with North Carolina-based SMT throughout the season, offering the yellow line encoded on their 50-yard-line camera. For the Super Bowl, they chose to expand it to include the other main cameras at each of the 20-yard lines. SMT’s Ben Grafchik will be sitting at the front of the control room, calling up specialty data-driven graphics, tickers, and data feeds for the control-room crew to call up as they desire.
Those advanced graphics are part of a completely fresh graphics package that Van Wagner has developed for this game. It’s the classic hard work done by the company: build a season’s worth of graphics to be used on a single night. Also, not only does Van Wagner come in and take over the U.S. Bank Stadium control room, but its team has basically torn it apart, pulling out gear and replacing it with specialty systems in order to take the videoboard show to that next level.
“It’s not because it’s not good,” says Becker, “but that’s how we make it bigger and better. Sometimes, you’ve got to bring technology in to make it bigger and better. And, to these guys’ credit, they have not only been there from Day One for us but have been open to allowing us to tear apart their room and integrate these new things. And it happens a lot that they go, Hey, you know something, I’d love to use that for a Vikings season next year. So there’s benefit on both sides.”
One of the vendors that has gone above and beyond for the control room is Evertz. The company has provided a crosspoint card for redundancy and the EQX router while also supplementing with some spare input cards, output cards, and frame syncs.
It’s a challenging effort to make temporary alterations to the control room, but SMG and the Vikings have welcomed the opportunity to expand with open arms.
“There’s a reason I took this job,” says Justin Lange, broadcast operations coordinator for U.S. Bank Stadium, SMG. “This is a prestigious event, and this is big for this city, the Vikings, and for us as a company. It’s been a great experience. It’s a great opportunity for us to showcase what we can do with this room, what we can do with these boards. The sightlines are great in this facility. The boards are great, the IPTV system is expansive, and we’re just excited to showcase what we have to offer as a facility.”
Normally, the control room features both Evertz IPX and baseband routing, an 8M/E Ross Acuity switcher with 4M/E and 2M/E control panels to cut secondary shows, and Ross XPression graphics systems. The all-EVS room houses a wide range of EVS products, including three 12-channel 1080p replay servers, one 4K replay server, IPDirector, Epsio Zoom, and MultiReview.
For the Super Bowl, the control room will have more cameras to choose from than it has ever had before. A total of 18 in-house cameras deployed throughout the bowl (which is more than the normal eight for a Vikings game), including four RF handhelds, an RF Steadicam, and two robotics.
The crew is also an impressive sight to behold. Nearly 100 people are working on the videoboard show in the combined efforts between Van Wagner, SMG, and the Vikings. There’s also a handful of editors across the street in the 1010 Building (where many broadcasters have set up auxiliary offices) cutting highlight packages and team-specific content.
“This is the biggest event in the world,” says Becker, “and we and the NFL mean to acknowledge that. We’re willing to do what needs to be done to put on the biggest event in the world.
February 2, 2018
NBC Sports
NASCAR will provide its teams with more data in real time this season, giving them access to publicly available steering, brake, throttle and RPM information as well as live Loop Data for the first time.
The information will be provided for every driver on every lap of every session on track.
The steering, brake, throttle and RPM information has been available through NASCAR.com’s RaceView application, which uses the information provided by the electronic control units used in the electronic fuel injection systems. Some teams have created labor-intensive programs that scraped the data from RaceView, so NASCAR decided to save time and effort for teams by directly providing the information.
No other engine data will be released. The ECU can record 200 channels of information (of a possible 1,000 parameters). NASCAR assigns about 60 channels (including the steering, brake, throttle, and RPM), and teams can select another 140 channels to log through practices and races. Those channels will remain at the teams’ discretion and won’t be distributed by NASCAR.
NASCAR’s real-time data pipeline to teams this season also will include Loop Data, which was created in 2005 and has spawned numerous advanced statistical categories that have been available to the news media. The information was born out of a safety initiative that installed scoring loops around tracks after NASCAR ended the practice of racing to the caution flag in ‘03.
Previously, teams had been provided only lap speeds/times; now they will have speeds in sectors around the track marked by the scoring loops.
Teams still won’t be given Loop Data for the pits, where the scoring loops are installed to maintain a speed limit for safety. If a scoring loop in the pits were to fail during a race, teams theoretically could take advantage of that by speeding through that loop (particularly those whose pit stall is in that sector). NASCAR does provide teams with pit speeds after races.
February 2, 2018
Stadium Business
The NFL’s popular Next Gen Stats data feed is getting a boost with real-time data delivery and graphics solutions firm SportsMEDIA Technology (SMT) for Super Bowl LII at the U.S. Bank Stadium.
For the championship game this Sunday in Minneapolis, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering fans real-time content not available on broadcasts.
SMT’s in-stadium production combines in-game stats that are displayed on the stadium’s two massive video boards, as well as 2,000 in-concourse HD video displays.
U.S. Bank Stadium, home of the Minnesota Vikings, boasts 31,000 square feet of video boards, including the west end zone display at 120- by-68 feet high, and the east end zone display at 88- by 51-feet high.
The 65,000 fans at Super Bowl LII will be presented with real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?).
“As an organisation, the Minnesota Vikings constantly look for innovative strategies that provide the best fan experience possible, and SMT’s in-stadium solution is the perfect complement to our new video boards,” said Allen Wertheimer, senior manager of production for the Vikings.
“For years, we’ve heard from fans that they want the same innovative technology in-stadium that they get at home. Now, with SMT’s presentation of the virtual 1st and Ten system and the NFL’s Next Gen Stats on the video boards, we can offer them in-game stats they wouldn’t get watching from home.”
Ben Grafchik, SMT’s business development manager, said: “In anticipation of creating the ultimate Game Day experience for Super Bowl fans at U.S. Bank Stadium, we have worked diligently all season with the Vikings and the NFL to provide in-stadium 1st & Ten graphics and NFL’s Next Gen Stats, giving fans the real-time data they’re hungry for, such as positional information, game leaders, and quarterback passing.
“We are confident that our execution will provide quantifiable and unique data points that truly highlight the skills inherent in elite NFL athletes.”
This year’s Super Bowl pits the New England Patriots against the Philadelphia Eagles.
January 31, 2018
Business Wire
DURHAM, N.C.--(BUSINESS WIRE)--SMT (SportsMEDIA Technology), the leading innovator in real-time data delivery and graphics solutions for the sports and entertainment industries, today announced it is providing in-stadium solutions, including its Emmy-winning virtual 1st & Ten line system and the NFL’s new Next Gen Stats, for Super Bowl LII, to be held Feb. 4 at U.S. Bank Stadium.
For Super Bowl LII, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering fans real-time content not available on broadcasts. SMT’s in-stadium production combines in-game stats integrated into SMT-designed graphics packages that are displayed on the stadium’s two massive video boards, as well as 2,000 in-concourse HD video displays, offering fans a chance to watch highlights and stay informed no matter where they are in the stadium. U.S. Bank Stadium boasts 31,000 square feet of video boards, including the west end zone display at 120- by-68 feet high, and the east end zone display at 88- by 51-feet high.
The more than 65,000 football fans attending the Super Bowl will be treated to a variety of valuable real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?). The production is made possible by SMT’s Dual-Channel SportsCG, a turnkey clock-and-score graphics publishing system that requires just a single operator.
“As an organization, the Minnesota Vikings constantly look for innovative strategies that provide the best fan experience possible, and SMT’s in-stadium solution is the perfect complement to our new video boards,” said Allen Wertheimer, Senior Manager of Production for the Minnesota Vikings. “For years, we’ve heard from fans that they want the same innovative technology in-stadium that they get at home. Now, with SMT’s presentation of the virtual 1st and Ten system and the NFL’s Next Gen Stats on the video boards, we can offer them in-game stats they wouldn’t get watching from home.”
“In anticipation of creating the ultimate Game Day experience for Super Bowl fans at U.S. Bank Stadium, we have worked diligently all season with the Vikings and the NFL to provide in-stadium 1st & Ten graphics and NFL’s Next Gen Stats, giving fans the real-time data they’re hungry for, such as positional information, game leaders, and quarterback passing,” said Ben Grafchik, SMT Business Development Manager. “We are confident that our execution will provide quantifiable and unique data points that truly highlight the skills inherent in elite NFL athletes.”
In addition to in-stadium solutions, SMT will provide broadcast solutions for Super Bowl LII, including the virtual 1st and Ten system, data-driven graphics and tickers, and in-game data feeds to commentator touchscreens, among other services. SMT has supported Sunday Night Football on NBC since 2006.
About SMT
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious events. The 31-time Emmy Award-winning company is headquartered in Durham, N.C.
January 30, 2018
Sports Video Group
With the Madden NFL 18 Club Championship Finals in full swing this week and the recent announcement of a new TV and streaming deal with Disney/ESPN, EA’s Madden NFL Championship Series is squarely in the esports spotlight. The series has been moving toward this moment for months, with 11 NFL teams hosting events in which fans competed to advance to the Finals in Minneapolis this week. In its first foray into competitive gaming, SMT’s Video Production Services (VPS) group produced events for the Arizona Cardinals, Buffalo Bills, and Jacksonville Jaguars throughout the end of 2017.
“SMT’s experience with supporting top football shows like the Super Bowl and Sunday Night Football makes us uniquely positioned to attract Madden gamers to the NFL through the medium they are most attracted to: esports,” says C.J. Bottitta, executive director, VPS, SMT. “With a worldwide fan audience now estimated at 280 million, approaching that of the NFL, SMT is excited to enter the growing market of competitive gaming.”
Although the level of services SMT provided varied from show to show, the base complement for all three productions comprised a full technical team of broadcast specialists operating six cameras, multiple replay machines, and a telestration system. SMT kept pace with the Madden’s lightning-quick style of play for the three-hour shows streamed on EASports YouTube channel, Twitch.TV/Madden, and the EA Sports’ Facebook page. In addition, SMT’s Creative Studio customized EA’s promotional trailer with team-specific elements for each of the three events.
“We started doing [Madden events] with teams last year, and there has been an evolution from wanting a [small-scale] podcast-level environment to almost a broadcast-level show,” says Bottitta. “What I loved about the three teams this year was how passionate and excited they were to be doing this. Teams were handling events very differently, but all of them had great people to work with and did a wonderful job.”
Inside the Production: University of Phoenix Stadium, Glendale, AZ
The Cardinals’ Madden NFL 18 Club Championship took place on took place on Saturday Nov. 11, soon after the team’s Thursday Night Football home game against the Seahawks, creating a quick turnaround for SMT and the team’s production staff. SMT provided the producer (Bottitta), director, tech manager, and lead camera operator and advised on what should be added for the production.
“We primarily provided leadership for the Cardinals,” says Bottitta. “They have a fantastic facility, so we reviewed with their tech group what they had and what they needed to add for [a competitive-gaming production] like this. They have a fantastic control room, and they used the crew that they normally use except for the producer, director, tech manager, and lead cameraman, which we provided.”
Inside the Production: New Era Field, Buffalo, NY
In Buffalo, SMT provided a similar level of services for the Bills’ event on Saturday Dec. 2, the day before the team faced off against the New England Patriots. SMT worked with the Bills to manage other shows using the team’s studio at New Era Field: a simulcast radio show, pre/postgame show for the Buffalo Sabres, and Bills GameDay on Sunday.
SMT once again used the team’s crew primarily but provided its own producer, director, tech manager, and camera ops and added a stage manager.
“Buffalo was on a real-time crunch,” says Bottitta, “so they told us the studio they wanted to use, the schedule of the studio, and asked us what was reasonable to expect. We guided them through what would make the most sense, so we could get in there, have a rehearsal and set day and then do the show while also allowing them to still do their normal duties.”
Inside the Production: Daily’s Place Amphitheater, Jacksonville, FL
SMT ramped up its role at the Jaguars’ event, which took place the morning of a home game against the Seahawks on Dec. 10. Since it was a game day, the Jaguars crew was occupied handling the in-venue production, so SMT essentially handled the entire Madden production at Daily’s Place Amphitheater, which is connected to EverBank Field. Since the two events were happening concurrently, the Jaguars provided SMT access to their router, allowing live camera views of warmups to be integrated into the Madden show throughout.
“The Jaguars [production] was the most unique of the three because it was on game day,” Bottitta explains. “They wanted to host it on the morning of what ended up being a very meaningful December football game for the Jaguars for the first time in a long time. Since the game-day crew was obviously busy, we did the whole show. We were taking Seattle and Jacksonville warming up on the field as bump-ins and bump-outs for our show, which was great and really captured the energy of the game.”
The Broadcast Mentality: Madden NFL Coverage Continues To Evolve
As the Madden NFL Club Championship grows (all 32 NFL franchises were involved for the first time this year, with prize money totaling $400,000 at this week’s Championship), the property has made an effort to boost its production value for live streams. Bottitta believes that SMT’s experience on A-level NFL productions, including Sunday Night Football and this weekend’s Super Bowl LII, was integral in the league’s selecting SMT: “I think that made a big difference: knowing that we weren’t just a group that’s doing one more esports tournament; this is a group that does professional sports production.”
He adds that VPS aims to leverage this broadcast-level expertise by bringing in such tools as replay systems and telestrators, which would be standard on an NFL telecast.
“We tried to bring a [broadcast] philosophy to these shows and want to make it more consumable for the viewers,” he says. “We brought telestrators and replay to all of the [productions], and that was not the norm when EA launched [the Club Championship] last year. I did that not only because SMT has a very portable, very easy-to-implement telestrator system but because it really adds to the show. If you went to a game and didn’t see replays or the key camera angles, you’d be in shock. So that became a big part of our production plan.”
January 19, 2018
Sports Video Group
As the Jacksonville Jaguars look to stymie the New England Patriots’ quest for a sixth Super Bowl victory, CBS Sports will cover this Sunday’s AFC Championship from every angle — including overhead.
CBS Sports will deploy 39 cameras in Foxborough, MA: seven super-slow-motion cameras, eight handhelds, and a Steadicam; pylon cams; and a collection of 4K, robotic, and Marshall cameras. The network will also have access to Intel 360 cameras for 360-degree replays. To give viewers an aerial view, CBS will rely on a dual SkyCam WildCat aerial camera system and fly a fixed-wing aircraft over Gillette Stadium.
The CBS Sports crew will work out of NEP SSCBS and have access to 152 channels of replay from 14 EVS servers — four eight-channel XT3’s and10 12-channel XT3’s — plus a six-channel SpotBox and one 4K server.
CBS Sports’ lead announce team Jim Nantz, Tony Romo, and Tracy Wolfson will have plenty of storytelling tools at their fingertips, including SMT’s Next Gen Tele and play-marking systems with auto-render technology on both SkyCams. The lower SkyCam will focus on the actual game play at the line of scrimmage, including the quarterback’s point of view, while the upper SkyCam will provide a more tactical, “all-22” look at the field. During the AFC Championship, Romo will be able to use these tools to break down what he sees on the field for first and second replays.
Coverage begins at 2:00 p.m. ET with The NFL Today, featuring host James Brown and analysts Boomer Esiason, Phil Simms, Nate Burleson, and Bill Cowher at the CBS Broadcast Center in New York City; kickoff follows at 3:05 p.m. ET. Fans wanting to start their day even earlier can tune into The Other Pregame Show (TOPS) on CBS Sports Network, which runs from 10:00 a.m. to noon
January 12, 2018
Sports Video Group
The Tennessee Titans travel to New England this weekend to take on the reigning Super Bowl champions in the AFC Divisional Round. To capture the action on the gridiron from every angle, CBS Sports will rely on dual SkyCam WildCat aerial camera systems with SMT’s Next Gen Tele and play-marking systems, as well as its virtual 1st & Ten line.
The Next Gen Tele System, which debuted during last year’s AFC Divisional Round, channels the NFL’s Next Gen Stats (NGS) data into an enhanced player-tracking telestrator. Combined with SMT’s proprietary play-marking system, which enables rendering of four virtual-player routes on the SkyCam video and its virtual 1st & Ten line, Next Gen Tele System provides a multitude of options for on-screen graphics that CBS Sports talent can leverage to better tell the story of the game.
“From a production standpoint, everything is about storytelling and conveying the story behind the game,” says Robbie Louthan, VP, client services and systems, SMT. “It’s handled in many different ways, but one way is obviously graphics. The advantage there is, you’re able to tell relevant, compelling information in a quick and succinct way without having to have the talent verbalize it to [viewers]. When you can get it reduced down to a graphic that is relevant to the viewer, you’re guaranteeing that the information you want to convey is being handled in a very quick, succinct manner, because there’s very short time frame between plays.”
During Saturday’s game, SkyCam will focus the lower camera system on the actual game play at the line of scrimmage, showing the quarterback’s point of view. The upper system will provide more of a tactical, “all 22” look at the field. Both systems will feature SMT graphics that enhance their respective camera angles and roles.
“Our camera angle creates a view that helps tell the story better than other camera angles,” explains Stephen Wharton, CTO, SkyCam. “Our view just establishes the storytelling for those graphics better than any other camera can, and then, when you add the motion that our camera brings with it, it makes those graphics — whether NGS, routes, and lines or first-down markers —- get placed very well within the angle of the shot, so that that story is being told.”
SMT will deploy four staffers to Gillette Stadium to support the graphics on the dual Skycam system: one operator to support the Next Gen Tele System, a dedicated operator for each of the camera systems, and one to oversee the operation and help produce the content. SkyCam will have a team of nine on the ground in New England, including five operators on the lower camera system (an engineer in charge, an assistant, a rigger, a pilot, and an operator responsible for the camera’s pan/tilt/zoom) and four on the upper camera system (an EIC, rigger, pilot, and PTZ operator).
The same system will return the following week during the AFC Championship Game, and similar systems will appear in other games throughout the NFL playoffs. And, while the action on the gridiron is sure to excite throughout the playoffs, the graphics overlaid on the dual Skycam system will only increase the level of storytelling that the talent can deliver and fans can expect.
“We’re excited about showing off a new way of using Next Gen Stats and really focusing on where the players are running, where the routes are, and creating that sort of Madden look, if you will,” says Wharton. “If you [look at the broadcasters, they’re] usually telestrating: they’re saying, Here’s this guy, and they draw the little yellow line of where he ran. Now we’re leveraging the NFL’s Next Gen Stats system to get that data to create the graphics with SMT and then overlay that from our angle. It creates a very compelling shot.”
Echoes Louthan, “It’s another tool in the toolkit for the announcers — in this case, for [analyst] Tony Romo to use graphics to help tell the story of what he sees. It has been exciting for us to work with Tony on fine-tuning these graphics to [enable] him to use his incredible insight into the game to tell the story.”
(SportsMEDIA Technology), the leading innovator in real-time data delivery and graphics solutions for the sports broadcasts, and SkyCam, the company that specializes in cable suspended aerial camera systems, are continuing to deliver technological innovations to CBS Sports’ broadcasts of the AFC playoff games, including Saturday’s Tennessee Titans vs. New England Patriots contest, Sunday’s Jacksonville Jaguars vs. Pittsburgh Steelers game and the AFC Championship on Jan. 21.
SMT will provide its Next Gen Tele system, an enhanced player-tracking telestrator that harnesses the power of NFL's Next Gen Stats data and SMT’s proprietary play-marking system to instantly render four virtual player routes on SkyCam video that’s available to the producer and talent at the end of every play. This “first-replay series, every replay” availability makes SMT’s system a true breakthrough in which NFL's Next Gen Stats data is able to drive meaningful content as an integral component of live NFL game production. The system debuted last year for the AFC divisional playoffs.
Using dual SkyCam WildCat aerial camera systems to enhance its broadcast, CBS Sports has made standard the “Madden-like” experience that gives football fans a more active and dynamic viewing experience behind the offense, revealing blocking schemes, defensive fronts and throwing windows and providing a deeper understanding of plays. Combined with
SMT’s virtual 1st & Ten line solution placed from SkyCam images, viewers are experiencing the new, modernized look of NFL games. SMT, through its offices in Durham and Fremont, has supported CBS NFL broadcasts since 1996.
“Used in conjunction with SMT’s virtual technology, fans have embraced the enhanced coverage made possible with dual SkyCam systems, a look that younger viewers have come to expect in their games,” said Stephen Wharton, CTO, SkyCam. “With SkyCam, fans get the benefit of a more complete view of the action and play development – we place them right into the action in real-time. Sideline cameras force fans to wait for replays to get a sense of what receivers and quarterback were seeing. With SkyCam, no other camera angle is as immersive or engaging.”
“SMT’s ability to place virtual graphics from SkyCam opens up a plethora of possibilities for broadcasts in terms of augmented reality applications with advertising content, player introductions on the field, or a whole host of possibilities,” said Gerard J. Hall, CEO, SMT. “The potential with our technology is limitless.”
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious live events, including the Super Bowl, NBC Sunday Night Football, major golf and tennis events, the Indianapolis 500, the NCAA Tournament, the World Series, ESPN X Games, NBA on TNT, NASCAR events, and NHL games. SMT’s clients include major US and international broadcasters as well as regional and specialty networks, organizing bodies, event operators, sponsors and teams. The 31-time Emmy Award-winning company is headquartered in Durham, N.C., with divisions in Jacksonville, Fla., Fremont, Calif., and London, England.
Headquartered in Fort Worth, Texas, SkyCam is a leading designer, manufacturer and operator of mobile aerial camera systems. SkyCam plays a significant role in changing the way sporting events are broadcast in the world, appearing at marquee broadcast events, such as The NFL Super Bowl, NCAA Final Four, NBA Finals, Thursday Night Football, Sunday Night Football, NCAA College Football, 2015 CONCACAF Gold Cup and 2014 FIFA World Cup. SkyCam is a division of KSE Media Ventures, LLC
January 08, 2018
TV Technology
NEW ORLEANS—New Orleans Saints and Carolina Panther receivers and quarterbacks weren’t the only ones concerned about what was in and out of bounds Sunday (Jan. 7) in New Orleans during the NFC Wildcard game.
Fox Sports, which telecast the game, walked a different sort of line with its playoff coverage—one that delineates between delivering the great shots needed to present game action and some new tech implementation that actually gets in the way of coverage.
“We don’t want to make things all that different for the production team and give them a whole bunch of stuff that they haven’t had before for the big games,” says Mike Davies, SVP of Field and Technical Operations at Fox Sports. Rather, the strategy is to start with a “base layer” of production technology used throughout the 17 weeks of the regular season and then deploy choice pieces of technology that will have the biggest impact on game production and allow Fox Sports to tell the best story, he says.
“A lot of this stuff we’ve used before and some just this year,” says Davies. “We just pick the best of the best to represent us.”
For example, for the three NFL playoff games Fox Sports is covering the broadcaster will add a second, higher SkyCam to deliver a drone’s-eye view of plays that captures all 22 players on the field. “Although you think of how over the top two SkyCams might sound, it turns out to be very useful,” says Davies. Fox Sports first used the dual SkyCam setup during the preseason and then again in Week 5 for the Packers vs. Cowboys game. “I think that camera angle is new enough that we are still learning what it can do,” he says.
The broadcaster recognized the upper SkyCam “was something special” in Week 5 during a play involving Cowboys running back Ezekiel Elliot. “He jumped over that pile and no camera, including the lower SkyCam, saw that he had reached out over the first down line [except for the new upper SkyCam],” he says. “At least for that moment, we were sold that this is something special and something we wanted to offer.”
However, camera enhancements—both in terms of numbers and applications—aren’t limited to the second SkyCam. For its NFL playoff coverage, Fox Sports will deploy seven 8x Super Mo cameras, rather than the typical five. Fox also will use 6x Super Mo for its SkyCams, which it first did for its Super Bowl LI coverage in February 2017.
“There are so many replay opportunities in football, and the Super Mo gives this crisp—almost cinematic—look at the action,” says Davies.
The sports broadcaster also will take advantage of work it has done this year with Sports Media Technologies (SMT), SkyCam and Vizrt “to cobble together a recipe” to do augmented reality with the SkyCam, he says. Not only does the setup allow Fox Sports to put a live yellow line on the field of play with its SkyCam shots, but also to put graphic billboards and other 2-D graphics on the field and to fly around them with the SkyCam as if they were real objects.
“It’s a bit of an orchestration because the pilot of the SkyCam needs to be flying around the object as if it were an object on the field. If you break through it, it’s not going to look real,” says Davies.
Another enhancement is how Fox Sports will use its pylon cameras, says Davies. Rather than pointing the pylon cams positioned at the front of the end zone down the field, Fox will rotate them so they look down the field at a 45 degree angle, says Davies.
“That gives you a way to cover a play where the camera is actually looking. Yes, you have the goal line, but you also have the out-of-bounds line as well,” he says. As a result, there are more game situations in which the pylon cameras can contribute to coverage. “The pylon cameras are a lot like catching lightning in a bottle. They are great, but you don’t want to use them unless you’ve got something that is really compelling,” says Davies.
While it is too soon to tell if the drop in viewership plaguing the league this season will carry over to the playoffs, Davies is confident that the right technology and production techniques have the potential to help fans reconnect with the game.
“I feel that what we are able to do using all of this incredible technology—the dual SkyCams, the Super Mo’s and the pylons—is that we are able to deliver that kind of experience in replay right after the play that also shows the emotions of players, not just what happens between the whistles,” he says.
Harkening back to his stint at HBO, Davies recalls the connection the cinematic style used for “Inside the NFL” created as “you watched a game that happened three or four days prior.” Today’s production tools give broadcasters that same opportunity to create that connection, he says. “I can’t help but think that these kind of storytelling tools, honestly, can only help,” says Davies.
The 2019 College Football Playoff National Championship concludes tonight at Levi’s Stadium in Santa Clara, CA. Like every other football game, it will feature two teams — in this case, Alabama and Clemson — and one broadcaster. For its part, ESPN is once again all-in for the big game, deploying more than 310 cameras to cover all the action and providing 17 viewing options via the MegaCast over 11 TV and radio networks and via the ESPN app.
“The thing that makes this event is the volume and magnitude of what we put behind it but also the time frame,” says John LaChance, director, remote production operations, ESPN. “[There are] other marquee events, which stand alone, but, with the volume and viewer enhancements being done here in a 72-hour window to get everything installed, this event [is] in a unique classification. Trying to integrate everything into place was a herculean effort.”
The game wraps up a season in which ESPN’s production team delivered more than 160 games to ABC and ESPN and more than 1,000 games to various other ESPN platforms.
“To watch that volume and make sure all the pieces are in place is a highlight for all of us, [seeing] it go from plan to working,” says LaChance. “You always have things that are challenges, but it’s about how quickly you can recover, and I think we’ve done it well.”
The core of ESPN’s production efforts will be done out of Game Creek Video’s 79 A and B units with Nitro A and B handling game submix, EVS overflow, 360 replay, robo ops, and tape release. ESPN’s team creating 17 MegaCast offerings is onsite, housed in Nitro and Game Creek’s Edit 3 and Edit 4 trailers andTVTruck.tv’s Sophie HD. Game Creek Video’s Yogi, meanwhile, is on hand for studio operations, and Maverick is also in the compound. All told, 70 transmission paths (50 outbound, 20 inbound) will be flowing through the compound, and 40 miles of fiber and cable has been deployed to supplement what already exists at Levi’s Stadium.
Also on hand are Fletcher, which is providing robotics; BSI, handling wired pylons and RF audio and video; 3G, which is in charge of the line-to-gain PylonCam and the first-and-10–marker camera; Vicareo, with the Ref Cams; and CAT Entertainment, for UPS and power. SMT is on board for the 1st & Ten lines; PSSI, for uplink; Bexel, for RF audio and other gear; and Illumination Dynamics, for lighting.
“It’s a team effort,” says LaChance. “I couldn’t be prouder of the team we assembled here and the vendors, technicians, leads, and staff that have, over the course of the last several months and weeks when it gets to a fever pitch, put it all together.”
The Camera Contingent
A large part of the 300-camera arsenal is comprised of 160 4K DSLR cameras deployed for the 4D Replay system that will provide definitive looks at every play from every angle. Those cameras are mounted around the stadium and, combined, provide images that can be merged on computers and enable an operator to zoom around a play and show any angle.
One place where the 4D system is poised to shine is the Red Zone. The 4D Replay team and ESPN have created templates that can cut the time needed to synthesize the images for plays around the goal line and pylons to eight seconds.
Besides the 160 4D replay cameras, plenty of cameras are focused on the game action, including 90 dedicated to game coverage. Among those are 10 super-slo-mo cameras, nine 4K game cameras, 15 RF cameras, two SkyCams, and two aerial cameras in a blimp and fixed-wing aircraft. The vast majority of cameras are Sony models (mostly Sony HDC-2500 and HDC-4300 with one HDC-4800 in 4K mode) coupled with Canon lenses, including five 100X, two 95X, 21 wide-angle, and 14 22X and 24X lenses. Seven 86X lenses and a 27X lens are also in use.
The game-coverage cameras are complemented by specialty cameras. Four Vicario Ref Cams will be worn by the officials; a line-to-gain RF PylonCam will move up and down the sideline with the first-and-10 marker, which also has a camera; and eight PylonCams around the end zones provide a total of 28 cameras.
The RefCam is new this year, having been tested during last year’s final in Atlanta. The MarkerCam did debut last year, and LaChance says it has been improved: “It has a c360 Live camera in the target portion of the marker to give a 180-degree perspective in 4K. The operator can push in and get a great perspective; we are taking it to another level with the push in.”
A second c360 camera will also be in use on the second SkyCam, again giving the ESPN team the ability to zoom in and capture images.
Another exciting new offering is AllCam, a system designed by ESPN’s in-house team and ChyronHego. It stitches images from three 4K cameras placed alongside the all-22 camera position and gives the production team the ability to zoom in anywhere on the field to capture events that might have taken place away from the action. For example, in a test at a bowl game, the system was used to show an unnecessary-roughness violation that took place during a kickoff far from the other players, who were focused on the run-back.
“It’s another good example of the partnerships we have and working for a common goal,” says LaChance.
Beyond the game coverage cameras there are 20 cameras dedicated to the various MegaCast feeds, 29 for ESPN College GameDay, and nine for the SEC Network. ESPN Deportes also has two dedicated cameras.
All told the production team will have access to 320 sources via 170 channels of EVS playback as well as 32 channels of Evertz Dreamcatcher playback. There are also two Sony PVW-4500 servers in use, a Sony BPU-4800 4K record server, and two c360 record servers.
Non-Stop Action — for the Production Team
“The game wraps up a busy time for the production team as well as for those who work at Levi’s Stadium. LaChance credits Jim Mercurio, VP, stadium operations/GM, Levi’s Stadium, and Nelson Ferreira, director, technical operations, San Francisco 49ers, with being an important part of the process during the past year.
“It’s a solid venue and great group of folks to work with, and that helps,” says LaChance. “They have done the Super Bowl here, and they do a lot of great events, so they are well-equipped. We had to supplement with some fiber, but they had a great infrastructure to start with.”
As for the ESPN team, everybody worked on one of the two semifinals as well as an additional bowl game.
“Folks that did the Cotton Bowl headed on to the Sugar Bowl, and those that did the Orange Bowl headed to the Rose Bowl,” says LaChance. “A lot of the people here have been non-stop since the Christmas Day offerings for the NBA, then right into a semifinal assignment, then the second of the New Year’s bowl offerings, and then making their way here to Santa Clara for one of the largest events the company does every year.”
For anyone looking to see what the new toys will bring to the show, LaChance recommends tuning into the TechCast, which will have a sampling of everything that will be used, including 4D Replay, C360, and the RefCam.
“Besides the game itself,” he says, “tune into the TechCast. Hopefully, the weather is good for us, and we can offer the BlimpCast from the Goodyear airship, which is another opportunity to provide a unique look for viewers at home.”
2018 was one of the most eventful years for sports production in recent memory, with the 2018 PyeongChang Olympics and 2018 FIFA World Cup capturing the nation’s attention over the summer and annual events like the College Football Playoff National Championship Game, Super Bowl, NFL Draft, and others breaking production records and test-driving new technologies and workflows. As if there weren’t enough going on stateside, this year’s Road Warriors features an expanded look at what went on across the Atlantic. Here’s is Part 2 of SVG’s look at some of the sports-production highlights from the past year (CLICK HERE for Part 1).
US OPEN
USTA Billie Jean King National Tennis Center, Flushing Meadows, NY
August 27–September 9
For ESPN, it simply doesn’t get bigger than US Open tennis. In the network’s fourth year as host broadcaster and sole domestic-rights holder — part of an 11-year rights deal — the technical and operations teams continued to evolve production workflows and add elements. Highlights this year included the debut of a Fletcher Tr-ACE/SimplyLive ViBox automated production system covering the nine outer courts and several new camera systems.
This truly is the largest event that ESPN produces out of the thousands of events that we do all year,” said ESPN Director, Remote Operations, Dennis Cleary, “and it’s all done in a 3½-week span.”
For the first time, ESPN covered all 16 courts at the US Open, thanks to a new automated production system deployed on the nine outer courts. Having debuted at Wimbledon in June, the Fletcher Tr-ACE motion-detecting robotic camera system was deployed on each court (with four robos per court) and relied on SimplyLive’s ViBox for switching and replay and an SMT automated graphics system. With this workflow, one robotic-camera operator and one ViBox director/producer covered each of the nine courts.
New this year was a two-point aerial CineLine system (provided by Picture Factory) running between Louis Armstrong Stadium and Court 10, a run of roughly 1,000 ft. After a successful debut at Wimbledon in June and the Australian Open in January, Telstra Broadcast Services’ NetCam made its US Open debut. The Globecam HD 1080i/50 POV miniature robotic camera was deployed on each side of the net for singles matches at Arthur Ashe Stadium, Armstrong, and the Grandstand, providing viewers with a close-up look at the action on the court. In addition, both Intel’s Tru View 360-degree camera system and the SpiderCam four-point aerial system returned to Ashe.
The US Open production compound was almost unrecognizable from five years ago, prior to ESPN’s taking over as host broadcaster. What had been a caravan of production trucks became two permanent structures housing ESPN’s NTC broadcast center and production/operations offices, along with two ultra-organized stacks of temporary work pods housing the TOC, vendors, international broadcasters, and ESPN’s automated production operation for the outer courts. NEP’s NCP8 was on hand for ESPN’s ITV operation (serving AT&T/DirecTV’s US Open Mix Channel), and NEP’s Chromium and Nickel were home to the USTA’s world-feed production. — JD
U.S. OPEN
Shinnecock Hills Golf Club, Shinnecock Hills, NY
June 14-17
The 2018 U.S. Open from Shinnecock Hills Golf Club gave the Fox Sports team challenges in production planning that led to innovations, the opportunity to refresh old workflows and core infrastructure, and a chance to chart some new directions for golf coverage.
Game Creek Video’s Encore production unit was at the center of the coverage for Fox and FS1, with Game Creek Pride handling RF-video control and submix and providing a backup emergency control room. Pride’s B unit handled production control for one of the featured groups, Edit 4 supported all iso audio mixes, and Edit 2 was home to five edit bays with equipment and support provided by Creative Mobile Solutions Inc. (CMSI). There was also the 4K HDR show, which was produced out of Game Creek Maverick.
“All the Sony HDC-4300 cameras on the 7th through 18th greens are 4K HDR-native with a secondary output at 720p SDR,” noted Brad Cheney, VP, field operations and engineering, Fox Sports, during the tournament. There were also six Sony PXW- Z450’s for the featured holes and featured groups, the output of two of them delivered via 5G wireless.
In terms of numbers, Fox Sports had 474 technicians onsite, making use of 38 miles of 24-strand fiber-optic cable to produce the event captured by 106 cameras (including 21 wireless 1080p, 21 4K HDR units, six 4K HDR wireless units, three Inertia Unlimited X-Mo cameras shooting at 8,000 fps, a Sony HDC-4800 at 960 fps, and three Sony HDC-4300’s at 360 fps), and 218 microphones. Tons of data was passed around: 3 Gbps of internet data was managed, along with 83 Gbps of broadcast data, 144 TB of real-time storage, and 512 TB of nearline storage.
Each course provides its unique challenges. At Shinnecock Hills, they included the roads running through the course, not to mention the hilly terrain, which also had plenty of deep fescue. But, from a production standpoint, the biggest issue was the small space available for the compound.
One big step taken in preparation for the 2018 events was that the IP router in Encore was rebuilt from scratch. RF wireless coverage was provided by CP Communications. There were 26 wireless cameras on the course, along with 18 wireless parabolic mics and nine wireless mics for on-course talent. CP Communications also provided all the fiber on the course. — KK
MLB ALL-STAR GAME
Nationals Park, Washington, DC
July 17
With its biggest summer drawing to a close with the MLB All-Star Game, Fox certainly showed no sign of fatigue technologically. Not only did the network roll out a SkyCam system for actual game coverage for the first time in MLB history, but Fox also deployed its largest high-speed–camera complement (including all 12 primary game cameras), two C360 360-degree camera systems, and ActionStreamer POV-style HelmetCams on the bullpen catcher, first-base coach, and Minnesota Twins pitcher José Berríos.
People always used to say Fox owned the fall with NFL and MLB Postseason, but, this year, we owned May through July, too, with the U.S. Open, World Cup, and now All-Star,” said Brad Cheney, VP, field operations and engineering, Fox Sports. “The capabilities of our [operations] team here are just unsurpassed. For big events, we used to throw everything we had at it, and it was all hands on deck. That’s still the case, but now, when we have big events, everybody’s [scattered] across the globe. Yet we’re still figuring out ways to raise the bar with every show.”
Between game coverage and studio shows, Fox Sports deployed a total of 36 cameras (up from 33 in 2017) at Nationals Park, highlighted by its largest high-speed–camera complement yet for an All-Star Game. Building on the efforts of Fox-owned RSN YES Network, all 12 of Fox’s Sony HDC-4300 primary game cameras were licensed for high-speed: six at 6X slo-mo, six at 2X slo-mo. This was made possible by the ultra-robust infrastructure of Game Creek Video’s Encore mobile unit.
Fox also had two Phantom cameras running at roughly 2,000 fps (at low first and low third) provided by Inertia Unlimited and a pair of Sony P43 6X-slo-mo robos at low-home left and low-home right provided by Fletcher. Fletcher provided nine robos in all — including low-home Pan Bar robo systems that debuted at the 2017 World Series — and Inertia Unlimited provided a Marshall POV in both teams’ bullpen and batting cage.
CP Communications supplied a pair of wireless RF cameras: a Sony P1r mounted on a MōVI three-axis gimbal and a Sony HDC-2500 handheld. An aerial camera provided by AVS was used for beauty shots — no easy task in security-conscious Washington.
Inside the compound, a reshuffling of USGA golf events allowed Game Creek Video’s Encore mobile unit (A, B, and C units), home to Fox’s U.S. Open and NFL A-game productions, to make its first All-Star appearance.
The primary control room inside the Encore B unit handled the game production, and a second production area was created in the B unit to serve the onsite studio shows. — JD
The Open Championship
Carnoustie Golf Links, Angus, UK
July 19-22
Sky Sports used its Open Zone in new ways to get closer to both players and the public in its role as the UK live broadcaster from Carnoustie. On Thursday and Friday, Sky Sports The Open channel was on the air from 6:30 a.m. to 9:00 p.m. Featured Group coverage of the 147th Championships was available each day via the red button and on the Sky Sports website. Viewers could also track players’ progress in Featured Hole coverage on the red button, with cameras focusing on the 8th, 9th, and 10th holes. Sky Sports had a team of 186 people onsite in Carnoustie for The Open, which included Sky production and technical staff and the team from OB provider Telegenic. — Fergal Ringrose
WIMBLEDON
All England Lawn Tennis and Croquet Club, Wimbledon, UK
July 2-15
At 11:30 a.m. on Monday, July 2, coverage of the Wimbledon Championships went live from the AELTC, produced for the first time by a new host broadcaster. After more than 80 years under the BBC’s expert guidance, the host baton was passed to Wimbledon Broadcast Services (WBS), bringing production of the Championships in-house. Going live on that Monday was the culmination of two years of planning, preparation, and testing: a process that has allowed the AELTC to “take control” of the event coverage and provide international rightsholders with a better service as well as add some new twists, such as Ultra High Definition (UHD), a NetCam on both Centre Court and No.1 Court, and multicamera coverage of all 18 courts. — Will Strauss
FRENCH OPEN
Stade Roland-Garros, Paris
May 27–June 10
Tennis Channel was once again on hand in a big way at the French Open. The expanded coverage this year meant more than 300 hours of televised coverage for fans in the U.S. as well as 700 hours of court coverage via Tennis Channel Plus. The Fédération Française de Tennis (FFT) increased overall court coverage this year, and Tennis Channel made sure all of that additional coverage made it to viewers. Tennis Channel had approximately 175 crew members onsite, working across the grounds as well as in a main production-control room, an asset-management area, six announce booths, and a main set on Place des Mousquetaires. The production facilities were provided by VER for the fifth year. Centurylink provided fiber transport to the U.S. via 10-Gbps circuits. — KK
The Professional Fighters League (PFL) and SMT (SportsMEDIA Technology) announced an exclusive, long-term technology partnership. Under the terms of the agreement, SMT will partner with the PFL to create proprietary technology that will measure real-time MMA fighter performance analytics along with biometric and positional data that will provide fans a live event experience across all platforms.
Starting in 2019, SMT will help power the PFL’s vision of the first-ever SmartCage. The SmartCage will utilize biometric sensors and proprietary technology that will enable the PFL to measure and deliver real-time fighter performance data and analytics, what the PFL is dubbing: Cagenomics. PFL fans watching linear and digital broadcasts of the league’s Regular Season, Playoff, and Championship events will experience a new dimension of MMA fight action with integration of live athlete performance and tracking measurements including: speed (mph) of punches and kicks, power ratings, heart rate tracking, energy exerted, and more.
“The Professional Fighters League is excited to be partnering with SMT to advance the sport of MMA. The PFL’s new SmartCage will revolutionize the way MMA fans experience watching live fights as next year every PFL fight will deliver unprecedented, real-time fighter performance data and analytics, biometric tracking, and an enhanced visual presentation of this great sport,” says Peter Murray, CEO, Professional Fighters League. “Not only will PFL fans benefit from our SmartCage™ innovation, but our pro fighters will now have access to new performance measurement data, analysis, and tools to help them train and compete. The PFL’s vision has always been two-fold: deliver the absolute best experience to fans and be a fighters first organization and with the SmartCage we will accomplish both.”
“SMT is thrilled to be collaborating with the Professional Fighters League’s forward-thinking innovation team to bring our latest and greatest technology to PFL events,” says Gerard J. Hall, Founder & CEO, SMT. “Starting in 2019, PFL fans will begin to see real-time, live, innovative technology that is unique to the PFL in the MMA space. SMT’s OASIS Platform will provide the PFL with a seamlessly integrated system that combines live scoring with real-time biometric and positional data to enhance the analysis, storytelling and graphic presentation of the PFL’s Regular Season, Playoffs and Championship events next season.”
The PFL 2018 Championship takes place on New Year’s Eve live from The Hulu Theater at Madison Square Garden and consists of the 6 world title fights in 6 weight classes of the PFL 2018 Season. Winners of each title bout will be crowned PFL World Champion of their respective weight class and earn $1M. The PFL Championship can be viewed live on Monday, December 31 on NBC Sports Network (NBCSN) from 7 to 11 pm ET in the U.S. and on Facebook Watch in the rest of the world.
SMT To Partner with PFL to DevelopProprietary Technology to Measure Real-Time Fighter Performance Data and Analytics,Biometric Tracking Along with Innovative Graphic Enhancements for the League’s LiveLinear and Digital Events.
WASHINGTON DC (December 17,2018) The Professional Fighters League (PFL)and SMT (SportsMEDIA Technology) – theleading innovator in real-time data delivery and graphics solutions for thesports and entertainment industries – today announced an exclusive, long-termtechnology partnership. Under the terms of the agreement, SMT will partnerwith the PFL to create proprietary technology that will measure real-timeMMA fighter performance analytics along with biometric and positional data thatwill provide fans a game-changing live event experience across all platforms.
Starting in 2019, SMT will help power the PFL’s vision of thefirst-ever SmartCage™. The SmartCage™ will utilize biometricsensors and proprietary technology that will enable the PFL to measure anddeliver real-time fighter performance data and analytics, what the PFL isdubbing: Cagenomics™. PFL fans watching linear and digital broadcasts ofthe league’s Regular Season, Playoff and Championship events will experience anew dimension of MMA fight action with integration of live athlete performanceand tracking measurements including: speed (mph) of punches andkicks, power ratings, heart rate tracking, energy exerted and more.
“The Professional Fighters League is excited to be partnering withSMT to advance the sport of MMA. The PFL's new SmartCage™ willrevolutionize the way MMA fans experience watching live fights as next yearevery PFL fight will deliver unprecedented, real-time fighter performance dataand analytics, biometric tracking and an enhanced visual presentation ofthis great sport,” said Peter Murray, CEO, Professional Fighters League. “Notonly will PFL fans benefit from our SmartCage™ innovation, but our pro fighterswill now have access to new performance measurement data, analysis and tools tohelp them train and compete. The PFL’s vision has always been two-fold:deliver the absolute best experience to fans and be a fighters-firstorganization and with the SmartCage™ we will accomplish both.”
“SMTis thrilled to be collaborating with the Professional FightersLeague’s forward-thinking innovation team to bring our latest and greatesttechnology to PFL events,” said Gerard J. Hall, Founder & CEO, SMT.“Starting in 2019, PFL fans will begin to see real-time, live, innovativetechnology that is unique to the PFL in the MMA space. SMT’s OASISPlatform will provide the PFL with a seamlessly integrated system that combineslive scoring with real-time biometric and positional data to enhancethe analysis, storytelling and graphic presentation of the PFL’sRegular Season, Playoffs and Championship events next season.”
The PFL 2018Championship takes place on New Year’s Eve live from The Hulu Theaterat Madison Square Garden and consists of the 6 world title fights in 6weight classes of the PFL 2018 Season. Winnersof each title bout will be crowned PFL World Champion of their respectiveweight class and earn $1M. The PFL Championship can be viewed live onMonday, December 31 on NBC Sports Network (NBCSN) from 7 to 11pm ET in the U.S.and on Facebook Watch in the rest of the world.
###
Professional Fighters League
TheProfessional Fighters League (PFL) presents MMA for the first time inthe sport format where individual fighters compete in a regular season,playoffs, and championship. PFL Season has 72 Elite MMAathletes across 6 weight-classes, with each fighting twice in the PFLRegular Season in June, July, and August. The top 8 fighters in each weight-class advance to thesingle-elimination PFL Playoffs in October. The PFL Championship isNew Year’s Eve in Madison Square Gardens with the finals in each of sixweight classes competing for the $10 million prize pool. The PFL is broadcast live on NBC SportsNetwork (NBCSN) and streamed live worldwide on Facebook Watch. Founded in 2017, the PFL is backed by group of sports, media,and business titans. For more info visit PFLmma.com.
SMT
SMT (SportsMEDIA Technology) is the leading innovator inreal-time data delivery and graphics solutions for the sports and entertainmentindustries, providing clients with scoring, statistics, virtual insertion andmessaging for broadcasts and live events. For the past 30 years, SMT’ssolutions have been used at the world’s most prestigious live sports events,including the Super Bowl, Indy 500, Triple Crown, major golf and tennis events,MLB’s World Series, Tour de France, and the Olympics. SMT’s clients includesports governing bodies; major, regional and specialty broadcast networks;event operators; sponsors; and teams. The 32-time Emmy Award-winning company isheadquartered in Durham, N.C., with divisions in Jacksonville, Fla., Fremont,Calif., and London, England.
SMT is once again one of the busiest vendors on hand at the US Open, providing a cavalcade of technology to serve the USTA, broadcasters, spectators, athletes, and media onsite at the USTA Billie Jean King National Tennis Center (NTC). In addition to providing the much discussed serve clock, SMT — now in its 25th year at the Open — is providing scoring systems, scoring and stats data feeds, LED scoreboards, TV interfaces, IPTV systems, and match analysis.
“This event, just like any Grand Slam, is becoming a three-week event,” says Olivier Lorin, business development manager, SMT. “We have more and more recipients asking for data. Today, we’re actually sending 19 different data feeds to recipients for their own platform. Obviously, we have to get the authorization from the USTA, but then they use that for whatever.”
Countdown to the Serve
An on-court digital clock, similar to the shot clock in basketball and the play clock in football, counts down the allotted 25 seconds before a player must begin the serve (previously, the 20-second clock was visible only to the chair umpire).
After the USTA announced plans to display a countdown clock for this year’s tournament, SMT introduced the clock at ATP and WTA events leading up to the Open — most recently, in Winston-Salem, NC, and Cincinnati — to help players acclimate to it.
“The USTA has been looking to do the serve clock at the US Open for a few years, starting in 2016 with the Juniors and then the qualifiers as an experiment, which all went very well,” says Lorin. “The Australian Open and the French Open also did it in quallies, but the US Open wanted to be the first [Grand Slam] to do this for all events, and we were able to work with them to make that happen.”
The clock, visible to players and spectators alike, begins to tick down immediately after the chair umpire announces the score. The umpire will issue a time violation if the player has not started the service motion at the end of the countdown. The first time the clock hits zero before a player begins the motion, the player receives a warning. For every subsequent time, the player loses a first serve. SMT is driving umpire scoring on all 16 courts and offsite for Junior Qualifying (eight courts).
Lorin sees a benefit to TV in the five-minute warmup clock and the serve clock: “At least seven minutes [is saved], so the match is going to [end] on time more often.”
Serving the Media: IPTV and CCTV
SMT is also responsible for the infrastructure for the USTA’s CCTV, IPTV, and Media Room. The IPTV system for the Media Center at this year’s US Open is now “browser-independent.” It allows users to select and view up to five streams/videos at one time from any of the digitally encoded channels available on the 13-channel CCTV system. In addition, the system allows access to archived player interviews. The IPTV system also includes real-time scores, match stats, draws, schedule, results, tournament stat leaders, US Open history, and WTA/ATP player bio information.
“It’s a very slick interface, and the USTA has been very positive about it,” says Lorin. “Today, it is still under a controlled environment here at the US Open, but, if the US Open wanted to make this open to anybody on the outside, we could easily provide a solution for them to log in and have the same information, with the exception of live video.”
Automation Is Key to New Outer-Courts Coverage
A fixture at live-sports-broadcast compounds, SMT is also providing a variety of services to domestic-rights holder and host broadcaster ESPN, as well as other broadcasters onsite. ESPN is deploying an SMT automated-graphics interface as part of its new automated-production system for outer-court coverage, which relies on a Fletcher Tr-ACE motion-detecting robotic camera system and SimplyLive’s ViBox all-in-one production system.
An SMT touchpad at each of the 16 workstations is used only during prematch coverage. All other graphics elements, including the scorebug and lower-thirds, are fully automated, and informational elements are triggered by preconfigured settings in SMT’s data feed (for example, 10 total aces or 10 unforced errors).
“The beauty of our system is that everything is automated and driven by the score notification of the umpire’s tablet,” says Lorin. “We have built up prematch graphics so we know that, when the umpire hits warmup on the tablet, a bio page for both players and a head-to-head graphic will appear, and then they’ll go to the match. When the match starts, the system is just listening to the score notifications, and we have built-in notifications for five aces and things like that. The only thing that is manual and left to the producer for that court is the set summary and the match summary for statistics.”
Also From SMT: Prize Money Report, LED Superwall, More
This year, SMT has updated its Official Prize Money Report, in which prize money is calculated and a report generated at the end of the tournament and distributed to media officials.
SMT also provides content for the massive outdoor LED Superwall at the main entrance of Arthur Ashe Stadium, displaying scoring-system content: schedules, results, matches-in-progress scores, custom information messages (for example, weather announcements). SMT designs the scoring graphics and provides live updates.
“One of the big things is, we rebranded the US Open package for 2018 with a new logo, a new font, and a new background,” says Lorin. “As a result, we had to apply those design changes across all the platforms we are serving. One of the things we try to do more and more in the video production is, instead of having the typical headshot of a player, to integrate more action shots and motion shots, which are a lot more appealing to the design.”
Other services SMT provides to the US Open on behalf of USTA include stats entry on seven courts; serve-speed systems and content on seven courts; playback controls, including lap selector and data-point scrubbing; draw creation and ceremony; and match scheduling.
For the first time, ESPN is covering all 16 courts at the US Open, thanks to a new automated production system deployed on the nine outer courts at the USTA Billie Jean King National Tennis Center (NTC). Having debuted at Wimbledon in June, the Fletcher Tr-ACE motion-detecting robotic camera system has been deployed on each court (with four robos per court) and relies on SimplyLive’s ViBox for switching and replay and an SMT automated graphics system. With this workflow, one robotic camera operator and one ViBox director/producer is covering each of the nine courts.
“With one production room and one rack room here, we are essentially replacing what would have traditionally been nine mobile units,” notes ESPN Director, Remote Operations, Dennis Cleary. “We’ve been working on this plan for a long time, and there is just no way we would have been able to cover all these courts in a traditional [production model]. SimplyLive has been used at other [Grand Slams], and it was used with Fletcher Tr-ACE at Wimbledon but not really to this extent. We feel that we have taken it to the next level [and] are integrating it with our overall [show] and adding elements like electronic line calling and media management.”
With all 16 courts now accessible, ESPN can present true “first ball to last ball” live coverage across its linear networks and the streaming platforms (a total of 130 TV hours and 1,300 more streaming on the ESPN app via ESPN3 and ESPN+. Moreover, ESPN was able to provide the USTA with live coverage of last week’s qualifying rounds for the first time, deploying the Tr-ACE/ViBox system on five courts.
In addition, ESPN, which serves as the US Open host broadcaster, has been able to provide any rightsholder with a live feed of a player from its country — regardless of the court and including qualifying rounds.
On the Outer Courts: LiDAR Drives Fletcher Tr-ACE System
Four Fletcher robotic systems with Sony HDC-P1 cameras have been deployed on each of the nine outer courts: two standard robos (traditional high play-by-play and reverse-slash positions) and two Tr-ACE automated robos (to the left and right of the net).
“From the beginning, one of ESPN’s big focuses was increasing the camera quality of what was being done on the outer courts,” says Fletcher Sports Program Manager Ed Andrzejewski. “So we built everything around the Sony P1’s to increase the camera quality to match the main [TV courts]. When they send a feed to the rightsholder in Australia and the player they are interested is on one of those outer courts, they wanted the basic quality to be the same as in the bigger stadiums. I think we’ve been able to accomplish that.”
Between the two Tr-ACE cameras is “the puck,” which powers the Tr-ACE system at each court via a custom-designed LiDAR (Light Detection and Ranging) image-recognition and -tracking system. The LiDAR tracks every moving object on the court (the ball, players, ball kids, judges) and provides the two Tr-ACE cameras with necessary data to automatically follow the action on the court. The LiDAR can also sense fine details on each player (such as skin tone or clothing color), allowing the cameras to tell the difference between a player and other moving objects.
A Room of Its Own: Nine Mobile Units in a Single Room
ESPN has erected a dedicated production room for the Tr-ACE/ViBox operation across from its NTC Broadcast Center. Inside this room are nine workstations featuring one Fletcher Tr-ACE camera operator and one ViBox director/producer each.
The Tr-ACE operator monitors the camera coverage and can take control of any of the four cameras at any point during the match. Meanwhile, the ViBox operator cuts cameras and rolls replays. An SMT touchpad at the workstation is used only during prematch coverage. All other graphics elements, including the scorebug and lower-thirds, are fully automated, and informational elements are triggered by preconfigured settings in SMT’s data feed (for example, 10 total aces or 10 unforced errors).
“The camera op and director are constantly communicating,” Andrzejewski explains. “ESPN put a lot of trust in us with this, so we brought out the best people we could and have some of the best [robo operators] in the business here. There was a lot of onsite learning, but we were able to give everyone lots of time on the system during setup and qualifying.”
The coverage does not feature commentary, so all nine courts are being submixed out of a single audio room using a single Calrec audio console and operator.
Also inside the automated production room are a video area to shade all 36 cameras, an SMT position to manage the automated graphics systems deployed at each workstation, an electronic line-calling position (which was not available for the systems at Wimbledon), and a media-management area, which was used during qualifying to record all five courts (this operation moved to the NTC Broadcast Center once draw play began on Monday).
Since the automated-production systems had to be up and running for qualifying rounds last week, ESPN built the operation on an island entirely separate from the Broadcast Center.
“It was just too costly and just not sensible to bring the full broadcast center up a week early,” notes Cleary. “So this entire operation is all standalone. All the equipment from Fletcher, SimplyLive, Gearhouse, and even transmission is all separate and on its own.”
Two-Plus Years of Development Pays Off
Although automated production is nothing new for the US Open — Sony Hawk-Eye technology had been used for several years to produce coverage from five outside courts — this new system has expanded the ability to truly cover every ball of the tournament.
Use of the Tr-ACE/ViBox system at Wimbledon in June and now at the US Open was a long time coming. Fletcher has been developing the Tr-ACE system for 2½ years and demonstrated it offline on one court at the NTC last year. In addition to the Fletcher and SimplyLive teams, ESPN Senior Remote Operations Specialist Steve Raymond, Senior Operations Specialist Chris Strong, and Remote Operations Specialist Sam Olsen played key roles in development of the system and its implementation this week.
“This is certainly a new workflow for us, so a lot of thought and time went into it before we deployed it,” says Olsen. “We felt that the ViBox and the Tr-ACE would certainly give us the ability to produce a high level of content using an automated [workflow], and it’s worked out really well thus far. Having it for the qualifying rounds for the first few days also served as a great test bed. I think the best way to put it is, we’ve grown into it and we’ll develop it and take it to higher level each time we use it.”
By Jason Dachman, Chief Editor, SVG
Thursday, August 2, 2018 - 2:52 pm
After a move from Los Angeles to Madison, WI, prior to last year’s event, the CrossFit Games production operation has continued to grow prodigiously. The “Woodstock of Fitness” has grown from a production comprising 35 crew members working out of a single mobile unit just six years ago to one of the largest live productions on the annual sports calendar: more than 10 NEP mobile units, a crew of more than 300, and 50-plus cameras. Add in the fact that the CrossFit competitions change from year to year, and it becomes clear just how challenging the event can be for the production team.
This year’s CrossFit Games — Aug. 1-5 at the Alliant Energy Center in Madison — are being streamed on Facebook, CBSSports.com, and the CBS Sports App and televised live on CBS (one-hour live look-ins on Saturday and Sunday plus a recap show) with a daily highlights show on CBS Sports Network.
CrossFit has its own live-streaming team onsite and handles in-house production for the videoboards at Alliant Energy Center. SMT, which is CrossFit’s scoring partner, provides a wealth of presentation options for the boards as well.
CrossFit has used TVU Networks bonded-cellular and IP systems for several years for point-to-point transmission. This year, CBS Digital also used a TVU system to take in streams from the CrossFit Regionals earlier this summer. That success led to a similar partnership for the Games, with CBS Digital receiving all the live competitions on two streams via TVU receivers.
As CrossFit Games’ Footprint Grows, So Does the Live Production
The Games themselves have expanded and become more complex. The production team is tasked with covering multiple venues throughout Alliant Energy Center, primarily The Coliseum and North Park Stadium. This year, the stadium has been expanded to 10,000 people (nearly 50% more than for the 2017 edition) and has added a new videoboard.
July 18, 2018
Sports Video Group
SMT was back at MLB All-Star in Washington, providing Fox Sports its live virtual–strike-zone system and, for the 14th consecutive year, virtual signage.
SMT rendered the virtual–strike-zone graphic, as well as the watermarks when viewers saw the ball cross the plate.
SMT’s Peter Frank was on hand at 2018 MLB All-Star to support Fox Sports’ virtual efforts.
SMT handled virtual signage behind the plate for Fox’s Camera 4 (the primary pitcher/batter camera) and tight center field. For the third year in a row, the company also integrated its system with the high-home position, inserting virtual signage on the batter’s eye in center field.
“We use stabilization for virtual signage on the main camera, which is used for the virtual strike zone, so that helps out with the stability of both graphics,” said SMT Media Production Manager Peter Frank. “Two years ago at MLB All-Star in San Diego was the first time we did [virtual signage on] the batter’s eye, and Fox was really happy with it. So we also brought it back in
July 5, 2018
Sports Video Group
After a successful pilot game last year, the American Flag Football League (AFFL) is back in action this summer with the U.S. Open of Football (USOF) Tournament. The final 11 games of the tournament kick off NFL Network’s AFFL coverage, and the network is embracing the “Madden-style” coverage and the production elements it debuted last year, including using a SkyCam as the primary game angle, deploying RF Steadicams inside the huddle, rolling out customized SMT virtual graphics across the field, and miking players throughout the game.
“After last year’s pilot show, there was a lot of great feedback. Everybody liked the football on the field and the direction the technology was going,” says Johnathan Evans, who served as executive producer and director of last year’s production and is directing the NFL Network telecasts this year. “So our coverage is going to be almost exactly the same as last year, with a few differences since we are doing 11 games instead of just one. We have come up with a great formula that hasn’t been tried on a consistent basis before and offers a different perspective from watching a [traditional] football broadcast. With [AFFL], you’re watching from the quarterback perspective; you’re watching it just like you’re playing a Madden NFL [videogame].”
How It Works: Breaking Down the AFFL FormatThe 12 teams featured in the USOF Playoffs are composed of eight amateur squads in the America’s Bracket (derived from four rounds of play that began with 128 teams) and four teams captained by celebrities in the Pro Championship Bracket. NFL Network’s USOF coverage began with the America’s Bracket Quarterfinal last weekend from Pittsburgh’s Highmark Stadium and continues with the semifinals this weekend at Atlanta’s Fifth Third Bank Stadium, the America’s Bracket Final and Pro Bracket Final on July 14 at Indianapolis’s Butler Bowl, and the $1 million Ultimate Final (featuring both bracket champions) on July 19 at Houston’s BBVA Compass Stadium.
The 11 AFFL telecasts on NFL Network will feature SMT virtual graphics, including the Go Clock.
The 11 AFFL telecasts on NFL Network will feature SMT virtual graphics, including the Go Clock. The 7-on-7, no-contact 60-minute AFFL games feature many of the same rules that average Americans know from their backyard games. The same players are on the field for both offense and defense, and a team must go 25 yards for a first down. There is no blocking; instead, a “Go Clock” indicates when the defense can rush the QB (after two seconds) and when the QB must release the ball or cross the line of scrimmage (four seconds). There are also no field goals (or uprights, for that matter), and kickoffs are replaced with throw-offs.
“This is not only a sport that creates a lot of intensity and energy; it’s also a sport that you as an average person can relate to because you’re watching an average person play the game,” says Evans. “You’re not watching professional athletes. You’re watching amateurs playing a sport that you can play at home. That is something that every single viewer can relate to.”
Inside the Production: It’s All About Access By using the SkyCam for play-by-play, RF Steadicams on the field, and player mics, the AFFL and NFL Network are focused on providing fans unprecedented up-close-and-personal access to the action on the field.
“We’re most excited about having SkyCam as our game camera, which really adds a different perspective, and also having everybody miked up so we can hear everything that’s going on and listen in,” says producer Tom McNeely. “We’re focused on making [viewers] feel like they’re right there on the field with these guys. Bringing them into the huddle with our cameras and microphones — we will have somebody sitting in the truck with a mute button in case the players are a little rambunctious — is going to make this really appealing and fun.”
The upcoming NFL Network AFFL productions will deploy Game Creek Video mobile units and feature an average of eight cameras: the SkyCam system, two traditional 25-yard-line angles for isos, a mid-level end-zone angle, one handheld high-speed camera, a jib on a cart roving the sidelines, and two RF cameras (Steadicam and a MōVI).
“The only new cameras we are adding is a second [RF camera] so we can cover both sides of the football,” says Evans. “Last year, we had only one Steadicam, which was great, but I realized that we were losing the intimacy on both sides of the ball. Before you get to the red zone, it’s great to be inside the huddle and see from behind the quarterback on the offensive side of the ball. But, once you get to the red zone, you need to get ready for a touchdown, so you have to switch your Steadicam to the defensive side of the ball, and you hope to get a touchdown in the end zone. This time, in Indianapolis and in Houston, we’re going to have a Steadicam on both sides of the ball to retain the potential atmosphere for every single play. Before the snap, during the snap, and after the snap, you’re going to have that great intensity right in your face the entire time.”
Go Clock Returns; Interactive Line of Scrimmage DebutsThe Go Clock, designed by SMT specifically for the fast-paced AFFL, is also back after playing a major role in defining the league’s production style during its pilot game. The system synchronizes with in-stadium displays to indicate when the defense can rush the quarterback.
“The Go Clock was a big success, and we’re bringing it back this year,” says Evans. “We’re also introducing a line of scrimmage that will change color when [the defense] is able to rush. So the virtual graphics are still there and play a big role [in the production].
The same SMT virtual 1st & Ten line used in NFL broadcasts will be deployed from the company’s Camera Tracker system, working in tandem with SkyCam to give viewers the “Madden-style” play-by-play angle used several times by NBC Sports last NFL season.
SMT’s Design Studio also designed and implemented the AFFL graphics package — including show open and score bug — and the virtual-graphics package.
SMT’s clock-and-score technology is made available via its dual-channel SportsCG, a turnkey graphics-publishing system that allows greater autonomy via a second-channel laptop that can be operated remotely. In addition to producing the score bug, the SportsCG offers real-time, in-game offensive and defensive statistics powered by SMT QB Stats (the same system used for NCAA and NFL games).
In addition to the virtual elements, the AFFL has enhanced the physical first-down marker used on the field, so that it digitally displays the down, play clock, game clock, and possession arrow. The system also emits an audible alert when the rusher can break the line of scrimmage after two seconds and when the quarterback has to throw the ball after four seconds.
Beyond the Tech: Storytelling, NFL Network IntegrationAside from the production elements, the AFFL also offers a host of great storytelling opportunities surrounding the squads of Average Joes on the field. McNeely, who knows a thing or two about telling the stories of unknowns on the field, having produced a dozen Little League World Series for ESPN, sees the AFFL as a one-of-a-kind storytelling opportunity.
“These aren’t pro names or pro teams; you’re starting from scratch telling those stories. There are a lot of great stories and personalities with layers — [such as] a 50-year-old, 5-ft.-8 quarterback with a potbelly leading the team from Tennessee or one of the amazing athletes who fell short of the NFL but played in the CFL or the Arena League,” says McNeely. “When I first met [AFFL CEO/founder] Jeff Lewis, who has worked so closely with Jonathan and all of us to develop this, he mentioned what a huge fan he was of Little League World Series. And he promised us all the access we needed so that we would be able to tell introduce these players and tell their stories.”
NFL Network’s commitment to the AFFL goes well beyond just televising 11 games, however. Not only do the telecasts feature NFL Network talent like Good Morning Football’s Kay Adams (serving as sideline reporter throughout the tournament) and NFL Total Access host Cole Wright (calling play-by-play on July 14), the network is also incorporating AFFL segments into its daily studio programming, social-media channels, and digital outlets in an effort to appeal to football-hungry fans during the NFL offseason.
“We really feel like there’s a huge opportunity here during the summer, when the NFL really has nothing going on,” says McNeely. “We’re excited to see some traction with social media and on the NFL Network. They are doing a lot to promote [the AFFL] on their studio shows, and we’re hoping it takes off. I think there will be a grassroots push for this similar to what you’ve seen with the Little League World Series.”
June 29, 2018
Sports Video Group
While the broadcast debut of Dale Earnhardt Jr. in the NASCAR on NBC booth is creating plenty of buzz around NBC’s first races of the season this weekend at Chicagoland Speedway, the uber-popular retired driver isn’t the only new addition to the network’s NASCAR coverage this year. Echoing its rink-side “Inside the Glass” position on NHL coverage, NBC will debut the Peacock Pit Box – a remote studio set built within a traditional pit box frame that will be located along pit road for pre- and post-race coverage at each speedway throughout the season.
NBC will debut the Peacock Pit Box – a remote studio set built within a traditional pit box located on pit road – for its NASCAR pre/post-game shows
“The Peacock Pit Box is going to put us in the middle of the action,” says NBC Sports Group Executive Producer Sam Flood. “We’ve had the big set down on the grid for the first three years of [our NASCAR rights] contract. We realized that sometimes the fans departed from that area as we got closer to race time and took away some of the sense of place. So the idea was to have a real sense of place throughout the day, starting with the pre-race show. And most importantly, it gives us a place inside that mayhem that is pit road, which has become one of the most exciting places at the racetrack each week.”
Inside the Peacock Pit Box: Two Levels With Plenty of Tech FirepowerThe 14-ft.-long x 12.5 ft.-wide Peacock Pit Box (a normal-sized NASCAR pit box is 10×8 ft.) features two-levels and is located in a traditional pit box right along pit road. In addition to serving as the home to NASCAR on NBC’s pre-race coverage throughout the season, the structure also features an arsenal of robotic cameras that will aid in NBC’s coverage of pit road throughout each race.
“Sam [Flood] and Jeff [Behnke, VP, NASCAR production, NBC Sports Group] first had the vision and then there were a lot of great creative and technical people that helped to bring it to life,” says NBC Sports Technical Manager Eric Thomas. “They wanted to give our announcers a uniqe vantage point of the field of play – and that’s obviously pit lane. It’s like the 50-yard line in football or center ice in hockey. Our [announcers] will have an elevated position between all the teams right in the middle of the action, so they not only can see the racetrack but also see the competitors on either side of them.”
The NASCAR on NBC team worked with the NBC Sports Group design team in Stamford, CT, to design the Peacock Pit Box, while Nitro Manufacturing built the structure and Game Creek Video provided technical support and equipment.
The top level of the Peacock Pit Box will serve as the primary home from NBC Sports’ Monster Energy NASCAR Cup Series and Xfinity Series pre- and post-race coverage, with host Krista Voda and analysts Kyle Petty and Dale Jarrett occupying the desk. One handheld and three robotic cameras will be on hand for pre/post-race shows.
The 14-ft.-long x 12.5 ft.-wide Peacock Pit Box (a normal-sized NASCAR pit box is 10×8 ft.) features two-levels and is located in a traditional pit box right along pit road.
“It’s a nice dance floor that can support our announcers and various different configurations,” says Thomas. “We have to work within the space of the pit stall, which depends on the track. We have neighbors on either side of us, so we want to really be respectful of the teams and not interfere with them whatsoever. So we’re going to fit in our space very neatly and very cleanly without having an impact on the actual event. We wanted to make it as big as we could to make our announcers as comfortable as possible and also provide the technical equipment to produce a quality show.”
Meanwhile, the lower level of the Pit Box will provide additional broadcast positions with two wired cameras and an occasionally an RF camera and/or a small jib (depending on the size of pit box at each track). The space features interactive displays and a show-and-tell position for analysts like Daytona 500-winning crew chief Steve Letarte to deliver deeper analysis of the track action.
“The technology will be there for Steve to [provide deeper analysis], particularly in the Xfinity races, where he’s going to be hanging down on pit road in a pit box, restarting his old career of looking at the race when you only can see half the racetrack on pit road,” says Flood. “We think by [locating] Steve [there], it will give him more opportunity to focus that unique mind of his on what the heck all the other cars are doing on the track. So we see that as a huge advantage.”
The lower level also features a patio position where NBC will look to conduct interviews with drivers, pit crew chiefs, owners, and NASCAR officials throughout its race coverage.
All About Flexibility: Nine Robo Positions Give NBC Plenty of OptionsSince NBC’s pre- and post-race setup will vary week-to-week depending on the track, Thomas and company were tasked with making the Peacock Pit Box as versatile as possible. With that in mind, the upper level features nine different robotic camera positions. Three robos can be deployed at a time and – thanks to the small, lightweight cameras and custom-developed camera mounts deployed on the Pit Box – the operations team can quickly swap camera positions at any time during NBC’s coverage.
Beloved NASCAR driver Dale Earnhardt Jr., who retired after last season makes his broadcast debut as NASCAR on NBC Analyst this weekend at Chicagoland.
“If our director wants to change the shot or we want to totally rotate 180 degrees, we can do that in about 10 minutes,” says Thomas. “If we want to do a show with the track in the background first and then, a few minutes later, we want to look toward the garage with a different set of announcers, we can move the cameras quickly and make that happen. So it’s very flexible.”
In addition to being used for pre- and post-race studio coverage, these robos will be utilized for coverage of the action on pit road throughout NASCAR on NBC telecasts.
“The cameras are going to pull double duty because, if something’s going on in pit lane, those cameras are still going to physically be there. So they are going to give us some different angles that we haven’t seen very much of in the past,” says Thomas. “We’ve tried to create as much flexibility as possible so when Sam and Jeff ask, ‘can we do this?’, then we can say, ‘of course you can.’”
BatCam Returns: Aerial System Headlines NBC’s Army of CamerasNBC Sports will deploy an average of 55 cameras – including the return of the BatCam point-to-point aerial system to cover the backstretch – on big races at Daytona, Indianapolis, and Homestead-Miami this season. Thomas also expects to use BatCam, which debuted last year and can hit speeds of over 100+ mph, at the Watkins Glen road course this year. The BatCam also drew rave reviews throughout NBC’s Triple Crown coverage this past spring.
NBCS Sports is bringing back the BatCam point-to-point aerial system will to cover the backstretch at NASCAR races
The bulk of NBC’s camera complement for NASCAR is made up of Sony HDC-4300’s along with a mix of robos (provided by Robovision) and roving RF cameras. BSI will once again be providing eight RF in-car-camera dual-path systems, which allow two angles to be transmitted from each car at any given moment. Thomas also says his NASCAR on NBC team is currently experimenting with several new camera positions, which he expects to roll out throughout the season.
Going Inside the Action With New Graphics, Analysis ToolsNBC is utilizing SMT’s tools for the fourth straight NASCAR season. This year, the SMT race crawl has been updated to show the live running order and driver statistics at the traditional position on top of the screen and in a new vertical pylon display on the left side. The multiple options provide production with a variety of ways to allow fans to track each driver.
Also new this year is the SMT GOTO interactive touchscreen display, which provides several tools NBC can use throughout each race weekend, giving on-air analysts the ability to telestrate highlights, compare drivers and statistics, and interact with fans on social media.
SMT’s new Broadcast Analytics system has also been added to help enhance the coverage. The system live tracks all the cars during each session and allows production to show a virtual replay of any lap run during practice, qualifying and the race. The system allows production to visualize any lap run by any driver. It can provide a combined display of how a single driver ran on different laps, showing changes they’ve made during the session. The system can also show how different drivers ran the same lap. All of these options will allow fans to see the key moments during each session and better understand how that impacted where each driver finished.
In the Compound and Back Home in StamfordGame Creek Video’s PeacockOne (A and B units) will once again serve as the home to the NASCAR on NBC production team on-site, while an additional pair of Game Creek trucks will house mix effects and editing, as well as robo operations and tape release. In all, NASCAR truck compounds will be stocked with an average of 19 trailers (including BSI, Sportvision, NASCAR operations, and more).
“NASCAR does a great job setting up the compounds for us and providing a beautiful sandbox for us to play in,” says Thomas.
In addition, the NBC production team continues to increase rely more and more on file-sharing with the NBC Broadcast Center in Stamford, CT. AT&T and PSSI have partnered established fiber connectivity at the majority of the NASCAR tracks and will provide NBC with a circuit back to Stamford for file-transfer, as well as home-running individual cameras for at-home productions. Pre- and post-race shows from the Peacock Pit Box will regularly send back cameras to a control room in Stamford, where the show will be produced.
“We started [producing shows out of Stamford] last year and we will expand it more this year,” says Thomas. “It worked well last year and we’re making some improvements this year to make it even more seamless. With the increased support from AT&T and PSSI for network connectivity, I think it’s going to be even better this year. Obviously there are big cost savings on travel [as a result], but the product is of the same quality – so it’s really a win-win.”
SMT (SportsMEDIA Technology) continues its collaboration with the American Flag Football League (AFFL) to provide game management technology for the AFFL’s first U.S. Open of Football Tournament (USOF). The teams playing in the Ultimate Final at BBVA Compass Stadium in Houston will battle for a $1 million cash prize. SMT technical teams will be onsite at the USOF Tournament for every game, providing the customized virtual and clock-and score technology and graphics package that helped to define the league last year during its launch on June 27 at Avaya Stadium. Retired NFL stars return to the field to captain the teams, along with basketball legends and an Olympic gold medalist. SMT’s virtual 1st & Ten line system, used in NFL broadcasts, will be deployed from its Camera Tracker system, working in tandem with SkyCam to give viewers the “Madden-style” play-by-play angle used during NBC Sports’ 2017 season. SMT’s virtual Go Clock, designed specifically for the fast-paced AFFL, will synchronize with in-stadium displays to indicate when the defense can rush the quarterback.
SMT’s Design Studio designed and implemented the AFFL graphics package — including show open and score bug — and the virtual-graphics package. SMT’s clock-and-score technology is made available via its dual-channel SportsCG, a turnkey graphics publishing system that allows greater autonomy via a second channel laptop that can be operated remotely. In addition to producing the score bug, the SportsCG offers real-time, in-game offensive and defensive statistics powered by SMT QB Stats, the same system SMT uses for NCAA and NFL games. “SMT is proud to have helped the AFFL launch a new sports era, and we are thrilled to build on last year’s great success by offering flag football fans the same platform they’re used to when watching college and NFL games,” says Ben, SMT Business Development Manager. “With the debut of our dual- channel SportsCG, we can decrease the production bottleneck associated with rendering graphics on-air, allowing the quickly developing storylines to be told in a more dynamic way.”
June 17, 2018
Sports Video Group
The 2018 U.S. Open from Shinnecock Hills Golf Club gave the Fox Sports team challenges in production planning that led to innovations, the opportunity to refresh old workflows and core infrastructure, and a chance to chart some new directions for golf coverage.
The front-bench area in Game Creek Video’s Encore truck is at the center of Fox Sports’ U.S. Open coverage.
Game Creek Video’s Encore production unit is at the center of the coverage for Fox and FS1 with Game Creek Pride handling RF-video control and submix and providing a backup emergency control room. Pride’s B unit is handling production control for one of the featured groups, Edit 4 is handling all iso audio mixes, and Edit 2 is home to five edit bays with equipment and support provided by CMSI. And there is also the 4K HDR show, which is being produced out of Game Creek Maverick.
“All the Sony 4300 cameras on the seventh through 18th greens are 4K HDR-native with a secondary output at 720p SDR,” says Brad Cheney, VP, field operations and engineering, Fox Sports. There are also six Sony PXW-Z450’s for the featured holes and featured group, the output of two of them delivered via 5G wireless.
“We are producing two 4K HDR shows out of one mobile unit with four RF-based 4K cameras,” he adds. “That is another big step forward.”
In terms of numbers, Fox Sports has 474 technicians onsite, making use of 38 miles of 24-strand fiber-optic cable to produce the event captured by 106 cameras (including 21 wireless 1080p, 21 4K HDR units, six 4K HDR wireless, three Inertia Unlimited X-Mo cameras shooting at 8,000 fps, a Sony HDC-4800 at 960 fps, and three Sony HDC-4300’s at 360 fps) and 218 microphones. Tons of data is being passed around: 3 Gbps of internet data is managed, along with 83 Gbps of broadcast data, 144 TB of real-time storage, and 512 TB of nearline storage.
A Second CompoundEach course provides its own unique challenges. At Shinnecock Hills, there is is the presence of roads running through the course, not to mention the hilly terrain, which also has plenty of deep fescue. But, from a production standpoint, the biggest issue was the small space available for the compound.
Director, Field Operations, Sarita Meinking (left) and VP, Field Operations and Engineering, Brad Cheney are tasked with keeping Fox Sports’ U.S. Open production running smoothly.
“We came out here 18 months ago,” says Cheney, “and, when we placed all of our trucks in the compound map, [they] didn’t fit, and that is without the world feed, Sky, TV Asahi, and others. At Erin Hills last year, we had a support tent, and that gave our camera crew more space, dry storage, and a place to work.”
The decision was made to expand on what was done at Erin Hills last year: move the production operations that most benefit from being close to the course to a large field tent located along the third hole. The field tent is about a half mile from the main compound and is home to the technology area (shot-tracing technologies, etc.); the camera, audio, and RF areas; and the robotic cameras provided by Fletcher. Inertia Unlimited President Jeff Silverman is also located in the tent, controlling X-Mo cameras as well as robotic cameras that can be moved around the course to provide different looks.
Cheney says the team took the field tent to a new level by providing an integrated source of distribution and monitoring so that it could effectively be an island to itself. “It has worked out well. People are comfortable there. It’s dry and offers direct access to the course.”
According to Michael Davies, SVP, technical and field operations, Fox Sports, some of the operations in the field tent, such as those related to enhancements like shot tracing and the Visual Eye, could ultimately move even farther from the main compound.
“Typically, they would be in the main compound,” he explains, “but, once we figured out how to connect the two compounds via fiber for a half mile, it [indicates] how far away you can put things [like the shot-tracking production]. It gets the mind going, especially for events like this that can be hard to get to.”
Fox Fiber Technician Bryce Boob (left) and Technical Producer Carlos Gonzalez inside the fiber cabin
Also located closer to the course is the fiber cabin, a move that allows the team to more quickly deal with any connectivity issues on the course. The 37 miles of fiber cable used across the course is monitored in the cabin, and Carlos Gonzalez, technical producer, Fox Sports, and the team troubleshoot and solve any issues.
“We’re isolated from the compound, which can make it a challenge,” he notes, “but we are actually liking it.”
Cheney says that placing the cabin closer to the course means a reduction in the amount of outbound fiber and also makes the operation a true headend. “It’s something that we will continue to do at Pebble next year [for the 2019 U.S. Open] because of the setup there. This has been another good learning experience for us.”
Steps ForwardOne big step taken in preparation for the 2018 events was that the IP router in Encore was rebuilt from scratch.
“All of the programming in the router was there since day one [in 2015], and we have found new ways to do things,” says Cheney. “To strategically try to pull things out of it just wasn’t worth it. So we started from zero, and it paid off in terms of how quickly we could get up and running.”
Also playing an important part in enhancing the workflows was CMSI and Beagle Networks, which made sure networks and editing systems were all ready to go.
“The team from CMSI and Beagle Networks has been phenomenal in wiring up our networks and making sure it’s robust and all-encompassing,” says Cheney. “We also figured out new ways with IP to control things, move signals, and offer better control for our operators no matter where they are.”
RF wireless coverage this year is being provided completely by CP Communications. There are 26 wireless cameras on the course plus 18 wireless parabolic mics and nine wireless mics for talent on the course. All the signals are run via IP Mesh control systems, and CP Communications also provided all the fiber on the course.
The 5G setup includes a 5G cell mounted on the tower connected to processing gear on the back of a buggy.
Fox Sports is at the forefront of wireless innovation, working with Ericsson, Intel, and AT&T on using next-generation 5G wireless technology to transmit 4K HDR signals from Sony PXW-Z450 cameras to the compound. The 4K cameras are wired into an Ericsson AVP encoder, which sends an IP signal to an Intel 5G MTP (Mobile Trial Platform), which transmits the signal in millimeter wave spectrum via a 28-GHz link to a 5G cell site mounted to a camera tower. That cell site is connected to the Fox IP Network and, in the production truck, to an Ericsson AVP that converts the signal back to baseband 4K.
The potential of 5G is promising, according to Cheney. First, the delay is less than 10 ms, and, conceptually, a 10-Gbps (or even 20-Gbps) 5G node could be placed in a venue and the bandwidth parsed out to different devices, such as cameras, removing the need for cabling.
“You can fully control the system as a whole versus allowing direct management on the device level,” he says.
And, although the current setup requires a couple of racks of equipment, the form factor is expected to get down to the size of a chip within a year.
Expanding InnovationIn terms of production elements, Fox Sports’ commitment to ball-tracing on all 18 holes continues in 2018, with the network equipping each tee box with Trackman radar technology. Eight holes are equipped to show viewers a standard ball trace over live video, with enhanced club and ball data. The other 10 holes have Fox FlightTrack, a live trace over a graphic representation of the golf hole, offering more perspective to the viewer.
Beyond tee-shot tracing, three roaming RF wireless cameras are equipped with Toptracer technology, providing trace on approach shots. And new this year is FlightTrack for fairway shots on two holes, Nos. 5 and 16.
Zac Fields, SVP, graphic tech and innovation, Fox Sports, says the goal next year is to expand the use on fairways. “We want to do more next year and also find a way to use that on taped shots as well.”
Virtual Eye, the system at the core of FlightTrack that takes a 3D model of a hole and uses shot data from SMT as well as from the Trackman and Top Tracer shot-tracking systems to show the ball flight within the 3D model, has also been expanded. The Virtual Eye production team began its U.S. Open preparation a couple months back by flying a plane over the course and capturing photos to map the topography. Then, a few weeks ago, a helicopter shot video of the course, and pictures were extracted from the video and laid over the topographical images.
The FlightTrack team is located inside the field tent, making it easier to hit the course and fix any issues related to shot-tracking technology.
One of the goals, says Ben Taylor, operations manager, Virtual Eye, has been to make the system more automated and to allow it to be used on taped shots. For example, the EVS-replay users themselves can now trigger Virtual Eye to be active with the push of a button. And, when the ball comes to a rest, the graphic slides off the screen.
“The system will reset in the background after the shot,” he notes.
Fields and the Fox team have been happy with the performance, particularly the ability for EVS operators to control the graphic overlay. “It’s pretty slick,” he says. “The system takes the EVS feed and runs it through the graphics compositor and then back into the EVS, so the EVS system is recording itself. It seems complex, but, once the operator gets used to it, it’s easy. And now they can do FlightTrack a lot more.”
When Fox Sports took on the challenge of the U.S. Open in 2015, the industry watched to see how it would change the perception of golf coverage. Four U.S. Opens later, it is clear that the innovative spirit that has been part of Fox Sports since its early days continues unabated, especially as the era of sports data takes hold of the visualization side.
“We want to bring the CG world into our coverage and create animations to tell stories like comparing every tee shot a player took on a certain hole or comparing Dustin Johnson’s fade with another player’s draw,” says Fields. “And now we can show how the wind will affect a shot.”
June 8, 2018
Sports Video Group
With the second Triple Crown in just four years on the line, NBC Sports Group is pulling out all the stops for coverage of this weekend’s 150th Belmont Stakes. With Justify poised to capture the final gem of the Triple Crown, NBC Sports Group has boosted its production complement, adding a second onsite studio set, live pointer graphics to identify Justify on the track, and five additional cameras, including the Bat Cam aerial system that drew rave reviews at both the Kentucky Derby and the Preakness Stakes.
“Once Justify won Preakness, we knew what we were in for, and we started putting everything in motion right away,” says Tim Dekime, VP, operations, NBC Sports Group. “The [equipment levels] were increased a good bit, and we added all the bells and whistles. It means a lot more work and preparation, but it’s very exciting for us, and we are very well-prepared.”
All Eyes on Justify: More Cameras and Virtual Tracking Graphics NEP’s ND1 (A, B, C, and D units) mobile unit will once again be on hand to run the show, with a total of 43 cameras deployed — up from 33 for last year’s non-Triple-Crown race. Besides the Bat Cam aerial system covering the backstretch, the camera arsenal includes a Sony HDC-4800 4K camera (outfitted with a Canon UHD 86X lens) on the finish line, five HDC-4300’s running at 6X slo-mo and five more running at 60 fps, 14 HDC-2500’s (eight hard, six handheld), five HDC-1500’s in a wireless RF configuration (provided by BSI), a bevy of robos (provided by Fletcher) and POVs, and an aerial helicopter (provided by AVS weather permitting).
Ready for a Triple Crown effort at Belmont: (from left) NEP’s John Roché and NBC Sports Group’s Keith Kice and Tim Dekime
Five other cameras have been added because of the Triple Crown possibility: a POV camera at Justify’s gate and one in the PA booth with announcer Larry Collmus (which will be streamed live on the NBC Sports App), a robo to capture a 360° view of the paddock, an additional RF camera roaming the grounds, and, most notably, the Bat Cam system.
In addition to more cameras, NBC plans to use SMT’s ISO Track system to identify Justify with a virtual pointer graphic live during the race. The system will incorporate real-time data — speed, current standing, and distance from finish line — into the on-air pointer graphic, helping viewers follow Justify and other key horses throughout the day’s races.
“We’ll have a live pointer that tracks Justify during the race that our director [Drew Esocoff] will insert, if needed, [so] the horse will be tracked for the viewers watching at home,” says Coordinating Producer Rob Hyland. “It will have a little arrow pointing to where he is at certain points in the race.”
Bat Cam Covers the Back StretchThe Bat Cam was a hit at both Churchill Downs and Pimlico, providing a never-before-seen view of the backstretch and also coming in handy when rain and fog complicated matters for NBC at both the Derby and the Preakness. The two-point cable-cam system can travel 80 mph along the backstretch, running 15-18 ft. above the ground.
“NBC had already used the Bat Cam on NASCAR, so we knew what to expect at the Derby, and it was just a matter of figuring out how to implement it into our show,” says Keith Kice, senior technical manager, NBC Sports. “It’s turned out to be a great [tool for us], especially at [the Preakness]. Even if it wasn’t for all the fog, the infield [at Pimlico] with all the tents and stages and infrastructure makes it very difficult; you really need the Bat Cam just to cover the backstretch because you can’t see it otherwise.”
Given the massive size of the Belmont track, the Bat Cam will cover more ground than at either of the two prior races but will not cover the entire backstretch. The system will run 2,750 ft. — more than 700 ft. longer than at the Kentucky Derby, 500 ft. longer than at the Preakness Stakes — of the 3,000-ft. backstretch.
“The length of the backstretch was definitely a challenge in getting the Bat Cam unit [installed],” says Dekime. “But the benefit here as opposed to Preakness is that there’s nothing in the infield the way that it’s one big party at Pimlico. We are unencumbered, so that’s a positive. The length of the backstretch was a challenge in getting the Bat Cam units to cover most of the backstretch.
Although NBC and the Bat Cam team were forced to bring in larger cranes at Belmont in order to install the longer system, says NEP Technical Director John Roché, setup and operation of the Bat Cam has improved significantly since the Derby.
“It’s no longer a science experiment like it was before,” he says. “We’re able to get [Bat Cam owner/operator] Kevin Chase all the gear that they need, and they are able to give us what we need pretty easily in terms of terminal gear, intercoms, and everything. It’s pretty much plug-and-play now.”
Hyland adds that the Bat Cam “will not only cover the backstretch of the race but will also provide dramatic reset shots of this vast facility. When the Triple Crown is on the line at Belmont, the energy in this venue is electric, and we want to capture the sense of place.”
Triple Crown Chance Warrants Double the SetsBesides additional cameras because of the Triple Crown potential, NBC Sports has also added a second studio set. Host Mike Tirico and analysts Randy Moss and Jerry Bailey will man the 18- x 18-ft. set at the finish line, and a secondary 24- x 24-ft. stage located near Turn 2 will feature host Bob Costas and other on-air talent.
“If it was not going to be a Triple Crown, we would likely be down to just the finish-line set,” says Dekime, “but, now that it is, we’ve put the Turn 2 set back into operation.”
SMT’s Betting and Social Media GOTO videoboard will also be situated at the main set for handicapper Eddie Olczyk, who will use the interactive touchscreen for real-time odds and bet payouts for all races throughout the day. The touchscreen technology and betting touchscreen will enable him to explain and educate the viewers on how he handicaps specific races.
In addition to the onsite sets, NBC plans to incorporate several live remote feeds into the telecast, including from Churchill Downs.
“We brought out all of the tools to showcase the Triple Crown attempt, including a number of remotes that will carry live shots from Churchill Downs, where it all began five weeks ago,” says Hyland. “There will be hundreds of people gathered watching the race. We may have a live remote shot from a Yankees-Mets game just a few miles away. We’re working on a couple other fun ones as well, just to showcase this day and this athletic achievement, should it happen.”
Looking Back at a Wet and Wild Triple Crown CampaignAlthough the horse-racing gods have granted NBC the potential for a Triple Crown this weekend — and the big ratings that go along with it — the weather gods have not been so kind. After the wettest Kentucky Derby on record and the foggiest Preakness Stakes in recent memory, a chance of rain remains in the forecast for Saturday. However, Roché notes that the proliferation of fiber and the elimination of most copper cabling onsite has significantly reduced weather-related issues.
“Despite torrential downpours on the first two races, we’ve been really fortunate,” says Roché. “And no matter what happens here [in terms of rain], we’re getting a little spoiled having two Triple Crowns in [four] years after a 37-year drought. For us to be able to have an opportunity to show the public how we cover racing, especially with the addition of Bat Cam, in a Triple Crown situation is really an honor.”
Kice seconds that notion: “Having a Triple Crown [in play] makes all the hard work and troubles we went through with the weather and logistics on the first two races even more worthwhile.”
June 6, 2018
Sports Video Group
SMT will provide fan-engagement technology solutions for NBC Sports Group’s broadcast of the 150th Belmont Stakes. This year marks the eighth consecutive Triple Crown collaboration between SMT and NBC Sports Group and is particularly exciting as Justify seeks to become only the second horse since 1978 to win a Triple Crown.
Much like the Preakness Stakes and the Kentucky Derby, SMT’s suite of products will engage viewers from gate to finish with real-time, data-driven graphics, up-to-the-second odds, and commentator analysis.
SMT’s Live Leaderboard System highlights the running order of the top six horses using positional data updated 30 times per second per horse, ensuring accuracy and speed for SMT’s on-air graphic presentation.
SMT’s ISO Track system identifies the horses and incorporates real-time data such as speed, current standing, and distance from finish line into an on-air pointer graphic, helping viewers follow the action during the race.
SMT’s ticker produces an on-air display of real-time odds and bet payouts using live data from the race’s Tote provider (in-house wagering system). The ticker also curates and visually displays social media feeds that give followers an inside look at happenings at the track.
SMT’s Track Map System gives viewers a display of the lead horse’s real-time position and split times via an on-screen graphic.
SMT’s Betting and Social Media GOTO video board features real-time odds and bet payouts for all the races throughout the day. The system provides an interactive system for talent to explain the process of horse wagering.
The Data Matrix Switchboard (DMX) provides a customized solution for each Triple Crown race, absorbing, collating, and synchronizing live data feeds into SMT’s proprietary horse racing database. The DMX integrates live data for on-air and off-air graphics in real-time and replay modes, enhancing NBC’s live race presentation and pre and post race analysis. These displaysalso feature real-time advanced odds and minutes-to-post countdowns.
“With a Triple Crown in play for the second time in four years, SMT has another unique chance to help document a historic moment,” says Ben Hayes, Manager, Client Services, SMT. “Our systems help novice race fans understand the core aspects of the sport, while also providing in-depth betting and live race analysis for racing aficionados.”
April 24, 2018
Golf Channel
World No. 1 Justin James, Defending Champion Ryan Reisbeck & 2013 Volvik World Long Drive Champion Heather Manfredda Headline First Televised Event of 2018 from Long Drive’s Most Storied Venue
Veteran Sports Broadcaster Jonathan Coachman Making Golf Channel Debut; Will Conduct Play-by-Play at Each of the Five Televised WLDA Events in 2018
Eight men and four women have advanced to compete in tonight’s live telecast of the Clash in the CanyonWorld Long Drive Association (WLDA) event, airing in primetime from Mesquite, Nevada, at 7 p.m. ET on Golf Channel. In partnership with Golf Mesquite Nevada and taking place at the Mesquite Regional Sports and Event Complex, the group of competitors headlining the first televised WLDA event of 2018 are World No. 1 Justin James (Jacksonville, Fla.), defending Clash in the Canyon champion Ryan Reisbeck (Layton, Utah), and 2013 Volvik World Long Drive champion Heather Manfredda (Shelbyville, Ky.)
A familiar setting in World Long Drive, Mesquite previously hosted the Volvik World Long Drive Championship and a number of qualifying events dating back to 1997, including the World Championship having been staged at the same venue as the Clash in the Canyon from 2008-2012.
FORMAT: The eight men advanced from Monday’s preliminary rounds that featured a 36-man field and will compete within a single-elimination match play bracket during tonight’s live telecast. The four women advancing from this morning’s preliminary rounds (18-person field) also will utilize a single elimination match play bracket this evening to crown a champion.
COVERAGE: Live coverage of the Clash in the Canyon will air in primetime on Golf Channel from 7-9 p.m. ET tonight, with Golf Central previewing the event from 6-7 p.m. ET. An encore telecast also is scheduled to air later this evening on Golf Channel from 11 p.m.-1 a.m. ET. Fans also can stream the event live using the Golf Channel Mobile App, or on GolfChannel.com.
The production centering around live coverage of the competition will utilize six dedicated cameras, capturing all angles from the hitting platform and the landing grid, including a SuperMo camera as well as two craned-positioned cameras that will track the ball in flight once it leaves the competitor’s clubface. New to 2018 will be an overlaid graphic line on the grid, the “DXL Big Drive to Beat,” (similar to the “1st & 10 line” made popular in football) displaying the longest drive during a given match to signify the driving distance an opposing competitor will need to surpass to take the lead. The telecast also will feature a custom graphics package suited to the anomalous swing data typically generated by Long Drive competitors, tracking club speed, ball speed and apex in real-time via Trackman. Trackman technology also will provide viewers with a sense of ball flight, tracing the arc of each drive from the moment of impact.
BROADCAST TEAM: A new voice to World Long Drive, veteran sports broadcaster Jonathan Coachman will conduct play-by-play at each of the five WLDA televised events on Golf Channel in 2018, beginning with the Clash in the Canyon.Art Sellinger – World Long Drive pioneer and two-time World champion – will provide analysis, and Golf Channel’s Jerry Foltz will offer reports from the teeing platform and conduct interviews with competitors in the field.
DIGITAL & SOCIAL MEDIA COVERAGE: Fans can stay up-to-date on all of the action surrounding the Clash in the Canyon by following @GolfChannel and @WorldLongDrive on social media. Golf Channel social media host Alexandra O’Laughlin is on-site, contributing to the social conversation as the event unfolds, and, the telecast will integrate social media-generated content during tonight’s telecast using the hashtag, #WorldLongDrive.
In addition to the latest video and highlights from on-site in Mesquite, www.WorldLongDrive.com will feature real-time scoring. Golf Channel Digital also will feature content from the Clash in the Canyon leading up to and immediately following the live telecast.
Coming off record viewership in 2017 and a season fueled by emergent dynamic personalities, the Clash in the Canyon is the second official event of the 2018 World Long Drive season, as Justin Moose claimed the East Coast Classic in Columbia, South Carolina last month.
Showcasing the truly global nature of World Long Drive, several events will be staged in 2018 through officially sanctioned WLDA international partners, including stops in Germany, Japan, New Zealand and the United Kingdom. Additionally, an all-encompassing international qualifier will be staged (late summer) featuring a minimum of four exemptions into the Open Division of the Volvik World Long Drive Championship in September.
April 15, 2018
Boston.com
The light at the end of the tunnel for Boston Marathon runners making the final turn onto Boylston Street will be shining a little brighter this year. One of the changes the Boston Athletic Association made to the finish line for Monday’s 122nd running of the race is a new digital display board, affixed to the photo bridge above the finish line, that will be visible even if the forecasted rain falls.
“The finish times are going to be displayed big and bright and in color on that video board so that the participants and the spectators on Boylston Street will be able to see from afar what the time is,” said Jack Fleming, Chief Operating Officer of the B.A.A.
For their first year with the new board, which is similar to those that ring Gillette Stadium or TD Garden, the race organizers intend to go with a conservative approach and minimal animation. On Friday, it displayed a countdown clock for Saturday’s 5K and on Sunday it will show a tribute to One Boston Day. But the digital display opens up a new path forward for the finish line, and Fleming said that the B.A.A. could use lights and sound to enhance the spectator experience in the years to come.
“Boylston Street is like the home stretch of the Kentucky Derby or when the team comes out of the tunnel in Gillette Stadium,” he said. “We want our participants to feel that same way.”
In 2021, during the 125th Boston Marathon, don’t be surprised if the roar of the crowd over the final 500 meters is set to a background beat. But Fleming said the aesthetic changes will be made in keeping with the tradition of the event. Of course, no matter what sounds are added, the loudest noise in the runners’ heads will always be the ticking of the clock.
To that end, the organizers swapped the old clock — suspended by cable and beam above the street — for two consoles with double-sided clocks facing the oncoming runners on one side and the world’s media on the other. The race tape will be suspended in between the two consoles, and after the elite runners break the tape it will be wheeled out of the way.
Dave McGillivray, the race director, said that runners will notice some changes this year and a few more next year, building towards 2021 when the B.A.A. plans to showcase the finish line as part of the quasquicentennial celebrations. For that race, the organizers are also considering a request for an increased field size or more ancillary events around the Marathon.
The Boston Marathon finish line: a painted strip across a city street that’s taken on a meaning far beyond that.
“Everything to do with 2013 showed us just how loved Boylston Street is by our participants, by our fans, by the neighborhood, by the community,” Fleming said. “So that was sort of the inspiration for taking some actions on it.”
March 23, 2018
Sports Video Group
Although augmented reality is nothing new to sports production — the 1st & Ten line celebrates its 20th anniversary this year — AR has taken a giant leap in the past three years and is dramatically changing the way stories are told, both on the field and in the studio.
From left: Turner Studios’ Zach Bell, Fox Sports’ Zac Fields, Vizrt’s Isaac Hersly, SMT’s John Howell, and ChyronHego’s Bradley Wasilition
At SVG’s Sports Graphics Forum this month, a panel featuring executives from Fox Sports, Turner Sports, The Future Group, ChyronHego, SMT, and Vizrt discussed best-use cases, platforms, and workflows for AR, as well as how its use within live sports coverage is evolving. The one principle the entire panel agreed on was that AR cannot be used for technology’s sake alone: these elements must be used to further the story and provide valuable information to fans.
“Our philosophy has always been to use [AR] as a storytelling tool. We try not to use it for technology’s sake – whether that is in a live event or in the studio,” said Zac Fields, SVP, graphic technology and innovation, Fox Sports. “The interesting thing is that people can interact with [AR] on their phones and are familiar with what AR is now. That puts the onus on us to present those elements at an even higher quality now. [AR has] become the norm now, and it’s just going to continue to grow. The tools are there for people to come up with new ideas. The one thing that I would hope is that we can make it easier [to use] moving forward.”
Fields’s desire for more–user-friendly AR creation and integration was echoed throughout the panel by both users and vendors. Although a bleeding-edge AR project may be exciting and create a new experience for the fan, the goal is to create a solution that can be set up and used simply for every game.
“We’re trying to make sure that customers have ease of usability and repeatability every day,” said Isaac Hersly, director, business development, Vizrt. “It is an issue, and we are always looking for tools that are going to make it easier to set up and not need a rocket scientist. You [need to be able to] have someone that can operate the system very simply. That is our challenge, and we are always looking to come up with solutions to solve that.”
Turner Sports Brings Videogame Characters to Life With ARLast year, Turner Sports teamed with The Future Group to introduce augmented reality to its ELEAGUE coverage. The two companies worked with Ross Video to create life-like incarnations of videogame characters, allowing fans tuning in to watch games like Street Fighter V or Injustice2 to see these characters brought to life in the studio.
“I think creating AR characters from the games and bringing them to the audience adds an enormous amount of value for the fans and the viewing experience,” said Zach Bell, senior CG artist, Turner Studios. “If you can take characters or aspects of the game and have them as dimensional elements within that environment, it creates a much richer experience and allows fans of the game to visualize these characters in a new way. That in itself adds an enormous amount of connection to the experience for the viewer.”
Although esports presents a different case from a live game taking place on a field, Bell said, he believes similar AR elements will soon be making their way into live sports content (for example, NBC’s 3D AR elements from player scans during Super Bowl LII).
More Than Just a Game: Bringing AR to the MassesIt was only a couple years ago that high-end AR elements were reserved for the highest-profile sports events, such as NFL A games. However, with the technology’s rapid advance in recent years, AR has become ubiquitous for most national-level live sports productions and is making its way into even lower-tier properties. In addition, AR elements are becoming available on multiple cameras rather than just the main play-by-play camera (such as the SkyCam), and these systems can even be remotely controlled from offsite.
“The technology is allowing us to drive the next generation of this [content],” noted John Howell, creative strategist, SMT. “We have done the yellow [1st & Ten] line for 20 years, but, two years ago, SMT helped to create a technology that allowed us to do it on the SkyCam. Having that optical vision tracking to create the pan-tilt information off a $30,000 camera head for an image has enabled us not only to do this off the SkyCam but also to do it remotely.
“[That allows us to deploy AR] on more shows [more cheaply],” he continued, “and that technology will then trickle down to more shows. It won’t be just on Fox’s 4 p.m. Sunday NFL game or ESPN’s MNF or NBC’s SNF; now this [technology] gets to go on a lot more shows.”
What’s Next?: Getting More From Player-Tracking Chips, Customizing ARThe use of AR and the technology driving it has evolved rapidly over the past few years, raising the question, What’s next? The panel had plenty of predictions regarding the next great leap forward, but the primary point of excitement revolved around the continued advance of player-tracking RFID chips, especially the NFL’s Next-Gen Stats system.
“With the emergence of Zebra [Technologies] chips on players and [the NFL] looking at instrumenting the football [with a chip], you could see how that can tie to your first-down–line [graphic],” said Bradley Wasilition, director, sports analysis/lead sports analyst, ChyronHego. “The first-down line could actually dynamically change color, for example, when the first down is reached. Now, when that chip crosses that line, you can [definitively] say whether it is a first down or a player was out of bounds [on the sideline].
“Or think of a dynamic strike zone in baseball or a dynamic offside line in soccer,” he continued. “These are all different things that don’t necessarily reinvent the wheel, but they take baseline AR and move it into the 21st century.”
Fields predicted that, as multiplatform content and OTT outlets grow, fans will someday be able to customize their own AR elements within the sports coverage they are watching: “Eventually, it will get to a point where we can put this data in the hands of the viewer on an OTT offering. Once that happens, they can choose to turn off the strike zone over the plate. That is when we’ll really get some flexibility and customization to people so [viewers] can enhance [their experience].
March 16, 2018
Avixa
Sports. The great common denominator of all conversation. Even if you don’t like sports, you know enough to be able to talk about it, at least for a minute. And sports, by convenient association, is actually one of my favorite ways to talk about what it is that AVIXA members do.
We tell sports stories. Through gigantic video boards (forever “Jumbotrons” to the layman, and hey, that’s alright), humongous speaker systems, tiny microphones, variably-sized digital signage displays and perceptually invisible but actually ridiculously huge lighting systems and projection mapping, AV experience designers make the live event into a highlight reel. Everything has impact, in real-time.
So it happens to be that I’m forever on the lookout for evolving ways to tell sports stories in venues. In reading Sports Video Group’s coverage of the Super Bowl, I found another great angle on stadium storytelling. Most sports fans know that we are in the age of abundant sports data analytics, but what I didn’t know is that we are also in the era where those next-gen stats are changing the in-house show on the big screens at stadiums.
In a first for the Super Bowl, the 2018 game brought some television broadcast features to the in-house displays at U.S. Bank Stadium. And on top of that, they challenged audiences with a whole new graphics package featuring next-gen stats (“NGS” if you’re savvy).
With production tools by SportsMEDIA Technology (SMT), the virtual yellow line and some cool new NGS factoids made it to the big-time on the live-game displays. The latter of these came from SMT’s tapping into the NFL Next Gen Stats API to go deeper with the data.
SMT’s goal to delight fans with even more details to obsess over during the game seems like a good one. Especially because, well, “NFL fans are insatiable — they want data,” said Ben Grafchik, Business Development Manager for SMT.
To meet that need, SMT is exploring ways to tie in traditional data points with NGS in a visual format that fans can easily consume during a game. The objectivity and analytical depth of these additions to video board storytelling is compelling to all diehard fans, but in particular, the next-gen stats appeal to next-gen fans, Grafchik added.
These new graphics may have been a first for the Super Bowl, but actually, Vikings fans enjoyed them for the entire season at home at U.S. Bank Stadium. SMT worked with the in-house production team there to add all sorts of visual spice to the show, gradually going more complex with the offerings as the season went on and fans became accustomed to the new depths of data exploration.
But football isn’t the only sport that’s receiving the NGS upgrade. SMT happens to provide video enhancement and virtual insertion graphics for hundreds of major U.S. and international sporting events and broadcasters. So watch for a lot more variety to come both in house and wherever else you consume your sports content. It will certainly give us all a lot more to talk about when we talk about sports.
March 14, 2018
Sportstar Live
For more than 100 years, tennis, unlike team sports, used statistics sparingly. Basketball, baseball and football needed a plethora of stats, such as shooting percentages, batting averages and touchdowns scored, to measure the performances of their athletes and teams. But tennis players were measured chiefly by their wins, losses, titles and rankings. After all, few cared if the Wimbledon champion made 64% of his first serves or the No. 1 player averaged 77 miles per hour on her backhand.
All that changed in the Computer Age. With more information than they ever dreamed possible, tennis coaches, players, media and fans suddenly craved all sorts of revealing match data, not to mention astute analysis of it. No longer was it just whether you won or lost that mattered, but how and why you won or lost — points, games, sets and matches. Training methods, stroke production, tactics and equipment were also dissected and analysed in much greater depth and detail than ever before.
As the demand for data burgeoned, new technologies, such as sophisticated virtual graphics, tracking technology, statistical applications and telestration, have provided yet more valuable services and information to give athletes that “extra edge.”
Like any prescient, enterprising pioneer, Leo Levin seized the opportunity by developing the first computerised stats system for tennis in 1982. Levin’s seminal work was highlighted by creating the concept of and coining “unforced error,” a term now used in most sports and even by pundits to describe a politician’s self-inflicted blunder.
Since then, the genial 59-year-old, based in Jacksonville, Florida, has covered more than 120 Grand Slam events and countless other tournaments to provide the Association of Tennis Professionals (ATP) and other businesses with match statistics. Levin, dubbed “The Doctor” by broadcaster Mary Carillo for his incisive diagnoses of players’ games, is currently director of sports analytics at SportsMEDIA Technology (SMT), a company that provides custom technology solutions for sporting events.
In this wide-ranging interview, Levin explains his many roles in the exciting, fast-growing field of analytics and how it has changed tennis for the better.
What is sports data analytics?
Sports data analytics is a combination of gathering and analysing data that focuses on performance. The difference between analysis and analytics is that analysis is just gathering the basic data and looking at what happened. Analytics is trying to figure out why and how the basic performance analysis works with other factors to determine the overall performance of the athlete or the team.
When and how did this field start changing amateur and pro tennis? And who were the pioneers?
Honestly, I was. At the end of 1981, the first IBM personal computer hit the market for general consumer use. By the middle of 1982, I was working with a company in California to develop the very first computerised stats system for tennis. The key factor was the way we decided to describe the results of a tennis point in three basic areas. The point had to end with a winner, a forced error, or an unforced error. That created the foundation for how we look at tennis today.
How and when did you become interested in tennis analytics?
I was playing on the tennis team at Foothill College in Los Altos, California, about five miles from Stanford University. When I wasn’t playing matches, I was actually charting matches for my team-mates and then providing that information to the coach and the players to try to help them improve their games.
Brad Gilbert, a former world No. 4 and later the coach of Andre Agassi and Andy Murray, played on your Foothill team. Did you help him?
Brad was on that team, and it was interesting because in his first year, he played No. 2. The player who played No. 1 came to me before the state finals where he had to play Brad in the final, and asked me, ‘How do I beat Brad?’ I was able to give him specific information on strategy and tactics that helped him win the state title.
That was the year Brad took his runner-up trophy and smashed it against a tree and vowed never to lose a match the following year. And the following year, Brad didn’t lose a match.
SportsMEDIA Technology’s (SMT) products and services have evolved from a clock-and-score graphic in 1994 to innovative and sophisticated virtual graphics, tracking technology, statistical applications, and telestration. How do you and your team at SMT use these four methods to analyse statistical data at tennis’ four Grand Slams to provide valuable insight that helps players, coaches, broadcasters and the print media determine how and why a match was won or lost?
One of the challenges with tennis, more so than with any other major sport, is the lack of data. When we started doing this, there really wasn’t any consistent gathering of data from matches. So the first piece we developed was simply a system now known as Match Facts. It pulled factual statistical data directly from the chair umpire. That started with the ATP back in the early 1990s. We were then able to create a base for year-round information on the players. It allowed for the next level of analysis. It has expanded from there. We developed the very first serve speed system to start adding additional data and how players were winning or losing based on the serve speeds. As the technology improved, we’ve been able to harness the new generation — tracking video technology and then on the presentation side, using virtual graphics as a way to be able to place data directly into the field of play to help illuminate what is actually going on. Telestration is a tool that allows the broadcasters to get inside the points and help the fans understand the combinations of shots and strategies the players are using.
Your website (www.smt.com) has a section titled “Visual Data Intelligence” with the subtitle, “SMT delivers the world’s most innovative solutions for live sports and entertainment events across the globe.” What is Visual Data Intelligence? And what are its most important, innovative solutions for live sports and entertainment events?
Visual Data Intelligence goes to the heart of what we try to do as a company. In a lot of different sports, there is a lot of information available. But making it useful to the broadcasters, and specifically to the fans, to help them understand the game is a huge part of what we’re providing. That entails simple things like the first-and-10 line in football. That provides the visual set of information for the commentators and fans that really helps them understand where the teams are and how much yardage they need (to get a first down). It’s gotten to the point where fans in the football stadium are yelling, “Where’s the yellow line?” So we’re expanding that to provide the service to the large screens displayed inside the stadium so teams have their own system to be able to show that to the fans.
How does Visual Data Intelligence apply to tennis?
In tennis where you have a lot of data, the challenge is: how do you provide all that data to the fans and the commentators? We do that through a series of different systems. We have what we call our “open vision system,” which is an IPTV solution that has real-time scoring, stats and video as well as historical data. And it ties it all together and puts it in one place so it provides a true research tool for the commentators and the (print and online) media. Along with that, we have a product we call our “television interface,” which is really a system which drives graphics on air for the broadcasters. This tool allows them to look at the data and see where the trends are. Hit the button and have that information directly on the screen.
Please tell me about the new technology service partnership between Infosys and the ATP, and the analytics and metrics this partnership brings to the tennis world.
I’m not really that aware of what Infosys and the ATP are doing. But I do know that a lot of that hinges on the technology we created for Match Facts. One of the unique things about tennis is the scoring system. Unlike other sports, the player or team that wins the most points doesn’t necessarily win the match. That’s not how our scoring system works. I think they are trying to take a deeper look into the individual points, and how winning or losing specific points in key situations impacts a player’s ability to win or lose matches. The same is true for total games. That’s one of the challenges when you’re trying to do analysis of tennis. In a lot of other sports, you’re just looking at the raw numbers and saying how many points did he score or how many rebounds did she get or how many yards did they gain. But in tennis, it has to be compartmentalised into specific performances in specific situations.
How do insights from game and training data analytics improve coaching?
The key to coaching and player improvement is first to understand what is going on out on the court. It’s a matter of gathering data. One of the challenges tennis has faced because of its late start in the world of statistics and data analysis has been a reluctance by a lot of coaches and players to rely on anything other than what they see and feel. So the real challenge and the real key is to be able to relate the data to what coaches see and what players feel out on the court. When you can make that connection, you have a real chance for improvement.
What are one or two insights that have improved coaching?
The challenge is that every player is different. What the data analysis allows you to do is to customise those things and focus not on what a player does, but what your player does, and how you can get the most out of your player’s game. A simple example of this was when we first started doing detailed statistics and analysis, we worked with the Stanford University tennis programme. Their No. 1 woman player, Linda Gates, was struggling, and the coaches couldn’t figure out where or why. We did an analysis of her game, and we found out that she was dominating her service games on her service points in the deuce court, but she was struggling in the ad court. It wasn’t visually obvious. The coaches couldn’t put their finger on what the problem was. But once we started looking at the numbers and the data, it allowed them to focus in practices on her ad-court shot patterns. Linda went on to win the NCAA Championships that year, 1985, in singles and doubles (with Leigh Anne Eldredge).
An Infosys ATP “Beyond The Numbers” analysis of Rafael Nadal’s resurgence to No. 1 in the Emirates ATP Rankings showed that Nadal ranked No. 1 on tour in 2017 for winning return points against first serves, at 35.2 percent (971/2761). That metric shoots up to an astounding 43.4 percent (454/1045) for his clay-court matches. Which other stunning statistics help explain why other players have had outstanding years this decade?
This goes to the basics of looking at players’ strengths and weaknesses. One stat I always look at is serve and return performance because I still split the game up that way. It’s interesting that when you look at a player like Nadal, you see that he is not only dominant on return of serve. He’s also dominant on his own second serve.
Even with all the analytics we have, an old maxim still holds true: “You’re only as good as your second serve.” You’ll find the players at the top of the rankings for the last four or five years were also at the top of both second serve points won and return of second serve points. Despite all the focus on power and big serves, second serve performance is really a huge key to understanding a player’s overall strengths and weaknesses.
How much do the Women’s Tennis Association tour and its players take advantage of analytics?
Although the WTA was a little behind the ATP curve in terms of gathering and storing match data, the good news is that now they’ve caught up. Their association with SAP and that they’re also now using a Match Facts system to provide data for the players on a match-by-match basis has moved them up the curve.
Which pro players have benefited most from tennis analytics so far? And in what specific ways?
That’s a tough question. Because I don’t work directly with the players and coaches as I used to, I don’t know who is utilising the data more so than others. You can tell just by looking at Roger Federer’s improvement over the last year that his team used analytics to determine that he needed to be more aggressive on his backhand. He’s now hitting a much higher percentage of topspin backhands than he did in previous years and that change has made his game more balanced and puts a lot more pressure on his opponents. Playing to Roger’s backhand used to be the safe play — it’s not any more.
Another area of Federer’s game that came to light using analytics was the difference between his winning and losing matches at Wimbledon. When you compare his final match wins to his matches lost since he won his first Wimbledon in 2003 — 8 titles, 7 matches lost — the numbers that jump out are all about his return of serve, and specifically, his performance on break points. Federer’s serving performance barely changed, but his return game fell dramatically in his losses. In his Wimbledon final wins, Federer converted 30 of 69 break points for 44%. In his losses, he converted only 9 of 53 for 17%. In both cases, he averaged around 8 break points per match. In his wins, he converted almost 4 per match, but in his losses he converted just over once per match. His team looked at that crucial data and added in that nearly all his opponents served and volleyed 2% or less of their service points and concluded that Roger needed to work on hitting his returns deep and not worry about his opponents coming in behind their serves.
Younger players are taking most advantage of the information because they’ve grown up in that world. They’re used to the electronics and the digital experience and having all that information available to them.
How do these insights enhance the fan experience?
I credit (renowned former NFL analyst) John Madden for being one of the very first TV commentators who would take fans inside the game to explain to them things they didn’t necessarily see. Madden would explain to women football fans what the centre or guard was doing on a particular play and why that back ran for 50 yards was all because of this really good block.
What we’re trying to do in tennis and what these insights have provided is to do the same kind of things for tennis fans. Help get them inside the game so they understand the nuances of what’s happening on the court, and they’re not just watching two guys running around hitting the ball.
What is radar-based tracking, which is now used by the United States Olympic Committee (USOC) for every throw an Olympic athlete makes? Is it being used in tennis?
Radar-based tracking is simply tracking the speed and location of the ball or object that is being thrown or hit. Radar-based tracking has been typically used for service speeds in tennis. That is something we pioneered in the late 1980s. The tracking used in tennis has been video-based, as opposed to radar. The advantage of that is that you can track movement of the players as well as the movement of the ball and from a variety of positions and angles.
Can analytics predict which junior players will someday become world-class players or even champions? And if so, can it guide their coaches and national federations to increase the odds that will happen?
Not yet. The challenge is that prediction is different from analysis. You’re trying to draw conclusions from the data, and we don’t have a complete set of data. If you wanted to predict which junior players will become world-class players, sure you can do that if we have genetics, biomechanics, all the physical characteristics measured as well as using analytics to measure the player’s overall performance on the court. We can see whether or not they have specific markers that indicate they will make that jump. But the bottom line is that there are so many factors involved. And a lot of it has to do with the physical side that you can’t necessarily determine from data.
What is bioanalytics? And why is measuring and analysing an elite athlete’s perspiration important?
We’re pioneering bioanalytics in football now. We’re taking biometric readings from players at the university level. The players are equipped with motion sensors and full biometric readers, which are reading things like heart rate, body temperature and respiration. And they’re combining that with the movement data from the tracking information. With that, we’re able to measure the physical output of the players. The sensors in the helmet measure impacts (from collisions).
We’ve been working on this project for a few years. It’s been used for the football programme at Duke University. We’re in the process of adding a couple more universities to this project. At this stage it’s being used for medical purposes. So when a player is on the practice field, they can know immediately if his heart rate starts racing or if his body temperature goes up too high, they can immediately pull him out of practice and get him more electrolytes and hydration. They also weigh the players before and after every practice so they know how much fluid the player has lost during their practice times.
How is bioanalytics used in tennis?
Unlike a team sport where a team can outfit all its players with this equipment, tennis players are all independent contractors. So it’s going to take more of a nationalistic approach — something like what the USTA is doing — to step in and say, “For our junior players, we’re going to outfit some courts and we’re going to provide this level of analysis on the physical side.”
Does analytics apply to tennis equipment and court surfaces? And if so, how?
Sure, it can. Analytics can identify how well players perform using different types of equipment and on different surfaces. For instance, if you’re using some tracking technology to determine what racquet and string combination allows a player to have the most amount of power, that’s a relatively simple exercise. You run a player through a set of drills, hitting particular shots, and measuring the speed of the ball coming off the racquet.
For surfaces, analytics can really help with identifying the type of shots that have an effect on particular surfaces or areas where players’ games break down. For example, you have players who have a long backswing, and that works really well on a slower surface where they have time to take a big backswing. But when you put them on a faster court, where the ball bounces lower and faster, it upsets their timing, and it makes it more difficult for them to adjust. Analytics measures the court’s bounce speed and bounce trajectory. So you can take a player and modify his game on a particular surface taking into account how the ball reacts to it.
You’ve analysed thousands of matches. Which factors influence the outcome of matches the most in men’s tennis and women’s tennis? And why?
The No. 1 factor typically is unforced errors. If you’re making mistakes, you’re basically giving the match to your opponent. Being able to measure and quantify that is a huge factor for player improvement. That entails understanding where you’re making your mistakes — which shots and what situations. The caveat to that is that there are certain players whose games are based on absolutely controlling the pace and tempo of the match. And they have the tools to do that. Two of the best players ever to do that are Steffi Graf and Serena Williams.
What are the disadvantages of and dangers involved with analytics? Will some number crunchers and coaches go overboard with analytics and be guilty of Occam’s razor?
The simple danger is to rely on data alone. The challenge is that you have to make the data relatable to what the player is doing physically and mentally on the court. Analytics doesn’t necessarily measure the mental side of the game, at least not yet. If you’re focusing so much on the analytics of certain shots and not looking at the big picture of their mental focus and how they’re preparing for matches, you can get into trouble.
Since tennis players vary greatly in temperament, talent, current form and other variables, do practitioners of analytics risk over-concluding from their numbers? And what mistakes have you and others made in this regard?
There is always a risk. Data can provide you with valuable information. Then you make that next leap that says, “This information says this, and therefore we have to do this, or therefore we have an issue.” I’ll give you a simple story from a few years ago. Jim Grabb, who was the No. 1 doubles player in the world then, came up to me at a tournament before the US Open and said, “I’m struggling with my first volley in singles. I can’t make a first volley.” And I told him, “You’re the No. 1 doubles player in the world. You have great volleys. And you’re saying you can’t make a first volley in singles.” He says, “Yeah.”
A lot of coaches would say, “How are you hitting it? Let’s analyse the stroke.” I asked, “When you step to the baseline to hit the serve, where is your first volley going?” Jim looked at me like I was speaking a foreign language. So I asked again, “Before you hit your first serve, where are you going to hit your first volley?” He said, “I just react to the ball. I don’t know what you’re talking about.”
So I suggested, “Do this. Every first volley goes to the open court. You serve wide in the deuce court and you volley wide into the ad court. You serve wide in the ad court and volley wide into the deuce court. Just for your first volleys.”
Jim goes out to play and comes back and says, “I didn’t miss a first volley.” The next week he got to the fourth round of the US Open, his best result at a Grand Slam (event) ever in singles. That had to do with the fact that all it really required was a little bit of focus by the player. It didn’t require a level of analysis and stroke production changes. It was simply eliminating decision-making.
What is the connection between analytics and the established field of biomechanics?
Analytics can tell you how a player is performing or how a stroke is performing in key situations. That can then identify that we need to examine the biomechanics of the stroke, particularly if it is breaking down under pressure. Or we can determine that the errors are occurring when the ball is bouncing four feet in the air versus three feet in the air, so their contact point is a foot higher. Now we can look at the biomechanics and see what the player is doing when the ball is a foot higher.
What are player rating systems? And what is the connection between analytics and player rating systems? How valid is the Universal Tennis Ratings system?
I don’t think there is any now. But that’s a direction we can take in the future.
Which match statistic or statistics do you foresee becoming increasingly important as a result of analytics?
I think you’ll see more focus on key point performance as we do more and more analysis of players’ games in key pressure situations. Because you’re serving half of the time and receiving serve half of the time, analytics will look increasingly at each half of the game. We talk a lot about unforced errors, but are they occurring on your serve game or return game? We talk about aggressive play and taking control of the points, but when is that happening? And the serve or return games? On the first serve or second serve?
Data analytics is undeniably changing tennis. Do you think it will revolutionise tennis?
Absolutely! Because the game is always changing. The technology around tennis and all sports keeps changing. Analytics is going to make the athletes better. It’s going to provide them with insights about how they can be at their peak for the key matches. It will help them train better, prepare better, execute shots better under pressure. All those pieces and parts will be available for athletes. And all of their nutritional, sleep, and training regimens will also help tennis players to perform better.
March 9, 2018
Sports Video Group
The 2018 NASCAR season is underway and with it comes a new remote production workflow for NASCAR whereby cameras and audio signals are being sent from race tracks to NASCAR’s production center in Charlotte, NC. The efforts began with the Rolex 24 at Daytona race and will continue with the WeatherTech SportsCar Championship racing series next week, and ARCA Racing Series as the season progresses.
“We have done a lot of testing at smaller events the past couple of years but this year we wanted to push the limits and see what we can do,” says Steve Stum, NASCAR Productions, VP of Operations and Technical Production.
The Rolex 24 Hour race used NEP’s NCP IV production unit to put out 12 hard cameras, two RF cameras for the pit announcers, and 14 in-car cameras around the track. RF was handled by 3G and a tech manager and engineering team ensured that 28 video and 75 audio signals were sent to Charlotte via a single antenna from PSSI Global Services. PSSI Global Services leveraged its C27 mobile teleport, equipped with cutting-edge Newtec modulators and GaN SSPB amplifiers from Advantech Wireless.
Rick Ball, Director of Broadcast Sports at PSSI Global Services, adds: “We’re not afraid to go where no one has gone before, and we’re proud that our efforts continue to create new possibilities in live television.”
Once the signals are back in Charlotte the director, producer, TD, replay, SMT virtual graphics, and announcers created the show.
“Round trip the latency is 1.09 seconds so we have camera returns and feeds for the screens for the fans in the stands,” adds Stum.
With upwards of a third of production costs being sunk into travel Stum says that the goal is to put more money into the production itself, get more specialized equipment, and have a production truck unit that is more aligned with the needs of a remote production.
The efforts are part of a season that Stum says has been going great so far. And all of the testing prior to the Rolex race paid off as Stum says nerves at the beginning subsided as the workflow was proven out.
March 2, 2018
Sports Video Group
As the NFL Scouting Combine becomes an increasingly fan-focused event onsite, NFL Media is expanding its already sizeable coverage of the annual event in Indianapolis. Last year, the NFL added Combine events, including the bench press and press conferences, at the Indianapolis Convention Center next door to Lucas Oil Stadium and allowed a limited number of fans into the stadium’s upper bowl in an effort to boost the NFL Combine Experience. With that in mind, NFL Network and NFL Digital outlets are rolling out their biggest productions to date to cover the growing parade of events taking place at both locations.
“We attack this show with everything we have in order to cover it from every aspect,” says Dave Shaw, VP, production, NFL Media. “The league has continued to expand the fan-focused aspect of the Combine at the convention center. They started that last year and are putting even more events over there this year. So we’ve expanded our show to cover some of the more fan-friendly stuff.”
For its 14th Combine, NFL Media is delivering a whopping 52 hours of live coverage during the event (Feb. 28 – March 5), including 34 hours from Indianapolis: 26 hours of Combine coverage Friday-Monday and eight hours of press conferences Wednesday and Thursday.
“This event really didn’t become ‘an event’ until it was covered by NFL Network,” says Christine Mills, director, remote operations, NFL Media. “It’s grown and evolved, and now fans are becoming more involved [onsite]. It’s interesting how it’s grown from a very small intimate event essentially just for scouts to an event covered by NFL Network and NFL Digital and on social. It’s grown into a fan-facing event, but it has kept that intimate feel at its core.”
Onsite in Indy: Encore and Pride, Four Sets Drive Multiplatform ProductionDespite the expansion, NFL Media has maintained the same footprint in the truck compound at Lucas Oil Stadium. Game Creek Video’s Encore is serving the NFL Network show, and Pride is handling the streaming coverage.
The trucks onsite are fully connected to NFL Media’s broadcast center in Culver City, CA, via diverse fiber circuits (with 12 muxed feeds going each way) to allow extensive file-transfer and backhaul of camera feeds.
“For our coverage, we treat this like we’re covering a high-end game,” notes Shaw. “It’s a very slick production that moves quickly. It is a bit of a marathon, but our production teams do an outstanding job of rolling in features and keeping the action moving. It’s an important show for the NFL Network and NFL Media group because it’s the baseline for what we are about, which is giving viewers the inside look and show fans what they should look for in the upcoming players.”
NFL Media has deployed a total of four sets — three at Lucas Oil (one on the field, two on the concourse level) and one at the convention center — to serve its 23-deep talent roster. Two of the three sets at the stadium are dedicated to the digital operation; NFL Network is manning the convention-center set, which is primarily for press-conference coverage.
“The setup we have at the convention center for NFL Network is very similar to [Super Bowl] Opening Night, where they have eight podium positions set up and we’re right in the middle of that room,” says Mills. “It ends up being a really fun and busy couple of days, especially with the fans more involved now [onsite].”
In addition to the four sets, NFL Network has a position in the traditional announce booth at Lucas Oil Stadium, as well as an interview location in a suite, where head coaches often stop by. For example, last year, NFL Media landed a rare interview with Patriots coach Bill Belichick in this location.
“Most of the head coaches are here in a casual atmosphere trying to pull something away from some of these players they’re evaluating,” says Shaw. “And the coaches have [free rein over] where they want to be in the building, so sometimes they will stop by the announce booth. Having Belichick stop by and do some time with our guys took us all off guard a little, but it was great and got a lot of attention. What’s exciting is, you don’t know what you’re going to pull off here since you have all the coaches and GMs. It’s a lot of fun trying to get in their minds and hearing what they have to say in this kind of atmosphere.”
The Camera Complement: Skycam, Robos, and TeamCamsBetween NFL Network and NFL Digital, the operation is deploying a combined 37 cameras at the two venues, including a SkyCam at the stadium and a large complement of robos (provided by Indy-based Robovision) at both locations. In addition, five ENG cameras are roving the grounds capturing content, which is being sprinkled into both the linear and the streaming coverage.
NFL Media will continue to spotlight the 40-yard–dash drill, with a high-speed camera capturing the smallest details. In addition, SMT is providing virtual graphics and graphics overlays for visual comparison of prospects with one another or with current NFL players’ Combine performances: for example, projected top pick QB Sam Darnold vs. Pro Bowl QB Carson Wentz’s sprint).
In addition, NFL Media is leveraging its Azzurro TeamCam system to provide live shots throughout its press-conference coverage. The TeamCam system, which NFL Network has used for a variety of needs for several years, features a single camera and transports bidirectional HD signals via a public-internet connection — along with IFB, comms, and tally — between Indianapolis and Culver City. In addition to a show produced onsite during the first two days, all press conferences are fed to Culver City via the TeamCam system.
“It’s interesting what we do for our live shots with the TeamCam system,” says Shaw. “We can just do one-off cameras, or we can bring it back; we can do two-ways just with a single camera. It’s a great [tool] for our Wednesday and Thursday coverage.”
NFL Digital Bigger Than Ever at CombineNFL Digital’s presence continues to grow at the Combine. NFL Now Live is streaming on NFL.com, the NFL app, and Yahoo.com Friday-Monday beginning at 9 a.m. ET. In addition, NFL Media is providing extensive social-media coverage across Twitter, Facebook, Instagram, and Snapchat. Twitter Amplify is being used to produce highlights, distribute on-the-ground original content of top achievements across social networks, and deliver original social content to all 32 NFL clubs. On top of that, for the first time, the NFL is coordinating with some of the top college football programs to share, create, and amplify social-media content from Indianapolis.
In addition to live coverage, each prospect goes through the “Car Wash” following his press conference at the convention center. Each player progresses through interviews with NFL Media’s features team, digital team, and social-media team.
“These [Car Wash] interviews help us build features and get footage for the Draft,” says Shaw. “It also helps us down the road, and we’ll use footage all the way through the season. This is an NFL Media-exclusive event, so we go out of our way to give the avid NFL fan that inside position they don’t usually get to see.”
February 28, 2018
Sports Video Group
NFL Network will produce and broadcast 11 live American American Flag Football League (AFFL) games during its debut season, as well as distribute highlights from the AFFL’s upcoming 2018 U.S. Open of Football (USOF) Tournament. The agreement is the first-ever broadcast deal for professional flag football, and “provides a unique opportunity for the NFL to explore digital distribution of AFFL content,” according to the league’s announcement. The 11 game telecasts will be produced by NFL Network and feature NFL Network talent.
“Today marks great progress for football fans and players,” says AFFL CEO/founder Jeffrey Lewis. “As the first-ever broadcast and distribution deal focused on bringing the game of flag football to the broadest possible audience, we are thrilled to partner with NFL Network, the premier platform for football.”
The AFFL is set to launch this summer, and NFL Network is expected to build on the unique use of technology deployed for coverage of the AFFL’s first exhibition game on June 27, 2017, at Avaya Stadium in San Jose, CA. In an effort to create a wholly revamped football-viewing experience similar to the Madden NFL gaming look, the AFFL production team deployed SkyCam as the primary play-by-play angle (prior to NBC Sports’ decision to do so for several games during the 20017 NFL season), RF cameras inside the huddle, and SMT virtual graphics and augmented-reality elements all over the field.
The USOF is a 132-team, single-elimination tournament that will ultimately pit a team of elite former professionals against a team that has conquered a 128-team open national bracket. The tournament marks the AFFL’s first major competition, following an exhibition game in June 2017. NFL Network will televise 11 USOF games live June 29-July 19, concluding with the Ultimate Final, where America’s Champion and the Pros’ Champion will meet in a winner-take-all contest for $1 million.
The broadcasts are currently scheduled for the following dates:
The four Pro teams are expected to be led by Michael Vick, Chad “Ochocinco” Johnson, basketball duo Nate Robinson and Carlos Boozer, Justin Forsett, and Olympic champion Michael Johnson. Airtimes and broadcast talent for USOF games on NFL Network will be announced at a later date.
“Football fans are passionate about having continuous access to entertaining football content all year round,” said Mark Quenzel, SVP, programming and production, NFL. “AFFL games on NFL Network will give viewers a chance to experience a new kind of football competition in the summer months, and we’re excited for the opportunity to deliver more live programming that fans enjoy.”
The AFFL is extending the application deadline for the USOF from March 1 to March 8. Interested applicants can apply to play in the USOF here. Those selected will play in America’s Bracket, which comprises 128 teams.
February 19, 2018
Sports Video Group
One of the highlights of Turner’s NBA All-Star Saturday Night coverage was the debut of a shot-tracking technology developed by Israeli startup RSPCT. Deployed for the Three-point Contest, RSPCT’s system, which uses a sensor attached to the backboard to identify exactly where the ball hits the rim/basket, was integrated with SMT’s graphics system to offer fans a deeper look at each competitor’s shooting accuracy and patterns.
“There is a story behind shooting, and we believe it’s time to tell it. Shooting is more than just a make or a miss,” says RSPCT CEO Oren Moravtchik. “Turner and the NBA immediately understood that the first time they ever saw [our system] and said, Let’s do it.”
During Saturday night’s telecast, Turner featured an integrated scorebug-like graphic showing a circle representing the rim for each of the five racks of balls during the competition. As a player took a shot, groupings indicating where the ball hit the rim/basket were inserted in real time, showing where the ball landed on the rim or inside the basket.
“It’s a bridge between the deep analytics that teams are using and the average fan,” says RSPCT COO Leo Moravtchik. “Viewers can understand shooting accuracy faster and better without having to dive into analytics; they clearly see groupings of shots and why a shot is made or missed. Last night, if a player missed all five shots of a rack, you could see why: if they are all going right or all going left.”
The system, which can be set up in just 30 minutes, consists of a small Intel RealSense Depth Camera mounted behind the top of the backboard and connected wirelessly to a small computing unit.
“We have some very sophisticated proprietary algorithms on the sensor,” says Oren Moravtchik. “The ball arrives at a high speed from the three-point line at various angles. We can [capture] the entire trajectory of the ball: where it came from, how it flew in the air, where it hit the basket — everything. We know the height of the player, the release point, and where it hit the basket, and then we can extrapolate back from there.”
Although Saturday night marked the debut of the RSPCT system for the NBA, Leo Moravtchik sees far more potential once complete data sets on players can be captured — such as a full playoff series or even a full season.
“There may be an amazing player shooting 18 out of 20 from every [three-point] location, but there are differences between locations beyond just field-goal percentage,” he says. “Based on our data, we not only can show them [that] shooting [tendencies] can predict, [that] we can actually project their field goals for the next 100 shots. We can tell them, If you are about to take the last shot to win the game, don’t take it from the top of the key because your best location is actually the right corner.”
RSPCT is not only focusing on sports broadcast and media clients but marketing the system as a scouting and player-development tool.
“We’re [targeting] NBA teams, college teams, and even high school and amateur teams,” says Leo Moravtchik. “Wherever there is a basket — camps, gyms, schools — people want to see how they are shooting. We can bring it there because it’s a 30-minute installation and very cost-effective.”
February 16, 2018
Sports Video Group
The 60th running of the Daytona 500 takes place this Sunday, and Fox Sports, as it has done every year, again has found a way to push the technological envelope and expand on the resources dedicated to broadcasting the Great American Race. Coverage of this year’s race includes the introduction of Visor Cam, the return (and refinement) of the dedicated Car Channels on Fox Sports GO, and — in an industry first — a tethered drone that will provide live coverage from behind the backstretch at Daytona International Speedway.
“Every year, there’s something new,” says Mike Davies, SVP, field and technical operations, Fox Sports. “The Daytona 500 is always a great way to kick off the first part of the year in terms of technological testing: a lot of the things that we bring down to Daytona to look at, to test, and to try are things that manifest themselves later and in other sports. It’s a lot of fun to dream these things up.”
A Unique Point of ViewThis weekend’s race will feature all the camera angles that racing fans have come to expect, plus a few new views that promise to enhance the broadcast. Fans have grown accustomed to seeing their favorite drivers up close thanks to in-car cameras, but, on Sunday, they’ll be able to see what the driver sees.
Visor Cam, which first appeared at the Eldora NASCAR Camping World Truck Series race last year, makes its Daytona 500 debut this weekend. The small camera, developed by BSI, will be clipped to the helmets of Kurt Busch (last year’s Daytona 500 champion) and Daniel Suarez.
“You can try to put cameras everywhere you can, but seeing what the driver is seeing through a camera placed just above his eye line on his visor is pretty cool,” says Davies. “We’re looking forward to having that at our disposal.”
Fox Sports worked closely with NASCAR and ISC to provide aerial drone coverage of the Daytona 500. The drone, which will be tethered to allow longer periods of flight time, will move around behind the backstretch — outside of the racing area — to cover the race from a new angle.
Gopher Cam, provided by Inertia Unlimited, returns for its 10th year with enhanced lens quality for a wider, clearer field of view. Three cameras will be placed in the track, including one in Turn 4 and another on the backstretch.
Cameras, Cameras EverywhereFox Sports will deploy a record number of in-car cameras during the Daytona 500. In total, Sunday’s broadcast will feature 14 in-car cameras, including the pace car — more than in any NASCAR race in the past 15 years. Each car will be outfitted with three cameras for three viewing angles.
Last year, Fox Sports launched two dedicated Car Channels on the Fox Sports GO app, each focusing on a single driver. For this year’s race, Fox Sports has opted for a team approach, showing multiple drivers, cars, and telemetry data on the channel.
In total, Fox Sports will deploy a total of 20 manned cameras, including three Sony HDC-4300’s operating in 6X super-slo-mo, one Sony HDC-4800 operating in 16X HD slo-mo, and an Inertia Unlimited X-Mo capturing 1,000 frames per second. Fox Sports will outfit its Sony cameras with a variety of Canon lenses, ranging from handheld ENG to the DIGISUPER 100. The network will also have four wireless roving pit/garage camera crews, 10 robotic cameras around the track (plus three robotic Hollywood Hotel cameras), and a jib camera with Stype augmented-reality enhancement. The Goodyear Blimp will provide aerial coverage.
Not to be forgotten, viewers will be treated to all the sounds of the race as well, thanks to more than 100 microphones surrounding the track. Fox Sports plans to make use of in-car radios throughout the broadcast, both in real time (having the drivers and crew chiefs narrate the race) and after the fact (using the audio to tell a story).
A Compound Fit for the Super Bowl of RacingFor the first time in 12 years, Game Creek Video’s FX mobile unit will not handle Fox Sports’ Daytona 500 production. Instead, Game Creek’s Cleatus (known by another network as PeacockOne) will be responsible for the main race broadcast and will be joined in the compound by 11 additional units for digital production, editing, RF cameras and audio (BSI), telemetry and graphics (SMT), and studio production. Two satellite uplink trucks will be onsite, as well as a set of mobile generators that will provide nearly 2 MW of power independent of the local power source.
Fox Sports is shaking up its transmission as well, relying on an AT&T gigabit circuit capable of transmitting eight video signals (and receiving four) via fiber by way of its Charlotte, NC, facility to Fox Sports’ Pico Blvd. Broadcast Center in Los Angeles.
“Based on some of the things that we’re doing for the World Cup in Moscow as well as home-run productions for MLS and college basketball, we’ve taken some of that knowledge and leveraged it to doing full-on contribution for NASCAR,” Davies explains. “It’s exciting, it’s scalable, and we’re looking forward to doing it. AT&T put in circuit at every track or is in the process of doing so, so this is a first foray into IP transmission as it relates to NASCAR.”
The benefit of transitioning to IP transmission, according to Davies, is the volume of content that Fox Sports will be able to send from tracks that notoriously lack connectivity. “At the end of the day,” he says, “we’ll be able to leverage resources from Charlotte and Pico to do more things. Right now, we’re able to contribute more to our Charlotte shows via fiber, but, like everything in technology, the more we get used to it and the more we know how to use it, the more useful it’s going to be.”
Daytona 500 Gets a Graphics MakeoverThe on-air graphics package for the Daytona 500 will be new, featuring much of the look and feel of Fox Sports’ football, basketball, and baseball graphics with all the data that NASCAR fans expect.
Fox Sports will up the ante on virtual graphics and augmented reality, deploying Stype camera-tracking technology (with a Vizrt backend) on a jib between Turns 3 and 4 in order to place 3D graphics within the broadcast. For example, the system can be used to create virtual leaderboards, sponsor enhancements, and race summaries that are placed on Turn 3 as virtual billboards.
“Where that jib is between Turns 3 and 4, you can place graphics [on screen in] such a way that you don’t necessarily have to leave the track in order to get information across,” Davies explains. “In the past, we might have used full-screen graphics, but now, we can put the graphics in space, and it looks pretty cool. It’s the third year that we’ve been doing that, and we seem to get better at it each year.”
The network has also enhanced its 3D-cutaway car, putting these graphics in the hands of the broadcast team. And, in the booth, Fox Sports NASCAR analyst Larry McReynolds will have his own dedicated touchscreen, allowing him to enhance any technical story and give the viewer clear illustrative explanations during the race.
A Company-Wide EffortBetween the production personnel, camera operators, engineers, on-air talent, and many more, Fox Sports currently has 300 people onsite at the Daytona International Speedway. In addition, Fox Sports’ Pico and Charlotte facilities, as well as its network-operations center in The Woodlands, TX, are very much a part of the action. And, when the Daytona 500 starts on Sunday, all will be ready to deliver this year’s race to NASCAR fans everywhere.
“Between everything that you’re going to see on-screen and everything under the hood, these are all things that are going to help the company as a whole,” says Davies. “We’ve been able to bring together all of the resources across the company, and it’s particularly exciting to get everybody working as one on this event.”
Latest HeadlinesEuropean Championships 2018: BBC To Give ‘Major Event Treatment’ to Inaugural Multi-Sport ShowcaseSCMS 2018: Fox Sports Execs Reflect on Cloud Workflows, Data Strategy for 2018 FIFA World CupITN Productions Provides Live Commentary for International Champions Cup With ArqivaPGA Tour Partners With NBC Sports to Bring PGA TOUR LIVE to Network’s Direct-to-Consumer Platform, NBC Sports GoldMX1, Arista Networks, Tektronix, Warner Chappelll Production Music Renew SVG SponsorshipsCenturyLink Renews SVG Platinum SponsorshipSCMS 2018: Quantum’s Molly Presley Shares the Latest on StorNext, Tiered StorageSCMS 2018: NBC Sports’ Darryl Jefferson Offers a Look at 2018 Olympics Asset ManagementSCMS 2018: How Machine Learning Can Make Sports Workflows More EfficientEBU To Conduct UHD HDR High Frame Rate Tests With NGA at European Championships
February 8, 2018
Digital Journal
DURHAM, N.C.--(Business Wire)--NBC Olympics, a division of the NBC Sports Group, has selected SMT to provide real-time, final results and timing interfaces for its production of the XXIII Olympic Winter Games, which take place in PyeongChang, South Korea, from February 8 - February 25. The announcement was made today by Dan Robertson, Vice President, Information Technology, NBC Olympics, and Gerard J. Hall, Founder and CEO, SMT.
Since 2000, SMT has been a key contributor to NBC Olympics’ productions by providing results integration solutions that have enhanced NBC’s presentations of the Games via on-air graphics, scheduling, and searches for content in the media-asset–management (MAM) system.
For the 2018 Olympic Winter Games, SMT will deliver TV graphics interfaces for NBC Olympics’ Chyron Mosaic systems in its coverage of alpine skiing, freestyle skiing, snowboarding, figure skating, short track speed skating, speed skating, bobsled, luge, skeleton, ski jumping and the ski jumping portion of Nordic combined.
SMT’s Point-in-Time software system integrates live results to allow commentators to locate a specific time during a competition in both live and recorded coverage. The software graphically shows key events on a unified timeline so that NBC Olympics commentators can quickly see how a race began, when a lead changed, where an athlete’s performance improved, and the kinds of details that dramatically enhance the incredible stories of triumphs and defeats intrinsic to the 2018 Winter Games.
“The complexity and sheer amount of scoring, tracking, and judging data that comes with an event of this size, both real-time and post production, is beyond compare,” said Robertson. “The ability to organize and deliver it aids NBC’s production in presenting the stories of these amazing athletes, and requires nothing short of the capabilities, innovation and track record of SMT.”
“It is our privilege to provide our expertise, experience, and results reporting technology for NBC Olympics’ production of the 2018 Olympic Winter Games, SMT’s 10th straight Olympics,” said Hall. “Our team of 10 on-site engineers have rigorously prepared for PyeongChang with a tremendous amount of testing and behind-the-scenes work, ensuring SMT delivers seamless services of a scope and scale unprecedented in a sports production.”
SMT’s partnership with NBC Olympics began with the 2000 Sydney Games and has included providing graphics interfaces as well as NBC’s digital asset management interface that helped the network receive Emmy Awards for “Outstanding Team Technical Remote,” following the 2008 and 2016 Games.
About NBC Olympics
A division of the NBC Sports Group, NBC Olympics is responsible for producing, programming and promoting NBCUniversal's Olympic coverage. It is renowned for its unsurpassed Olympic heritage, award-winning production, and ability to aggregate the largest audiences in U.S. television history.
For more information on NBC Olympics’ coverage of the PyeongChang Olympics, please visit: http://nbcsportsgrouppressbox.com/.
About SMT
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious events, including the Super Bowl, major golf and tennis events, the Indianapolis 500 and the World Series. The 31-time Emmy Award-winning company is headquartered in Durham, N.C. For more information, visit smt.com.
February 5, 2018
Sports Video Group
To put it mildly, the 2017-18 NFL campaign has been a memorable one for SkyCam. In a matter of months, the dual-SkyCam model — an unheard-of proposition just a season ago — has become the norm on high-profile A-game productions. In addition, the company unveiled its SkyCommand for at-home production in conjunction with The Switch, with plans to continue to grow this central-control model. In addition, last year, SkyCam worked with SMT to debut the 1st & Ten line and other virtual graphics on the SkyCam system; today, it is standard practice on almost any show using a SkyCam.
At Super Bowl LII, SkyCam once again deployed dual SkyCams with the high-angle focusing on an all-22 look and the lower SkyCam focusing on play-by-play. SVG sat down with Chief Technology Officer Stephen Wharton at U.S. Bank Stadium during Super Bowl Week to discuss SkyCam’s role in NBC’s game production, the rapidly growing use of the dual SkyCams by broadcasters, NBC’s use of the system as the primary play-by-play game camera on a handful of Thursday Night Football games this season, and an update on the company SkyCommand at-home–production control system, which was announced earlier this year.
Tell us a bit about your presence at U.S. Bank Stadium and the role SkyCam will play in NBC’s Super Bowl LII production?We were fortunate enough to be here with Fox for the Wild Card Game, and that allowed us to keep a majority of our infrastructure in place. Also, when the stadium was built, they built in a booth for SkyCam and cabled the building, so that obviously helped us quite a bit. But we’ve been here since Sunday working with the halftime show to make sure that our rigging isn’t in the way of them and they’re not in the way of us. And then, Monday, full crew in for Tuesday first-day rehearsal, and then all the way through the week.
In a matter of months, several major NFL broadcasters have adopted the dual-SkyCam model. What are the benefits of two SkyCams?We used to say you knew you had a big show when you had SkyCam on it. Now you have a big show when you have two SkyCams on it. I think one of the key driving factors for [the increased use of] dual SkyCam was working with the NFL and the broadcasters to better highlight Next Gen Stats. And, working with SMT on their auto render system, one of the big values that we now bring is this ability to show you the routes and what’s going on with each player as the play develops from the overhead all-22 position.
It just so happened that, as the dual systems started to evolve, we got this amazing opportunity in Gillette Stadium when the fog came in and no other cameras could be used. Typically, you think of SkyCam as being used for the first replay camera; we’re not necessarily live. But, in that instance, we had to go live with SkyCam, and the first replay became the high SkyCam. That opportunity changed how we are seen and used. It demonstrated what you could do with SkyCam, and that obviously penetrated all the other networks. You get two totally different angles, one more tactical and one play-by-play, and there’s really no sacrifice. You’re not giving anything up on the lower system; you’re actually helping because you don’t have to chase down beauty shots and comebacks since the upper system can do that. The lower system can just focus on play-by-play.
Do you expect the use of dual SkyCams for NFL coverage to continue to grow next season?I think that you’ll continue to see the dual SkyCams become more of the norm, not just for the playoff games but for most A-level shows, because it brings such a value for both Next Gen Stats and the broadcasters. We’re obviously super excited about that.
I think there’s a bifurcation between audiences in terms of [SkyCam] as a primary angle: some really love it, and some don’t like it. But what you’re seeing in broadcast today with the growth of technology and evolving media is that people end up with a buffet of options to choose from: OTT, streaming, mobile, television, or something else. And there is a market for all of it. I think, at the national level, you’ll see more play-by-play action live from SkyCam because broadcasters will be able to use it and distribute it however they like.
At NAB 2017, you introduced SkyCommand, an at-home–production tool that allows SkyCam operators to be located remotely. Do you have any update on this platform, and are broadcasters using it already?We have seen tremendous interest. People are asking where and when they can we do this, but there are obviously a couple different challenges we have to address: one, since it’s a cost-saving model, you’re looking at lower-tier shows in venues that don’t have much infrastructure in most cases. That said, when you take lower-tier games that happen to take place in venues that [have the necessary infrastructure], it becomes very appealing. Most of our network partners have been very interested in finding ways of utilizing Sky Command for [at-home] production. [Our partners] Sneaky Big Studios and SMT are on board, and we’re looking at doing a lot more of it in 2018. We’ve actually got some pilot programs already.
Just a couple weeks ago, we relocated SkyCam into an 80,000-sq.-ft. facility a few miles down the road from our old facility. It’s a brand-new facility, built from the ground up, that’s tailored to our needs. We’ve got two entire broadcast booths with SkyCommand in mind. One is a network-operation center with full streaming capabilities and data connectivity to the games that we’re doing. Beyond SkyCommand, when our operators are onsite, we will have a guy in Fort Worth who is basically at NOC watching the game. This person will be looking at the responses coming out of the computer systems and will be on PLs with the [on-site operators]. And then we can send that video back to the NOC and address any type of issues that we have; it gives us a great ability to manage that. The second booth is where we can actually put an operator and a pilot.
We’re continuing to work with the network vendors —The Switch, CenturyLink, and others — but we’ve already got full 10-gig fiber to the facility. So we’re working now to put all that in place for SkyCommand. I think you’ll see that more in 2018.
In what other sectors is SkyCam looking to grow in the near future?We’re also trying to expand [permanent SkyCam installations] throughout the NFL. I expect that we will have some other announcements coming out shortly about additional teams building on what we did with the Baltimore Ravens last year. Those team SkyCams will continue to grow in 2018, and we’re looking at leveraging Sky Command specifically for those cases.
February 5, 2018
Sports Video Group
SMT (SportsMEDIA Technology) is bringing a number of Super Bowl firsts to Minneapolis on both the broadcast and the in-venue production side. On NBC’s Super Bowl LII broadcast, SMT will deploy a telestrator on the high SkyCam for the first time and also will have the 1st & Ten line available on additional cameras. The in-venue production will offer the 1st & Ten line on the videoboards for the first time in a Super Bowl and will also feature enhanced NFL Next Gen Stats integration.
“It’s always exciting to do something brand new for the first time,” says SMT Coordinating Producer Tommy Gianakos, who leads the NBC SNF/TNF team. “And it’s even better when you’re doing it on the biggest show of the year with a lot of extra pieces added on top.”
In addition, during the Super Bowl LII telecast, NBC Sports’ production team will have access to a new telestration system on the high SkyCam for first replays.
“We’re now adding some telestration elements on SkyCam,” Gianakos explains. “In the past, we’ve been able to have a tackle-box [graphic] on one of the hard cameras if there’s an intentional-grounding play, but we haven’t been able to do it from high and low SkyCam on first or second replay. That intentional-grounding [virtual graphic] right above the tackles on SkyCam is something we haven’t been able to do before, but now we are able to do pretty instantaneously.”
SMT demonstrated it for NBC Sports producer Fred Gaudelli on Friday when a high school football team was on the field, and NBC opted to move forward with the system for the game.
“We’re able to do backwards-pass line virtually in real space; we’re able to measure cushions, able to paint routes on the field, all very rapidly,” says Ben Hayes, senior account manager, SMT. “It’s pretty unique to this show and the first time we’re going to be doing it on-air.”
In addition to having the live 1st & Ten line on both SkyCams and the same six hard cameras available for NBC’s Thursday Night Football and Sunday Night Football telecasts, SMT has added it to the two goal-line cameras, the all-22 camera, and two more iso cameras.
SMT also added next-gen DMX switchboard connectivity to NBC’s scorebug, so on-field graphics will update in real time and list personnel and formations of both teams.
“From a crew standpoint, it was really nice for us to have both Thursday Night Football and Sunday Night Football this season because it gave us a second group of people that understood the expectations of this show and what Fred and [director] Drew [Esocoff] really want from the show,” says Hayes. “We were basically able to merge those two crews for this game and not miss a beat.”
On the Videoboards: 1st & Ten line, Enhanced Next Gen StatsFans at the stadium will be able to see the 1st & Ten line system on the videoboards. For the first time at a Super Bowl, the yellow virtual line will be deployed on three cameras –— on the 50- and both 25-yard lines — for the in-venue videoboard production.
Also, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering real-time content not available on broadcasts.
“It’s amazing to be doing this here at Super Bowl,” says Ben Grafchik, business development manager, SMT. “Obviously, we can build upon the technology in the future, but this is our first step into it. And then I’m looking to try to continue that going forward.”
Fans inside U.S. Bank Stadium will have access to real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?). The production is made possible by SMT’s Dual-Channel SportsCG, a turnkey clock-and-score–graphics publishing system that requires just a single operator.
“We knew the Minnesota Vikings were already doing virtual and NFL Next Gen Stats, so we started thinking about what we could do to spice it up for the Super Bowl,” says Grafchik. “We’re throwing a lot of things at this production in hopes of seeing what sticks and what makes sense going forward for other venues.”
In the lead-up to the game, SMT worked with the league to merge the NFL Game Statistics & Information System (GSIS) feed with NFL Next Gen Stats API to come up with a simple lower-thirds graphics interface. This will allow the graphics operator to easily create and deploy a host of new deep analytics graphics on the videoboard during the game.
“These additional NGS elements get viewers used to seeing traditional stats along with nontraditional stats when they are following the story of the game,” says Grafchik. “If Alshon Jeffery has a massive play, the operator can instantly go with the lower third for his average receptions per target. The whole plan was to speed up this process so that this individual isn’t [creating] true specialty graphics; they’re just creating traditional graphics with extra spice on top of it. By getting quick graphics in like that, it helps to tell a story to the viewer in-venue without much narration on top of it.”
February 4, 2018
Sports Video Group
Since the first beam went up on this massive structure in Downtown Minneapolis, U.S. Bank Stadium has been building to this moment. Super Bowl LII is here, and an all-star team from Van Wagner Sports & Entertainment Productions, stadium manager SMG, and the Minnesota Vikings is ready to put on a Super Bowl videoboard production for the ages.
When 66,000-plus pack into the sparkling bowl, they’ll be treated to quite a few in-venue firsts on those boards, including the Super Bowl debut of SMT’s Yellow 1st & 10 line, a completely new Super Bowl LII graphics package, and an expanded arsenal of camera angles.
“Every Super Bowl, we’re tasked with moving the needle,” says Bob Becker, EVP, Van Wagner Sports & Entertainment (VWSE) Productions, which has designed the videoboard. “What can we do differently this Super Bowl that we haven’t done in the past? That’s our constant challenge. This is my 23rd [Super Bowl], and, every year, it gets bigger and bigger and bigger. When it’s over, you say, ‘Wow, what a great job,’ and then you start stressing about next year and wonder, ‘Well, how do we top that?’ That’s how I feel about that: you’ve got to always up your game.”
The stadium’s crown jewels are a pair of Daktronics video displays behind the end zones that measure 68 x 120 ft. and 50 x 88 ft., respectively. This year, for the first time at a Super Bowl, those boards will feature a full complement of the Yellow 1st & 10 line. SMG and the Vikings had a standing relationship with North Carolina-based SMT throughout the season, offering the yellow line encoded on their 50-yard-line camera. For the Super Bowl, they chose to expand it to include the other main cameras at each of the 20-yard lines. SMT’s Ben Grafchik will be sitting at the front of the control room, calling up specialty data-driven graphics, tickers, and data feeds for the control-room crew to call up as they desire.
Those advanced graphics are part of a completely fresh graphics package that Van Wagner has developed for this game. It’s the classic hard work done by the company: build a season’s worth of graphics to be used on a single night. Also, not only does Van Wagner come in and take over the U.S. Bank Stadium control room, but its team has basically torn it apart, pulling out gear and replacing it with specialty systems in order to take the videoboard show to that next level.
“It’s not because it’s not good,” says Becker, “but that’s how we make it bigger and better. Sometimes, you’ve got to bring technology in to make it bigger and better. And, to these guys’ credit, they have not only been there from Day One for us but have been open to allowing us to tear apart their room and integrate these new things. And it happens a lot that they go, Hey, you know something, I’d love to use that for a Vikings season next year. So there’s benefit on both sides.”
One of the vendors that has gone above and beyond for the control room is Evertz. The company has provided a crosspoint card for redundancy and the EQX router while also supplementing with some spare input cards, output cards, and frame syncs.
It’s a challenging effort to make temporary alterations to the control room, but SMG and the Vikings have welcomed the opportunity to expand with open arms.
“There’s a reason I took this job,” says Justin Lange, broadcast operations coordinator for U.S. Bank Stadium, SMG. “This is a prestigious event, and this is big for this city, the Vikings, and for us as a company. It’s been a great experience. It’s a great opportunity for us to showcase what we can do with this room, what we can do with these boards. The sightlines are great in this facility. The boards are great, the IPTV system is expansive, and we’re just excited to showcase what we have to offer as a facility.”
Normally, the control room features both Evertz IPX and baseband routing, an 8M/E Ross Acuity switcher with 4M/E and 2M/E control panels to cut secondary shows, and Ross XPression graphics systems. The all-EVS room houses a wide range of EVS products, including three 12-channel 1080p replay servers, one 4K replay server, IPDirector, Epsio Zoom, and MultiReview.
For the Super Bowl, the control room will have more cameras to choose from than it has ever had before. A total of 18 in-house cameras deployed throughout the bowl (which is more than the normal eight for a Vikings game), including four RF handhelds, an RF Steadicam, and two robotics.
The crew is also an impressive sight to behold. Nearly 100 people are working on the videoboard show in the combined efforts between Van Wagner, SMG, and the Vikings. There’s also a handful of editors across the street in the 1010 Building (where many broadcasters have set up auxiliary offices) cutting highlight packages and team-specific content.
“This is the biggest event in the world,” says Becker, “and we and the NFL mean to acknowledge that. We’re willing to do what needs to be done to put on the biggest event in the world.
February 2, 2018
NBC Sports
NASCAR will provide its teams with more data in real time this season, giving them access to publicly available steering, brake, throttle and RPM information as well as live Loop Data for the first time.
The information will be provided for every driver on every lap of every session on track.
The steering, brake, throttle and RPM information has been available through NASCAR.com’s RaceView application, which uses the information provided by the electronic control units used in the electronic fuel injection systems. Some teams have created labor-intensive programs that scraped the data from RaceView, so NASCAR decided to save time and effort for teams by directly providing the information.
No other engine data will be released. The ECU can record 200 channels of information (of a possible 1,000 parameters). NASCAR assigns about 60 channels (including the steering, brake, throttle, and RPM), and teams can select another 140 channels to log through practices and races. Those channels will remain at the teams’ discretion and won’t be distributed by NASCAR.
NASCAR’s real-time data pipeline to teams this season also will include Loop Data, which was created in 2005 and has spawned numerous advanced statistical categories that have been available to the news media. The information was born out of a safety initiative that installed scoring loops around tracks after NASCAR ended the practice of racing to the caution flag in ‘03.
Previously, teams had been provided only lap speeds/times; now they will have speeds in sectors around the track marked by the scoring loops.
Teams still won’t be given Loop Data for the pits, where the scoring loops are installed to maintain a speed limit for safety. If a scoring loop in the pits were to fail during a race, teams theoretically could take advantage of that by speeding through that loop (particularly those whose pit stall is in that sector). NASCAR does provide teams with pit speeds after races.
February 2, 2018
Stadium Business
The NFL’s popular Next Gen Stats data feed is getting a boost with real-time data delivery and graphics solutions firm SportsMEDIA Technology (SMT) for Super Bowl LII at the U.S. Bank Stadium.
For the championship game this Sunday in Minneapolis, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering fans real-time content not available on broadcasts.
SMT’s in-stadium production combines in-game stats that are displayed on the stadium’s two massive video boards, as well as 2,000 in-concourse HD video displays.
U.S. Bank Stadium, home of the Minnesota Vikings, boasts 31,000 square feet of video boards, including the west end zone display at 120- by-68 feet high, and the east end zone display at 88- by 51-feet high.
The 65,000 fans at Super Bowl LII will be presented with real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?).
“As an organisation, the Minnesota Vikings constantly look for innovative strategies that provide the best fan experience possible, and SMT’s in-stadium solution is the perfect complement to our new video boards,” said Allen Wertheimer, senior manager of production for the Vikings.
“For years, we’ve heard from fans that they want the same innovative technology in-stadium that they get at home. Now, with SMT’s presentation of the virtual 1st and Ten system and the NFL’s Next Gen Stats on the video boards, we can offer them in-game stats they wouldn’t get watching from home.”
Ben Grafchik, SMT’s business development manager, said: “In anticipation of creating the ultimate Game Day experience for Super Bowl fans at U.S. Bank Stadium, we have worked diligently all season with the Vikings and the NFL to provide in-stadium 1st & Ten graphics and NFL’s Next Gen Stats, giving fans the real-time data they’re hungry for, such as positional information, game leaders, and quarterback passing.
“We are confident that our execution will provide quantifiable and unique data points that truly highlight the skills inherent in elite NFL athletes.”
This year’s Super Bowl pits the New England Patriots against the Philadelphia Eagles.
January 31, 2018
Business Wire
DURHAM, N.C.--(BUSINESS WIRE)--SMT (SportsMEDIA Technology), the leading innovator in real-time data delivery and graphics solutions for the sports and entertainment industries, today announced it is providing in-stadium solutions, including its Emmy-winning virtual 1st & Ten line system and the NFL’s new Next Gen Stats, for Super Bowl LII, to be held Feb. 4 at U.S. Bank Stadium.
For Super Bowl LII, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering fans real-time content not available on broadcasts. SMT’s in-stadium production combines in-game stats integrated into SMT-designed graphics packages that are displayed on the stadium’s two massive video boards, as well as 2,000 in-concourse HD video displays, offering fans a chance to watch highlights and stay informed no matter where they are in the stadium. U.S. Bank Stadium boasts 31,000 square feet of video boards, including the west end zone display at 120- by-68 feet high, and the east end zone display at 88- by 51-feet high.
The more than 65,000 football fans attending the Super Bowl will be treated to a variety of valuable real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?). The production is made possible by SMT’s Dual-Channel SportsCG, a turnkey clock-and-score graphics publishing system that requires just a single operator.
“As an organization, the Minnesota Vikings constantly look for innovative strategies that provide the best fan experience possible, and SMT’s in-stadium solution is the perfect complement to our new video boards,” said Allen Wertheimer, Senior Manager of Production for the Minnesota Vikings. “For years, we’ve heard from fans that they want the same innovative technology in-stadium that they get at home. Now, with SMT’s presentation of the virtual 1st and Ten system and the NFL’s Next Gen Stats on the video boards, we can offer them in-game stats they wouldn’t get watching from home.”
“In anticipation of creating the ultimate Game Day experience for Super Bowl fans at U.S. Bank Stadium, we have worked diligently all season with the Vikings and the NFL to provide in-stadium 1st & Ten graphics and NFL’s Next Gen Stats, giving fans the real-time data they’re hungry for, such as positional information, game leaders, and quarterback passing,” said Ben Grafchik, SMT Business Development Manager. “We are confident that our execution will provide quantifiable and unique data points that truly highlight the skills inherent in elite NFL athletes.”
In addition to in-stadium solutions, SMT will provide broadcast solutions for Super Bowl LII, including the virtual 1st and Ten system, data-driven graphics and tickers, and in-game data feeds to commentator touchscreens, among other services. SMT has supported Sunday Night Football on NBC since 2006.
About SMT
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious events. The 31-time Emmy Award-winning company is headquartered in Durham, N.C.
January 30, 2018
Sports Video Group
With the Madden NFL 18 Club Championship Finals in full swing this week and the recent announcement of a new TV and streaming deal with Disney/ESPN, EA’s Madden NFL Championship Series is squarely in the esports spotlight. The series has been moving toward this moment for months, with 11 NFL teams hosting events in which fans competed to advance to the Finals in Minneapolis this week. In its first foray into competitive gaming, SMT’s Video Production Services (VPS) group produced events for the Arizona Cardinals, Buffalo Bills, and Jacksonville Jaguars throughout the end of 2017.
“SMT’s experience with supporting top football shows like the Super Bowl and Sunday Night Football makes us uniquely positioned to attract Madden gamers to the NFL through the medium they are most attracted to: esports,” says C.J. Bottitta, executive director, VPS, SMT. “With a worldwide fan audience now estimated at 280 million, approaching that of the NFL, SMT is excited to enter the growing market of competitive gaming.”
Although the level of services SMT provided varied from show to show, the base complement for all three productions comprised a full technical team of broadcast specialists operating six cameras, multiple replay machines, and a telestration system. SMT kept pace with the Madden’s lightning-quick style of play for the three-hour shows streamed on EASports YouTube channel, Twitch.TV/Madden, and the EA Sports’ Facebook page. In addition, SMT’s Creative Studio customized EA’s promotional trailer with team-specific elements for each of the three events.
“We started doing [Madden events] with teams last year, and there has been an evolution from wanting a [small-scale] podcast-level environment to almost a broadcast-level show,” says Bottitta. “What I loved about the three teams this year was how passionate and excited they were to be doing this. Teams were handling events very differently, but all of them had great people to work with and did a wonderful job.”
Inside the Production: University of Phoenix Stadium, Glendale, AZ
The Cardinals’ Madden NFL 18 Club Championship took place on took place on Saturday Nov. 11, soon after the team’s Thursday Night Football home game against the Seahawks, creating a quick turnaround for SMT and the team’s production staff. SMT provided the producer (Bottitta), director, tech manager, and lead camera operator and advised on what should be added for the production.
“We primarily provided leadership for the Cardinals,” says Bottitta. “They have a fantastic facility, so we reviewed with their tech group what they had and what they needed to add for [a competitive-gaming production] like this. They have a fantastic control room, and they used the crew that they normally use except for the producer, director, tech manager, and lead cameraman, which we provided.”
Inside the Production: New Era Field, Buffalo, NY
In Buffalo, SMT provided a similar level of services for the Bills’ event on Saturday Dec. 2, the day before the team faced off against the New England Patriots. SMT worked with the Bills to manage other shows using the team’s studio at New Era Field: a simulcast radio show, pre/postgame show for the Buffalo Sabres, and Bills GameDay on Sunday.
SMT once again used the team’s crew primarily but provided its own producer, director, tech manager, and camera ops and added a stage manager.
“Buffalo was on a real-time crunch,” says Bottitta, “so they told us the studio they wanted to use, the schedule of the studio, and asked us what was reasonable to expect. We guided them through what would make the most sense, so we could get in there, have a rehearsal and set day and then do the show while also allowing them to still do their normal duties.”
Inside the Production: Daily’s Place Amphitheater, Jacksonville, FL
SMT ramped up its role at the Jaguars’ event, which took place the morning of a home game against the Seahawks on Dec. 10. Since it was a game day, the Jaguars crew was occupied handling the in-venue production, so SMT essentially handled the entire Madden production at Daily’s Place Amphitheater, which is connected to EverBank Field. Since the two events were happening concurrently, the Jaguars provided SMT access to their router, allowing live camera views of warmups to be integrated into the Madden show throughout.
“The Jaguars [production] was the most unique of the three because it was on game day,” Bottitta explains. “They wanted to host it on the morning of what ended up being a very meaningful December football game for the Jaguars for the first time in a long time. Since the game-day crew was obviously busy, we did the whole show. We were taking Seattle and Jacksonville warming up on the field as bump-ins and bump-outs for our show, which was great and really captured the energy of the game.”
The Broadcast Mentality: Madden NFL Coverage Continues To Evolve
As the Madden NFL Club Championship grows (all 32 NFL franchises were involved for the first time this year, with prize money totaling $400,000 at this week’s Championship), the property has made an effort to boost its production value for live streams. Bottitta believes that SMT’s experience on A-level NFL productions, including Sunday Night Football and this weekend’s Super Bowl LII, was integral in the league’s selecting SMT: “I think that made a big difference: knowing that we weren’t just a group that’s doing one more esports tournament; this is a group that does professional sports production.”
He adds that VPS aims to leverage this broadcast-level expertise by bringing in such tools as replay systems and telestrators, which would be standard on an NFL telecast.
“We tried to bring a [broadcast] philosophy to these shows and want to make it more consumable for the viewers,” he says. “We brought telestrators and replay to all of the [productions], and that was not the norm when EA launched [the Club Championship] last year. I did that not only because SMT has a very portable, very easy-to-implement telestrator system but because it really adds to the show. If you went to a game and didn’t see replays or the key camera angles, you’d be in shock. So that became a big part of our production plan.”
January 19, 2018
Sports Video Group
As the Jacksonville Jaguars look to stymie the New England Patriots’ quest for a sixth Super Bowl victory, CBS Sports will cover this Sunday’s AFC Championship from every angle — including overhead.
CBS Sports will deploy 39 cameras in Foxborough, MA: seven super-slow-motion cameras, eight handhelds, and a Steadicam; pylon cams; and a collection of 4K, robotic, and Marshall cameras. The network will also have access to Intel 360 cameras for 360-degree replays. To give viewers an aerial view, CBS will rely on a dual SkyCam WildCat aerial camera system and fly a fixed-wing aircraft over Gillette Stadium.
The CBS Sports crew will work out of NEP SSCBS and have access to 152 channels of replay from 14 EVS servers — four eight-channel XT3’s and10 12-channel XT3’s — plus a six-channel SpotBox and one 4K server.
CBS Sports’ lead announce team Jim Nantz, Tony Romo, and Tracy Wolfson will have plenty of storytelling tools at their fingertips, including SMT’s Next Gen Tele and play-marking systems with auto-render technology on both SkyCams. The lower SkyCam will focus on the actual game play at the line of scrimmage, including the quarterback’s point of view, while the upper SkyCam will provide a more tactical, “all-22” look at the field. During the AFC Championship, Romo will be able to use these tools to break down what he sees on the field for first and second replays.
Coverage begins at 2:00 p.m. ET with The NFL Today, featuring host James Brown and analysts Boomer Esiason, Phil Simms, Nate Burleson, and Bill Cowher at the CBS Broadcast Center in New York City; kickoff follows at 3:05 p.m. ET. Fans wanting to start their day even earlier can tune into The Other Pregame Show (TOPS) on CBS Sports Network, which runs from 10:00 a.m. to noon
January 12, 2018
Sports Video Group
The Tennessee Titans travel to New England this weekend to take on the reigning Super Bowl champions in the AFC Divisional Round. To capture the action on the gridiron from every angle, CBS Sports will rely on dual SkyCam WildCat aerial camera systems with SMT’s Next Gen Tele and play-marking systems, as well as its virtual 1st & Ten line.
The Next Gen Tele System, which debuted during last year’s AFC Divisional Round, channels the NFL’s Next Gen Stats (NGS) data into an enhanced player-tracking telestrator. Combined with SMT’s proprietary play-marking system, which enables rendering of four virtual-player routes on the SkyCam video and its virtual 1st & Ten line, Next Gen Tele System provides a multitude of options for on-screen graphics that CBS Sports talent can leverage to better tell the story of the game.
“From a production standpoint, everything is about storytelling and conveying the story behind the game,” says Robbie Louthan, VP, client services and systems, SMT. “It’s handled in many different ways, but one way is obviously graphics. The advantage there is, you’re able to tell relevant, compelling information in a quick and succinct way without having to have the talent verbalize it to [viewers]. When you can get it reduced down to a graphic that is relevant to the viewer, you’re guaranteeing that the information you want to convey is being handled in a very quick, succinct manner, because there’s very short time frame between plays.”
During Saturday’s game, SkyCam will focus the lower camera system on the actual game play at the line of scrimmage, showing the quarterback’s point of view. The upper system will provide more of a tactical, “all 22” look at the field. Both systems will feature SMT graphics that enhance their respective camera angles and roles.
“Our camera angle creates a view that helps tell the story better than other camera angles,” explains Stephen Wharton, CTO, SkyCam. “Our view just establishes the storytelling for those graphics better than any other camera can, and then, when you add the motion that our camera brings with it, it makes those graphics — whether NGS, routes, and lines or first-down markers —- get placed very well within the angle of the shot, so that that story is being told.”
SMT will deploy four staffers to Gillette Stadium to support the graphics on the dual Skycam system: one operator to support the Next Gen Tele System, a dedicated operator for each of the camera systems, and one to oversee the operation and help produce the content. SkyCam will have a team of nine on the ground in New England, including five operators on the lower camera system (an engineer in charge, an assistant, a rigger, a pilot, and an operator responsible for the camera’s pan/tilt/zoom) and four on the upper camera system (an EIC, rigger, pilot, and PTZ operator).
The same system will return the following week during the AFC Championship Game, and similar systems will appear in other games throughout the NFL playoffs. And, while the action on the gridiron is sure to excite throughout the playoffs, the graphics overlaid on the dual Skycam system will only increase the level of storytelling that the talent can deliver and fans can expect.
“We’re excited about showing off a new way of using Next Gen Stats and really focusing on where the players are running, where the routes are, and creating that sort of Madden look, if you will,” says Wharton. “If you [look at the broadcasters, they’re] usually telestrating: they’re saying, Here’s this guy, and they draw the little yellow line of where he ran. Now we’re leveraging the NFL’s Next Gen Stats system to get that data to create the graphics with SMT and then overlay that from our angle. It creates a very compelling shot.”
Echoes Louthan, “It’s another tool in the toolkit for the announcers — in this case, for [analyst] Tony Romo to use graphics to help tell the story of what he sees. It has been exciting for us to work with Tony on fine-tuning these graphics to [enable] him to use his incredible insight into the game to tell the story.”
(SportsMEDIA Technology), the leading innovator in real-time data delivery and graphics solutions for the sports broadcasts, and SkyCam, the company that specializes in cable suspended aerial camera systems, are continuing to deliver technological innovations to CBS Sports’ broadcasts of the AFC playoff games, including Saturday’s Tennessee Titans vs. New England Patriots contest, Sunday’s Jacksonville Jaguars vs. Pittsburgh Steelers game and the AFC Championship on Jan. 21.
SMT will provide its Next Gen Tele system, an enhanced player-tracking telestrator that harnesses the power of NFL's Next Gen Stats data and SMT’s proprietary play-marking system to instantly render four virtual player routes on SkyCam video that’s available to the producer and talent at the end of every play. This “first-replay series, every replay” availability makes SMT’s system a true breakthrough in which NFL's Next Gen Stats data is able to drive meaningful content as an integral component of live NFL game production. The system debuted last year for the AFC divisional playoffs.
Using dual SkyCam WildCat aerial camera systems to enhance its broadcast, CBS Sports has made standard the “Madden-like” experience that gives football fans a more active and dynamic viewing experience behind the offense, revealing blocking schemes, defensive fronts and throwing windows and providing a deeper understanding of plays. Combined with
SMT’s virtual 1st & Ten line solution placed from SkyCam images, viewers are experiencing the new, modernized look of NFL games. SMT, through its offices in Durham and Fremont, has supported CBS NFL broadcasts since 1996.
“Used in conjunction with SMT’s virtual technology, fans have embraced the enhanced coverage made possible with dual SkyCam systems, a look that younger viewers have come to expect in their games,” said Stephen Wharton, CTO, SkyCam. “With SkyCam, fans get the benefit of a more complete view of the action and play development – we place them right into the action in real-time. Sideline cameras force fans to wait for replays to get a sense of what receivers and quarterback were seeing. With SkyCam, no other camera angle is as immersive or engaging.”
“SMT’s ability to place virtual graphics from SkyCam opens up a plethora of possibilities for broadcasts in terms of augmented reality applications with advertising content, player introductions on the field, or a whole host of possibilities,” said Gerard J. Hall, CEO, SMT. “The potential with our technology is limitless.”
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious live events, including the Super Bowl, NBC Sunday Night Football, major golf and tennis events, the Indianapolis 500, the NCAA Tournament, the World Series, ESPN X Games, NBA on TNT, NASCAR events, and NHL games. SMT’s clients include major US and international broadcasters as well as regional and specialty networks, organizing bodies, event operators, sponsors and teams. The 31-time Emmy Award-winning company is headquartered in Durham, N.C., with divisions in Jacksonville, Fla., Fremont, Calif., and London, England.
Headquartered in Fort Worth, Texas, SkyCam is a leading designer, manufacturer and operator of mobile aerial camera systems. SkyCam plays a significant role in changing the way sporting events are broadcast in the world, appearing at marquee broadcast events, such as The NFL Super Bowl, NCAA Final Four, NBA Finals, Thursday Night Football, Sunday Night Football, NCAA College Football, 2015 CONCACAF Gold Cup and 2014 FIFA World Cup. SkyCam is a division of KSE Media Ventures, LLC
January 08, 2018
TV Technology
NEW ORLEANS—New Orleans Saints and Carolina Panther receivers and quarterbacks weren’t the only ones concerned about what was in and out of bounds Sunday (Jan. 7) in New Orleans during the NFC Wildcard game.
Fox Sports, which telecast the game, walked a different sort of line with its playoff coverage—one that delineates between delivering the great shots needed to present game action and some new tech implementation that actually gets in the way of coverage.
“We don’t want to make things all that different for the production team and give them a whole bunch of stuff that they haven’t had before for the big games,” says Mike Davies, SVP of Field and Technical Operations at Fox Sports. Rather, the strategy is to start with a “base layer” of production technology used throughout the 17 weeks of the regular season and then deploy choice pieces of technology that will have the biggest impact on game production and allow Fox Sports to tell the best story, he says.
“A lot of this stuff we’ve used before and some just this year,” says Davies. “We just pick the best of the best to represent us.”
For example, for the three NFL playoff games Fox Sports is covering the broadcaster will add a second, higher SkyCam to deliver a drone’s-eye view of plays that captures all 22 players on the field. “Although you think of how over the top two SkyCams might sound, it turns out to be very useful,” says Davies. Fox Sports first used the dual SkyCam setup during the preseason and then again in Week 5 for the Packers vs. Cowboys game. “I think that camera angle is new enough that we are still learning what it can do,” he says.
The broadcaster recognized the upper SkyCam “was something special” in Week 5 during a play involving Cowboys running back Ezekiel Elliot. “He jumped over that pile and no camera, including the lower SkyCam, saw that he had reached out over the first down line [except for the new upper SkyCam],” he says. “At least for that moment, we were sold that this is something special and something we wanted to offer.”
However, camera enhancements—both in terms of numbers and applications—aren’t limited to the second SkyCam. For its NFL playoff coverage, Fox Sports will deploy seven 8x Super Mo cameras, rather than the typical five. Fox also will use 6x Super Mo for its SkyCams, which it first did for its Super Bowl LI coverage in February 2017.
“There are so many replay opportunities in football, and the Super Mo gives this crisp—almost cinematic—look at the action,” says Davies.
The sports broadcaster also will take advantage of work it has done this year with Sports Media Technologies (SMT), SkyCam and Vizrt “to cobble together a recipe” to do augmented reality with the SkyCam, he says. Not only does the setup allow Fox Sports to put a live yellow line on the field of play with its SkyCam shots, but also to put graphic billboards and other 2-D graphics on the field and to fly around them with the SkyCam as if they were real objects.
“It’s a bit of an orchestration because the pilot of the SkyCam needs to be flying around the object as if it were an object on the field. If you break through it, it’s not going to look real,” says Davies.
Another enhancement is how Fox Sports will use its pylon cameras, says Davies. Rather than pointing the pylon cams positioned at the front of the end zone down the field, Fox will rotate them so they look down the field at a 45 degree angle, says Davies.
“That gives you a way to cover a play where the camera is actually looking. Yes, you have the goal line, but you also have the out-of-bounds line as well,” he says. As a result, there are more game situations in which the pylon cameras can contribute to coverage. “The pylon cameras are a lot like catching lightning in a bottle. They are great, but you don’t want to use them unless you’ve got something that is really compelling,” says Davies.
While it is too soon to tell if the drop in viewership plaguing the league this season will carry over to the playoffs, Davies is confident that the right technology and production techniques have the potential to help fans reconnect with the game.
“I feel that what we are able to do using all of this incredible technology—the dual SkyCams, the Super Mo’s and the pylons—is that we are able to deliver that kind of experience in replay right after the play that also shows the emotions of players, not just what happens between the whistles,” he says.
Harkening back to his stint at HBO, Davies recalls the connection the cinematic style used for “Inside the NFL” created as “you watched a game that happened three or four days prior.” Today’s production tools give broadcasters that same opportunity to create that connection, he says. “I can’t help but think that these kind of storytelling tools, honestly, can only help,” says Davies.
The 2019 College Football Playoff National Championship concludes tonight at Levi’s Stadium in Santa Clara, CA. Like every other football game, it will feature two teams — in this case, Alabama and Clemson — and one broadcaster. For its part, ESPN is once again all-in for the big game, deploying more than 310 cameras to cover all the action and providing 17 viewing options via the MegaCast over 11 TV and radio networks and via the ESPN app.
“The thing that makes this event is the volume and magnitude of what we put behind it but also the time frame,” says John LaChance, director, remote production operations, ESPN. “[There are] other marquee events, which stand alone, but, with the volume and viewer enhancements being done here in a 72-hour window to get everything installed, this event [is] in a unique classification. Trying to integrate everything into place was a herculean effort.”
The game wraps up a season in which ESPN’s production team delivered more than 160 games to ABC and ESPN and more than 1,000 games to various other ESPN platforms.
“To watch that volume and make sure all the pieces are in place is a highlight for all of us, [seeing] it go from plan to working,” says LaChance. “You always have things that are challenges, but it’s about how quickly you can recover, and I think we’ve done it well.”
The core of ESPN’s production efforts will be done out of Game Creek Video’s 79 A and B units with Nitro A and B handling game submix, EVS overflow, 360 replay, robo ops, and tape release. ESPN’s team creating 17 MegaCast offerings is onsite, housed in Nitro and Game Creek’s Edit 3 and Edit 4 trailers andTVTruck.tv’s Sophie HD. Game Creek Video’s Yogi, meanwhile, is on hand for studio operations, and Maverick is also in the compound. All told, 70 transmission paths (50 outbound, 20 inbound) will be flowing through the compound, and 40 miles of fiber and cable has been deployed to supplement what already exists at Levi’s Stadium.
Also on hand are Fletcher, which is providing robotics; BSI, handling wired pylons and RF audio and video; 3G, which is in charge of the line-to-gain PylonCam and the first-and-10–marker camera; Vicareo, with the Ref Cams; and CAT Entertainment, for UPS and power. SMT is on board for the 1st & Ten lines; PSSI, for uplink; Bexel, for RF audio and other gear; and Illumination Dynamics, for lighting.
“It’s a team effort,” says LaChance. “I couldn’t be prouder of the team we assembled here and the vendors, technicians, leads, and staff that have, over the course of the last several months and weeks when it gets to a fever pitch, put it all together.”
The Camera Contingent
A large part of the 300-camera arsenal is comprised of 160 4K DSLR cameras deployed for the 4D Replay system that will provide definitive looks at every play from every angle. Those cameras are mounted around the stadium and, combined, provide images that can be merged on computers and enable an operator to zoom around a play and show any angle.
One place where the 4D system is poised to shine is the Red Zone. The 4D Replay team and ESPN have created templates that can cut the time needed to synthesize the images for plays around the goal line and pylons to eight seconds.
Besides the 160 4D replay cameras, plenty of cameras are focused on the game action, including 90 dedicated to game coverage. Among those are 10 super-slo-mo cameras, nine 4K game cameras, 15 RF cameras, two SkyCams, and two aerial cameras in a blimp and fixed-wing aircraft. The vast majority of cameras are Sony models (mostly Sony HDC-2500 and HDC-4300 with one HDC-4800 in 4K mode) coupled with Canon lenses, including five 100X, two 95X, 21 wide-angle, and 14 22X and 24X lenses. Seven 86X lenses and a 27X lens are also in use.
The game-coverage cameras are complemented by specialty cameras. Four Vicario Ref Cams will be worn by the officials; a line-to-gain RF PylonCam will move up and down the sideline with the first-and-10 marker, which also has a camera; and eight PylonCams around the end zones provide a total of 28 cameras.
The RefCam is new this year, having been tested during last year’s final in Atlanta. The MarkerCam did debut last year, and LaChance says it has been improved: “It has a c360 Live camera in the target portion of the marker to give a 180-degree perspective in 4K. The operator can push in and get a great perspective; we are taking it to another level with the push in.”
A second c360 camera will also be in use on the second SkyCam, again giving the ESPN team the ability to zoom in and capture images.
Another exciting new offering is AllCam, a system designed by ESPN’s in-house team and ChyronHego. It stitches images from three 4K cameras placed alongside the all-22 camera position and gives the production team the ability to zoom in anywhere on the field to capture events that might have taken place away from the action. For example, in a test at a bowl game, the system was used to show an unnecessary-roughness violation that took place during a kickoff far from the other players, who were focused on the run-back.
“It’s another good example of the partnerships we have and working for a common goal,” says LaChance.
Beyond the game coverage cameras there are 20 cameras dedicated to the various MegaCast feeds, 29 for ESPN College GameDay, and nine for the SEC Network. ESPN Deportes also has two dedicated cameras.
All told the production team will have access to 320 sources via 170 channels of EVS playback as well as 32 channels of Evertz Dreamcatcher playback. There are also two Sony PVW-4500 servers in use, a Sony BPU-4800 4K record server, and two c360 record servers.
Non-Stop Action — for the Production Team
“The game wraps up a busy time for the production team as well as for those who work at Levi’s Stadium. LaChance credits Jim Mercurio, VP, stadium operations/GM, Levi’s Stadium, and Nelson Ferreira, director, technical operations, San Francisco 49ers, with being an important part of the process during the past year.
“It’s a solid venue and great group of folks to work with, and that helps,” says LaChance. “They have done the Super Bowl here, and they do a lot of great events, so they are well-equipped. We had to supplement with some fiber, but they had a great infrastructure to start with.”
As for the ESPN team, everybody worked on one of the two semifinals as well as an additional bowl game.
“Folks that did the Cotton Bowl headed on to the Sugar Bowl, and those that did the Orange Bowl headed to the Rose Bowl,” says LaChance. “A lot of the people here have been non-stop since the Christmas Day offerings for the NBA, then right into a semifinal assignment, then the second of the New Year’s bowl offerings, and then making their way here to Santa Clara for one of the largest events the company does every year.”
For anyone looking to see what the new toys will bring to the show, LaChance recommends tuning into the TechCast, which will have a sampling of everything that will be used, including 4D Replay, C360, and the RefCam.
“Besides the game itself,” he says, “tune into the TechCast. Hopefully, the weather is good for us, and we can offer the BlimpCast from the Goodyear airship, which is another opportunity to provide a unique look for viewers at home.”
2018 was one of the most eventful years for sports production in recent memory, with the 2018 PyeongChang Olympics and 2018 FIFA World Cup capturing the nation’s attention over the summer and annual events like the College Football Playoff National Championship Game, Super Bowl, NFL Draft, and others breaking production records and test-driving new technologies and workflows. As if there weren’t enough going on stateside, this year’s Road Warriors features an expanded look at what went on across the Atlantic. Here’s is Part 2 of SVG’s look at some of the sports-production highlights from the past year (CLICK HERE for Part 1).
US OPEN
USTA Billie Jean King National Tennis Center, Flushing Meadows, NY
August 27–September 9
For ESPN, it simply doesn’t get bigger than US Open tennis. In the network’s fourth year as host broadcaster and sole domestic-rights holder — part of an 11-year rights deal — the technical and operations teams continued to evolve production workflows and add elements. Highlights this year included the debut of a Fletcher Tr-ACE/SimplyLive ViBox automated production system covering the nine outer courts and several new camera systems.
This truly is the largest event that ESPN produces out of the thousands of events that we do all year,” said ESPN Director, Remote Operations, Dennis Cleary, “and it’s all done in a 3½-week span.”
For the first time, ESPN covered all 16 courts at the US Open, thanks to a new automated production system deployed on the nine outer courts. Having debuted at Wimbledon in June, the Fletcher Tr-ACE motion-detecting robotic camera system was deployed on each court (with four robos per court) and relied on SimplyLive’s ViBox for switching and replay and an SMT automated graphics system. With this workflow, one robotic-camera operator and one ViBox director/producer covered each of the nine courts.
New this year was a two-point aerial CineLine system (provided by Picture Factory) running between Louis Armstrong Stadium and Court 10, a run of roughly 1,000 ft. After a successful debut at Wimbledon in June and the Australian Open in January, Telstra Broadcast Services’ NetCam made its US Open debut. The Globecam HD 1080i/50 POV miniature robotic camera was deployed on each side of the net for singles matches at Arthur Ashe Stadium, Armstrong, and the Grandstand, providing viewers with a close-up look at the action on the court. In addition, both Intel’s Tru View 360-degree camera system and the SpiderCam four-point aerial system returned to Ashe.
The US Open production compound was almost unrecognizable from five years ago, prior to ESPN’s taking over as host broadcaster. What had been a caravan of production trucks became two permanent structures housing ESPN’s NTC broadcast center and production/operations offices, along with two ultra-organized stacks of temporary work pods housing the TOC, vendors, international broadcasters, and ESPN’s automated production operation for the outer courts. NEP’s NCP8 was on hand for ESPN’s ITV operation (serving AT&T/DirecTV’s US Open Mix Channel), and NEP’s Chromium and Nickel were home to the USTA’s world-feed production. — JD
U.S. OPEN
Shinnecock Hills Golf Club, Shinnecock Hills, NY
June 14-17
The 2018 U.S. Open from Shinnecock Hills Golf Club gave the Fox Sports team challenges in production planning that led to innovations, the opportunity to refresh old workflows and core infrastructure, and a chance to chart some new directions for golf coverage.
Game Creek Video’s Encore production unit was at the center of the coverage for Fox and FS1, with Game Creek Pride handling RF-video control and submix and providing a backup emergency control room. Pride’s B unit handled production control for one of the featured groups, Edit 4 supported all iso audio mixes, and Edit 2 was home to five edit bays with equipment and support provided by Creative Mobile Solutions Inc. (CMSI). There was also the 4K HDR show, which was produced out of Game Creek Maverick.
“All the Sony HDC-4300 cameras on the 7th through 18th greens are 4K HDR-native with a secondary output at 720p SDR,” noted Brad Cheney, VP, field operations and engineering, Fox Sports, during the tournament. There were also six Sony PXW- Z450’s for the featured holes and featured groups, the output of two of them delivered via 5G wireless.
In terms of numbers, Fox Sports had 474 technicians onsite, making use of 38 miles of 24-strand fiber-optic cable to produce the event captured by 106 cameras (including 21 wireless 1080p, 21 4K HDR units, six 4K HDR wireless units, three Inertia Unlimited X-Mo cameras shooting at 8,000 fps, a Sony HDC-4800 at 960 fps, and three Sony HDC-4300’s at 360 fps), and 218 microphones. Tons of data was passed around: 3 Gbps of internet data was managed, along with 83 Gbps of broadcast data, 144 TB of real-time storage, and 512 TB of nearline storage.
Each course provides its unique challenges. At Shinnecock Hills, they included the roads running through the course, not to mention the hilly terrain, which also had plenty of deep fescue. But, from a production standpoint, the biggest issue was the small space available for the compound.
One big step taken in preparation for the 2018 events was that the IP router in Encore was rebuilt from scratch. RF wireless coverage was provided by CP Communications. There were 26 wireless cameras on the course, along with 18 wireless parabolic mics and nine wireless mics for on-course talent. CP Communications also provided all the fiber on the course. — KK
MLB ALL-STAR GAME
Nationals Park, Washington, DC
July 17
With its biggest summer drawing to a close with the MLB All-Star Game, Fox certainly showed no sign of fatigue technologically. Not only did the network roll out a SkyCam system for actual game coverage for the first time in MLB history, but Fox also deployed its largest high-speed–camera complement (including all 12 primary game cameras), two C360 360-degree camera systems, and ActionStreamer POV-style HelmetCams on the bullpen catcher, first-base coach, and Minnesota Twins pitcher José Berríos.
People always used to say Fox owned the fall with NFL and MLB Postseason, but, this year, we owned May through July, too, with the U.S. Open, World Cup, and now All-Star,” said Brad Cheney, VP, field operations and engineering, Fox Sports. “The capabilities of our [operations] team here are just unsurpassed. For big events, we used to throw everything we had at it, and it was all hands on deck. That’s still the case, but now, when we have big events, everybody’s [scattered] across the globe. Yet we’re still figuring out ways to raise the bar with every show.”
Between game coverage and studio shows, Fox Sports deployed a total of 36 cameras (up from 33 in 2017) at Nationals Park, highlighted by its largest high-speed–camera complement yet for an All-Star Game. Building on the efforts of Fox-owned RSN YES Network, all 12 of Fox’s Sony HDC-4300 primary game cameras were licensed for high-speed: six at 6X slo-mo, six at 2X slo-mo. This was made possible by the ultra-robust infrastructure of Game Creek Video’s Encore mobile unit.
Fox also had two Phantom cameras running at roughly 2,000 fps (at low first and low third) provided by Inertia Unlimited and a pair of Sony P43 6X-slo-mo robos at low-home left and low-home right provided by Fletcher. Fletcher provided nine robos in all — including low-home Pan Bar robo systems that debuted at the 2017 World Series — and Inertia Unlimited provided a Marshall POV in both teams’ bullpen and batting cage.
CP Communications supplied a pair of wireless RF cameras: a Sony P1r mounted on a MōVI three-axis gimbal and a Sony HDC-2500 handheld. An aerial camera provided by AVS was used for beauty shots — no easy task in security-conscious Washington.
Inside the compound, a reshuffling of USGA golf events allowed Game Creek Video’s Encore mobile unit (A, B, and C units), home to Fox’s U.S. Open and NFL A-game productions, to make its first All-Star appearance.
The primary control room inside the Encore B unit handled the game production, and a second production area was created in the B unit to serve the onsite studio shows. — JD
The Open Championship
Carnoustie Golf Links, Angus, UK
July 19-22
Sky Sports used its Open Zone in new ways to get closer to both players and the public in its role as the UK live broadcaster from Carnoustie. On Thursday and Friday, Sky Sports The Open channel was on the air from 6:30 a.m. to 9:00 p.m. Featured Group coverage of the 147th Championships was available each day via the red button and on the Sky Sports website. Viewers could also track players’ progress in Featured Hole coverage on the red button, with cameras focusing on the 8th, 9th, and 10th holes. Sky Sports had a team of 186 people onsite in Carnoustie for The Open, which included Sky production and technical staff and the team from OB provider Telegenic. — Fergal Ringrose
WIMBLEDON
All England Lawn Tennis and Croquet Club, Wimbledon, UK
July 2-15
At 11:30 a.m. on Monday, July 2, coverage of the Wimbledon Championships went live from the AELTC, produced for the first time by a new host broadcaster. After more than 80 years under the BBC’s expert guidance, the host baton was passed to Wimbledon Broadcast Services (WBS), bringing production of the Championships in-house. Going live on that Monday was the culmination of two years of planning, preparation, and testing: a process that has allowed the AELTC to “take control” of the event coverage and provide international rightsholders with a better service as well as add some new twists, such as Ultra High Definition (UHD), a NetCam on both Centre Court and No.1 Court, and multicamera coverage of all 18 courts. — Will Strauss
FRENCH OPEN
Stade Roland-Garros, Paris
May 27–June 10
Tennis Channel was once again on hand in a big way at the French Open. The expanded coverage this year meant more than 300 hours of televised coverage for fans in the U.S. as well as 700 hours of court coverage via Tennis Channel Plus. The Fédération Française de Tennis (FFT) increased overall court coverage this year, and Tennis Channel made sure all of that additional coverage made it to viewers. Tennis Channel had approximately 175 crew members onsite, working across the grounds as well as in a main production-control room, an asset-management area, six announce booths, and a main set on Place des Mousquetaires. The production facilities were provided by VER for the fifth year. Centurylink provided fiber transport to the U.S. via 10-Gbps circuits. — KK
The Professional Fighters League (PFL) and SMT (SportsMEDIA Technology) announced an exclusive, long-term technology partnership. Under the terms of the agreement, SMT will partner with the PFL to create proprietary technology that will measure real-time MMA fighter performance analytics along with biometric and positional data that will provide fans a live event experience across all platforms.
Starting in 2019, SMT will help power the PFL’s vision of the first-ever SmartCage. The SmartCage will utilize biometric sensors and proprietary technology that will enable the PFL to measure and deliver real-time fighter performance data and analytics, what the PFL is dubbing: Cagenomics. PFL fans watching linear and digital broadcasts of the league’s Regular Season, Playoff, and Championship events will experience a new dimension of MMA fight action with integration of live athlete performance and tracking measurements including: speed (mph) of punches and kicks, power ratings, heart rate tracking, energy exerted, and more.
“The Professional Fighters League is excited to be partnering with SMT to advance the sport of MMA. The PFL’s new SmartCage will revolutionize the way MMA fans experience watching live fights as next year every PFL fight will deliver unprecedented, real-time fighter performance data and analytics, biometric tracking, and an enhanced visual presentation of this great sport,” says Peter Murray, CEO, Professional Fighters League. “Not only will PFL fans benefit from our SmartCage™ innovation, but our pro fighters will now have access to new performance measurement data, analysis, and tools to help them train and compete. The PFL’s vision has always been two-fold: deliver the absolute best experience to fans and be a fighters first organization and with the SmartCage we will accomplish both.”
“SMT is thrilled to be collaborating with the Professional Fighters League’s forward-thinking innovation team to bring our latest and greatest technology to PFL events,” says Gerard J. Hall, Founder & CEO, SMT. “Starting in 2019, PFL fans will begin to see real-time, live, innovative technology that is unique to the PFL in the MMA space. SMT’s OASIS Platform will provide the PFL with a seamlessly integrated system that combines live scoring with real-time biometric and positional data to enhance the analysis, storytelling and graphic presentation of the PFL’s Regular Season, Playoffs and Championship events next season.”
The PFL 2018 Championship takes place on New Year’s Eve live from The Hulu Theater at Madison Square Garden and consists of the 6 world title fights in 6 weight classes of the PFL 2018 Season. Winners of each title bout will be crowned PFL World Champion of their respective weight class and earn $1M. The PFL Championship can be viewed live on Monday, December 31 on NBC Sports Network (NBCSN) from 7 to 11 pm ET in the U.S. and on Facebook Watch in the rest of the world.
SMT To Partner with PFL to DevelopProprietary Technology to Measure Real-Time Fighter Performance Data and Analytics,Biometric Tracking Along with Innovative Graphic Enhancements for the League’s LiveLinear and Digital Events.
WASHINGTON DC (December 17,2018) The Professional Fighters League (PFL)and SMT (SportsMEDIA Technology) – theleading innovator in real-time data delivery and graphics solutions for thesports and entertainment industries – today announced an exclusive, long-termtechnology partnership. Under the terms of the agreement, SMT will partnerwith the PFL to create proprietary technology that will measure real-timeMMA fighter performance analytics along with biometric and positional data thatwill provide fans a game-changing live event experience across all platforms.
Starting in 2019, SMT will help power the PFL’s vision of thefirst-ever SmartCage™. The SmartCage™ will utilize biometricsensors and proprietary technology that will enable the PFL to measure anddeliver real-time fighter performance data and analytics, what the PFL isdubbing: Cagenomics™. PFL fans watching linear and digital broadcasts ofthe league’s Regular Season, Playoff and Championship events will experience anew dimension of MMA fight action with integration of live athlete performanceand tracking measurements including: speed (mph) of punches andkicks, power ratings, heart rate tracking, energy exerted and more.
“The Professional Fighters League is excited to be partnering withSMT to advance the sport of MMA. The PFL's new SmartCage™ willrevolutionize the way MMA fans experience watching live fights as next yearevery PFL fight will deliver unprecedented, real-time fighter performance dataand analytics, biometric tracking and an enhanced visual presentation ofthis great sport,” said Peter Murray, CEO, Professional Fighters League. “Notonly will PFL fans benefit from our SmartCage™ innovation, but our pro fighterswill now have access to new performance measurement data, analysis and tools tohelp them train and compete. The PFL’s vision has always been two-fold:deliver the absolute best experience to fans and be a fighters-firstorganization and with the SmartCage™ we will accomplish both.”
“SMTis thrilled to be collaborating with the Professional FightersLeague’s forward-thinking innovation team to bring our latest and greatesttechnology to PFL events,” said Gerard J. Hall, Founder & CEO, SMT.“Starting in 2019, PFL fans will begin to see real-time, live, innovativetechnology that is unique to the PFL in the MMA space. SMT’s OASISPlatform will provide the PFL with a seamlessly integrated system that combineslive scoring with real-time biometric and positional data to enhancethe analysis, storytelling and graphic presentation of the PFL’sRegular Season, Playoffs and Championship events next season.”
The PFL 2018Championship takes place on New Year’s Eve live from The Hulu Theaterat Madison Square Garden and consists of the 6 world title fights in 6weight classes of the PFL 2018 Season. Winnersof each title bout will be crowned PFL World Champion of their respectiveweight class and earn $1M. The PFL Championship can be viewed live onMonday, December 31 on NBC Sports Network (NBCSN) from 7 to 11pm ET in the U.S.and on Facebook Watch in the rest of the world.
###
Professional Fighters League
TheProfessional Fighters League (PFL) presents MMA for the first time inthe sport format where individual fighters compete in a regular season,playoffs, and championship. PFL Season has 72 Elite MMAathletes across 6 weight-classes, with each fighting twice in the PFLRegular Season in June, July, and August. The top 8 fighters in each weight-class advance to thesingle-elimination PFL Playoffs in October. The PFL Championship isNew Year’s Eve in Madison Square Gardens with the finals in each of sixweight classes competing for the $10 million prize pool. The PFL is broadcast live on NBC SportsNetwork (NBCSN) and streamed live worldwide on Facebook Watch. Founded in 2017, the PFL is backed by group of sports, media,and business titans. For more info visit PFLmma.com.
SMT
SMT (SportsMEDIA Technology) is the leading innovator inreal-time data delivery and graphics solutions for the sports and entertainmentindustries, providing clients with scoring, statistics, virtual insertion andmessaging for broadcasts and live events. For the past 30 years, SMT’ssolutions have been used at the world’s most prestigious live sports events,including the Super Bowl, Indy 500, Triple Crown, major golf and tennis events,MLB’s World Series, Tour de France, and the Olympics. SMT’s clients includesports governing bodies; major, regional and specialty broadcast networks;event operators; sponsors; and teams. The 32-time Emmy Award-winning company isheadquartered in Durham, N.C., with divisions in Jacksonville, Fla., Fremont,Calif., and London, England.
SMT is once again one of the busiest vendors on hand at the US Open, providing a cavalcade of technology to serve the USTA, broadcasters, spectators, athletes, and media onsite at the USTA Billie Jean King National Tennis Center (NTC). In addition to providing the much discussed serve clock, SMT — now in its 25th year at the Open — is providing scoring systems, scoring and stats data feeds, LED scoreboards, TV interfaces, IPTV systems, and match analysis.
“This event, just like any Grand Slam, is becoming a three-week event,” says Olivier Lorin, business development manager, SMT. “We have more and more recipients asking for data. Today, we’re actually sending 19 different data feeds to recipients for their own platform. Obviously, we have to get the authorization from the USTA, but then they use that for whatever.”
Countdown to the Serve
An on-court digital clock, similar to the shot clock in basketball and the play clock in football, counts down the allotted 25 seconds before a player must begin the serve (previously, the 20-second clock was visible only to the chair umpire).
After the USTA announced plans to display a countdown clock for this year’s tournament, SMT introduced the clock at ATP and WTA events leading up to the Open — most recently, in Winston-Salem, NC, and Cincinnati — to help players acclimate to it.
“The USTA has been looking to do the serve clock at the US Open for a few years, starting in 2016 with the Juniors and then the qualifiers as an experiment, which all went very well,” says Lorin. “The Australian Open and the French Open also did it in quallies, but the US Open wanted to be the first [Grand Slam] to do this for all events, and we were able to work with them to make that happen.”
The clock, visible to players and spectators alike, begins to tick down immediately after the chair umpire announces the score. The umpire will issue a time violation if the player has not started the service motion at the end of the countdown. The first time the clock hits zero before a player begins the motion, the player receives a warning. For every subsequent time, the player loses a first serve. SMT is driving umpire scoring on all 16 courts and offsite for Junior Qualifying (eight courts).
Lorin sees a benefit to TV in the five-minute warmup clock and the serve clock: “At least seven minutes [is saved], so the match is going to [end] on time more often.”
Serving the Media: IPTV and CCTV
SMT is also responsible for the infrastructure for the USTA’s CCTV, IPTV, and Media Room. The IPTV system for the Media Center at this year’s US Open is now “browser-independent.” It allows users to select and view up to five streams/videos at one time from any of the digitally encoded channels available on the 13-channel CCTV system. In addition, the system allows access to archived player interviews. The IPTV system also includes real-time scores, match stats, draws, schedule, results, tournament stat leaders, US Open history, and WTA/ATP player bio information.
“It’s a very slick interface, and the USTA has been very positive about it,” says Lorin. “Today, it is still under a controlled environment here at the US Open, but, if the US Open wanted to make this open to anybody on the outside, we could easily provide a solution for them to log in and have the same information, with the exception of live video.”
Automation Is Key to New Outer-Courts Coverage
A fixture at live-sports-broadcast compounds, SMT is also providing a variety of services to domestic-rights holder and host broadcaster ESPN, as well as other broadcasters onsite. ESPN is deploying an SMT automated-graphics interface as part of its new automated-production system for outer-court coverage, which relies on a Fletcher Tr-ACE motion-detecting robotic camera system and SimplyLive’s ViBox all-in-one production system.
An SMT touchpad at each of the 16 workstations is used only during prematch coverage. All other graphics elements, including the scorebug and lower-thirds, are fully automated, and informational elements are triggered by preconfigured settings in SMT’s data feed (for example, 10 total aces or 10 unforced errors).
“The beauty of our system is that everything is automated and driven by the score notification of the umpire’s tablet,” says Lorin. “We have built up prematch graphics so we know that, when the umpire hits warmup on the tablet, a bio page for both players and a head-to-head graphic will appear, and then they’ll go to the match. When the match starts, the system is just listening to the score notifications, and we have built-in notifications for five aces and things like that. The only thing that is manual and left to the producer for that court is the set summary and the match summary for statistics.”
Also From SMT: Prize Money Report, LED Superwall, More
This year, SMT has updated its Official Prize Money Report, in which prize money is calculated and a report generated at the end of the tournament and distributed to media officials.
SMT also provides content for the massive outdoor LED Superwall at the main entrance of Arthur Ashe Stadium, displaying scoring-system content: schedules, results, matches-in-progress scores, custom information messages (for example, weather announcements). SMT designs the scoring graphics and provides live updates.
“One of the big things is, we rebranded the US Open package for 2018 with a new logo, a new font, and a new background,” says Lorin. “As a result, we had to apply those design changes across all the platforms we are serving. One of the things we try to do more and more in the video production is, instead of having the typical headshot of a player, to integrate more action shots and motion shots, which are a lot more appealing to the design.”
Other services SMT provides to the US Open on behalf of USTA include stats entry on seven courts; serve-speed systems and content on seven courts; playback controls, including lap selector and data-point scrubbing; draw creation and ceremony; and match scheduling.
For the first time, ESPN is covering all 16 courts at the US Open, thanks to a new automated production system deployed on the nine outer courts at the USTA Billie Jean King National Tennis Center (NTC). Having debuted at Wimbledon in June, the Fletcher Tr-ACE motion-detecting robotic camera system has been deployed on each court (with four robos per court) and relies on SimplyLive’s ViBox for switching and replay and an SMT automated graphics system. With this workflow, one robotic camera operator and one ViBox director/producer is covering each of the nine courts.
“With one production room and one rack room here, we are essentially replacing what would have traditionally been nine mobile units,” notes ESPN Director, Remote Operations, Dennis Cleary. “We’ve been working on this plan for a long time, and there is just no way we would have been able to cover all these courts in a traditional [production model]. SimplyLive has been used at other [Grand Slams], and it was used with Fletcher Tr-ACE at Wimbledon but not really to this extent. We feel that we have taken it to the next level [and] are integrating it with our overall [show] and adding elements like electronic line calling and media management.”
With all 16 courts now accessible, ESPN can present true “first ball to last ball” live coverage across its linear networks and the streaming platforms (a total of 130 TV hours and 1,300 more streaming on the ESPN app via ESPN3 and ESPN+. Moreover, ESPN was able to provide the USTA with live coverage of last week’s qualifying rounds for the first time, deploying the Tr-ACE/ViBox system on five courts.
In addition, ESPN, which serves as the US Open host broadcaster, has been able to provide any rightsholder with a live feed of a player from its country — regardless of the court and including qualifying rounds.
On the Outer Courts: LiDAR Drives Fletcher Tr-ACE System
Four Fletcher robotic systems with Sony HDC-P1 cameras have been deployed on each of the nine outer courts: two standard robos (traditional high play-by-play and reverse-slash positions) and two Tr-ACE automated robos (to the left and right of the net).
“From the beginning, one of ESPN’s big focuses was increasing the camera quality of what was being done on the outer courts,” says Fletcher Sports Program Manager Ed Andrzejewski. “So we built everything around the Sony P1’s to increase the camera quality to match the main [TV courts]. When they send a feed to the rightsholder in Australia and the player they are interested is on one of those outer courts, they wanted the basic quality to be the same as in the bigger stadiums. I think we’ve been able to accomplish that.”
Between the two Tr-ACE cameras is “the puck,” which powers the Tr-ACE system at each court via a custom-designed LiDAR (Light Detection and Ranging) image-recognition and -tracking system. The LiDAR tracks every moving object on the court (the ball, players, ball kids, judges) and provides the two Tr-ACE cameras with necessary data to automatically follow the action on the court. The LiDAR can also sense fine details on each player (such as skin tone or clothing color), allowing the cameras to tell the difference between a player and other moving objects.
A Room of Its Own: Nine Mobile Units in a Single Room
ESPN has erected a dedicated production room for the Tr-ACE/ViBox operation across from its NTC Broadcast Center. Inside this room are nine workstations featuring one Fletcher Tr-ACE camera operator and one ViBox director/producer each.
The Tr-ACE operator monitors the camera coverage and can take control of any of the four cameras at any point during the match. Meanwhile, the ViBox operator cuts cameras and rolls replays. An SMT touchpad at the workstation is used only during prematch coverage. All other graphics elements, including the scorebug and lower-thirds, are fully automated, and informational elements are triggered by preconfigured settings in SMT’s data feed (for example, 10 total aces or 10 unforced errors).
“The camera op and director are constantly communicating,” Andrzejewski explains. “ESPN put a lot of trust in us with this, so we brought out the best people we could and have some of the best [robo operators] in the business here. There was a lot of onsite learning, but we were able to give everyone lots of time on the system during setup and qualifying.”
The coverage does not feature commentary, so all nine courts are being submixed out of a single audio room using a single Calrec audio console and operator.
Also inside the automated production room are a video area to shade all 36 cameras, an SMT position to manage the automated graphics systems deployed at each workstation, an electronic line-calling position (which was not available for the systems at Wimbledon), and a media-management area, which was used during qualifying to record all five courts (this operation moved to the NTC Broadcast Center once draw play began on Monday).
Since the automated-production systems had to be up and running for qualifying rounds last week, ESPN built the operation on an island entirely separate from the Broadcast Center.
“It was just too costly and just not sensible to bring the full broadcast center up a week early,” notes Cleary. “So this entire operation is all standalone. All the equipment from Fletcher, SimplyLive, Gearhouse, and even transmission is all separate and on its own.”
Two-Plus Years of Development Pays Off
Although automated production is nothing new for the US Open — Sony Hawk-Eye technology had been used for several years to produce coverage from five outside courts — this new system has expanded the ability to truly cover every ball of the tournament.
Use of the Tr-ACE/ViBox system at Wimbledon in June and now at the US Open was a long time coming. Fletcher has been developing the Tr-ACE system for 2½ years and demonstrated it offline on one court at the NTC last year. In addition to the Fletcher and SimplyLive teams, ESPN Senior Remote Operations Specialist Steve Raymond, Senior Operations Specialist Chris Strong, and Remote Operations Specialist Sam Olsen played key roles in development of the system and its implementation this week.
“This is certainly a new workflow for us, so a lot of thought and time went into it before we deployed it,” says Olsen. “We felt that the ViBox and the Tr-ACE would certainly give us the ability to produce a high level of content using an automated [workflow], and it’s worked out really well thus far. Having it for the qualifying rounds for the first few days also served as a great test bed. I think the best way to put it is, we’ve grown into it and we’ll develop it and take it to higher level each time we use it.”
By Jason Dachman, Chief Editor, SVG
Thursday, August 2, 2018 - 2:52 pm
After a move from Los Angeles to Madison, WI, prior to last year’s event, the CrossFit Games production operation has continued to grow prodigiously. The “Woodstock of Fitness” has grown from a production comprising 35 crew members working out of a single mobile unit just six years ago to one of the largest live productions on the annual sports calendar: more than 10 NEP mobile units, a crew of more than 300, and 50-plus cameras. Add in the fact that the CrossFit competitions change from year to year, and it becomes clear just how challenging the event can be for the production team.
This year’s CrossFit Games — Aug. 1-5 at the Alliant Energy Center in Madison — are being streamed on Facebook, CBSSports.com, and the CBS Sports App and televised live on CBS (one-hour live look-ins on Saturday and Sunday plus a recap show) with a daily highlights show on CBS Sports Network.
CrossFit has its own live-streaming team onsite and handles in-house production for the videoboards at Alliant Energy Center. SMT, which is CrossFit’s scoring partner, provides a wealth of presentation options for the boards as well.
CrossFit has used TVU Networks bonded-cellular and IP systems for several years for point-to-point transmission. This year, CBS Digital also used a TVU system to take in streams from the CrossFit Regionals earlier this summer. That success led to a similar partnership for the Games, with CBS Digital receiving all the live competitions on two streams via TVU receivers.
As CrossFit Games’ Footprint Grows, So Does the Live Production
The Games themselves have expanded and become more complex. The production team is tasked with covering multiple venues throughout Alliant Energy Center, primarily The Coliseum and North Park Stadium. This year, the stadium has been expanded to 10,000 people (nearly 50% more than for the 2017 edition) and has added a new videoboard.
July 18, 2018
Sports Video Group
SMT was back at MLB All-Star in Washington, providing Fox Sports its live virtual–strike-zone system and, for the 14th consecutive year, virtual signage.
SMT rendered the virtual–strike-zone graphic, as well as the watermarks when viewers saw the ball cross the plate.
SMT’s Peter Frank was on hand at 2018 MLB All-Star to support Fox Sports’ virtual efforts.
SMT handled virtual signage behind the plate for Fox’s Camera 4 (the primary pitcher/batter camera) and tight center field. For the third year in a row, the company also integrated its system with the high-home position, inserting virtual signage on the batter’s eye in center field.
“We use stabilization for virtual signage on the main camera, which is used for the virtual strike zone, so that helps out with the stability of both graphics,” said SMT Media Production Manager Peter Frank. “Two years ago at MLB All-Star in San Diego was the first time we did [virtual signage on] the batter’s eye, and Fox was really happy with it. So we also brought it back in
July 5, 2018
Sports Video Group
After a successful pilot game last year, the American Flag Football League (AFFL) is back in action this summer with the U.S. Open of Football (USOF) Tournament. The final 11 games of the tournament kick off NFL Network’s AFFL coverage, and the network is embracing the “Madden-style” coverage and the production elements it debuted last year, including using a SkyCam as the primary game angle, deploying RF Steadicams inside the huddle, rolling out customized SMT virtual graphics across the field, and miking players throughout the game.
“After last year’s pilot show, there was a lot of great feedback. Everybody liked the football on the field and the direction the technology was going,” says Johnathan Evans, who served as executive producer and director of last year’s production and is directing the NFL Network telecasts this year. “So our coverage is going to be almost exactly the same as last year, with a few differences since we are doing 11 games instead of just one. We have come up with a great formula that hasn’t been tried on a consistent basis before and offers a different perspective from watching a [traditional] football broadcast. With [AFFL], you’re watching from the quarterback perspective; you’re watching it just like you’re playing a Madden NFL [videogame].”
How It Works: Breaking Down the AFFL FormatThe 12 teams featured in the USOF Playoffs are composed of eight amateur squads in the America’s Bracket (derived from four rounds of play that began with 128 teams) and four teams captained by celebrities in the Pro Championship Bracket. NFL Network’s USOF coverage began with the America’s Bracket Quarterfinal last weekend from Pittsburgh’s Highmark Stadium and continues with the semifinals this weekend at Atlanta’s Fifth Third Bank Stadium, the America’s Bracket Final and Pro Bracket Final on July 14 at Indianapolis’s Butler Bowl, and the $1 million Ultimate Final (featuring both bracket champions) on July 19 at Houston’s BBVA Compass Stadium.
The 11 AFFL telecasts on NFL Network will feature SMT virtual graphics, including the Go Clock.
The 11 AFFL telecasts on NFL Network will feature SMT virtual graphics, including the Go Clock. The 7-on-7, no-contact 60-minute AFFL games feature many of the same rules that average Americans know from their backyard games. The same players are on the field for both offense and defense, and a team must go 25 yards for a first down. There is no blocking; instead, a “Go Clock” indicates when the defense can rush the QB (after two seconds) and when the QB must release the ball or cross the line of scrimmage (four seconds). There are also no field goals (or uprights, for that matter), and kickoffs are replaced with throw-offs.
“This is not only a sport that creates a lot of intensity and energy; it’s also a sport that you as an average person can relate to because you’re watching an average person play the game,” says Evans. “You’re not watching professional athletes. You’re watching amateurs playing a sport that you can play at home. That is something that every single viewer can relate to.”
Inside the Production: It’s All About Access By using the SkyCam for play-by-play, RF Steadicams on the field, and player mics, the AFFL and NFL Network are focused on providing fans unprecedented up-close-and-personal access to the action on the field.
“We’re most excited about having SkyCam as our game camera, which really adds a different perspective, and also having everybody miked up so we can hear everything that’s going on and listen in,” says producer Tom McNeely. “We’re focused on making [viewers] feel like they’re right there on the field with these guys. Bringing them into the huddle with our cameras and microphones — we will have somebody sitting in the truck with a mute button in case the players are a little rambunctious — is going to make this really appealing and fun.”
The upcoming NFL Network AFFL productions will deploy Game Creek Video mobile units and feature an average of eight cameras: the SkyCam system, two traditional 25-yard-line angles for isos, a mid-level end-zone angle, one handheld high-speed camera, a jib on a cart roving the sidelines, and two RF cameras (Steadicam and a MōVI).
“The only new cameras we are adding is a second [RF camera] so we can cover both sides of the football,” says Evans. “Last year, we had only one Steadicam, which was great, but I realized that we were losing the intimacy on both sides of the ball. Before you get to the red zone, it’s great to be inside the huddle and see from behind the quarterback on the offensive side of the ball. But, once you get to the red zone, you need to get ready for a touchdown, so you have to switch your Steadicam to the defensive side of the ball, and you hope to get a touchdown in the end zone. This time, in Indianapolis and in Houston, we’re going to have a Steadicam on both sides of the ball to retain the potential atmosphere for every single play. Before the snap, during the snap, and after the snap, you’re going to have that great intensity right in your face the entire time.”
Go Clock Returns; Interactive Line of Scrimmage DebutsThe Go Clock, designed by SMT specifically for the fast-paced AFFL, is also back after playing a major role in defining the league’s production style during its pilot game. The system synchronizes with in-stadium displays to indicate when the defense can rush the quarterback.
“The Go Clock was a big success, and we’re bringing it back this year,” says Evans. “We’re also introducing a line of scrimmage that will change color when [the defense] is able to rush. So the virtual graphics are still there and play a big role [in the production].
The same SMT virtual 1st & Ten line used in NFL broadcasts will be deployed from the company’s Camera Tracker system, working in tandem with SkyCam to give viewers the “Madden-style” play-by-play angle used several times by NBC Sports last NFL season.
SMT’s Design Studio also designed and implemented the AFFL graphics package — including show open and score bug — and the virtual-graphics package.
SMT’s clock-and-score technology is made available via its dual-channel SportsCG, a turnkey graphics-publishing system that allows greater autonomy via a second-channel laptop that can be operated remotely. In addition to producing the score bug, the SportsCG offers real-time, in-game offensive and defensive statistics powered by SMT QB Stats (the same system used for NCAA and NFL games).
In addition to the virtual elements, the AFFL has enhanced the physical first-down marker used on the field, so that it digitally displays the down, play clock, game clock, and possession arrow. The system also emits an audible alert when the rusher can break the line of scrimmage after two seconds and when the quarterback has to throw the ball after four seconds.
Beyond the Tech: Storytelling, NFL Network IntegrationAside from the production elements, the AFFL also offers a host of great storytelling opportunities surrounding the squads of Average Joes on the field. McNeely, who knows a thing or two about telling the stories of unknowns on the field, having produced a dozen Little League World Series for ESPN, sees the AFFL as a one-of-a-kind storytelling opportunity.
“These aren’t pro names or pro teams; you’re starting from scratch telling those stories. There are a lot of great stories and personalities with layers — [such as] a 50-year-old, 5-ft.-8 quarterback with a potbelly leading the team from Tennessee or one of the amazing athletes who fell short of the NFL but played in the CFL or the Arena League,” says McNeely. “When I first met [AFFL CEO/founder] Jeff Lewis, who has worked so closely with Jonathan and all of us to develop this, he mentioned what a huge fan he was of Little League World Series. And he promised us all the access we needed so that we would be able to tell introduce these players and tell their stories.”
NFL Network’s commitment to the AFFL goes well beyond just televising 11 games, however. Not only do the telecasts feature NFL Network talent like Good Morning Football’s Kay Adams (serving as sideline reporter throughout the tournament) and NFL Total Access host Cole Wright (calling play-by-play on July 14), the network is also incorporating AFFL segments into its daily studio programming, social-media channels, and digital outlets in an effort to appeal to football-hungry fans during the NFL offseason.
“We really feel like there’s a huge opportunity here during the summer, when the NFL really has nothing going on,” says McNeely. “We’re excited to see some traction with social media and on the NFL Network. They are doing a lot to promote [the AFFL] on their studio shows, and we’re hoping it takes off. I think there will be a grassroots push for this similar to what you’ve seen with the Little League World Series.”
June 29, 2018
Sports Video Group
While the broadcast debut of Dale Earnhardt Jr. in the NASCAR on NBC booth is creating plenty of buzz around NBC’s first races of the season this weekend at Chicagoland Speedway, the uber-popular retired driver isn’t the only new addition to the network’s NASCAR coverage this year. Echoing its rink-side “Inside the Glass” position on NHL coverage, NBC will debut the Peacock Pit Box – a remote studio set built within a traditional pit box frame that will be located along pit road for pre- and post-race coverage at each speedway throughout the season.
NBC will debut the Peacock Pit Box – a remote studio set built within a traditional pit box located on pit road – for its NASCAR pre/post-game shows
“The Peacock Pit Box is going to put us in the middle of the action,” says NBC Sports Group Executive Producer Sam Flood. “We’ve had the big set down on the grid for the first three years of [our NASCAR rights] contract. We realized that sometimes the fans departed from that area as we got closer to race time and took away some of the sense of place. So the idea was to have a real sense of place throughout the day, starting with the pre-race show. And most importantly, it gives us a place inside that mayhem that is pit road, which has become one of the most exciting places at the racetrack each week.”
Inside the Peacock Pit Box: Two Levels With Plenty of Tech FirepowerThe 14-ft.-long x 12.5 ft.-wide Peacock Pit Box (a normal-sized NASCAR pit box is 10×8 ft.) features two-levels and is located in a traditional pit box right along pit road. In addition to serving as the home to NASCAR on NBC’s pre-race coverage throughout the season, the structure also features an arsenal of robotic cameras that will aid in NBC’s coverage of pit road throughout each race.
“Sam [Flood] and Jeff [Behnke, VP, NASCAR production, NBC Sports Group] first had the vision and then there were a lot of great creative and technical people that helped to bring it to life,” says NBC Sports Technical Manager Eric Thomas. “They wanted to give our announcers a uniqe vantage point of the field of play – and that’s obviously pit lane. It’s like the 50-yard line in football or center ice in hockey. Our [announcers] will have an elevated position between all the teams right in the middle of the action, so they not only can see the racetrack but also see the competitors on either side of them.”
The NASCAR on NBC team worked with the NBC Sports Group design team in Stamford, CT, to design the Peacock Pit Box, while Nitro Manufacturing built the structure and Game Creek Video provided technical support and equipment.
The top level of the Peacock Pit Box will serve as the primary home from NBC Sports’ Monster Energy NASCAR Cup Series and Xfinity Series pre- and post-race coverage, with host Krista Voda and analysts Kyle Petty and Dale Jarrett occupying the desk. One handheld and three robotic cameras will be on hand for pre/post-race shows.
The 14-ft.-long x 12.5 ft.-wide Peacock Pit Box (a normal-sized NASCAR pit box is 10×8 ft.) features two-levels and is located in a traditional pit box right along pit road.
“It’s a nice dance floor that can support our announcers and various different configurations,” says Thomas. “We have to work within the space of the pit stall, which depends on the track. We have neighbors on either side of us, so we want to really be respectful of the teams and not interfere with them whatsoever. So we’re going to fit in our space very neatly and very cleanly without having an impact on the actual event. We wanted to make it as big as we could to make our announcers as comfortable as possible and also provide the technical equipment to produce a quality show.”
Meanwhile, the lower level of the Pit Box will provide additional broadcast positions with two wired cameras and an occasionally an RF camera and/or a small jib (depending on the size of pit box at each track). The space features interactive displays and a show-and-tell position for analysts like Daytona 500-winning crew chief Steve Letarte to deliver deeper analysis of the track action.
“The technology will be there for Steve to [provide deeper analysis], particularly in the Xfinity races, where he’s going to be hanging down on pit road in a pit box, restarting his old career of looking at the race when you only can see half the racetrack on pit road,” says Flood. “We think by [locating] Steve [there], it will give him more opportunity to focus that unique mind of his on what the heck all the other cars are doing on the track. So we see that as a huge advantage.”
The lower level also features a patio position where NBC will look to conduct interviews with drivers, pit crew chiefs, owners, and NASCAR officials throughout its race coverage.
All About Flexibility: Nine Robo Positions Give NBC Plenty of OptionsSince NBC’s pre- and post-race setup will vary week-to-week depending on the track, Thomas and company were tasked with making the Peacock Pit Box as versatile as possible. With that in mind, the upper level features nine different robotic camera positions. Three robos can be deployed at a time and – thanks to the small, lightweight cameras and custom-developed camera mounts deployed on the Pit Box – the operations team can quickly swap camera positions at any time during NBC’s coverage.
Beloved NASCAR driver Dale Earnhardt Jr., who retired after last season makes his broadcast debut as NASCAR on NBC Analyst this weekend at Chicagoland.
“If our director wants to change the shot or we want to totally rotate 180 degrees, we can do that in about 10 minutes,” says Thomas. “If we want to do a show with the track in the background first and then, a few minutes later, we want to look toward the garage with a different set of announcers, we can move the cameras quickly and make that happen. So it’s very flexible.”
In addition to being used for pre- and post-race studio coverage, these robos will be utilized for coverage of the action on pit road throughout NASCAR on NBC telecasts.
“The cameras are going to pull double duty because, if something’s going on in pit lane, those cameras are still going to physically be there. So they are going to give us some different angles that we haven’t seen very much of in the past,” says Thomas. “We’ve tried to create as much flexibility as possible so when Sam and Jeff ask, ‘can we do this?’, then we can say, ‘of course you can.’”
BatCam Returns: Aerial System Headlines NBC’s Army of CamerasNBC Sports will deploy an average of 55 cameras – including the return of the BatCam point-to-point aerial system to cover the backstretch – on big races at Daytona, Indianapolis, and Homestead-Miami this season. Thomas also expects to use BatCam, which debuted last year and can hit speeds of over 100+ mph, at the Watkins Glen road course this year. The BatCam also drew rave reviews throughout NBC’s Triple Crown coverage this past spring.
NBCS Sports is bringing back the BatCam point-to-point aerial system will to cover the backstretch at NASCAR races
The bulk of NBC’s camera complement for NASCAR is made up of Sony HDC-4300’s along with a mix of robos (provided by Robovision) and roving RF cameras. BSI will once again be providing eight RF in-car-camera dual-path systems, which allow two angles to be transmitted from each car at any given moment. Thomas also says his NASCAR on NBC team is currently experimenting with several new camera positions, which he expects to roll out throughout the season.
Going Inside the Action With New Graphics, Analysis ToolsNBC is utilizing SMT’s tools for the fourth straight NASCAR season. This year, the SMT race crawl has been updated to show the live running order and driver statistics at the traditional position on top of the screen and in a new vertical pylon display on the left side. The multiple options provide production with a variety of ways to allow fans to track each driver.
Also new this year is the SMT GOTO interactive touchscreen display, which provides several tools NBC can use throughout each race weekend, giving on-air analysts the ability to telestrate highlights, compare drivers and statistics, and interact with fans on social media.
SMT’s new Broadcast Analytics system has also been added to help enhance the coverage. The system live tracks all the cars during each session and allows production to show a virtual replay of any lap run during practice, qualifying and the race. The system allows production to visualize any lap run by any driver. It can provide a combined display of how a single driver ran on different laps, showing changes they’ve made during the session. The system can also show how different drivers ran the same lap. All of these options will allow fans to see the key moments during each session and better understand how that impacted where each driver finished.
In the Compound and Back Home in StamfordGame Creek Video’s PeacockOne (A and B units) will once again serve as the home to the NASCAR on NBC production team on-site, while an additional pair of Game Creek trucks will house mix effects and editing, as well as robo operations and tape release. In all, NASCAR truck compounds will be stocked with an average of 19 trailers (including BSI, Sportvision, NASCAR operations, and more).
“NASCAR does a great job setting up the compounds for us and providing a beautiful sandbox for us to play in,” says Thomas.
In addition, the NBC production team continues to increase rely more and more on file-sharing with the NBC Broadcast Center in Stamford, CT. AT&T and PSSI have partnered established fiber connectivity at the majority of the NASCAR tracks and will provide NBC with a circuit back to Stamford for file-transfer, as well as home-running individual cameras for at-home productions. Pre- and post-race shows from the Peacock Pit Box will regularly send back cameras to a control room in Stamford, where the show will be produced.
“We started [producing shows out of Stamford] last year and we will expand it more this year,” says Thomas. “It worked well last year and we’re making some improvements this year to make it even more seamless. With the increased support from AT&T and PSSI for network connectivity, I think it’s going to be even better this year. Obviously there are big cost savings on travel [as a result], but the product is of the same quality – so it’s really a win-win.”
SMT (SportsMEDIA Technology) continues its collaboration with the American Flag Football League (AFFL) to provide game management technology for the AFFL’s first U.S. Open of Football Tournament (USOF). The teams playing in the Ultimate Final at BBVA Compass Stadium in Houston will battle for a $1 million cash prize. SMT technical teams will be onsite at the USOF Tournament for every game, providing the customized virtual and clock-and score technology and graphics package that helped to define the league last year during its launch on June 27 at Avaya Stadium. Retired NFL stars return to the field to captain the teams, along with basketball legends and an Olympic gold medalist. SMT’s virtual 1st & Ten line system, used in NFL broadcasts, will be deployed from its Camera Tracker system, working in tandem with SkyCam to give viewers the “Madden-style” play-by-play angle used during NBC Sports’ 2017 season. SMT’s virtual Go Clock, designed specifically for the fast-paced AFFL, will synchronize with in-stadium displays to indicate when the defense can rush the quarterback.
SMT’s Design Studio designed and implemented the AFFL graphics package — including show open and score bug — and the virtual-graphics package. SMT’s clock-and-score technology is made available via its dual-channel SportsCG, a turnkey graphics publishing system that allows greater autonomy via a second channel laptop that can be operated remotely. In addition to producing the score bug, the SportsCG offers real-time, in-game offensive and defensive statistics powered by SMT QB Stats, the same system SMT uses for NCAA and NFL games. “SMT is proud to have helped the AFFL launch a new sports era, and we are thrilled to build on last year’s great success by offering flag football fans the same platform they’re used to when watching college and NFL games,” says Ben, SMT Business Development Manager. “With the debut of our dual- channel SportsCG, we can decrease the production bottleneck associated with rendering graphics on-air, allowing the quickly developing storylines to be told in a more dynamic way.”
June 17, 2018
Sports Video Group
The 2018 U.S. Open from Shinnecock Hills Golf Club gave the Fox Sports team challenges in production planning that led to innovations, the opportunity to refresh old workflows and core infrastructure, and a chance to chart some new directions for golf coverage.
The front-bench area in Game Creek Video’s Encore truck is at the center of Fox Sports’ U.S. Open coverage.
Game Creek Video’s Encore production unit is at the center of the coverage for Fox and FS1 with Game Creek Pride handling RF-video control and submix and providing a backup emergency control room. Pride’s B unit is handling production control for one of the featured groups, Edit 4 is handling all iso audio mixes, and Edit 2 is home to five edit bays with equipment and support provided by CMSI. And there is also the 4K HDR show, which is being produced out of Game Creek Maverick.
“All the Sony 4300 cameras on the seventh through 18th greens are 4K HDR-native with a secondary output at 720p SDR,” says Brad Cheney, VP, field operations and engineering, Fox Sports. There are also six Sony PXW-Z450’s for the featured holes and featured group, the output of two of them delivered via 5G wireless.
“We are producing two 4K HDR shows out of one mobile unit with four RF-based 4K cameras,” he adds. “That is another big step forward.”
In terms of numbers, Fox Sports has 474 technicians onsite, making use of 38 miles of 24-strand fiber-optic cable to produce the event captured by 106 cameras (including 21 wireless 1080p, 21 4K HDR units, six 4K HDR wireless, three Inertia Unlimited X-Mo cameras shooting at 8,000 fps, a Sony HDC-4800 at 960 fps, and three Sony HDC-4300’s at 360 fps) and 218 microphones. Tons of data is being passed around: 3 Gbps of internet data is managed, along with 83 Gbps of broadcast data, 144 TB of real-time storage, and 512 TB of nearline storage.
A Second CompoundEach course provides its own unique challenges. At Shinnecock Hills, there is is the presence of roads running through the course, not to mention the hilly terrain, which also has plenty of deep fescue. But, from a production standpoint, the biggest issue was the small space available for the compound.
Director, Field Operations, Sarita Meinking (left) and VP, Field Operations and Engineering, Brad Cheney are tasked with keeping Fox Sports’ U.S. Open production running smoothly.
“We came out here 18 months ago,” says Cheney, “and, when we placed all of our trucks in the compound map, [they] didn’t fit, and that is without the world feed, Sky, TV Asahi, and others. At Erin Hills last year, we had a support tent, and that gave our camera crew more space, dry storage, and a place to work.”
The decision was made to expand on what was done at Erin Hills last year: move the production operations that most benefit from being close to the course to a large field tent located along the third hole. The field tent is about a half mile from the main compound and is home to the technology area (shot-tracing technologies, etc.); the camera, audio, and RF areas; and the robotic cameras provided by Fletcher. Inertia Unlimited President Jeff Silverman is also located in the tent, controlling X-Mo cameras as well as robotic cameras that can be moved around the course to provide different looks.
Cheney says the team took the field tent to a new level by providing an integrated source of distribution and monitoring so that it could effectively be an island to itself. “It has worked out well. People are comfortable there. It’s dry and offers direct access to the course.”
According to Michael Davies, SVP, technical and field operations, Fox Sports, some of the operations in the field tent, such as those related to enhancements like shot tracing and the Visual Eye, could ultimately move even farther from the main compound.
“Typically, they would be in the main compound,” he explains, “but, once we figured out how to connect the two compounds via fiber for a half mile, it [indicates] how far away you can put things [like the shot-tracking production]. It gets the mind going, especially for events like this that can be hard to get to.”
Fox Fiber Technician Bryce Boob (left) and Technical Producer Carlos Gonzalez inside the fiber cabin
Also located closer to the course is the fiber cabin, a move that allows the team to more quickly deal with any connectivity issues on the course. The 37 miles of fiber cable used across the course is monitored in the cabin, and Carlos Gonzalez, technical producer, Fox Sports, and the team troubleshoot and solve any issues.
“We’re isolated from the compound, which can make it a challenge,” he notes, “but we are actually liking it.”
Cheney says that placing the cabin closer to the course means a reduction in the amount of outbound fiber and also makes the operation a true headend. “It’s something that we will continue to do at Pebble next year [for the 2019 U.S. Open] because of the setup there. This has been another good learning experience for us.”
Steps ForwardOne big step taken in preparation for the 2018 events was that the IP router in Encore was rebuilt from scratch.
“All of the programming in the router was there since day one [in 2015], and we have found new ways to do things,” says Cheney. “To strategically try to pull things out of it just wasn’t worth it. So we started from zero, and it paid off in terms of how quickly we could get up and running.”
Also playing an important part in enhancing the workflows was CMSI and Beagle Networks, which made sure networks and editing systems were all ready to go.
“The team from CMSI and Beagle Networks has been phenomenal in wiring up our networks and making sure it’s robust and all-encompassing,” says Cheney. “We also figured out new ways with IP to control things, move signals, and offer better control for our operators no matter where they are.”
RF wireless coverage this year is being provided completely by CP Communications. There are 26 wireless cameras on the course plus 18 wireless parabolic mics and nine wireless mics for talent on the course. All the signals are run via IP Mesh control systems, and CP Communications also provided all the fiber on the course.
The 5G setup includes a 5G cell mounted on the tower connected to processing gear on the back of a buggy.
Fox Sports is at the forefront of wireless innovation, working with Ericsson, Intel, and AT&T on using next-generation 5G wireless technology to transmit 4K HDR signals from Sony PXW-Z450 cameras to the compound. The 4K cameras are wired into an Ericsson AVP encoder, which sends an IP signal to an Intel 5G MTP (Mobile Trial Platform), which transmits the signal in millimeter wave spectrum via a 28-GHz link to a 5G cell site mounted to a camera tower. That cell site is connected to the Fox IP Network and, in the production truck, to an Ericsson AVP that converts the signal back to baseband 4K.
The potential of 5G is promising, according to Cheney. First, the delay is less than 10 ms, and, conceptually, a 10-Gbps (or even 20-Gbps) 5G node could be placed in a venue and the bandwidth parsed out to different devices, such as cameras, removing the need for cabling.
“You can fully control the system as a whole versus allowing direct management on the device level,” he says.
And, although the current setup requires a couple of racks of equipment, the form factor is expected to get down to the size of a chip within a year.
Expanding InnovationIn terms of production elements, Fox Sports’ commitment to ball-tracing on all 18 holes continues in 2018, with the network equipping each tee box with Trackman radar technology. Eight holes are equipped to show viewers a standard ball trace over live video, with enhanced club and ball data. The other 10 holes have Fox FlightTrack, a live trace over a graphic representation of the golf hole, offering more perspective to the viewer.
Beyond tee-shot tracing, three roaming RF wireless cameras are equipped with Toptracer technology, providing trace on approach shots. And new this year is FlightTrack for fairway shots on two holes, Nos. 5 and 16.
Zac Fields, SVP, graphic tech and innovation, Fox Sports, says the goal next year is to expand the use on fairways. “We want to do more next year and also find a way to use that on taped shots as well.”
Virtual Eye, the system at the core of FlightTrack that takes a 3D model of a hole and uses shot data from SMT as well as from the Trackman and Top Tracer shot-tracking systems to show the ball flight within the 3D model, has also been expanded. The Virtual Eye production team began its U.S. Open preparation a couple months back by flying a plane over the course and capturing photos to map the topography. Then, a few weeks ago, a helicopter shot video of the course, and pictures were extracted from the video and laid over the topographical images.
The FlightTrack team is located inside the field tent, making it easier to hit the course and fix any issues related to shot-tracking technology.
One of the goals, says Ben Taylor, operations manager, Virtual Eye, has been to make the system more automated and to allow it to be used on taped shots. For example, the EVS-replay users themselves can now trigger Virtual Eye to be active with the push of a button. And, when the ball comes to a rest, the graphic slides off the screen.
“The system will reset in the background after the shot,” he notes.
Fields and the Fox team have been happy with the performance, particularly the ability for EVS operators to control the graphic overlay. “It’s pretty slick,” he says. “The system takes the EVS feed and runs it through the graphics compositor and then back into the EVS, so the EVS system is recording itself. It seems complex, but, once the operator gets used to it, it’s easy. And now they can do FlightTrack a lot more.”
When Fox Sports took on the challenge of the U.S. Open in 2015, the industry watched to see how it would change the perception of golf coverage. Four U.S. Opens later, it is clear that the innovative spirit that has been part of Fox Sports since its early days continues unabated, especially as the era of sports data takes hold of the visualization side.
“We want to bring the CG world into our coverage and create animations to tell stories like comparing every tee shot a player took on a certain hole or comparing Dustin Johnson’s fade with another player’s draw,” says Fields. “And now we can show how the wind will affect a shot.”
June 8, 2018
Sports Video Group
With the second Triple Crown in just four years on the line, NBC Sports Group is pulling out all the stops for coverage of this weekend’s 150th Belmont Stakes. With Justify poised to capture the final gem of the Triple Crown, NBC Sports Group has boosted its production complement, adding a second onsite studio set, live pointer graphics to identify Justify on the track, and five additional cameras, including the Bat Cam aerial system that drew rave reviews at both the Kentucky Derby and the Preakness Stakes.
“Once Justify won Preakness, we knew what we were in for, and we started putting everything in motion right away,” says Tim Dekime, VP, operations, NBC Sports Group. “The [equipment levels] were increased a good bit, and we added all the bells and whistles. It means a lot more work and preparation, but it’s very exciting for us, and we are very well-prepared.”
All Eyes on Justify: More Cameras and Virtual Tracking Graphics NEP’s ND1 (A, B, C, and D units) mobile unit will once again be on hand to run the show, with a total of 43 cameras deployed — up from 33 for last year’s non-Triple-Crown race. Besides the Bat Cam aerial system covering the backstretch, the camera arsenal includes a Sony HDC-4800 4K camera (outfitted with a Canon UHD 86X lens) on the finish line, five HDC-4300’s running at 6X slo-mo and five more running at 60 fps, 14 HDC-2500’s (eight hard, six handheld), five HDC-1500’s in a wireless RF configuration (provided by BSI), a bevy of robos (provided by Fletcher) and POVs, and an aerial helicopter (provided by AVS weather permitting).
Ready for a Triple Crown effort at Belmont: (from left) NEP’s John Roché and NBC Sports Group’s Keith Kice and Tim Dekime
Five other cameras have been added because of the Triple Crown possibility: a POV camera at Justify’s gate and one in the PA booth with announcer Larry Collmus (which will be streamed live on the NBC Sports App), a robo to capture a 360° view of the paddock, an additional RF camera roaming the grounds, and, most notably, the Bat Cam system.
In addition to more cameras, NBC plans to use SMT’s ISO Track system to identify Justify with a virtual pointer graphic live during the race. The system will incorporate real-time data — speed, current standing, and distance from finish line — into the on-air pointer graphic, helping viewers follow Justify and other key horses throughout the day’s races.
“We’ll have a live pointer that tracks Justify during the race that our director [Drew Esocoff] will insert, if needed, [so] the horse will be tracked for the viewers watching at home,” says Coordinating Producer Rob Hyland. “It will have a little arrow pointing to where he is at certain points in the race.”
Bat Cam Covers the Back StretchThe Bat Cam was a hit at both Churchill Downs and Pimlico, providing a never-before-seen view of the backstretch and also coming in handy when rain and fog complicated matters for NBC at both the Derby and the Preakness. The two-point cable-cam system can travel 80 mph along the backstretch, running 15-18 ft. above the ground.
“NBC had already used the Bat Cam on NASCAR, so we knew what to expect at the Derby, and it was just a matter of figuring out how to implement it into our show,” says Keith Kice, senior technical manager, NBC Sports. “It’s turned out to be a great [tool for us], especially at [the Preakness]. Even if it wasn’t for all the fog, the infield [at Pimlico] with all the tents and stages and infrastructure makes it very difficult; you really need the Bat Cam just to cover the backstretch because you can’t see it otherwise.”
Given the massive size of the Belmont track, the Bat Cam will cover more ground than at either of the two prior races but will not cover the entire backstretch. The system will run 2,750 ft. — more than 700 ft. longer than at the Kentucky Derby, 500 ft. longer than at the Preakness Stakes — of the 3,000-ft. backstretch.
“The length of the backstretch was definitely a challenge in getting the Bat Cam unit [installed],” says Dekime. “But the benefit here as opposed to Preakness is that there’s nothing in the infield the way that it’s one big party at Pimlico. We are unencumbered, so that’s a positive. The length of the backstretch was a challenge in getting the Bat Cam units to cover most of the backstretch.
Although NBC and the Bat Cam team were forced to bring in larger cranes at Belmont in order to install the longer system, says NEP Technical Director John Roché, setup and operation of the Bat Cam has improved significantly since the Derby.
“It’s no longer a science experiment like it was before,” he says. “We’re able to get [Bat Cam owner/operator] Kevin Chase all the gear that they need, and they are able to give us what we need pretty easily in terms of terminal gear, intercoms, and everything. It’s pretty much plug-and-play now.”
Hyland adds that the Bat Cam “will not only cover the backstretch of the race but will also provide dramatic reset shots of this vast facility. When the Triple Crown is on the line at Belmont, the energy in this venue is electric, and we want to capture the sense of place.”
Triple Crown Chance Warrants Double the SetsBesides additional cameras because of the Triple Crown potential, NBC Sports has also added a second studio set. Host Mike Tirico and analysts Randy Moss and Jerry Bailey will man the 18- x 18-ft. set at the finish line, and a secondary 24- x 24-ft. stage located near Turn 2 will feature host Bob Costas and other on-air talent.
“If it was not going to be a Triple Crown, we would likely be down to just the finish-line set,” says Dekime, “but, now that it is, we’ve put the Turn 2 set back into operation.”
SMT’s Betting and Social Media GOTO videoboard will also be situated at the main set for handicapper Eddie Olczyk, who will use the interactive touchscreen for real-time odds and bet payouts for all races throughout the day. The touchscreen technology and betting touchscreen will enable him to explain and educate the viewers on how he handicaps specific races.
In addition to the onsite sets, NBC plans to incorporate several live remote feeds into the telecast, including from Churchill Downs.
“We brought out all of the tools to showcase the Triple Crown attempt, including a number of remotes that will carry live shots from Churchill Downs, where it all began five weeks ago,” says Hyland. “There will be hundreds of people gathered watching the race. We may have a live remote shot from a Yankees-Mets game just a few miles away. We’re working on a couple other fun ones as well, just to showcase this day and this athletic achievement, should it happen.”
Looking Back at a Wet and Wild Triple Crown CampaignAlthough the horse-racing gods have granted NBC the potential for a Triple Crown this weekend — and the big ratings that go along with it — the weather gods have not been so kind. After the wettest Kentucky Derby on record and the foggiest Preakness Stakes in recent memory, a chance of rain remains in the forecast for Saturday. However, Roché notes that the proliferation of fiber and the elimination of most copper cabling onsite has significantly reduced weather-related issues.
“Despite torrential downpours on the first two races, we’ve been really fortunate,” says Roché. “And no matter what happens here [in terms of rain], we’re getting a little spoiled having two Triple Crowns in [four] years after a 37-year drought. For us to be able to have an opportunity to show the public how we cover racing, especially with the addition of Bat Cam, in a Triple Crown situation is really an honor.”
Kice seconds that notion: “Having a Triple Crown [in play] makes all the hard work and troubles we went through with the weather and logistics on the first two races even more worthwhile.”
June 6, 2018
Sports Video Group
SMT will provide fan-engagement technology solutions for NBC Sports Group’s broadcast of the 150th Belmont Stakes. This year marks the eighth consecutive Triple Crown collaboration between SMT and NBC Sports Group and is particularly exciting as Justify seeks to become only the second horse since 1978 to win a Triple Crown.
Much like the Preakness Stakes and the Kentucky Derby, SMT’s suite of products will engage viewers from gate to finish with real-time, data-driven graphics, up-to-the-second odds, and commentator analysis.
SMT’s Live Leaderboard System highlights the running order of the top six horses using positional data updated 30 times per second per horse, ensuring accuracy and speed for SMT’s on-air graphic presentation.
SMT’s ISO Track system identifies the horses and incorporates real-time data such as speed, current standing, and distance from finish line into an on-air pointer graphic, helping viewers follow the action during the race.
SMT’s ticker produces an on-air display of real-time odds and bet payouts using live data from the race’s Tote provider (in-house wagering system). The ticker also curates and visually displays social media feeds that give followers an inside look at happenings at the track.
SMT’s Track Map System gives viewers a display of the lead horse’s real-time position and split times via an on-screen graphic.
SMT’s Betting and Social Media GOTO video board features real-time odds and bet payouts for all the races throughout the day. The system provides an interactive system for talent to explain the process of horse wagering.
The Data Matrix Switchboard (DMX) provides a customized solution for each Triple Crown race, absorbing, collating, and synchronizing live data feeds into SMT’s proprietary horse racing database. The DMX integrates live data for on-air and off-air graphics in real-time and replay modes, enhancing NBC’s live race presentation and pre and post race analysis. These displaysalso feature real-time advanced odds and minutes-to-post countdowns.
“With a Triple Crown in play for the second time in four years, SMT has another unique chance to help document a historic moment,” says Ben Hayes, Manager, Client Services, SMT. “Our systems help novice race fans understand the core aspects of the sport, while also providing in-depth betting and live race analysis for racing aficionados.”
April 24, 2018
Golf Channel
World No. 1 Justin James, Defending Champion Ryan Reisbeck & 2013 Volvik World Long Drive Champion Heather Manfredda Headline First Televised Event of 2018 from Long Drive’s Most Storied Venue
Veteran Sports Broadcaster Jonathan Coachman Making Golf Channel Debut; Will Conduct Play-by-Play at Each of the Five Televised WLDA Events in 2018
Eight men and four women have advanced to compete in tonight’s live telecast of the Clash in the CanyonWorld Long Drive Association (WLDA) event, airing in primetime from Mesquite, Nevada, at 7 p.m. ET on Golf Channel. In partnership with Golf Mesquite Nevada and taking place at the Mesquite Regional Sports and Event Complex, the group of competitors headlining the first televised WLDA event of 2018 are World No. 1 Justin James (Jacksonville, Fla.), defending Clash in the Canyon champion Ryan Reisbeck (Layton, Utah), and 2013 Volvik World Long Drive champion Heather Manfredda (Shelbyville, Ky.)
A familiar setting in World Long Drive, Mesquite previously hosted the Volvik World Long Drive Championship and a number of qualifying events dating back to 1997, including the World Championship having been staged at the same venue as the Clash in the Canyon from 2008-2012.
FORMAT: The eight men advanced from Monday’s preliminary rounds that featured a 36-man field and will compete within a single-elimination match play bracket during tonight’s live telecast. The four women advancing from this morning’s preliminary rounds (18-person field) also will utilize a single elimination match play bracket this evening to crown a champion.
COVERAGE: Live coverage of the Clash in the Canyon will air in primetime on Golf Channel from 7-9 p.m. ET tonight, with Golf Central previewing the event from 6-7 p.m. ET. An encore telecast also is scheduled to air later this evening on Golf Channel from 11 p.m.-1 a.m. ET. Fans also can stream the event live using the Golf Channel Mobile App, or on GolfChannel.com.
The production centering around live coverage of the competition will utilize six dedicated cameras, capturing all angles from the hitting platform and the landing grid, including a SuperMo camera as well as two craned-positioned cameras that will track the ball in flight once it leaves the competitor’s clubface. New to 2018 will be an overlaid graphic line on the grid, the “DXL Big Drive to Beat,” (similar to the “1st & 10 line” made popular in football) displaying the longest drive during a given match to signify the driving distance an opposing competitor will need to surpass to take the lead. The telecast also will feature a custom graphics package suited to the anomalous swing data typically generated by Long Drive competitors, tracking club speed, ball speed and apex in real-time via Trackman. Trackman technology also will provide viewers with a sense of ball flight, tracing the arc of each drive from the moment of impact.
BROADCAST TEAM: A new voice to World Long Drive, veteran sports broadcaster Jonathan Coachman will conduct play-by-play at each of the five WLDA televised events on Golf Channel in 2018, beginning with the Clash in the Canyon.Art Sellinger – World Long Drive pioneer and two-time World champion – will provide analysis, and Golf Channel’s Jerry Foltz will offer reports from the teeing platform and conduct interviews with competitors in the field.
DIGITAL & SOCIAL MEDIA COVERAGE: Fans can stay up-to-date on all of the action surrounding the Clash in the Canyon by following @GolfChannel and @WorldLongDrive on social media. Golf Channel social media host Alexandra O’Laughlin is on-site, contributing to the social conversation as the event unfolds, and, the telecast will integrate social media-generated content during tonight’s telecast using the hashtag, #WorldLongDrive.
In addition to the latest video and highlights from on-site in Mesquite, www.WorldLongDrive.com will feature real-time scoring. Golf Channel Digital also will feature content from the Clash in the Canyon leading up to and immediately following the live telecast.
Coming off record viewership in 2017 and a season fueled by emergent dynamic personalities, the Clash in the Canyon is the second official event of the 2018 World Long Drive season, as Justin Moose claimed the East Coast Classic in Columbia, South Carolina last month.
Showcasing the truly global nature of World Long Drive, several events will be staged in 2018 through officially sanctioned WLDA international partners, including stops in Germany, Japan, New Zealand and the United Kingdom. Additionally, an all-encompassing international qualifier will be staged (late summer) featuring a minimum of four exemptions into the Open Division of the Volvik World Long Drive Championship in September.
April 15, 2018
Boston.com
The light at the end of the tunnel for Boston Marathon runners making the final turn onto Boylston Street will be shining a little brighter this year. One of the changes the Boston Athletic Association made to the finish line for Monday’s 122nd running of the race is a new digital display board, affixed to the photo bridge above the finish line, that will be visible even if the forecasted rain falls.
“The finish times are going to be displayed big and bright and in color on that video board so that the participants and the spectators on Boylston Street will be able to see from afar what the time is,” said Jack Fleming, Chief Operating Officer of the B.A.A.
For their first year with the new board, which is similar to those that ring Gillette Stadium or TD Garden, the race organizers intend to go with a conservative approach and minimal animation. On Friday, it displayed a countdown clock for Saturday’s 5K and on Sunday it will show a tribute to One Boston Day. But the digital display opens up a new path forward for the finish line, and Fleming said that the B.A.A. could use lights and sound to enhance the spectator experience in the years to come.
“Boylston Street is like the home stretch of the Kentucky Derby or when the team comes out of the tunnel in Gillette Stadium,” he said. “We want our participants to feel that same way.”
In 2021, during the 125th Boston Marathon, don’t be surprised if the roar of the crowd over the final 500 meters is set to a background beat. But Fleming said the aesthetic changes will be made in keeping with the tradition of the event. Of course, no matter what sounds are added, the loudest noise in the runners’ heads will always be the ticking of the clock.
To that end, the organizers swapped the old clock — suspended by cable and beam above the street — for two consoles with double-sided clocks facing the oncoming runners on one side and the world’s media on the other. The race tape will be suspended in between the two consoles, and after the elite runners break the tape it will be wheeled out of the way.
Dave McGillivray, the race director, said that runners will notice some changes this year and a few more next year, building towards 2021 when the B.A.A. plans to showcase the finish line as part of the quasquicentennial celebrations. For that race, the organizers are also considering a request for an increased field size or more ancillary events around the Marathon.
The Boston Marathon finish line: a painted strip across a city street that’s taken on a meaning far beyond that.
“Everything to do with 2013 showed us just how loved Boylston Street is by our participants, by our fans, by the neighborhood, by the community,” Fleming said. “So that was sort of the inspiration for taking some actions on it.”
March 23, 2018
Sports Video Group
Although augmented reality is nothing new to sports production — the 1st & Ten line celebrates its 20th anniversary this year — AR has taken a giant leap in the past three years and is dramatically changing the way stories are told, both on the field and in the studio.
From left: Turner Studios’ Zach Bell, Fox Sports’ Zac Fields, Vizrt’s Isaac Hersly, SMT’s John Howell, and ChyronHego’s Bradley Wasilition
At SVG’s Sports Graphics Forum this month, a panel featuring executives from Fox Sports, Turner Sports, The Future Group, ChyronHego, SMT, and Vizrt discussed best-use cases, platforms, and workflows for AR, as well as how its use within live sports coverage is evolving. The one principle the entire panel agreed on was that AR cannot be used for technology’s sake alone: these elements must be used to further the story and provide valuable information to fans.
“Our philosophy has always been to use [AR] as a storytelling tool. We try not to use it for technology’s sake – whether that is in a live event or in the studio,” said Zac Fields, SVP, graphic technology and innovation, Fox Sports. “The interesting thing is that people can interact with [AR] on their phones and are familiar with what AR is now. That puts the onus on us to present those elements at an even higher quality now. [AR has] become the norm now, and it’s just going to continue to grow. The tools are there for people to come up with new ideas. The one thing that I would hope is that we can make it easier [to use] moving forward.”
Fields’s desire for more–user-friendly AR creation and integration was echoed throughout the panel by both users and vendors. Although a bleeding-edge AR project may be exciting and create a new experience for the fan, the goal is to create a solution that can be set up and used simply for every game.
“We’re trying to make sure that customers have ease of usability and repeatability every day,” said Isaac Hersly, director, business development, Vizrt. “It is an issue, and we are always looking for tools that are going to make it easier to set up and not need a rocket scientist. You [need to be able to] have someone that can operate the system very simply. That is our challenge, and we are always looking to come up with solutions to solve that.”
Turner Sports Brings Videogame Characters to Life With ARLast year, Turner Sports teamed with The Future Group to introduce augmented reality to its ELEAGUE coverage. The two companies worked with Ross Video to create life-like incarnations of videogame characters, allowing fans tuning in to watch games like Street Fighter V or Injustice2 to see these characters brought to life in the studio.
“I think creating AR characters from the games and bringing them to the audience adds an enormous amount of value for the fans and the viewing experience,” said Zach Bell, senior CG artist, Turner Studios. “If you can take characters or aspects of the game and have them as dimensional elements within that environment, it creates a much richer experience and allows fans of the game to visualize these characters in a new way. That in itself adds an enormous amount of connection to the experience for the viewer.”
Although esports presents a different case from a live game taking place on a field, Bell said, he believes similar AR elements will soon be making their way into live sports content (for example, NBC’s 3D AR elements from player scans during Super Bowl LII).
More Than Just a Game: Bringing AR to the MassesIt was only a couple years ago that high-end AR elements were reserved for the highest-profile sports events, such as NFL A games. However, with the technology’s rapid advance in recent years, AR has become ubiquitous for most national-level live sports productions and is making its way into even lower-tier properties. In addition, AR elements are becoming available on multiple cameras rather than just the main play-by-play camera (such as the SkyCam), and these systems can even be remotely controlled from offsite.
“The technology is allowing us to drive the next generation of this [content],” noted John Howell, creative strategist, SMT. “We have done the yellow [1st & Ten] line for 20 years, but, two years ago, SMT helped to create a technology that allowed us to do it on the SkyCam. Having that optical vision tracking to create the pan-tilt information off a $30,000 camera head for an image has enabled us not only to do this off the SkyCam but also to do it remotely.
“[That allows us to deploy AR] on more shows [more cheaply],” he continued, “and that technology will then trickle down to more shows. It won’t be just on Fox’s 4 p.m. Sunday NFL game or ESPN’s MNF or NBC’s SNF; now this [technology] gets to go on a lot more shows.”
What’s Next?: Getting More From Player-Tracking Chips, Customizing ARThe use of AR and the technology driving it has evolved rapidly over the past few years, raising the question, What’s next? The panel had plenty of predictions regarding the next great leap forward, but the primary point of excitement revolved around the continued advance of player-tracking RFID chips, especially the NFL’s Next-Gen Stats system.
“With the emergence of Zebra [Technologies] chips on players and [the NFL] looking at instrumenting the football [with a chip], you could see how that can tie to your first-down–line [graphic],” said Bradley Wasilition, director, sports analysis/lead sports analyst, ChyronHego. “The first-down line could actually dynamically change color, for example, when the first down is reached. Now, when that chip crosses that line, you can [definitively] say whether it is a first down or a player was out of bounds [on the sideline].
“Or think of a dynamic strike zone in baseball or a dynamic offside line in soccer,” he continued. “These are all different things that don’t necessarily reinvent the wheel, but they take baseline AR and move it into the 21st century.”
Fields predicted that, as multiplatform content and OTT outlets grow, fans will someday be able to customize their own AR elements within the sports coverage they are watching: “Eventually, it will get to a point where we can put this data in the hands of the viewer on an OTT offering. Once that happens, they can choose to turn off the strike zone over the plate. That is when we’ll really get some flexibility and customization to people so [viewers] can enhance [their experience].
March 16, 2018
Avixa
Sports. The great common denominator of all conversation. Even if you don’t like sports, you know enough to be able to talk about it, at least for a minute. And sports, by convenient association, is actually one of my favorite ways to talk about what it is that AVIXA members do.
We tell sports stories. Through gigantic video boards (forever “Jumbotrons” to the layman, and hey, that’s alright), humongous speaker systems, tiny microphones, variably-sized digital signage displays and perceptually invisible but actually ridiculously huge lighting systems and projection mapping, AV experience designers make the live event into a highlight reel. Everything has impact, in real-time.
So it happens to be that I’m forever on the lookout for evolving ways to tell sports stories in venues. In reading Sports Video Group’s coverage of the Super Bowl, I found another great angle on stadium storytelling. Most sports fans know that we are in the age of abundant sports data analytics, but what I didn’t know is that we are also in the era where those next-gen stats are changing the in-house show on the big screens at stadiums.
In a first for the Super Bowl, the 2018 game brought some television broadcast features to the in-house displays at U.S. Bank Stadium. And on top of that, they challenged audiences with a whole new graphics package featuring next-gen stats (“NGS” if you’re savvy).
With production tools by SportsMEDIA Technology (SMT), the virtual yellow line and some cool new NGS factoids made it to the big-time on the live-game displays. The latter of these came from SMT’s tapping into the NFL Next Gen Stats API to go deeper with the data.
SMT’s goal to delight fans with even more details to obsess over during the game seems like a good one. Especially because, well, “NFL fans are insatiable — they want data,” said Ben Grafchik, Business Development Manager for SMT.
To meet that need, SMT is exploring ways to tie in traditional data points with NGS in a visual format that fans can easily consume during a game. The objectivity and analytical depth of these additions to video board storytelling is compelling to all diehard fans, but in particular, the next-gen stats appeal to next-gen fans, Grafchik added.
These new graphics may have been a first for the Super Bowl, but actually, Vikings fans enjoyed them for the entire season at home at U.S. Bank Stadium. SMT worked with the in-house production team there to add all sorts of visual spice to the show, gradually going more complex with the offerings as the season went on and fans became accustomed to the new depths of data exploration.
But football isn’t the only sport that’s receiving the NGS upgrade. SMT happens to provide video enhancement and virtual insertion graphics for hundreds of major U.S. and international sporting events and broadcasters. So watch for a lot more variety to come both in house and wherever else you consume your sports content. It will certainly give us all a lot more to talk about when we talk about sports.
March 14, 2018
Sportstar Live
For more than 100 years, tennis, unlike team sports, used statistics sparingly. Basketball, baseball and football needed a plethora of stats, such as shooting percentages, batting averages and touchdowns scored, to measure the performances of their athletes and teams. But tennis players were measured chiefly by their wins, losses, titles and rankings. After all, few cared if the Wimbledon champion made 64% of his first serves or the No. 1 player averaged 77 miles per hour on her backhand.
All that changed in the Computer Age. With more information than they ever dreamed possible, tennis coaches, players, media and fans suddenly craved all sorts of revealing match data, not to mention astute analysis of it. No longer was it just whether you won or lost that mattered, but how and why you won or lost — points, games, sets and matches. Training methods, stroke production, tactics and equipment were also dissected and analysed in much greater depth and detail than ever before.
As the demand for data burgeoned, new technologies, such as sophisticated virtual graphics, tracking technology, statistical applications and telestration, have provided yet more valuable services and information to give athletes that “extra edge.”
Like any prescient, enterprising pioneer, Leo Levin seized the opportunity by developing the first computerised stats system for tennis in 1982. Levin’s seminal work was highlighted by creating the concept of and coining “unforced error,” a term now used in most sports and even by pundits to describe a politician’s self-inflicted blunder.
Since then, the genial 59-year-old, based in Jacksonville, Florida, has covered more than 120 Grand Slam events and countless other tournaments to provide the Association of Tennis Professionals (ATP) and other businesses with match statistics. Levin, dubbed “The Doctor” by broadcaster Mary Carillo for his incisive diagnoses of players’ games, is currently director of sports analytics at SportsMEDIA Technology (SMT), a company that provides custom technology solutions for sporting events.
In this wide-ranging interview, Levin explains his many roles in the exciting, fast-growing field of analytics and how it has changed tennis for the better.
What is sports data analytics?
Sports data analytics is a combination of gathering and analysing data that focuses on performance. The difference between analysis and analytics is that analysis is just gathering the basic data and looking at what happened. Analytics is trying to figure out why and how the basic performance analysis works with other factors to determine the overall performance of the athlete or the team.
When and how did this field start changing amateur and pro tennis? And who were the pioneers?
Honestly, I was. At the end of 1981, the first IBM personal computer hit the market for general consumer use. By the middle of 1982, I was working with a company in California to develop the very first computerised stats system for tennis. The key factor was the way we decided to describe the results of a tennis point in three basic areas. The point had to end with a winner, a forced error, or an unforced error. That created the foundation for how we look at tennis today.
How and when did you become interested in tennis analytics?
I was playing on the tennis team at Foothill College in Los Altos, California, about five miles from Stanford University. When I wasn’t playing matches, I was actually charting matches for my team-mates and then providing that information to the coach and the players to try to help them improve their games.
Brad Gilbert, a former world No. 4 and later the coach of Andre Agassi and Andy Murray, played on your Foothill team. Did you help him?
Brad was on that team, and it was interesting because in his first year, he played No. 2. The player who played No. 1 came to me before the state finals where he had to play Brad in the final, and asked me, ‘How do I beat Brad?’ I was able to give him specific information on strategy and tactics that helped him win the state title.
That was the year Brad took his runner-up trophy and smashed it against a tree and vowed never to lose a match the following year. And the following year, Brad didn’t lose a match.
SportsMEDIA Technology’s (SMT) products and services have evolved from a clock-and-score graphic in 1994 to innovative and sophisticated virtual graphics, tracking technology, statistical applications, and telestration. How do you and your team at SMT use these four methods to analyse statistical data at tennis’ four Grand Slams to provide valuable insight that helps players, coaches, broadcasters and the print media determine how and why a match was won or lost?
One of the challenges with tennis, more so than with any other major sport, is the lack of data. When we started doing this, there really wasn’t any consistent gathering of data from matches. So the first piece we developed was simply a system now known as Match Facts. It pulled factual statistical data directly from the chair umpire. That started with the ATP back in the early 1990s. We were then able to create a base for year-round information on the players. It allowed for the next level of analysis. It has expanded from there. We developed the very first serve speed system to start adding additional data and how players were winning or losing based on the serve speeds. As the technology improved, we’ve been able to harness the new generation — tracking video technology and then on the presentation side, using virtual graphics as a way to be able to place data directly into the field of play to help illuminate what is actually going on. Telestration is a tool that allows the broadcasters to get inside the points and help the fans understand the combinations of shots and strategies the players are using.
Your website (www.smt.com) has a section titled “Visual Data Intelligence” with the subtitle, “SMT delivers the world’s most innovative solutions for live sports and entertainment events across the globe.” What is Visual Data Intelligence? And what are its most important, innovative solutions for live sports and entertainment events?
Visual Data Intelligence goes to the heart of what we try to do as a company. In a lot of different sports, there is a lot of information available. But making it useful to the broadcasters, and specifically to the fans, to help them understand the game is a huge part of what we’re providing. That entails simple things like the first-and-10 line in football. That provides the visual set of information for the commentators and fans that really helps them understand where the teams are and how much yardage they need (to get a first down). It’s gotten to the point where fans in the football stadium are yelling, “Where’s the yellow line?” So we’re expanding that to provide the service to the large screens displayed inside the stadium so teams have their own system to be able to show that to the fans.
How does Visual Data Intelligence apply to tennis?
In tennis where you have a lot of data, the challenge is: how do you provide all that data to the fans and the commentators? We do that through a series of different systems. We have what we call our “open vision system,” which is an IPTV solution that has real-time scoring, stats and video as well as historical data. And it ties it all together and puts it in one place so it provides a true research tool for the commentators and the (print and online) media. Along with that, we have a product we call our “television interface,” which is really a system which drives graphics on air for the broadcasters. This tool allows them to look at the data and see where the trends are. Hit the button and have that information directly on the screen.
Please tell me about the new technology service partnership between Infosys and the ATP, and the analytics and metrics this partnership brings to the tennis world.
I’m not really that aware of what Infosys and the ATP are doing. But I do know that a lot of that hinges on the technology we created for Match Facts. One of the unique things about tennis is the scoring system. Unlike other sports, the player or team that wins the most points doesn’t necessarily win the match. That’s not how our scoring system works. I think they are trying to take a deeper look into the individual points, and how winning or losing specific points in key situations impacts a player’s ability to win or lose matches. The same is true for total games. That’s one of the challenges when you’re trying to do analysis of tennis. In a lot of other sports, you’re just looking at the raw numbers and saying how many points did he score or how many rebounds did she get or how many yards did they gain. But in tennis, it has to be compartmentalised into specific performances in specific situations.
How do insights from game and training data analytics improve coaching?
The key to coaching and player improvement is first to understand what is going on out on the court. It’s a matter of gathering data. One of the challenges tennis has faced because of its late start in the world of statistics and data analysis has been a reluctance by a lot of coaches and players to rely on anything other than what they see and feel. So the real challenge and the real key is to be able to relate the data to what coaches see and what players feel out on the court. When you can make that connection, you have a real chance for improvement.
What are one or two insights that have improved coaching?
The challenge is that every player is different. What the data analysis allows you to do is to customise those things and focus not on what a player does, but what your player does, and how you can get the most out of your player’s game. A simple example of this was when we first started doing detailed statistics and analysis, we worked with the Stanford University tennis programme. Their No. 1 woman player, Linda Gates, was struggling, and the coaches couldn’t figure out where or why. We did an analysis of her game, and we found out that she was dominating her service games on her service points in the deuce court, but she was struggling in the ad court. It wasn’t visually obvious. The coaches couldn’t put their finger on what the problem was. But once we started looking at the numbers and the data, it allowed them to focus in practices on her ad-court shot patterns. Linda went on to win the NCAA Championships that year, 1985, in singles and doubles (with Leigh Anne Eldredge).
An Infosys ATP “Beyond The Numbers” analysis of Rafael Nadal’s resurgence to No. 1 in the Emirates ATP Rankings showed that Nadal ranked No. 1 on tour in 2017 for winning return points against first serves, at 35.2 percent (971/2761). That metric shoots up to an astounding 43.4 percent (454/1045) for his clay-court matches. Which other stunning statistics help explain why other players have had outstanding years this decade?
This goes to the basics of looking at players’ strengths and weaknesses. One stat I always look at is serve and return performance because I still split the game up that way. It’s interesting that when you look at a player like Nadal, you see that he is not only dominant on return of serve. He’s also dominant on his own second serve.
Even with all the analytics we have, an old maxim still holds true: “You’re only as good as your second serve.” You’ll find the players at the top of the rankings for the last four or five years were also at the top of both second serve points won and return of second serve points. Despite all the focus on power and big serves, second serve performance is really a huge key to understanding a player’s overall strengths and weaknesses.
How much do the Women’s Tennis Association tour and its players take advantage of analytics?
Although the WTA was a little behind the ATP curve in terms of gathering and storing match data, the good news is that now they’ve caught up. Their association with SAP and that they’re also now using a Match Facts system to provide data for the players on a match-by-match basis has moved them up the curve.
Which pro players have benefited most from tennis analytics so far? And in what specific ways?
That’s a tough question. Because I don’t work directly with the players and coaches as I used to, I don’t know who is utilising the data more so than others. You can tell just by looking at Roger Federer’s improvement over the last year that his team used analytics to determine that he needed to be more aggressive on his backhand. He’s now hitting a much higher percentage of topspin backhands than he did in previous years and that change has made his game more balanced and puts a lot more pressure on his opponents. Playing to Roger’s backhand used to be the safe play — it’s not any more.
Another area of Federer’s game that came to light using analytics was the difference between his winning and losing matches at Wimbledon. When you compare his final match wins to his matches lost since he won his first Wimbledon in 2003 — 8 titles, 7 matches lost — the numbers that jump out are all about his return of serve, and specifically, his performance on break points. Federer’s serving performance barely changed, but his return game fell dramatically in his losses. In his Wimbledon final wins, Federer converted 30 of 69 break points for 44%. In his losses, he converted only 9 of 53 for 17%. In both cases, he averaged around 8 break points per match. In his wins, he converted almost 4 per match, but in his losses he converted just over once per match. His team looked at that crucial data and added in that nearly all his opponents served and volleyed 2% or less of their service points and concluded that Roger needed to work on hitting his returns deep and not worry about his opponents coming in behind their serves.
Younger players are taking most advantage of the information because they’ve grown up in that world. They’re used to the electronics and the digital experience and having all that information available to them.
How do these insights enhance the fan experience?
I credit (renowned former NFL analyst) John Madden for being one of the very first TV commentators who would take fans inside the game to explain to them things they didn’t necessarily see. Madden would explain to women football fans what the centre or guard was doing on a particular play and why that back ran for 50 yards was all because of this really good block.
What we’re trying to do in tennis and what these insights have provided is to do the same kind of things for tennis fans. Help get them inside the game so they understand the nuances of what’s happening on the court, and they’re not just watching two guys running around hitting the ball.
What is radar-based tracking, which is now used by the United States Olympic Committee (USOC) for every throw an Olympic athlete makes? Is it being used in tennis?
Radar-based tracking is simply tracking the speed and location of the ball or object that is being thrown or hit. Radar-based tracking has been typically used for service speeds in tennis. That is something we pioneered in the late 1980s. The tracking used in tennis has been video-based, as opposed to radar. The advantage of that is that you can track movement of the players as well as the movement of the ball and from a variety of positions and angles.
Can analytics predict which junior players will someday become world-class players or even champions? And if so, can it guide their coaches and national federations to increase the odds that will happen?
Not yet. The challenge is that prediction is different from analysis. You’re trying to draw conclusions from the data, and we don’t have a complete set of data. If you wanted to predict which junior players will become world-class players, sure you can do that if we have genetics, biomechanics, all the physical characteristics measured as well as using analytics to measure the player’s overall performance on the court. We can see whether or not they have specific markers that indicate they will make that jump. But the bottom line is that there are so many factors involved. And a lot of it has to do with the physical side that you can’t necessarily determine from data.
What is bioanalytics? And why is measuring and analysing an elite athlete’s perspiration important?
We’re pioneering bioanalytics in football now. We’re taking biometric readings from players at the university level. The players are equipped with motion sensors and full biometric readers, which are reading things like heart rate, body temperature and respiration. And they’re combining that with the movement data from the tracking information. With that, we’re able to measure the physical output of the players. The sensors in the helmet measure impacts (from collisions).
We’ve been working on this project for a few years. It’s been used for the football programme at Duke University. We’re in the process of adding a couple more universities to this project. At this stage it’s being used for medical purposes. So when a player is on the practice field, they can know immediately if his heart rate starts racing or if his body temperature goes up too high, they can immediately pull him out of practice and get him more electrolytes and hydration. They also weigh the players before and after every practice so they know how much fluid the player has lost during their practice times.
How is bioanalytics used in tennis?
Unlike a team sport where a team can outfit all its players with this equipment, tennis players are all independent contractors. So it’s going to take more of a nationalistic approach — something like what the USTA is doing — to step in and say, “For our junior players, we’re going to outfit some courts and we’re going to provide this level of analysis on the physical side.”
Does analytics apply to tennis equipment and court surfaces? And if so, how?
Sure, it can. Analytics can identify how well players perform using different types of equipment and on different surfaces. For instance, if you’re using some tracking technology to determine what racquet and string combination allows a player to have the most amount of power, that’s a relatively simple exercise. You run a player through a set of drills, hitting particular shots, and measuring the speed of the ball coming off the racquet.
For surfaces, analytics can really help with identifying the type of shots that have an effect on particular surfaces or areas where players’ games break down. For example, you have players who have a long backswing, and that works really well on a slower surface where they have time to take a big backswing. But when you put them on a faster court, where the ball bounces lower and faster, it upsets their timing, and it makes it more difficult for them to adjust. Analytics measures the court’s bounce speed and bounce trajectory. So you can take a player and modify his game on a particular surface taking into account how the ball reacts to it.
You’ve analysed thousands of matches. Which factors influence the outcome of matches the most in men’s tennis and women’s tennis? And why?
The No. 1 factor typically is unforced errors. If you’re making mistakes, you’re basically giving the match to your opponent. Being able to measure and quantify that is a huge factor for player improvement. That entails understanding where you’re making your mistakes — which shots and what situations. The caveat to that is that there are certain players whose games are based on absolutely controlling the pace and tempo of the match. And they have the tools to do that. Two of the best players ever to do that are Steffi Graf and Serena Williams.
What are the disadvantages of and dangers involved with analytics? Will some number crunchers and coaches go overboard with analytics and be guilty of Occam’s razor?
The simple danger is to rely on data alone. The challenge is that you have to make the data relatable to what the player is doing physically and mentally on the court. Analytics doesn’t necessarily measure the mental side of the game, at least not yet. If you’re focusing so much on the analytics of certain shots and not looking at the big picture of their mental focus and how they’re preparing for matches, you can get into trouble.
Since tennis players vary greatly in temperament, talent, current form and other variables, do practitioners of analytics risk over-concluding from their numbers? And what mistakes have you and others made in this regard?
There is always a risk. Data can provide you with valuable information. Then you make that next leap that says, “This information says this, and therefore we have to do this, or therefore we have an issue.” I’ll give you a simple story from a few years ago. Jim Grabb, who was the No. 1 doubles player in the world then, came up to me at a tournament before the US Open and said, “I’m struggling with my first volley in singles. I can’t make a first volley.” And I told him, “You’re the No. 1 doubles player in the world. You have great volleys. And you’re saying you can’t make a first volley in singles.” He says, “Yeah.”
A lot of coaches would say, “How are you hitting it? Let’s analyse the stroke.” I asked, “When you step to the baseline to hit the serve, where is your first volley going?” Jim looked at me like I was speaking a foreign language. So I asked again, “Before you hit your first serve, where are you going to hit your first volley?” He said, “I just react to the ball. I don’t know what you’re talking about.”
So I suggested, “Do this. Every first volley goes to the open court. You serve wide in the deuce court and you volley wide into the ad court. You serve wide in the ad court and volley wide into the deuce court. Just for your first volleys.”
Jim goes out to play and comes back and says, “I didn’t miss a first volley.” The next week he got to the fourth round of the US Open, his best result at a Grand Slam (event) ever in singles. That had to do with the fact that all it really required was a little bit of focus by the player. It didn’t require a level of analysis and stroke production changes. It was simply eliminating decision-making.
What is the connection between analytics and the established field of biomechanics?
Analytics can tell you how a player is performing or how a stroke is performing in key situations. That can then identify that we need to examine the biomechanics of the stroke, particularly if it is breaking down under pressure. Or we can determine that the errors are occurring when the ball is bouncing four feet in the air versus three feet in the air, so their contact point is a foot higher. Now we can look at the biomechanics and see what the player is doing when the ball is a foot higher.
What are player rating systems? And what is the connection between analytics and player rating systems? How valid is the Universal Tennis Ratings system?
I don’t think there is any now. But that’s a direction we can take in the future.
Which match statistic or statistics do you foresee becoming increasingly important as a result of analytics?
I think you’ll see more focus on key point performance as we do more and more analysis of players’ games in key pressure situations. Because you’re serving half of the time and receiving serve half of the time, analytics will look increasingly at each half of the game. We talk a lot about unforced errors, but are they occurring on your serve game or return game? We talk about aggressive play and taking control of the points, but when is that happening? And the serve or return games? On the first serve or second serve?
Data analytics is undeniably changing tennis. Do you think it will revolutionise tennis?
Absolutely! Because the game is always changing. The technology around tennis and all sports keeps changing. Analytics is going to make the athletes better. It’s going to provide them with insights about how they can be at their peak for the key matches. It will help them train better, prepare better, execute shots better under pressure. All those pieces and parts will be available for athletes. And all of their nutritional, sleep, and training regimens will also help tennis players to perform better.
March 9, 2018
Sports Video Group
The 2018 NASCAR season is underway and with it comes a new remote production workflow for NASCAR whereby cameras and audio signals are being sent from race tracks to NASCAR’s production center in Charlotte, NC. The efforts began with the Rolex 24 at Daytona race and will continue with the WeatherTech SportsCar Championship racing series next week, and ARCA Racing Series as the season progresses.
“We have done a lot of testing at smaller events the past couple of years but this year we wanted to push the limits and see what we can do,” says Steve Stum, NASCAR Productions, VP of Operations and Technical Production.
The Rolex 24 Hour race used NEP’s NCP IV production unit to put out 12 hard cameras, two RF cameras for the pit announcers, and 14 in-car cameras around the track. RF was handled by 3G and a tech manager and engineering team ensured that 28 video and 75 audio signals were sent to Charlotte via a single antenna from PSSI Global Services. PSSI Global Services leveraged its C27 mobile teleport, equipped with cutting-edge Newtec modulators and GaN SSPB amplifiers from Advantech Wireless.
Rick Ball, Director of Broadcast Sports at PSSI Global Services, adds: “We’re not afraid to go where no one has gone before, and we’re proud that our efforts continue to create new possibilities in live television.”
Once the signals are back in Charlotte the director, producer, TD, replay, SMT virtual graphics, and announcers created the show.
“Round trip the latency is 1.09 seconds so we have camera returns and feeds for the screens for the fans in the stands,” adds Stum.
With upwards of a third of production costs being sunk into travel Stum says that the goal is to put more money into the production itself, get more specialized equipment, and have a production truck unit that is more aligned with the needs of a remote production.
The efforts are part of a season that Stum says has been going great so far. And all of the testing prior to the Rolex race paid off as Stum says nerves at the beginning subsided as the workflow was proven out.
March 2, 2018
Sports Video Group
As the NFL Scouting Combine becomes an increasingly fan-focused event onsite, NFL Media is expanding its already sizeable coverage of the annual event in Indianapolis. Last year, the NFL added Combine events, including the bench press and press conferences, at the Indianapolis Convention Center next door to Lucas Oil Stadium and allowed a limited number of fans into the stadium’s upper bowl in an effort to boost the NFL Combine Experience. With that in mind, NFL Network and NFL Digital outlets are rolling out their biggest productions to date to cover the growing parade of events taking place at both locations.
“We attack this show with everything we have in order to cover it from every aspect,” says Dave Shaw, VP, production, NFL Media. “The league has continued to expand the fan-focused aspect of the Combine at the convention center. They started that last year and are putting even more events over there this year. So we’ve expanded our show to cover some of the more fan-friendly stuff.”
For its 14th Combine, NFL Media is delivering a whopping 52 hours of live coverage during the event (Feb. 28 – March 5), including 34 hours from Indianapolis: 26 hours of Combine coverage Friday-Monday and eight hours of press conferences Wednesday and Thursday.
“This event really didn’t become ‘an event’ until it was covered by NFL Network,” says Christine Mills, director, remote operations, NFL Media. “It’s grown and evolved, and now fans are becoming more involved [onsite]. It’s interesting how it’s grown from a very small intimate event essentially just for scouts to an event covered by NFL Network and NFL Digital and on social. It’s grown into a fan-facing event, but it has kept that intimate feel at its core.”
Onsite in Indy: Encore and Pride, Four Sets Drive Multiplatform ProductionDespite the expansion, NFL Media has maintained the same footprint in the truck compound at Lucas Oil Stadium. Game Creek Video’s Encore is serving the NFL Network show, and Pride is handling the streaming coverage.
The trucks onsite are fully connected to NFL Media’s broadcast center in Culver City, CA, via diverse fiber circuits (with 12 muxed feeds going each way) to allow extensive file-transfer and backhaul of camera feeds.
“For our coverage, we treat this like we’re covering a high-end game,” notes Shaw. “It’s a very slick production that moves quickly. It is a bit of a marathon, but our production teams do an outstanding job of rolling in features and keeping the action moving. It’s an important show for the NFL Network and NFL Media group because it’s the baseline for what we are about, which is giving viewers the inside look and show fans what they should look for in the upcoming players.”
NFL Media has deployed a total of four sets — three at Lucas Oil (one on the field, two on the concourse level) and one at the convention center — to serve its 23-deep talent roster. Two of the three sets at the stadium are dedicated to the digital operation; NFL Network is manning the convention-center set, which is primarily for press-conference coverage.
“The setup we have at the convention center for NFL Network is very similar to [Super Bowl] Opening Night, where they have eight podium positions set up and we’re right in the middle of that room,” says Mills. “It ends up being a really fun and busy couple of days, especially with the fans more involved now [onsite].”
In addition to the four sets, NFL Network has a position in the traditional announce booth at Lucas Oil Stadium, as well as an interview location in a suite, where head coaches often stop by. For example, last year, NFL Media landed a rare interview with Patriots coach Bill Belichick in this location.
“Most of the head coaches are here in a casual atmosphere trying to pull something away from some of these players they’re evaluating,” says Shaw. “And the coaches have [free rein over] where they want to be in the building, so sometimes they will stop by the announce booth. Having Belichick stop by and do some time with our guys took us all off guard a little, but it was great and got a lot of attention. What’s exciting is, you don’t know what you’re going to pull off here since you have all the coaches and GMs. It’s a lot of fun trying to get in their minds and hearing what they have to say in this kind of atmosphere.”
The Camera Complement: Skycam, Robos, and TeamCamsBetween NFL Network and NFL Digital, the operation is deploying a combined 37 cameras at the two venues, including a SkyCam at the stadium and a large complement of robos (provided by Indy-based Robovision) at both locations. In addition, five ENG cameras are roving the grounds capturing content, which is being sprinkled into both the linear and the streaming coverage.
NFL Media will continue to spotlight the 40-yard–dash drill, with a high-speed camera capturing the smallest details. In addition, SMT is providing virtual graphics and graphics overlays for visual comparison of prospects with one another or with current NFL players’ Combine performances: for example, projected top pick QB Sam Darnold vs. Pro Bowl QB Carson Wentz’s sprint).
In addition, NFL Media is leveraging its Azzurro TeamCam system to provide live shots throughout its press-conference coverage. The TeamCam system, which NFL Network has used for a variety of needs for several years, features a single camera and transports bidirectional HD signals via a public-internet connection — along with IFB, comms, and tally — between Indianapolis and Culver City. In addition to a show produced onsite during the first two days, all press conferences are fed to Culver City via the TeamCam system.
“It’s interesting what we do for our live shots with the TeamCam system,” says Shaw. “We can just do one-off cameras, or we can bring it back; we can do two-ways just with a single camera. It’s a great [tool] for our Wednesday and Thursday coverage.”
NFL Digital Bigger Than Ever at CombineNFL Digital’s presence continues to grow at the Combine. NFL Now Live is streaming on NFL.com, the NFL app, and Yahoo.com Friday-Monday beginning at 9 a.m. ET. In addition, NFL Media is providing extensive social-media coverage across Twitter, Facebook, Instagram, and Snapchat. Twitter Amplify is being used to produce highlights, distribute on-the-ground original content of top achievements across social networks, and deliver original social content to all 32 NFL clubs. On top of that, for the first time, the NFL is coordinating with some of the top college football programs to share, create, and amplify social-media content from Indianapolis.
In addition to live coverage, each prospect goes through the “Car Wash” following his press conference at the convention center. Each player progresses through interviews with NFL Media’s features team, digital team, and social-media team.
“These [Car Wash] interviews help us build features and get footage for the Draft,” says Shaw. “It also helps us down the road, and we’ll use footage all the way through the season. This is an NFL Media-exclusive event, so we go out of our way to give the avid NFL fan that inside position they don’t usually get to see.”
February 28, 2018
Sports Video Group
NFL Network will produce and broadcast 11 live American American Flag Football League (AFFL) games during its debut season, as well as distribute highlights from the AFFL’s upcoming 2018 U.S. Open of Football (USOF) Tournament. The agreement is the first-ever broadcast deal for professional flag football, and “provides a unique opportunity for the NFL to explore digital distribution of AFFL content,” according to the league’s announcement. The 11 game telecasts will be produced by NFL Network and feature NFL Network talent.
“Today marks great progress for football fans and players,” says AFFL CEO/founder Jeffrey Lewis. “As the first-ever broadcast and distribution deal focused on bringing the game of flag football to the broadest possible audience, we are thrilled to partner with NFL Network, the premier platform for football.”
The AFFL is set to launch this summer, and NFL Network is expected to build on the unique use of technology deployed for coverage of the AFFL’s first exhibition game on June 27, 2017, at Avaya Stadium in San Jose, CA. In an effort to create a wholly revamped football-viewing experience similar to the Madden NFL gaming look, the AFFL production team deployed SkyCam as the primary play-by-play angle (prior to NBC Sports’ decision to do so for several games during the 20017 NFL season), RF cameras inside the huddle, and SMT virtual graphics and augmented-reality elements all over the field.
The USOF is a 132-team, single-elimination tournament that will ultimately pit a team of elite former professionals against a team that has conquered a 128-team open national bracket. The tournament marks the AFFL’s first major competition, following an exhibition game in June 2017. NFL Network will televise 11 USOF games live June 29-July 19, concluding with the Ultimate Final, where America’s Champion and the Pros’ Champion will meet in a winner-take-all contest for $1 million.
The broadcasts are currently scheduled for the following dates:
The four Pro teams are expected to be led by Michael Vick, Chad “Ochocinco” Johnson, basketball duo Nate Robinson and Carlos Boozer, Justin Forsett, and Olympic champion Michael Johnson. Airtimes and broadcast talent for USOF games on NFL Network will be announced at a later date.
“Football fans are passionate about having continuous access to entertaining football content all year round,” said Mark Quenzel, SVP, programming and production, NFL. “AFFL games on NFL Network will give viewers a chance to experience a new kind of football competition in the summer months, and we’re excited for the opportunity to deliver more live programming that fans enjoy.”
The AFFL is extending the application deadline for the USOF from March 1 to March 8. Interested applicants can apply to play in the USOF here. Those selected will play in America’s Bracket, which comprises 128 teams.
February 19, 2018
Sports Video Group
One of the highlights of Turner’s NBA All-Star Saturday Night coverage was the debut of a shot-tracking technology developed by Israeli startup RSPCT. Deployed for the Three-point Contest, RSPCT’s system, which uses a sensor attached to the backboard to identify exactly where the ball hits the rim/basket, was integrated with SMT’s graphics system to offer fans a deeper look at each competitor’s shooting accuracy and patterns.
“There is a story behind shooting, and we believe it’s time to tell it. Shooting is more than just a make or a miss,” says RSPCT CEO Oren Moravtchik. “Turner and the NBA immediately understood that the first time they ever saw [our system] and said, Let’s do it.”
During Saturday night’s telecast, Turner featured an integrated scorebug-like graphic showing a circle representing the rim for each of the five racks of balls during the competition. As a player took a shot, groupings indicating where the ball hit the rim/basket were inserted in real time, showing where the ball landed on the rim or inside the basket.
“It’s a bridge between the deep analytics that teams are using and the average fan,” says RSPCT COO Leo Moravtchik. “Viewers can understand shooting accuracy faster and better without having to dive into analytics; they clearly see groupings of shots and why a shot is made or missed. Last night, if a player missed all five shots of a rack, you could see why: if they are all going right or all going left.”
The system, which can be set up in just 30 minutes, consists of a small Intel RealSense Depth Camera mounted behind the top of the backboard and connected wirelessly to a small computing unit.
“We have some very sophisticated proprietary algorithms on the sensor,” says Oren Moravtchik. “The ball arrives at a high speed from the three-point line at various angles. We can [capture] the entire trajectory of the ball: where it came from, how it flew in the air, where it hit the basket — everything. We know the height of the player, the release point, and where it hit the basket, and then we can extrapolate back from there.”
Although Saturday night marked the debut of the RSPCT system for the NBA, Leo Moravtchik sees far more potential once complete data sets on players can be captured — such as a full playoff series or even a full season.
“There may be an amazing player shooting 18 out of 20 from every [three-point] location, but there are differences between locations beyond just field-goal percentage,” he says. “Based on our data, we not only can show them [that] shooting [tendencies] can predict, [that] we can actually project their field goals for the next 100 shots. We can tell them, If you are about to take the last shot to win the game, don’t take it from the top of the key because your best location is actually the right corner.”
RSPCT is not only focusing on sports broadcast and media clients but marketing the system as a scouting and player-development tool.
“We’re [targeting] NBA teams, college teams, and even high school and amateur teams,” says Leo Moravtchik. “Wherever there is a basket — camps, gyms, schools — people want to see how they are shooting. We can bring it there because it’s a 30-minute installation and very cost-effective.”
February 16, 2018
Sports Video Group
The 60th running of the Daytona 500 takes place this Sunday, and Fox Sports, as it has done every year, again has found a way to push the technological envelope and expand on the resources dedicated to broadcasting the Great American Race. Coverage of this year’s race includes the introduction of Visor Cam, the return (and refinement) of the dedicated Car Channels on Fox Sports GO, and — in an industry first — a tethered drone that will provide live coverage from behind the backstretch at Daytona International Speedway.
“Every year, there’s something new,” says Mike Davies, SVP, field and technical operations, Fox Sports. “The Daytona 500 is always a great way to kick off the first part of the year in terms of technological testing: a lot of the things that we bring down to Daytona to look at, to test, and to try are things that manifest themselves later and in other sports. It’s a lot of fun to dream these things up.”
A Unique Point of ViewThis weekend’s race will feature all the camera angles that racing fans have come to expect, plus a few new views that promise to enhance the broadcast. Fans have grown accustomed to seeing their favorite drivers up close thanks to in-car cameras, but, on Sunday, they’ll be able to see what the driver sees.
Visor Cam, which first appeared at the Eldora NASCAR Camping World Truck Series race last year, makes its Daytona 500 debut this weekend. The small camera, developed by BSI, will be clipped to the helmets of Kurt Busch (last year’s Daytona 500 champion) and Daniel Suarez.
“You can try to put cameras everywhere you can, but seeing what the driver is seeing through a camera placed just above his eye line on his visor is pretty cool,” says Davies. “We’re looking forward to having that at our disposal.”
Fox Sports worked closely with NASCAR and ISC to provide aerial drone coverage of the Daytona 500. The drone, which will be tethered to allow longer periods of flight time, will move around behind the backstretch — outside of the racing area — to cover the race from a new angle.
Gopher Cam, provided by Inertia Unlimited, returns for its 10th year with enhanced lens quality for a wider, clearer field of view. Three cameras will be placed in the track, including one in Turn 4 and another on the backstretch.
Cameras, Cameras EverywhereFox Sports will deploy a record number of in-car cameras during the Daytona 500. In total, Sunday’s broadcast will feature 14 in-car cameras, including the pace car — more than in any NASCAR race in the past 15 years. Each car will be outfitted with three cameras for three viewing angles.
Last year, Fox Sports launched two dedicated Car Channels on the Fox Sports GO app, each focusing on a single driver. For this year’s race, Fox Sports has opted for a team approach, showing multiple drivers, cars, and telemetry data on the channel.
In total, Fox Sports will deploy a total of 20 manned cameras, including three Sony HDC-4300’s operating in 6X super-slo-mo, one Sony HDC-4800 operating in 16X HD slo-mo, and an Inertia Unlimited X-Mo capturing 1,000 frames per second. Fox Sports will outfit its Sony cameras with a variety of Canon lenses, ranging from handheld ENG to the DIGISUPER 100. The network will also have four wireless roving pit/garage camera crews, 10 robotic cameras around the track (plus three robotic Hollywood Hotel cameras), and a jib camera with Stype augmented-reality enhancement. The Goodyear Blimp will provide aerial coverage.
Not to be forgotten, viewers will be treated to all the sounds of the race as well, thanks to more than 100 microphones surrounding the track. Fox Sports plans to make use of in-car radios throughout the broadcast, both in real time (having the drivers and crew chiefs narrate the race) and after the fact (using the audio to tell a story).
A Compound Fit for the Super Bowl of RacingFor the first time in 12 years, Game Creek Video’s FX mobile unit will not handle Fox Sports’ Daytona 500 production. Instead, Game Creek’s Cleatus (known by another network as PeacockOne) will be responsible for the main race broadcast and will be joined in the compound by 11 additional units for digital production, editing, RF cameras and audio (BSI), telemetry and graphics (SMT), and studio production. Two satellite uplink trucks will be onsite, as well as a set of mobile generators that will provide nearly 2 MW of power independent of the local power source.
Fox Sports is shaking up its transmission as well, relying on an AT&T gigabit circuit capable of transmitting eight video signals (and receiving four) via fiber by way of its Charlotte, NC, facility to Fox Sports’ Pico Blvd. Broadcast Center in Los Angeles.
“Based on some of the things that we’re doing for the World Cup in Moscow as well as home-run productions for MLS and college basketball, we’ve taken some of that knowledge and leveraged it to doing full-on contribution for NASCAR,” Davies explains. “It’s exciting, it’s scalable, and we’re looking forward to doing it. AT&T put in circuit at every track or is in the process of doing so, so this is a first foray into IP transmission as it relates to NASCAR.”
The benefit of transitioning to IP transmission, according to Davies, is the volume of content that Fox Sports will be able to send from tracks that notoriously lack connectivity. “At the end of the day,” he says, “we’ll be able to leverage resources from Charlotte and Pico to do more things. Right now, we’re able to contribute more to our Charlotte shows via fiber, but, like everything in technology, the more we get used to it and the more we know how to use it, the more useful it’s going to be.”
Daytona 500 Gets a Graphics MakeoverThe on-air graphics package for the Daytona 500 will be new, featuring much of the look and feel of Fox Sports’ football, basketball, and baseball graphics with all the data that NASCAR fans expect.
Fox Sports will up the ante on virtual graphics and augmented reality, deploying Stype camera-tracking technology (with a Vizrt backend) on a jib between Turns 3 and 4 in order to place 3D graphics within the broadcast. For example, the system can be used to create virtual leaderboards, sponsor enhancements, and race summaries that are placed on Turn 3 as virtual billboards.
“Where that jib is between Turns 3 and 4, you can place graphics [on screen in] such a way that you don’t necessarily have to leave the track in order to get information across,” Davies explains. “In the past, we might have used full-screen graphics, but now, we can put the graphics in space, and it looks pretty cool. It’s the third year that we’ve been doing that, and we seem to get better at it each year.”
The network has also enhanced its 3D-cutaway car, putting these graphics in the hands of the broadcast team. And, in the booth, Fox Sports NASCAR analyst Larry McReynolds will have his own dedicated touchscreen, allowing him to enhance any technical story and give the viewer clear illustrative explanations during the race.
A Company-Wide EffortBetween the production personnel, camera operators, engineers, on-air talent, and many more, Fox Sports currently has 300 people onsite at the Daytona International Speedway. In addition, Fox Sports’ Pico and Charlotte facilities, as well as its network-operations center in The Woodlands, TX, are very much a part of the action. And, when the Daytona 500 starts on Sunday, all will be ready to deliver this year’s race to NASCAR fans everywhere.
“Between everything that you’re going to see on-screen and everything under the hood, these are all things that are going to help the company as a whole,” says Davies. “We’ve been able to bring together all of the resources across the company, and it’s particularly exciting to get everybody working as one on this event.”
Latest HeadlinesEuropean Championships 2018: BBC To Give ‘Major Event Treatment’ to Inaugural Multi-Sport ShowcaseSCMS 2018: Fox Sports Execs Reflect on Cloud Workflows, Data Strategy for 2018 FIFA World CupITN Productions Provides Live Commentary for International Champions Cup With ArqivaPGA Tour Partners With NBC Sports to Bring PGA TOUR LIVE to Network’s Direct-to-Consumer Platform, NBC Sports GoldMX1, Arista Networks, Tektronix, Warner Chappelll Production Music Renew SVG SponsorshipsCenturyLink Renews SVG Platinum SponsorshipSCMS 2018: Quantum’s Molly Presley Shares the Latest on StorNext, Tiered StorageSCMS 2018: NBC Sports’ Darryl Jefferson Offers a Look at 2018 Olympics Asset ManagementSCMS 2018: How Machine Learning Can Make Sports Workflows More EfficientEBU To Conduct UHD HDR High Frame Rate Tests With NGA at European Championships
February 8, 2018
Digital Journal
DURHAM, N.C.--(Business Wire)--NBC Olympics, a division of the NBC Sports Group, has selected SMT to provide real-time, final results and timing interfaces for its production of the XXIII Olympic Winter Games, which take place in PyeongChang, South Korea, from February 8 - February 25. The announcement was made today by Dan Robertson, Vice President, Information Technology, NBC Olympics, and Gerard J. Hall, Founder and CEO, SMT.
Since 2000, SMT has been a key contributor to NBC Olympics’ productions by providing results integration solutions that have enhanced NBC’s presentations of the Games via on-air graphics, scheduling, and searches for content in the media-asset–management (MAM) system.
For the 2018 Olympic Winter Games, SMT will deliver TV graphics interfaces for NBC Olympics’ Chyron Mosaic systems in its coverage of alpine skiing, freestyle skiing, snowboarding, figure skating, short track speed skating, speed skating, bobsled, luge, skeleton, ski jumping and the ski jumping portion of Nordic combined.
SMT’s Point-in-Time software system integrates live results to allow commentators to locate a specific time during a competition in both live and recorded coverage. The software graphically shows key events on a unified timeline so that NBC Olympics commentators can quickly see how a race began, when a lead changed, where an athlete’s performance improved, and the kinds of details that dramatically enhance the incredible stories of triumphs and defeats intrinsic to the 2018 Winter Games.
“The complexity and sheer amount of scoring, tracking, and judging data that comes with an event of this size, both real-time and post production, is beyond compare,” said Robertson. “The ability to organize and deliver it aids NBC’s production in presenting the stories of these amazing athletes, and requires nothing short of the capabilities, innovation and track record of SMT.”
“It is our privilege to provide our expertise, experience, and results reporting technology for NBC Olympics’ production of the 2018 Olympic Winter Games, SMT’s 10th straight Olympics,” said Hall. “Our team of 10 on-site engineers have rigorously prepared for PyeongChang with a tremendous amount of testing and behind-the-scenes work, ensuring SMT delivers seamless services of a scope and scale unprecedented in a sports production.”
SMT’s partnership with NBC Olympics began with the 2000 Sydney Games and has included providing graphics interfaces as well as NBC’s digital asset management interface that helped the network receive Emmy Awards for “Outstanding Team Technical Remote,” following the 2008 and 2016 Games.
About NBC Olympics
A division of the NBC Sports Group, NBC Olympics is responsible for producing, programming and promoting NBCUniversal's Olympic coverage. It is renowned for its unsurpassed Olympic heritage, award-winning production, and ability to aggregate the largest audiences in U.S. television history.
For more information on NBC Olympics’ coverage of the PyeongChang Olympics, please visit: http://nbcsportsgrouppressbox.com/.
About SMT
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious events, including the Super Bowl, major golf and tennis events, the Indianapolis 500 and the World Series. The 31-time Emmy Award-winning company is headquartered in Durham, N.C. For more information, visit smt.com.
February 5, 2018
Sports Video Group
To put it mildly, the 2017-18 NFL campaign has been a memorable one for SkyCam. In a matter of months, the dual-SkyCam model — an unheard-of proposition just a season ago — has become the norm on high-profile A-game productions. In addition, the company unveiled its SkyCommand for at-home production in conjunction with The Switch, with plans to continue to grow this central-control model. In addition, last year, SkyCam worked with SMT to debut the 1st & Ten line and other virtual graphics on the SkyCam system; today, it is standard practice on almost any show using a SkyCam.
At Super Bowl LII, SkyCam once again deployed dual SkyCams with the high-angle focusing on an all-22 look and the lower SkyCam focusing on play-by-play. SVG sat down with Chief Technology Officer Stephen Wharton at U.S. Bank Stadium during Super Bowl Week to discuss SkyCam’s role in NBC’s game production, the rapidly growing use of the dual SkyCams by broadcasters, NBC’s use of the system as the primary play-by-play game camera on a handful of Thursday Night Football games this season, and an update on the company SkyCommand at-home–production control system, which was announced earlier this year.
Tell us a bit about your presence at U.S. Bank Stadium and the role SkyCam will play in NBC’s Super Bowl LII production?We were fortunate enough to be here with Fox for the Wild Card Game, and that allowed us to keep a majority of our infrastructure in place. Also, when the stadium was built, they built in a booth for SkyCam and cabled the building, so that obviously helped us quite a bit. But we’ve been here since Sunday working with the halftime show to make sure that our rigging isn’t in the way of them and they’re not in the way of us. And then, Monday, full crew in for Tuesday first-day rehearsal, and then all the way through the week.
In a matter of months, several major NFL broadcasters have adopted the dual-SkyCam model. What are the benefits of two SkyCams?We used to say you knew you had a big show when you had SkyCam on it. Now you have a big show when you have two SkyCams on it. I think one of the key driving factors for [the increased use of] dual SkyCam was working with the NFL and the broadcasters to better highlight Next Gen Stats. And, working with SMT on their auto render system, one of the big values that we now bring is this ability to show you the routes and what’s going on with each player as the play develops from the overhead all-22 position.
It just so happened that, as the dual systems started to evolve, we got this amazing opportunity in Gillette Stadium when the fog came in and no other cameras could be used. Typically, you think of SkyCam as being used for the first replay camera; we’re not necessarily live. But, in that instance, we had to go live with SkyCam, and the first replay became the high SkyCam. That opportunity changed how we are seen and used. It demonstrated what you could do with SkyCam, and that obviously penetrated all the other networks. You get two totally different angles, one more tactical and one play-by-play, and there’s really no sacrifice. You’re not giving anything up on the lower system; you’re actually helping because you don’t have to chase down beauty shots and comebacks since the upper system can do that. The lower system can just focus on play-by-play.
Do you expect the use of dual SkyCams for NFL coverage to continue to grow next season?I think that you’ll continue to see the dual SkyCams become more of the norm, not just for the playoff games but for most A-level shows, because it brings such a value for both Next Gen Stats and the broadcasters. We’re obviously super excited about that.
I think there’s a bifurcation between audiences in terms of [SkyCam] as a primary angle: some really love it, and some don’t like it. But what you’re seeing in broadcast today with the growth of technology and evolving media is that people end up with a buffet of options to choose from: OTT, streaming, mobile, television, or something else. And there is a market for all of it. I think, at the national level, you’ll see more play-by-play action live from SkyCam because broadcasters will be able to use it and distribute it however they like.
At NAB 2017, you introduced SkyCommand, an at-home–production tool that allows SkyCam operators to be located remotely. Do you have any update on this platform, and are broadcasters using it already?We have seen tremendous interest. People are asking where and when they can we do this, but there are obviously a couple different challenges we have to address: one, since it’s a cost-saving model, you’re looking at lower-tier shows in venues that don’t have much infrastructure in most cases. That said, when you take lower-tier games that happen to take place in venues that [have the necessary infrastructure], it becomes very appealing. Most of our network partners have been very interested in finding ways of utilizing Sky Command for [at-home] production. [Our partners] Sneaky Big Studios and SMT are on board, and we’re looking at doing a lot more of it in 2018. We’ve actually got some pilot programs already.
Just a couple weeks ago, we relocated SkyCam into an 80,000-sq.-ft. facility a few miles down the road from our old facility. It’s a brand-new facility, built from the ground up, that’s tailored to our needs. We’ve got two entire broadcast booths with SkyCommand in mind. One is a network-operation center with full streaming capabilities and data connectivity to the games that we’re doing. Beyond SkyCommand, when our operators are onsite, we will have a guy in Fort Worth who is basically at NOC watching the game. This person will be looking at the responses coming out of the computer systems and will be on PLs with the [on-site operators]. And then we can send that video back to the NOC and address any type of issues that we have; it gives us a great ability to manage that. The second booth is where we can actually put an operator and a pilot.
We’re continuing to work with the network vendors —The Switch, CenturyLink, and others — but we’ve already got full 10-gig fiber to the facility. So we’re working now to put all that in place for SkyCommand. I think you’ll see that more in 2018.
In what other sectors is SkyCam looking to grow in the near future?We’re also trying to expand [permanent SkyCam installations] throughout the NFL. I expect that we will have some other announcements coming out shortly about additional teams building on what we did with the Baltimore Ravens last year. Those team SkyCams will continue to grow in 2018, and we’re looking at leveraging Sky Command specifically for those cases.
February 5, 2018
Sports Video Group
SMT (SportsMEDIA Technology) is bringing a number of Super Bowl firsts to Minneapolis on both the broadcast and the in-venue production side. On NBC’s Super Bowl LII broadcast, SMT will deploy a telestrator on the high SkyCam for the first time and also will have the 1st & Ten line available on additional cameras. The in-venue production will offer the 1st & Ten line on the videoboards for the first time in a Super Bowl and will also feature enhanced NFL Next Gen Stats integration.
“It’s always exciting to do something brand new for the first time,” says SMT Coordinating Producer Tommy Gianakos, who leads the NBC SNF/TNF team. “And it’s even better when you’re doing it on the biggest show of the year with a lot of extra pieces added on top.”
In addition, during the Super Bowl LII telecast, NBC Sports’ production team will have access to a new telestration system on the high SkyCam for first replays.
“We’re now adding some telestration elements on SkyCam,” Gianakos explains. “In the past, we’ve been able to have a tackle-box [graphic] on one of the hard cameras if there’s an intentional-grounding play, but we haven’t been able to do it from high and low SkyCam on first or second replay. That intentional-grounding [virtual graphic] right above the tackles on SkyCam is something we haven’t been able to do before, but now we are able to do pretty instantaneously.”
SMT demonstrated it for NBC Sports producer Fred Gaudelli on Friday when a high school football team was on the field, and NBC opted to move forward with the system for the game.
“We’re able to do backwards-pass line virtually in real space; we’re able to measure cushions, able to paint routes on the field, all very rapidly,” says Ben Hayes, senior account manager, SMT. “It’s pretty unique to this show and the first time we’re going to be doing it on-air.”
In addition to having the live 1st & Ten line on both SkyCams and the same six hard cameras available for NBC’s Thursday Night Football and Sunday Night Football telecasts, SMT has added it to the two goal-line cameras, the all-22 camera, and two more iso cameras.
SMT also added next-gen DMX switchboard connectivity to NBC’s scorebug, so on-field graphics will update in real time and list personnel and formations of both teams.
“From a crew standpoint, it was really nice for us to have both Thursday Night Football and Sunday Night Football this season because it gave us a second group of people that understood the expectations of this show and what Fred and [director] Drew [Esocoff] really want from the show,” says Hayes. “We were basically able to merge those two crews for this game and not miss a beat.”
On the Videoboards: 1st & Ten line, Enhanced Next Gen StatsFans at the stadium will be able to see the 1st & Ten line system on the videoboards. For the first time at a Super Bowl, the yellow virtual line will be deployed on three cameras –— on the 50- and both 25-yard lines — for the in-venue videoboard production.
Also, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering real-time content not available on broadcasts.
“It’s amazing to be doing this here at Super Bowl,” says Ben Grafchik, business development manager, SMT. “Obviously, we can build upon the technology in the future, but this is our first step into it. And then I’m looking to try to continue that going forward.”
Fans inside U.S. Bank Stadium will have access to real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?). The production is made possible by SMT’s Dual-Channel SportsCG, a turnkey clock-and-score–graphics publishing system that requires just a single operator.
“We knew the Minnesota Vikings were already doing virtual and NFL Next Gen Stats, so we started thinking about what we could do to spice it up for the Super Bowl,” says Grafchik. “We’re throwing a lot of things at this production in hopes of seeing what sticks and what makes sense going forward for other venues.”
In the lead-up to the game, SMT worked with the league to merge the NFL Game Statistics & Information System (GSIS) feed with NFL Next Gen Stats API to come up with a simple lower-thirds graphics interface. This will allow the graphics operator to easily create and deploy a host of new deep analytics graphics on the videoboard during the game.
“These additional NGS elements get viewers used to seeing traditional stats along with nontraditional stats when they are following the story of the game,” says Grafchik. “If Alshon Jeffery has a massive play, the operator can instantly go with the lower third for his average receptions per target. The whole plan was to speed up this process so that this individual isn’t [creating] true specialty graphics; they’re just creating traditional graphics with extra spice on top of it. By getting quick graphics in like that, it helps to tell a story to the viewer in-venue without much narration on top of it.”
February 4, 2018
Sports Video Group
Since the first beam went up on this massive structure in Downtown Minneapolis, U.S. Bank Stadium has been building to this moment. Super Bowl LII is here, and an all-star team from Van Wagner Sports & Entertainment Productions, stadium manager SMG, and the Minnesota Vikings is ready to put on a Super Bowl videoboard production for the ages.
When 66,000-plus pack into the sparkling bowl, they’ll be treated to quite a few in-venue firsts on those boards, including the Super Bowl debut of SMT’s Yellow 1st & 10 line, a completely new Super Bowl LII graphics package, and an expanded arsenal of camera angles.
“Every Super Bowl, we’re tasked with moving the needle,” says Bob Becker, EVP, Van Wagner Sports & Entertainment (VWSE) Productions, which has designed the videoboard. “What can we do differently this Super Bowl that we haven’t done in the past? That’s our constant challenge. This is my 23rd [Super Bowl], and, every year, it gets bigger and bigger and bigger. When it’s over, you say, ‘Wow, what a great job,’ and then you start stressing about next year and wonder, ‘Well, how do we top that?’ That’s how I feel about that: you’ve got to always up your game.”
The stadium’s crown jewels are a pair of Daktronics video displays behind the end zones that measure 68 x 120 ft. and 50 x 88 ft., respectively. This year, for the first time at a Super Bowl, those boards will feature a full complement of the Yellow 1st & 10 line. SMG and the Vikings had a standing relationship with North Carolina-based SMT throughout the season, offering the yellow line encoded on their 50-yard-line camera. For the Super Bowl, they chose to expand it to include the other main cameras at each of the 20-yard lines. SMT’s Ben Grafchik will be sitting at the front of the control room, calling up specialty data-driven graphics, tickers, and data feeds for the control-room crew to call up as they desire.
Those advanced graphics are part of a completely fresh graphics package that Van Wagner has developed for this game. It’s the classic hard work done by the company: build a season’s worth of graphics to be used on a single night. Also, not only does Van Wagner come in and take over the U.S. Bank Stadium control room, but its team has basically torn it apart, pulling out gear and replacing it with specialty systems in order to take the videoboard show to that next level.
“It’s not because it’s not good,” says Becker, “but that’s how we make it bigger and better. Sometimes, you’ve got to bring technology in to make it bigger and better. And, to these guys’ credit, they have not only been there from Day One for us but have been open to allowing us to tear apart their room and integrate these new things. And it happens a lot that they go, Hey, you know something, I’d love to use that for a Vikings season next year. So there’s benefit on both sides.”
One of the vendors that has gone above and beyond for the control room is Evertz. The company has provided a crosspoint card for redundancy and the EQX router while also supplementing with some spare input cards, output cards, and frame syncs.
It’s a challenging effort to make temporary alterations to the control room, but SMG and the Vikings have welcomed the opportunity to expand with open arms.
“There’s a reason I took this job,” says Justin Lange, broadcast operations coordinator for U.S. Bank Stadium, SMG. “This is a prestigious event, and this is big for this city, the Vikings, and for us as a company. It’s been a great experience. It’s a great opportunity for us to showcase what we can do with this room, what we can do with these boards. The sightlines are great in this facility. The boards are great, the IPTV system is expansive, and we’re just excited to showcase what we have to offer as a facility.”
Normally, the control room features both Evertz IPX and baseband routing, an 8M/E Ross Acuity switcher with 4M/E and 2M/E control panels to cut secondary shows, and Ross XPression graphics systems. The all-EVS room houses a wide range of EVS products, including three 12-channel 1080p replay servers, one 4K replay server, IPDirector, Epsio Zoom, and MultiReview.
For the Super Bowl, the control room will have more cameras to choose from than it has ever had before. A total of 18 in-house cameras deployed throughout the bowl (which is more than the normal eight for a Vikings game), including four RF handhelds, an RF Steadicam, and two robotics.
The crew is also an impressive sight to behold. Nearly 100 people are working on the videoboard show in the combined efforts between Van Wagner, SMG, and the Vikings. There’s also a handful of editors across the street in the 1010 Building (where many broadcasters have set up auxiliary offices) cutting highlight packages and team-specific content.
“This is the biggest event in the world,” says Becker, “and we and the NFL mean to acknowledge that. We’re willing to do what needs to be done to put on the biggest event in the world.
February 2, 2018
NBC Sports
NASCAR will provide its teams with more data in real time this season, giving them access to publicly available steering, brake, throttle and RPM information as well as live Loop Data for the first time.
The information will be provided for every driver on every lap of every session on track.
The steering, brake, throttle and RPM information has been available through NASCAR.com’s RaceView application, which uses the information provided by the electronic control units used in the electronic fuel injection systems. Some teams have created labor-intensive programs that scraped the data from RaceView, so NASCAR decided to save time and effort for teams by directly providing the information.
No other engine data will be released. The ECU can record 200 channels of information (of a possible 1,000 parameters). NASCAR assigns about 60 channels (including the steering, brake, throttle, and RPM), and teams can select another 140 channels to log through practices and races. Those channels will remain at the teams’ discretion and won’t be distributed by NASCAR.
NASCAR’s real-time data pipeline to teams this season also will include Loop Data, which was created in 2005 and has spawned numerous advanced statistical categories that have been available to the news media. The information was born out of a safety initiative that installed scoring loops around tracks after NASCAR ended the practice of racing to the caution flag in ‘03.
Previously, teams had been provided only lap speeds/times; now they will have speeds in sectors around the track marked by the scoring loops.
Teams still won’t be given Loop Data for the pits, where the scoring loops are installed to maintain a speed limit for safety. If a scoring loop in the pits were to fail during a race, teams theoretically could take advantage of that by speeding through that loop (particularly those whose pit stall is in that sector). NASCAR does provide teams with pit speeds after races.
February 2, 2018
Stadium Business
The NFL’s popular Next Gen Stats data feed is getting a boost with real-time data delivery and graphics solutions firm SportsMEDIA Technology (SMT) for Super Bowl LII at the U.S. Bank Stadium.
For the championship game this Sunday in Minneapolis, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering fans real-time content not available on broadcasts.
SMT’s in-stadium production combines in-game stats that are displayed on the stadium’s two massive video boards, as well as 2,000 in-concourse HD video displays.
U.S. Bank Stadium, home of the Minnesota Vikings, boasts 31,000 square feet of video boards, including the west end zone display at 120- by-68 feet high, and the east end zone display at 88- by 51-feet high.
The 65,000 fans at Super Bowl LII will be presented with real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?).
“As an organisation, the Minnesota Vikings constantly look for innovative strategies that provide the best fan experience possible, and SMT’s in-stadium solution is the perfect complement to our new video boards,” said Allen Wertheimer, senior manager of production for the Vikings.
“For years, we’ve heard from fans that they want the same innovative technology in-stadium that they get at home. Now, with SMT’s presentation of the virtual 1st and Ten system and the NFL’s Next Gen Stats on the video boards, we can offer them in-game stats they wouldn’t get watching from home.”
Ben Grafchik, SMT’s business development manager, said: “In anticipation of creating the ultimate Game Day experience for Super Bowl fans at U.S. Bank Stadium, we have worked diligently all season with the Vikings and the NFL to provide in-stadium 1st & Ten graphics and NFL’s Next Gen Stats, giving fans the real-time data they’re hungry for, such as positional information, game leaders, and quarterback passing.
“We are confident that our execution will provide quantifiable and unique data points that truly highlight the skills inherent in elite NFL athletes.”
This year’s Super Bowl pits the New England Patriots against the Philadelphia Eagles.
January 31, 2018
Business Wire
DURHAM, N.C.--(BUSINESS WIRE)--SMT (SportsMEDIA Technology), the leading innovator in real-time data delivery and graphics solutions for the sports and entertainment industries, today announced it is providing in-stadium solutions, including its Emmy-winning virtual 1st & Ten line system and the NFL’s new Next Gen Stats, for Super Bowl LII, to be held Feb. 4 at U.S. Bank Stadium.
For Super Bowl LII, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering fans real-time content not available on broadcasts. SMT’s in-stadium production combines in-game stats integrated into SMT-designed graphics packages that are displayed on the stadium’s two massive video boards, as well as 2,000 in-concourse HD video displays, offering fans a chance to watch highlights and stay informed no matter where they are in the stadium. U.S. Bank Stadium boasts 31,000 square feet of video boards, including the west end zone display at 120- by-68 feet high, and the east end zone display at 88- by 51-feet high.
The more than 65,000 football fans attending the Super Bowl will be treated to a variety of valuable real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?). The production is made possible by SMT’s Dual-Channel SportsCG, a turnkey clock-and-score graphics publishing system that requires just a single operator.
“As an organization, the Minnesota Vikings constantly look for innovative strategies that provide the best fan experience possible, and SMT’s in-stadium solution is the perfect complement to our new video boards,” said Allen Wertheimer, Senior Manager of Production for the Minnesota Vikings. “For years, we’ve heard from fans that they want the same innovative technology in-stadium that they get at home. Now, with SMT’s presentation of the virtual 1st and Ten system and the NFL’s Next Gen Stats on the video boards, we can offer them in-game stats they wouldn’t get watching from home.”
“In anticipation of creating the ultimate Game Day experience for Super Bowl fans at U.S. Bank Stadium, we have worked diligently all season with the Vikings and the NFL to provide in-stadium 1st & Ten graphics and NFL’s Next Gen Stats, giving fans the real-time data they’re hungry for, such as positional information, game leaders, and quarterback passing,” said Ben Grafchik, SMT Business Development Manager. “We are confident that our execution will provide quantifiable and unique data points that truly highlight the skills inherent in elite NFL athletes.”
In addition to in-stadium solutions, SMT will provide broadcast solutions for Super Bowl LII, including the virtual 1st and Ten system, data-driven graphics and tickers, and in-game data feeds to commentator touchscreens, among other services. SMT has supported Sunday Night Football on NBC since 2006.
About SMT
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious events. The 31-time Emmy Award-winning company is headquartered in Durham, N.C.
January 30, 2018
Sports Video Group
With the Madden NFL 18 Club Championship Finals in full swing this week and the recent announcement of a new TV and streaming deal with Disney/ESPN, EA’s Madden NFL Championship Series is squarely in the esports spotlight. The series has been moving toward this moment for months, with 11 NFL teams hosting events in which fans competed to advance to the Finals in Minneapolis this week. In its first foray into competitive gaming, SMT’s Video Production Services (VPS) group produced events for the Arizona Cardinals, Buffalo Bills, and Jacksonville Jaguars throughout the end of 2017.
“SMT’s experience with supporting top football shows like the Super Bowl and Sunday Night Football makes us uniquely positioned to attract Madden gamers to the NFL through the medium they are most attracted to: esports,” says C.J. Bottitta, executive director, VPS, SMT. “With a worldwide fan audience now estimated at 280 million, approaching that of the NFL, SMT is excited to enter the growing market of competitive gaming.”
Although the level of services SMT provided varied from show to show, the base complement for all three productions comprised a full technical team of broadcast specialists operating six cameras, multiple replay machines, and a telestration system. SMT kept pace with the Madden’s lightning-quick style of play for the three-hour shows streamed on EASports YouTube channel, Twitch.TV/Madden, and the EA Sports’ Facebook page. In addition, SMT’s Creative Studio customized EA’s promotional trailer with team-specific elements for each of the three events.
“We started doing [Madden events] with teams last year, and there has been an evolution from wanting a [small-scale] podcast-level environment to almost a broadcast-level show,” says Bottitta. “What I loved about the three teams this year was how passionate and excited they were to be doing this. Teams were handling events very differently, but all of them had great people to work with and did a wonderful job.”
Inside the Production: University of Phoenix Stadium, Glendale, AZ
The Cardinals’ Madden NFL 18 Club Championship took place on took place on Saturday Nov. 11, soon after the team’s Thursday Night Football home game against the Seahawks, creating a quick turnaround for SMT and the team’s production staff. SMT provided the producer (Bottitta), director, tech manager, and lead camera operator and advised on what should be added for the production.
“We primarily provided leadership for the Cardinals,” says Bottitta. “They have a fantastic facility, so we reviewed with their tech group what they had and what they needed to add for [a competitive-gaming production] like this. They have a fantastic control room, and they used the crew that they normally use except for the producer, director, tech manager, and lead cameraman, which we provided.”
Inside the Production: New Era Field, Buffalo, NY
In Buffalo, SMT provided a similar level of services for the Bills’ event on Saturday Dec. 2, the day before the team faced off against the New England Patriots. SMT worked with the Bills to manage other shows using the team’s studio at New Era Field: a simulcast radio show, pre/postgame show for the Buffalo Sabres, and Bills GameDay on Sunday.
SMT once again used the team’s crew primarily but provided its own producer, director, tech manager, and camera ops and added a stage manager.
“Buffalo was on a real-time crunch,” says Bottitta, “so they told us the studio they wanted to use, the schedule of the studio, and asked us what was reasonable to expect. We guided them through what would make the most sense, so we could get in there, have a rehearsal and set day and then do the show while also allowing them to still do their normal duties.”
Inside the Production: Daily’s Place Amphitheater, Jacksonville, FL
SMT ramped up its role at the Jaguars’ event, which took place the morning of a home game against the Seahawks on Dec. 10. Since it was a game day, the Jaguars crew was occupied handling the in-venue production, so SMT essentially handled the entire Madden production at Daily’s Place Amphitheater, which is connected to EverBank Field. Since the two events were happening concurrently, the Jaguars provided SMT access to their router, allowing live camera views of warmups to be integrated into the Madden show throughout.
“The Jaguars [production] was the most unique of the three because it was on game day,” Bottitta explains. “They wanted to host it on the morning of what ended up being a very meaningful December football game for the Jaguars for the first time in a long time. Since the game-day crew was obviously busy, we did the whole show. We were taking Seattle and Jacksonville warming up on the field as bump-ins and bump-outs for our show, which was great and really captured the energy of the game.”
The Broadcast Mentality: Madden NFL Coverage Continues To Evolve
As the Madden NFL Club Championship grows (all 32 NFL franchises were involved for the first time this year, with prize money totaling $400,000 at this week’s Championship), the property has made an effort to boost its production value for live streams. Bottitta believes that SMT’s experience on A-level NFL productions, including Sunday Night Football and this weekend’s Super Bowl LII, was integral in the league’s selecting SMT: “I think that made a big difference: knowing that we weren’t just a group that’s doing one more esports tournament; this is a group that does professional sports production.”
He adds that VPS aims to leverage this broadcast-level expertise by bringing in such tools as replay systems and telestrators, which would be standard on an NFL telecast.
“We tried to bring a [broadcast] philosophy to these shows and want to make it more consumable for the viewers,” he says. “We brought telestrators and replay to all of the [productions], and that was not the norm when EA launched [the Club Championship] last year. I did that not only because SMT has a very portable, very easy-to-implement telestrator system but because it really adds to the show. If you went to a game and didn’t see replays or the key camera angles, you’d be in shock. So that became a big part of our production plan.”
January 19, 2018
Sports Video Group
As the Jacksonville Jaguars look to stymie the New England Patriots’ quest for a sixth Super Bowl victory, CBS Sports will cover this Sunday’s AFC Championship from every angle — including overhead.
CBS Sports will deploy 39 cameras in Foxborough, MA: seven super-slow-motion cameras, eight handhelds, and a Steadicam; pylon cams; and a collection of 4K, robotic, and Marshall cameras. The network will also have access to Intel 360 cameras for 360-degree replays. To give viewers an aerial view, CBS will rely on a dual SkyCam WildCat aerial camera system and fly a fixed-wing aircraft over Gillette Stadium.
The CBS Sports crew will work out of NEP SSCBS and have access to 152 channels of replay from 14 EVS servers — four eight-channel XT3’s and10 12-channel XT3’s — plus a six-channel SpotBox and one 4K server.
CBS Sports’ lead announce team Jim Nantz, Tony Romo, and Tracy Wolfson will have plenty of storytelling tools at their fingertips, including SMT’s Next Gen Tele and play-marking systems with auto-render technology on both SkyCams. The lower SkyCam will focus on the actual game play at the line of scrimmage, including the quarterback’s point of view, while the upper SkyCam will provide a more tactical, “all-22” look at the field. During the AFC Championship, Romo will be able to use these tools to break down what he sees on the field for first and second replays.
Coverage begins at 2:00 p.m. ET with The NFL Today, featuring host James Brown and analysts Boomer Esiason, Phil Simms, Nate Burleson, and Bill Cowher at the CBS Broadcast Center in New York City; kickoff follows at 3:05 p.m. ET. Fans wanting to start their day even earlier can tune into The Other Pregame Show (TOPS) on CBS Sports Network, which runs from 10:00 a.m. to noon
January 12, 2018
Sports Video Group
The Tennessee Titans travel to New England this weekend to take on the reigning Super Bowl champions in the AFC Divisional Round. To capture the action on the gridiron from every angle, CBS Sports will rely on dual SkyCam WildCat aerial camera systems with SMT’s Next Gen Tele and play-marking systems, as well as its virtual 1st & Ten line.
The Next Gen Tele System, which debuted during last year’s AFC Divisional Round, channels the NFL’s Next Gen Stats (NGS) data into an enhanced player-tracking telestrator. Combined with SMT’s proprietary play-marking system, which enables rendering of four virtual-player routes on the SkyCam video and its virtual 1st & Ten line, Next Gen Tele System provides a multitude of options for on-screen graphics that CBS Sports talent can leverage to better tell the story of the game.
“From a production standpoint, everything is about storytelling and conveying the story behind the game,” says Robbie Louthan, VP, client services and systems, SMT. “It’s handled in many different ways, but one way is obviously graphics. The advantage there is, you’re able to tell relevant, compelling information in a quick and succinct way without having to have the talent verbalize it to [viewers]. When you can get it reduced down to a graphic that is relevant to the viewer, you’re guaranteeing that the information you want to convey is being handled in a very quick, succinct manner, because there’s very short time frame between plays.”
During Saturday’s game, SkyCam will focus the lower camera system on the actual game play at the line of scrimmage, showing the quarterback’s point of view. The upper system will provide more of a tactical, “all 22” look at the field. Both systems will feature SMT graphics that enhance their respective camera angles and roles.
“Our camera angle creates a view that helps tell the story better than other camera angles,” explains Stephen Wharton, CTO, SkyCam. “Our view just establishes the storytelling for those graphics better than any other camera can, and then, when you add the motion that our camera brings with it, it makes those graphics — whether NGS, routes, and lines or first-down markers —- get placed very well within the angle of the shot, so that that story is being told.”
SMT will deploy four staffers to Gillette Stadium to support the graphics on the dual Skycam system: one operator to support the Next Gen Tele System, a dedicated operator for each of the camera systems, and one to oversee the operation and help produce the content. SkyCam will have a team of nine on the ground in New England, including five operators on the lower camera system (an engineer in charge, an assistant, a rigger, a pilot, and an operator responsible for the camera’s pan/tilt/zoom) and four on the upper camera system (an EIC, rigger, pilot, and PTZ operator).
The same system will return the following week during the AFC Championship Game, and similar systems will appear in other games throughout the NFL playoffs. And, while the action on the gridiron is sure to excite throughout the playoffs, the graphics overlaid on the dual Skycam system will only increase the level of storytelling that the talent can deliver and fans can expect.
“We’re excited about showing off a new way of using Next Gen Stats and really focusing on where the players are running, where the routes are, and creating that sort of Madden look, if you will,” says Wharton. “If you [look at the broadcasters, they’re] usually telestrating: they’re saying, Here’s this guy, and they draw the little yellow line of where he ran. Now we’re leveraging the NFL’s Next Gen Stats system to get that data to create the graphics with SMT and then overlay that from our angle. It creates a very compelling shot.”
Echoes Louthan, “It’s another tool in the toolkit for the announcers — in this case, for [analyst] Tony Romo to use graphics to help tell the story of what he sees. It has been exciting for us to work with Tony on fine-tuning these graphics to [enable] him to use his incredible insight into the game to tell the story.”
(SportsMEDIA Technology), the leading innovator in real-time data delivery and graphics solutions for the sports broadcasts, and SkyCam, the company that specializes in cable suspended aerial camera systems, are continuing to deliver technological innovations to CBS Sports’ broadcasts of the AFC playoff games, including Saturday’s Tennessee Titans vs. New England Patriots contest, Sunday’s Jacksonville Jaguars vs. Pittsburgh Steelers game and the AFC Championship on Jan. 21.
SMT will provide its Next Gen Tele system, an enhanced player-tracking telestrator that harnesses the power of NFL's Next Gen Stats data and SMT’s proprietary play-marking system to instantly render four virtual player routes on SkyCam video that’s available to the producer and talent at the end of every play. This “first-replay series, every replay” availability makes SMT’s system a true breakthrough in which NFL's Next Gen Stats data is able to drive meaningful content as an integral component of live NFL game production. The system debuted last year for the AFC divisional playoffs.
Using dual SkyCam WildCat aerial camera systems to enhance its broadcast, CBS Sports has made standard the “Madden-like” experience that gives football fans a more active and dynamic viewing experience behind the offense, revealing blocking schemes, defensive fronts and throwing windows and providing a deeper understanding of plays. Combined with
SMT’s virtual 1st & Ten line solution placed from SkyCam images, viewers are experiencing the new, modernized look of NFL games. SMT, through its offices in Durham and Fremont, has supported CBS NFL broadcasts since 1996.
“Used in conjunction with SMT’s virtual technology, fans have embraced the enhanced coverage made possible with dual SkyCam systems, a look that younger viewers have come to expect in their games,” said Stephen Wharton, CTO, SkyCam. “With SkyCam, fans get the benefit of a more complete view of the action and play development – we place them right into the action in real-time. Sideline cameras force fans to wait for replays to get a sense of what receivers and quarterback were seeing. With SkyCam, no other camera angle is as immersive or engaging.”
“SMT’s ability to place virtual graphics from SkyCam opens up a plethora of possibilities for broadcasts in terms of augmented reality applications with advertising content, player introductions on the field, or a whole host of possibilities,” said Gerard J. Hall, CEO, SMT. “The potential with our technology is limitless.”
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious live events, including the Super Bowl, NBC Sunday Night Football, major golf and tennis events, the Indianapolis 500, the NCAA Tournament, the World Series, ESPN X Games, NBA on TNT, NASCAR events, and NHL games. SMT’s clients include major US and international broadcasters as well as regional and specialty networks, organizing bodies, event operators, sponsors and teams. The 31-time Emmy Award-winning company is headquartered in Durham, N.C., with divisions in Jacksonville, Fla., Fremont, Calif., and London, England.
Headquartered in Fort Worth, Texas, SkyCam is a leading designer, manufacturer and operator of mobile aerial camera systems. SkyCam plays a significant role in changing the way sporting events are broadcast in the world, appearing at marquee broadcast events, such as The NFL Super Bowl, NCAA Final Four, NBA Finals, Thursday Night Football, Sunday Night Football, NCAA College Football, 2015 CONCACAF Gold Cup and 2014 FIFA World Cup. SkyCam is a division of KSE Media Ventures, LLC
January 08, 2018
TV Technology
NEW ORLEANS—New Orleans Saints and Carolina Panther receivers and quarterbacks weren’t the only ones concerned about what was in and out of bounds Sunday (Jan. 7) in New Orleans during the NFC Wildcard game.
Fox Sports, which telecast the game, walked a different sort of line with its playoff coverage—one that delineates between delivering the great shots needed to present game action and some new tech implementation that actually gets in the way of coverage.
“We don’t want to make things all that different for the production team and give them a whole bunch of stuff that they haven’t had before for the big games,” says Mike Davies, SVP of Field and Technical Operations at Fox Sports. Rather, the strategy is to start with a “base layer” of production technology used throughout the 17 weeks of the regular season and then deploy choice pieces of technology that will have the biggest impact on game production and allow Fox Sports to tell the best story, he says.
“A lot of this stuff we’ve used before and some just this year,” says Davies. “We just pick the best of the best to represent us.”
For example, for the three NFL playoff games Fox Sports is covering the broadcaster will add a second, higher SkyCam to deliver a drone’s-eye view of plays that captures all 22 players on the field. “Although you think of how over the top two SkyCams might sound, it turns out to be very useful,” says Davies. Fox Sports first used the dual SkyCam setup during the preseason and then again in Week 5 for the Packers vs. Cowboys game. “I think that camera angle is new enough that we are still learning what it can do,” he says.
The broadcaster recognized the upper SkyCam “was something special” in Week 5 during a play involving Cowboys running back Ezekiel Elliot. “He jumped over that pile and no camera, including the lower SkyCam, saw that he had reached out over the first down line [except for the new upper SkyCam],” he says. “At least for that moment, we were sold that this is something special and something we wanted to offer.”
However, camera enhancements—both in terms of numbers and applications—aren’t limited to the second SkyCam. For its NFL playoff coverage, Fox Sports will deploy seven 8x Super Mo cameras, rather than the typical five. Fox also will use 6x Super Mo for its SkyCams, which it first did for its Super Bowl LI coverage in February 2017.
“There are so many replay opportunities in football, and the Super Mo gives this crisp—almost cinematic—look at the action,” says Davies.
The sports broadcaster also will take advantage of work it has done this year with Sports Media Technologies (SMT), SkyCam and Vizrt “to cobble together a recipe” to do augmented reality with the SkyCam, he says. Not only does the setup allow Fox Sports to put a live yellow line on the field of play with its SkyCam shots, but also to put graphic billboards and other 2-D graphics on the field and to fly around them with the SkyCam as if they were real objects.
“It’s a bit of an orchestration because the pilot of the SkyCam needs to be flying around the object as if it were an object on the field. If you break through it, it’s not going to look real,” says Davies.
Another enhancement is how Fox Sports will use its pylon cameras, says Davies. Rather than pointing the pylon cams positioned at the front of the end zone down the field, Fox will rotate them so they look down the field at a 45 degree angle, says Davies.
“That gives you a way to cover a play where the camera is actually looking. Yes, you have the goal line, but you also have the out-of-bounds line as well,” he says. As a result, there are more game situations in which the pylon cameras can contribute to coverage. “The pylon cameras are a lot like catching lightning in a bottle. They are great, but you don’t want to use them unless you’ve got something that is really compelling,” says Davies.
While it is too soon to tell if the drop in viewership plaguing the league this season will carry over to the playoffs, Davies is confident that the right technology and production techniques have the potential to help fans reconnect with the game.
“I feel that what we are able to do using all of this incredible technology—the dual SkyCams, the Super Mo’s and the pylons—is that we are able to deliver that kind of experience in replay right after the play that also shows the emotions of players, not just what happens between the whistles,” he says.
Harkening back to his stint at HBO, Davies recalls the connection the cinematic style used for “Inside the NFL” created as “you watched a game that happened three or four days prior.” Today’s production tools give broadcasters that same opportunity to create that connection, he says. “I can’t help but think that these kind of storytelling tools, honestly, can only help,” says Davies.
The 2019 College Football Playoff National Championship concludes tonight at Levi’s Stadium in Santa Clara, CA. Like every other football game, it will feature two teams — in this case, Alabama and Clemson — and one broadcaster. For its part, ESPN is once again all-in for the big game, deploying more than 310 cameras to cover all the action and providing 17 viewing options via the MegaCast over 11 TV and radio networks and via the ESPN app.
“The thing that makes this event is the volume and magnitude of what we put behind it but also the time frame,” says John LaChance, director, remote production operations, ESPN. “[There are] other marquee events, which stand alone, but, with the volume and viewer enhancements being done here in a 72-hour window to get everything installed, this event [is] in a unique classification. Trying to integrate everything into place was a herculean effort.”
The game wraps up a season in which ESPN’s production team delivered more than 160 games to ABC and ESPN and more than 1,000 games to various other ESPN platforms.
“To watch that volume and make sure all the pieces are in place is a highlight for all of us, [seeing] it go from plan to working,” says LaChance. “You always have things that are challenges, but it’s about how quickly you can recover, and I think we’ve done it well.”
The core of ESPN’s production efforts will be done out of Game Creek Video’s 79 A and B units with Nitro A and B handling game submix, EVS overflow, 360 replay, robo ops, and tape release. ESPN’s team creating 17 MegaCast offerings is onsite, housed in Nitro and Game Creek’s Edit 3 and Edit 4 trailers andTVTruck.tv’s Sophie HD. Game Creek Video’s Yogi, meanwhile, is on hand for studio operations, and Maverick is also in the compound. All told, 70 transmission paths (50 outbound, 20 inbound) will be flowing through the compound, and 40 miles of fiber and cable has been deployed to supplement what already exists at Levi’s Stadium.
Also on hand are Fletcher, which is providing robotics; BSI, handling wired pylons and RF audio and video; 3G, which is in charge of the line-to-gain PylonCam and the first-and-10–marker camera; Vicareo, with the Ref Cams; and CAT Entertainment, for UPS and power. SMT is on board for the 1st & Ten lines; PSSI, for uplink; Bexel, for RF audio and other gear; and Illumination Dynamics, for lighting.
“It’s a team effort,” says LaChance. “I couldn’t be prouder of the team we assembled here and the vendors, technicians, leads, and staff that have, over the course of the last several months and weeks when it gets to a fever pitch, put it all together.”
The Camera Contingent
A large part of the 300-camera arsenal is comprised of 160 4K DSLR cameras deployed for the 4D Replay system that will provide definitive looks at every play from every angle. Those cameras are mounted around the stadium and, combined, provide images that can be merged on computers and enable an operator to zoom around a play and show any angle.
One place where the 4D system is poised to shine is the Red Zone. The 4D Replay team and ESPN have created templates that can cut the time needed to synthesize the images for plays around the goal line and pylons to eight seconds.
Besides the 160 4D replay cameras, plenty of cameras are focused on the game action, including 90 dedicated to game coverage. Among those are 10 super-slo-mo cameras, nine 4K game cameras, 15 RF cameras, two SkyCams, and two aerial cameras in a blimp and fixed-wing aircraft. The vast majority of cameras are Sony models (mostly Sony HDC-2500 and HDC-4300 with one HDC-4800 in 4K mode) coupled with Canon lenses, including five 100X, two 95X, 21 wide-angle, and 14 22X and 24X lenses. Seven 86X lenses and a 27X lens are also in use.
The game-coverage cameras are complemented by specialty cameras. Four Vicario Ref Cams will be worn by the officials; a line-to-gain RF PylonCam will move up and down the sideline with the first-and-10 marker, which also has a camera; and eight PylonCams around the end zones provide a total of 28 cameras.
The RefCam is new this year, having been tested during last year’s final in Atlanta. The MarkerCam did debut last year, and LaChance says it has been improved: “It has a c360 Live camera in the target portion of the marker to give a 180-degree perspective in 4K. The operator can push in and get a great perspective; we are taking it to another level with the push in.”
A second c360 camera will also be in use on the second SkyCam, again giving the ESPN team the ability to zoom in and capture images.
Another exciting new offering is AllCam, a system designed by ESPN’s in-house team and ChyronHego. It stitches images from three 4K cameras placed alongside the all-22 camera position and gives the production team the ability to zoom in anywhere on the field to capture events that might have taken place away from the action. For example, in a test at a bowl game, the system was used to show an unnecessary-roughness violation that took place during a kickoff far from the other players, who were focused on the run-back.
“It’s another good example of the partnerships we have and working for a common goal,” says LaChance.
Beyond the game coverage cameras there are 20 cameras dedicated to the various MegaCast feeds, 29 for ESPN College GameDay, and nine for the SEC Network. ESPN Deportes also has two dedicated cameras.
All told the production team will have access to 320 sources via 170 channels of EVS playback as well as 32 channels of Evertz Dreamcatcher playback. There are also two Sony PVW-4500 servers in use, a Sony BPU-4800 4K record server, and two c360 record servers.
Non-Stop Action — for the Production Team
“The game wraps up a busy time for the production team as well as for those who work at Levi’s Stadium. LaChance credits Jim Mercurio, VP, stadium operations/GM, Levi’s Stadium, and Nelson Ferreira, director, technical operations, San Francisco 49ers, with being an important part of the process during the past year.
“It’s a solid venue and great group of folks to work with, and that helps,” says LaChance. “They have done the Super Bowl here, and they do a lot of great events, so they are well-equipped. We had to supplement with some fiber, but they had a great infrastructure to start with.”
As for the ESPN team, everybody worked on one of the two semifinals as well as an additional bowl game.
“Folks that did the Cotton Bowl headed on to the Sugar Bowl, and those that did the Orange Bowl headed to the Rose Bowl,” says LaChance. “A lot of the people here have been non-stop since the Christmas Day offerings for the NBA, then right into a semifinal assignment, then the second of the New Year’s bowl offerings, and then making their way here to Santa Clara for one of the largest events the company does every year.”
For anyone looking to see what the new toys will bring to the show, LaChance recommends tuning into the TechCast, which will have a sampling of everything that will be used, including 4D Replay, C360, and the RefCam.
“Besides the game itself,” he says, “tune into the TechCast. Hopefully, the weather is good for us, and we can offer the BlimpCast from the Goodyear airship, which is another opportunity to provide a unique look for viewers at home.”
2018 was one of the most eventful years for sports production in recent memory, with the 2018 PyeongChang Olympics and 2018 FIFA World Cup capturing the nation’s attention over the summer and annual events like the College Football Playoff National Championship Game, Super Bowl, NFL Draft, and others breaking production records and test-driving new technologies and workflows. As if there weren’t enough going on stateside, this year’s Road Warriors features an expanded look at what went on across the Atlantic. Here’s is Part 2 of SVG’s look at some of the sports-production highlights from the past year (CLICK HERE for Part 1).
US OPEN
USTA Billie Jean King National Tennis Center, Flushing Meadows, NY
August 27–September 9
For ESPN, it simply doesn’t get bigger than US Open tennis. In the network’s fourth year as host broadcaster and sole domestic-rights holder — part of an 11-year rights deal — the technical and operations teams continued to evolve production workflows and add elements. Highlights this year included the debut of a Fletcher Tr-ACE/SimplyLive ViBox automated production system covering the nine outer courts and several new camera systems.
This truly is the largest event that ESPN produces out of the thousands of events that we do all year,” said ESPN Director, Remote Operations, Dennis Cleary, “and it’s all done in a 3½-week span.”
For the first time, ESPN covered all 16 courts at the US Open, thanks to a new automated production system deployed on the nine outer courts. Having debuted at Wimbledon in June, the Fletcher Tr-ACE motion-detecting robotic camera system was deployed on each court (with four robos per court) and relied on SimplyLive’s ViBox for switching and replay and an SMT automated graphics system. With this workflow, one robotic-camera operator and one ViBox director/producer covered each of the nine courts.
New this year was a two-point aerial CineLine system (provided by Picture Factory) running between Louis Armstrong Stadium and Court 10, a run of roughly 1,000 ft. After a successful debut at Wimbledon in June and the Australian Open in January, Telstra Broadcast Services’ NetCam made its US Open debut. The Globecam HD 1080i/50 POV miniature robotic camera was deployed on each side of the net for singles matches at Arthur Ashe Stadium, Armstrong, and the Grandstand, providing viewers with a close-up look at the action on the court. In addition, both Intel’s Tru View 360-degree camera system and the SpiderCam four-point aerial system returned to Ashe.
The US Open production compound was almost unrecognizable from five years ago, prior to ESPN’s taking over as host broadcaster. What had been a caravan of production trucks became two permanent structures housing ESPN’s NTC broadcast center and production/operations offices, along with two ultra-organized stacks of temporary work pods housing the TOC, vendors, international broadcasters, and ESPN’s automated production operation for the outer courts. NEP’s NCP8 was on hand for ESPN’s ITV operation (serving AT&T/DirecTV’s US Open Mix Channel), and NEP’s Chromium and Nickel were home to the USTA’s world-feed production. — JD
U.S. OPEN
Shinnecock Hills Golf Club, Shinnecock Hills, NY
June 14-17
The 2018 U.S. Open from Shinnecock Hills Golf Club gave the Fox Sports team challenges in production planning that led to innovations, the opportunity to refresh old workflows and core infrastructure, and a chance to chart some new directions for golf coverage.
Game Creek Video’s Encore production unit was at the center of the coverage for Fox and FS1, with Game Creek Pride handling RF-video control and submix and providing a backup emergency control room. Pride’s B unit handled production control for one of the featured groups, Edit 4 supported all iso audio mixes, and Edit 2 was home to five edit bays with equipment and support provided by Creative Mobile Solutions Inc. (CMSI). There was also the 4K HDR show, which was produced out of Game Creek Maverick.
“All the Sony HDC-4300 cameras on the 7th through 18th greens are 4K HDR-native with a secondary output at 720p SDR,” noted Brad Cheney, VP, field operations and engineering, Fox Sports, during the tournament. There were also six Sony PXW- Z450’s for the featured holes and featured groups, the output of two of them delivered via 5G wireless.
In terms of numbers, Fox Sports had 474 technicians onsite, making use of 38 miles of 24-strand fiber-optic cable to produce the event captured by 106 cameras (including 21 wireless 1080p, 21 4K HDR units, six 4K HDR wireless units, three Inertia Unlimited X-Mo cameras shooting at 8,000 fps, a Sony HDC-4800 at 960 fps, and three Sony HDC-4300’s at 360 fps), and 218 microphones. Tons of data was passed around: 3 Gbps of internet data was managed, along with 83 Gbps of broadcast data, 144 TB of real-time storage, and 512 TB of nearline storage.
Each course provides its unique challenges. At Shinnecock Hills, they included the roads running through the course, not to mention the hilly terrain, which also had plenty of deep fescue. But, from a production standpoint, the biggest issue was the small space available for the compound.
One big step taken in preparation for the 2018 events was that the IP router in Encore was rebuilt from scratch. RF wireless coverage was provided by CP Communications. There were 26 wireless cameras on the course, along with 18 wireless parabolic mics and nine wireless mics for on-course talent. CP Communications also provided all the fiber on the course. — KK
MLB ALL-STAR GAME
Nationals Park, Washington, DC
July 17
With its biggest summer drawing to a close with the MLB All-Star Game, Fox certainly showed no sign of fatigue technologically. Not only did the network roll out a SkyCam system for actual game coverage for the first time in MLB history, but Fox also deployed its largest high-speed–camera complement (including all 12 primary game cameras), two C360 360-degree camera systems, and ActionStreamer POV-style HelmetCams on the bullpen catcher, first-base coach, and Minnesota Twins pitcher José Berríos.
People always used to say Fox owned the fall with NFL and MLB Postseason, but, this year, we owned May through July, too, with the U.S. Open, World Cup, and now All-Star,” said Brad Cheney, VP, field operations and engineering, Fox Sports. “The capabilities of our [operations] team here are just unsurpassed. For big events, we used to throw everything we had at it, and it was all hands on deck. That’s still the case, but now, when we have big events, everybody’s [scattered] across the globe. Yet we’re still figuring out ways to raise the bar with every show.”
Between game coverage and studio shows, Fox Sports deployed a total of 36 cameras (up from 33 in 2017) at Nationals Park, highlighted by its largest high-speed–camera complement yet for an All-Star Game. Building on the efforts of Fox-owned RSN YES Network, all 12 of Fox’s Sony HDC-4300 primary game cameras were licensed for high-speed: six at 6X slo-mo, six at 2X slo-mo. This was made possible by the ultra-robust infrastructure of Game Creek Video’s Encore mobile unit.
Fox also had two Phantom cameras running at roughly 2,000 fps (at low first and low third) provided by Inertia Unlimited and a pair of Sony P43 6X-slo-mo robos at low-home left and low-home right provided by Fletcher. Fletcher provided nine robos in all — including low-home Pan Bar robo systems that debuted at the 2017 World Series — and Inertia Unlimited provided a Marshall POV in both teams’ bullpen and batting cage.
CP Communications supplied a pair of wireless RF cameras: a Sony P1r mounted on a MōVI three-axis gimbal and a Sony HDC-2500 handheld. An aerial camera provided by AVS was used for beauty shots — no easy task in security-conscious Washington.
Inside the compound, a reshuffling of USGA golf events allowed Game Creek Video’s Encore mobile unit (A, B, and C units), home to Fox’s U.S. Open and NFL A-game productions, to make its first All-Star appearance.
The primary control room inside the Encore B unit handled the game production, and a second production area was created in the B unit to serve the onsite studio shows. — JD
The Open Championship
Carnoustie Golf Links, Angus, UK
July 19-22
Sky Sports used its Open Zone in new ways to get closer to both players and the public in its role as the UK live broadcaster from Carnoustie. On Thursday and Friday, Sky Sports The Open channel was on the air from 6:30 a.m. to 9:00 p.m. Featured Group coverage of the 147th Championships was available each day via the red button and on the Sky Sports website. Viewers could also track players’ progress in Featured Hole coverage on the red button, with cameras focusing on the 8th, 9th, and 10th holes. Sky Sports had a team of 186 people onsite in Carnoustie for The Open, which included Sky production and technical staff and the team from OB provider Telegenic. — Fergal Ringrose
WIMBLEDON
All England Lawn Tennis and Croquet Club, Wimbledon, UK
July 2-15
At 11:30 a.m. on Monday, July 2, coverage of the Wimbledon Championships went live from the AELTC, produced for the first time by a new host broadcaster. After more than 80 years under the BBC’s expert guidance, the host baton was passed to Wimbledon Broadcast Services (WBS), bringing production of the Championships in-house. Going live on that Monday was the culmination of two years of planning, preparation, and testing: a process that has allowed the AELTC to “take control” of the event coverage and provide international rightsholders with a better service as well as add some new twists, such as Ultra High Definition (UHD), a NetCam on both Centre Court and No.1 Court, and multicamera coverage of all 18 courts. — Will Strauss
FRENCH OPEN
Stade Roland-Garros, Paris
May 27–June 10
Tennis Channel was once again on hand in a big way at the French Open. The expanded coverage this year meant more than 300 hours of televised coverage for fans in the U.S. as well as 700 hours of court coverage via Tennis Channel Plus. The Fédération Française de Tennis (FFT) increased overall court coverage this year, and Tennis Channel made sure all of that additional coverage made it to viewers. Tennis Channel had approximately 175 crew members onsite, working across the grounds as well as in a main production-control room, an asset-management area, six announce booths, and a main set on Place des Mousquetaires. The production facilities were provided by VER for the fifth year. Centurylink provided fiber transport to the U.S. via 10-Gbps circuits. — KK
The Professional Fighters League (PFL) and SMT (SportsMEDIA Technology) announced an exclusive, long-term technology partnership. Under the terms of the agreement, SMT will partner with the PFL to create proprietary technology that will measure real-time MMA fighter performance analytics along with biometric and positional data that will provide fans a live event experience across all platforms.
Starting in 2019, SMT will help power the PFL’s vision of the first-ever SmartCage. The SmartCage will utilize biometric sensors and proprietary technology that will enable the PFL to measure and deliver real-time fighter performance data and analytics, what the PFL is dubbing: Cagenomics. PFL fans watching linear and digital broadcasts of the league’s Regular Season, Playoff, and Championship events will experience a new dimension of MMA fight action with integration of live athlete performance and tracking measurements including: speed (mph) of punches and kicks, power ratings, heart rate tracking, energy exerted, and more.
“The Professional Fighters League is excited to be partnering with SMT to advance the sport of MMA. The PFL’s new SmartCage will revolutionize the way MMA fans experience watching live fights as next year every PFL fight will deliver unprecedented, real-time fighter performance data and analytics, biometric tracking, and an enhanced visual presentation of this great sport,” says Peter Murray, CEO, Professional Fighters League. “Not only will PFL fans benefit from our SmartCage™ innovation, but our pro fighters will now have access to new performance measurement data, analysis, and tools to help them train and compete. The PFL’s vision has always been two-fold: deliver the absolute best experience to fans and be a fighters first organization and with the SmartCage we will accomplish both.”
“SMT is thrilled to be collaborating with the Professional Fighters League’s forward-thinking innovation team to bring our latest and greatest technology to PFL events,” says Gerard J. Hall, Founder & CEO, SMT. “Starting in 2019, PFL fans will begin to see real-time, live, innovative technology that is unique to the PFL in the MMA space. SMT’s OASIS Platform will provide the PFL with a seamlessly integrated system that combines live scoring with real-time biometric and positional data to enhance the analysis, storytelling and graphic presentation of the PFL’s Regular Season, Playoffs and Championship events next season.”
The PFL 2018 Championship takes place on New Year’s Eve live from The Hulu Theater at Madison Square Garden and consists of the 6 world title fights in 6 weight classes of the PFL 2018 Season. Winners of each title bout will be crowned PFL World Champion of their respective weight class and earn $1M. The PFL Championship can be viewed live on Monday, December 31 on NBC Sports Network (NBCSN) from 7 to 11 pm ET in the U.S. and on Facebook Watch in the rest of the world.
SMT To Partner with PFL to DevelopProprietary Technology to Measure Real-Time Fighter Performance Data and Analytics,Biometric Tracking Along with Innovative Graphic Enhancements for the League’s LiveLinear and Digital Events.
WASHINGTON DC (December 17,2018) The Professional Fighters League (PFL)and SMT (SportsMEDIA Technology) – theleading innovator in real-time data delivery and graphics solutions for thesports and entertainment industries – today announced an exclusive, long-termtechnology partnership. Under the terms of the agreement, SMT will partnerwith the PFL to create proprietary technology that will measure real-timeMMA fighter performance analytics along with biometric and positional data thatwill provide fans a game-changing live event experience across all platforms.
Starting in 2019, SMT will help power the PFL’s vision of thefirst-ever SmartCage™. The SmartCage™ will utilize biometricsensors and proprietary technology that will enable the PFL to measure anddeliver real-time fighter performance data and analytics, what the PFL isdubbing: Cagenomics™. PFL fans watching linear and digital broadcasts ofthe league’s Regular Season, Playoff and Championship events will experience anew dimension of MMA fight action with integration of live athlete performanceand tracking measurements including: speed (mph) of punches andkicks, power ratings, heart rate tracking, energy exerted and more.
“The Professional Fighters League is excited to be partnering withSMT to advance the sport of MMA. The PFL's new SmartCage™ willrevolutionize the way MMA fans experience watching live fights as next yearevery PFL fight will deliver unprecedented, real-time fighter performance dataand analytics, biometric tracking and an enhanced visual presentation ofthis great sport,” said Peter Murray, CEO, Professional Fighters League. “Notonly will PFL fans benefit from our SmartCage™ innovation, but our pro fighterswill now have access to new performance measurement data, analysis and tools tohelp them train and compete. The PFL’s vision has always been two-fold:deliver the absolute best experience to fans and be a fighters-firstorganization and with the SmartCage™ we will accomplish both.”
“SMTis thrilled to be collaborating with the Professional FightersLeague’s forward-thinking innovation team to bring our latest and greatesttechnology to PFL events,” said Gerard J. Hall, Founder & CEO, SMT.“Starting in 2019, PFL fans will begin to see real-time, live, innovativetechnology that is unique to the PFL in the MMA space. SMT’s OASISPlatform will provide the PFL with a seamlessly integrated system that combineslive scoring with real-time biometric and positional data to enhancethe analysis, storytelling and graphic presentation of the PFL’sRegular Season, Playoffs and Championship events next season.”
The PFL 2018Championship takes place on New Year’s Eve live from The Hulu Theaterat Madison Square Garden and consists of the 6 world title fights in 6weight classes of the PFL 2018 Season. Winnersof each title bout will be crowned PFL World Champion of their respectiveweight class and earn $1M. The PFL Championship can be viewed live onMonday, December 31 on NBC Sports Network (NBCSN) from 7 to 11pm ET in the U.S.and on Facebook Watch in the rest of the world.
###
Professional Fighters League
TheProfessional Fighters League (PFL) presents MMA for the first time inthe sport format where individual fighters compete in a regular season,playoffs, and championship. PFL Season has 72 Elite MMAathletes across 6 weight-classes, with each fighting twice in the PFLRegular Season in June, July, and August. The top 8 fighters in each weight-class advance to thesingle-elimination PFL Playoffs in October. The PFL Championship isNew Year’s Eve in Madison Square Gardens with the finals in each of sixweight classes competing for the $10 million prize pool. The PFL is broadcast live on NBC SportsNetwork (NBCSN) and streamed live worldwide on Facebook Watch. Founded in 2017, the PFL is backed by group of sports, media,and business titans. For more info visit PFLmma.com.
SMT
SMT (SportsMEDIA Technology) is the leading innovator inreal-time data delivery and graphics solutions for the sports and entertainmentindustries, providing clients with scoring, statistics, virtual insertion andmessaging for broadcasts and live events. For the past 30 years, SMT’ssolutions have been used at the world’s most prestigious live sports events,including the Super Bowl, Indy 500, Triple Crown, major golf and tennis events,MLB’s World Series, Tour de France, and the Olympics. SMT’s clients includesports governing bodies; major, regional and specialty broadcast networks;event operators; sponsors; and teams. The 32-time Emmy Award-winning company isheadquartered in Durham, N.C., with divisions in Jacksonville, Fla., Fremont,Calif., and London, England.
SMT is once again one of the busiest vendors on hand at the US Open, providing a cavalcade of technology to serve the USTA, broadcasters, spectators, athletes, and media onsite at the USTA Billie Jean King National Tennis Center (NTC). In addition to providing the much discussed serve clock, SMT — now in its 25th year at the Open — is providing scoring systems, scoring and stats data feeds, LED scoreboards, TV interfaces, IPTV systems, and match analysis.
“This event, just like any Grand Slam, is becoming a three-week event,” says Olivier Lorin, business development manager, SMT. “We have more and more recipients asking for data. Today, we’re actually sending 19 different data feeds to recipients for their own platform. Obviously, we have to get the authorization from the USTA, but then they use that for whatever.”
Countdown to the Serve
An on-court digital clock, similar to the shot clock in basketball and the play clock in football, counts down the allotted 25 seconds before a player must begin the serve (previously, the 20-second clock was visible only to the chair umpire).
After the USTA announced plans to display a countdown clock for this year’s tournament, SMT introduced the clock at ATP and WTA events leading up to the Open — most recently, in Winston-Salem, NC, and Cincinnati — to help players acclimate to it.
“The USTA has been looking to do the serve clock at the US Open for a few years, starting in 2016 with the Juniors and then the qualifiers as an experiment, which all went very well,” says Lorin. “The Australian Open and the French Open also did it in quallies, but the US Open wanted to be the first [Grand Slam] to do this for all events, and we were able to work with them to make that happen.”
The clock, visible to players and spectators alike, begins to tick down immediately after the chair umpire announces the score. The umpire will issue a time violation if the player has not started the service motion at the end of the countdown. The first time the clock hits zero before a player begins the motion, the player receives a warning. For every subsequent time, the player loses a first serve. SMT is driving umpire scoring on all 16 courts and offsite for Junior Qualifying (eight courts).
Lorin sees a benefit to TV in the five-minute warmup clock and the serve clock: “At least seven minutes [is saved], so the match is going to [end] on time more often.”
Serving the Media: IPTV and CCTV
SMT is also responsible for the infrastructure for the USTA’s CCTV, IPTV, and Media Room. The IPTV system for the Media Center at this year’s US Open is now “browser-independent.” It allows users to select and view up to five streams/videos at one time from any of the digitally encoded channels available on the 13-channel CCTV system. In addition, the system allows access to archived player interviews. The IPTV system also includes real-time scores, match stats, draws, schedule, results, tournament stat leaders, US Open history, and WTA/ATP player bio information.
“It’s a very slick interface, and the USTA has been very positive about it,” says Lorin. “Today, it is still under a controlled environment here at the US Open, but, if the US Open wanted to make this open to anybody on the outside, we could easily provide a solution for them to log in and have the same information, with the exception of live video.”
Automation Is Key to New Outer-Courts Coverage
A fixture at live-sports-broadcast compounds, SMT is also providing a variety of services to domestic-rights holder and host broadcaster ESPN, as well as other broadcasters onsite. ESPN is deploying an SMT automated-graphics interface as part of its new automated-production system for outer-court coverage, which relies on a Fletcher Tr-ACE motion-detecting robotic camera system and SimplyLive’s ViBox all-in-one production system.
An SMT touchpad at each of the 16 workstations is used only during prematch coverage. All other graphics elements, including the scorebug and lower-thirds, are fully automated, and informational elements are triggered by preconfigured settings in SMT’s data feed (for example, 10 total aces or 10 unforced errors).
“The beauty of our system is that everything is automated and driven by the score notification of the umpire’s tablet,” says Lorin. “We have built up prematch graphics so we know that, when the umpire hits warmup on the tablet, a bio page for both players and a head-to-head graphic will appear, and then they’ll go to the match. When the match starts, the system is just listening to the score notifications, and we have built-in notifications for five aces and things like that. The only thing that is manual and left to the producer for that court is the set summary and the match summary for statistics.”
Also From SMT: Prize Money Report, LED Superwall, More
This year, SMT has updated its Official Prize Money Report, in which prize money is calculated and a report generated at the end of the tournament and distributed to media officials.
SMT also provides content for the massive outdoor LED Superwall at the main entrance of Arthur Ashe Stadium, displaying scoring-system content: schedules, results, matches-in-progress scores, custom information messages (for example, weather announcements). SMT designs the scoring graphics and provides live updates.
“One of the big things is, we rebranded the US Open package for 2018 with a new logo, a new font, and a new background,” says Lorin. “As a result, we had to apply those design changes across all the platforms we are serving. One of the things we try to do more and more in the video production is, instead of having the typical headshot of a player, to integrate more action shots and motion shots, which are a lot more appealing to the design.”
Other services SMT provides to the US Open on behalf of USTA include stats entry on seven courts; serve-speed systems and content on seven courts; playback controls, including lap selector and data-point scrubbing; draw creation and ceremony; and match scheduling.
For the first time, ESPN is covering all 16 courts at the US Open, thanks to a new automated production system deployed on the nine outer courts at the USTA Billie Jean King National Tennis Center (NTC). Having debuted at Wimbledon in June, the Fletcher Tr-ACE motion-detecting robotic camera system has been deployed on each court (with four robos per court) and relies on SimplyLive’s ViBox for switching and replay and an SMT automated graphics system. With this workflow, one robotic camera operator and one ViBox director/producer is covering each of the nine courts.
“With one production room and one rack room here, we are essentially replacing what would have traditionally been nine mobile units,” notes ESPN Director, Remote Operations, Dennis Cleary. “We’ve been working on this plan for a long time, and there is just no way we would have been able to cover all these courts in a traditional [production model]. SimplyLive has been used at other [Grand Slams], and it was used with Fletcher Tr-ACE at Wimbledon but not really to this extent. We feel that we have taken it to the next level [and] are integrating it with our overall [show] and adding elements like electronic line calling and media management.”
With all 16 courts now accessible, ESPN can present true “first ball to last ball” live coverage across its linear networks and the streaming platforms (a total of 130 TV hours and 1,300 more streaming on the ESPN app via ESPN3 and ESPN+. Moreover, ESPN was able to provide the USTA with live coverage of last week’s qualifying rounds for the first time, deploying the Tr-ACE/ViBox system on five courts.
In addition, ESPN, which serves as the US Open host broadcaster, has been able to provide any rightsholder with a live feed of a player from its country — regardless of the court and including qualifying rounds.
On the Outer Courts: LiDAR Drives Fletcher Tr-ACE System
Four Fletcher robotic systems with Sony HDC-P1 cameras have been deployed on each of the nine outer courts: two standard robos (traditional high play-by-play and reverse-slash positions) and two Tr-ACE automated robos (to the left and right of the net).
“From the beginning, one of ESPN’s big focuses was increasing the camera quality of what was being done on the outer courts,” says Fletcher Sports Program Manager Ed Andrzejewski. “So we built everything around the Sony P1’s to increase the camera quality to match the main [TV courts]. When they send a feed to the rightsholder in Australia and the player they are interested is on one of those outer courts, they wanted the basic quality to be the same as in the bigger stadiums. I think we’ve been able to accomplish that.”
Between the two Tr-ACE cameras is “the puck,” which powers the Tr-ACE system at each court via a custom-designed LiDAR (Light Detection and Ranging) image-recognition and -tracking system. The LiDAR tracks every moving object on the court (the ball, players, ball kids, judges) and provides the two Tr-ACE cameras with necessary data to automatically follow the action on the court. The LiDAR can also sense fine details on each player (such as skin tone or clothing color), allowing the cameras to tell the difference between a player and other moving objects.
A Room of Its Own: Nine Mobile Units in a Single Room
ESPN has erected a dedicated production room for the Tr-ACE/ViBox operation across from its NTC Broadcast Center. Inside this room are nine workstations featuring one Fletcher Tr-ACE camera operator and one ViBox director/producer each.
The Tr-ACE operator monitors the camera coverage and can take control of any of the four cameras at any point during the match. Meanwhile, the ViBox operator cuts cameras and rolls replays. An SMT touchpad at the workstation is used only during prematch coverage. All other graphics elements, including the scorebug and lower-thirds, are fully automated, and informational elements are triggered by preconfigured settings in SMT’s data feed (for example, 10 total aces or 10 unforced errors).
“The camera op and director are constantly communicating,” Andrzejewski explains. “ESPN put a lot of trust in us with this, so we brought out the best people we could and have some of the best [robo operators] in the business here. There was a lot of onsite learning, but we were able to give everyone lots of time on the system during setup and qualifying.”
The coverage does not feature commentary, so all nine courts are being submixed out of a single audio room using a single Calrec audio console and operator.
Also inside the automated production room are a video area to shade all 36 cameras, an SMT position to manage the automated graphics systems deployed at each workstation, an electronic line-calling position (which was not available for the systems at Wimbledon), and a media-management area, which was used during qualifying to record all five courts (this operation moved to the NTC Broadcast Center once draw play began on Monday).
Since the automated-production systems had to be up and running for qualifying rounds last week, ESPN built the operation on an island entirely separate from the Broadcast Center.
“It was just too costly and just not sensible to bring the full broadcast center up a week early,” notes Cleary. “So this entire operation is all standalone. All the equipment from Fletcher, SimplyLive, Gearhouse, and even transmission is all separate and on its own.”
Two-Plus Years of Development Pays Off
Although automated production is nothing new for the US Open — Sony Hawk-Eye technology had been used for several years to produce coverage from five outside courts — this new system has expanded the ability to truly cover every ball of the tournament.
Use of the Tr-ACE/ViBox system at Wimbledon in June and now at the US Open was a long time coming. Fletcher has been developing the Tr-ACE system for 2½ years and demonstrated it offline on one court at the NTC last year. In addition to the Fletcher and SimplyLive teams, ESPN Senior Remote Operations Specialist Steve Raymond, Senior Operations Specialist Chris Strong, and Remote Operations Specialist Sam Olsen played key roles in development of the system and its implementation this week.
“This is certainly a new workflow for us, so a lot of thought and time went into it before we deployed it,” says Olsen. “We felt that the ViBox and the Tr-ACE would certainly give us the ability to produce a high level of content using an automated [workflow], and it’s worked out really well thus far. Having it for the qualifying rounds for the first few days also served as a great test bed. I think the best way to put it is, we’ve grown into it and we’ll develop it and take it to higher level each time we use it.”
By Jason Dachman, Chief Editor, SVG
Thursday, August 2, 2018 - 2:52 pm
After a move from Los Angeles to Madison, WI, prior to last year’s event, the CrossFit Games production operation has continued to grow prodigiously. The “Woodstock of Fitness” has grown from a production comprising 35 crew members working out of a single mobile unit just six years ago to one of the largest live productions on the annual sports calendar: more than 10 NEP mobile units, a crew of more than 300, and 50-plus cameras. Add in the fact that the CrossFit competitions change from year to year, and it becomes clear just how challenging the event can be for the production team.
This year’s CrossFit Games — Aug. 1-5 at the Alliant Energy Center in Madison — are being streamed on Facebook, CBSSports.com, and the CBS Sports App and televised live on CBS (one-hour live look-ins on Saturday and Sunday plus a recap show) with a daily highlights show on CBS Sports Network.
CrossFit has its own live-streaming team onsite and handles in-house production for the videoboards at Alliant Energy Center. SMT, which is CrossFit’s scoring partner, provides a wealth of presentation options for the boards as well.
CrossFit has used TVU Networks bonded-cellular and IP systems for several years for point-to-point transmission. This year, CBS Digital also used a TVU system to take in streams from the CrossFit Regionals earlier this summer. That success led to a similar partnership for the Games, with CBS Digital receiving all the live competitions on two streams via TVU receivers.
As CrossFit Games’ Footprint Grows, So Does the Live Production
The Games themselves have expanded and become more complex. The production team is tasked with covering multiple venues throughout Alliant Energy Center, primarily The Coliseum and North Park Stadium. This year, the stadium has been expanded to 10,000 people (nearly 50% more than for the 2017 edition) and has added a new videoboard.
July 18, 2018
Sports Video Group
SMT was back at MLB All-Star in Washington, providing Fox Sports its live virtual–strike-zone system and, for the 14th consecutive year, virtual signage.
SMT rendered the virtual–strike-zone graphic, as well as the watermarks when viewers saw the ball cross the plate.
SMT’s Peter Frank was on hand at 2018 MLB All-Star to support Fox Sports’ virtual efforts.
SMT handled virtual signage behind the plate for Fox’s Camera 4 (the primary pitcher/batter camera) and tight center field. For the third year in a row, the company also integrated its system with the high-home position, inserting virtual signage on the batter’s eye in center field.
“We use stabilization for virtual signage on the main camera, which is used for the virtual strike zone, so that helps out with the stability of both graphics,” said SMT Media Production Manager Peter Frank. “Two years ago at MLB All-Star in San Diego was the first time we did [virtual signage on] the batter’s eye, and Fox was really happy with it. So we also brought it back in
July 5, 2018
Sports Video Group
After a successful pilot game last year, the American Flag Football League (AFFL) is back in action this summer with the U.S. Open of Football (USOF) Tournament. The final 11 games of the tournament kick off NFL Network’s AFFL coverage, and the network is embracing the “Madden-style” coverage and the production elements it debuted last year, including using a SkyCam as the primary game angle, deploying RF Steadicams inside the huddle, rolling out customized SMT virtual graphics across the field, and miking players throughout the game.
“After last year’s pilot show, there was a lot of great feedback. Everybody liked the football on the field and the direction the technology was going,” says Johnathan Evans, who served as executive producer and director of last year’s production and is directing the NFL Network telecasts this year. “So our coverage is going to be almost exactly the same as last year, with a few differences since we are doing 11 games instead of just one. We have come up with a great formula that hasn’t been tried on a consistent basis before and offers a different perspective from watching a [traditional] football broadcast. With [AFFL], you’re watching from the quarterback perspective; you’re watching it just like you’re playing a Madden NFL [videogame].”
How It Works: Breaking Down the AFFL FormatThe 12 teams featured in the USOF Playoffs are composed of eight amateur squads in the America’s Bracket (derived from four rounds of play that began with 128 teams) and four teams captained by celebrities in the Pro Championship Bracket. NFL Network’s USOF coverage began with the America’s Bracket Quarterfinal last weekend from Pittsburgh’s Highmark Stadium and continues with the semifinals this weekend at Atlanta’s Fifth Third Bank Stadium, the America’s Bracket Final and Pro Bracket Final on July 14 at Indianapolis’s Butler Bowl, and the $1 million Ultimate Final (featuring both bracket champions) on July 19 at Houston’s BBVA Compass Stadium.
The 11 AFFL telecasts on NFL Network will feature SMT virtual graphics, including the Go Clock.
The 11 AFFL telecasts on NFL Network will feature SMT virtual graphics, including the Go Clock. The 7-on-7, no-contact 60-minute AFFL games feature many of the same rules that average Americans know from their backyard games. The same players are on the field for both offense and defense, and a team must go 25 yards for a first down. There is no blocking; instead, a “Go Clock” indicates when the defense can rush the QB (after two seconds) and when the QB must release the ball or cross the line of scrimmage (four seconds). There are also no field goals (or uprights, for that matter), and kickoffs are replaced with throw-offs.
“This is not only a sport that creates a lot of intensity and energy; it’s also a sport that you as an average person can relate to because you’re watching an average person play the game,” says Evans. “You’re not watching professional athletes. You’re watching amateurs playing a sport that you can play at home. That is something that every single viewer can relate to.”
Inside the Production: It’s All About Access By using the SkyCam for play-by-play, RF Steadicams on the field, and player mics, the AFFL and NFL Network are focused on providing fans unprecedented up-close-and-personal access to the action on the field.
“We’re most excited about having SkyCam as our game camera, which really adds a different perspective, and also having everybody miked up so we can hear everything that’s going on and listen in,” says producer Tom McNeely. “We’re focused on making [viewers] feel like they’re right there on the field with these guys. Bringing them into the huddle with our cameras and microphones — we will have somebody sitting in the truck with a mute button in case the players are a little rambunctious — is going to make this really appealing and fun.”
The upcoming NFL Network AFFL productions will deploy Game Creek Video mobile units and feature an average of eight cameras: the SkyCam system, two traditional 25-yard-line angles for isos, a mid-level end-zone angle, one handheld high-speed camera, a jib on a cart roving the sidelines, and two RF cameras (Steadicam and a MōVI).
“The only new cameras we are adding is a second [RF camera] so we can cover both sides of the football,” says Evans. “Last year, we had only one Steadicam, which was great, but I realized that we were losing the intimacy on both sides of the ball. Before you get to the red zone, it’s great to be inside the huddle and see from behind the quarterback on the offensive side of the ball. But, once you get to the red zone, you need to get ready for a touchdown, so you have to switch your Steadicam to the defensive side of the ball, and you hope to get a touchdown in the end zone. This time, in Indianapolis and in Houston, we’re going to have a Steadicam on both sides of the ball to retain the potential atmosphere for every single play. Before the snap, during the snap, and after the snap, you’re going to have that great intensity right in your face the entire time.”
Go Clock Returns; Interactive Line of Scrimmage DebutsThe Go Clock, designed by SMT specifically for the fast-paced AFFL, is also back after playing a major role in defining the league’s production style during its pilot game. The system synchronizes with in-stadium displays to indicate when the defense can rush the quarterback.
“The Go Clock was a big success, and we’re bringing it back this year,” says Evans. “We’re also introducing a line of scrimmage that will change color when [the defense] is able to rush. So the virtual graphics are still there and play a big role [in the production].
The same SMT virtual 1st & Ten line used in NFL broadcasts will be deployed from the company’s Camera Tracker system, working in tandem with SkyCam to give viewers the “Madden-style” play-by-play angle used several times by NBC Sports last NFL season.
SMT’s Design Studio also designed and implemented the AFFL graphics package — including show open and score bug — and the virtual-graphics package.
SMT’s clock-and-score technology is made available via its dual-channel SportsCG, a turnkey graphics-publishing system that allows greater autonomy via a second-channel laptop that can be operated remotely. In addition to producing the score bug, the SportsCG offers real-time, in-game offensive and defensive statistics powered by SMT QB Stats (the same system used for NCAA and NFL games).
In addition to the virtual elements, the AFFL has enhanced the physical first-down marker used on the field, so that it digitally displays the down, play clock, game clock, and possession arrow. The system also emits an audible alert when the rusher can break the line of scrimmage after two seconds and when the quarterback has to throw the ball after four seconds.
Beyond the Tech: Storytelling, NFL Network IntegrationAside from the production elements, the AFFL also offers a host of great storytelling opportunities surrounding the squads of Average Joes on the field. McNeely, who knows a thing or two about telling the stories of unknowns on the field, having produced a dozen Little League World Series for ESPN, sees the AFFL as a one-of-a-kind storytelling opportunity.
“These aren’t pro names or pro teams; you’re starting from scratch telling those stories. There are a lot of great stories and personalities with layers — [such as] a 50-year-old, 5-ft.-8 quarterback with a potbelly leading the team from Tennessee or one of the amazing athletes who fell short of the NFL but played in the CFL or the Arena League,” says McNeely. “When I first met [AFFL CEO/founder] Jeff Lewis, who has worked so closely with Jonathan and all of us to develop this, he mentioned what a huge fan he was of Little League World Series. And he promised us all the access we needed so that we would be able to tell introduce these players and tell their stories.”
NFL Network’s commitment to the AFFL goes well beyond just televising 11 games, however. Not only do the telecasts feature NFL Network talent like Good Morning Football’s Kay Adams (serving as sideline reporter throughout the tournament) and NFL Total Access host Cole Wright (calling play-by-play on July 14), the network is also incorporating AFFL segments into its daily studio programming, social-media channels, and digital outlets in an effort to appeal to football-hungry fans during the NFL offseason.
“We really feel like there’s a huge opportunity here during the summer, when the NFL really has nothing going on,” says McNeely. “We’re excited to see some traction with social media and on the NFL Network. They are doing a lot to promote [the AFFL] on their studio shows, and we’re hoping it takes off. I think there will be a grassroots push for this similar to what you’ve seen with the Little League World Series.”
June 29, 2018
Sports Video Group
While the broadcast debut of Dale Earnhardt Jr. in the NASCAR on NBC booth is creating plenty of buzz around NBC’s first races of the season this weekend at Chicagoland Speedway, the uber-popular retired driver isn’t the only new addition to the network’s NASCAR coverage this year. Echoing its rink-side “Inside the Glass” position on NHL coverage, NBC will debut the Peacock Pit Box – a remote studio set built within a traditional pit box frame that will be located along pit road for pre- and post-race coverage at each speedway throughout the season.
NBC will debut the Peacock Pit Box – a remote studio set built within a traditional pit box located on pit road – for its NASCAR pre/post-game shows
“The Peacock Pit Box is going to put us in the middle of the action,” says NBC Sports Group Executive Producer Sam Flood. “We’ve had the big set down on the grid for the first three years of [our NASCAR rights] contract. We realized that sometimes the fans departed from that area as we got closer to race time and took away some of the sense of place. So the idea was to have a real sense of place throughout the day, starting with the pre-race show. And most importantly, it gives us a place inside that mayhem that is pit road, which has become one of the most exciting places at the racetrack each week.”
Inside the Peacock Pit Box: Two Levels With Plenty of Tech FirepowerThe 14-ft.-long x 12.5 ft.-wide Peacock Pit Box (a normal-sized NASCAR pit box is 10×8 ft.) features two-levels and is located in a traditional pit box right along pit road. In addition to serving as the home to NASCAR on NBC’s pre-race coverage throughout the season, the structure also features an arsenal of robotic cameras that will aid in NBC’s coverage of pit road throughout each race.
“Sam [Flood] and Jeff [Behnke, VP, NASCAR production, NBC Sports Group] first had the vision and then there were a lot of great creative and technical people that helped to bring it to life,” says NBC Sports Technical Manager Eric Thomas. “They wanted to give our announcers a uniqe vantage point of the field of play – and that’s obviously pit lane. It’s like the 50-yard line in football or center ice in hockey. Our [announcers] will have an elevated position between all the teams right in the middle of the action, so they not only can see the racetrack but also see the competitors on either side of them.”
The NASCAR on NBC team worked with the NBC Sports Group design team in Stamford, CT, to design the Peacock Pit Box, while Nitro Manufacturing built the structure and Game Creek Video provided technical support and equipment.
The top level of the Peacock Pit Box will serve as the primary home from NBC Sports’ Monster Energy NASCAR Cup Series and Xfinity Series pre- and post-race coverage, with host Krista Voda and analysts Kyle Petty and Dale Jarrett occupying the desk. One handheld and three robotic cameras will be on hand for pre/post-race shows.
The 14-ft.-long x 12.5 ft.-wide Peacock Pit Box (a normal-sized NASCAR pit box is 10×8 ft.) features two-levels and is located in a traditional pit box right along pit road.
“It’s a nice dance floor that can support our announcers and various different configurations,” says Thomas. “We have to work within the space of the pit stall, which depends on the track. We have neighbors on either side of us, so we want to really be respectful of the teams and not interfere with them whatsoever. So we’re going to fit in our space very neatly and very cleanly without having an impact on the actual event. We wanted to make it as big as we could to make our announcers as comfortable as possible and also provide the technical equipment to produce a quality show.”
Meanwhile, the lower level of the Pit Box will provide additional broadcast positions with two wired cameras and an occasionally an RF camera and/or a small jib (depending on the size of pit box at each track). The space features interactive displays and a show-and-tell position for analysts like Daytona 500-winning crew chief Steve Letarte to deliver deeper analysis of the track action.
“The technology will be there for Steve to [provide deeper analysis], particularly in the Xfinity races, where he’s going to be hanging down on pit road in a pit box, restarting his old career of looking at the race when you only can see half the racetrack on pit road,” says Flood. “We think by [locating] Steve [there], it will give him more opportunity to focus that unique mind of his on what the heck all the other cars are doing on the track. So we see that as a huge advantage.”
The lower level also features a patio position where NBC will look to conduct interviews with drivers, pit crew chiefs, owners, and NASCAR officials throughout its race coverage.
All About Flexibility: Nine Robo Positions Give NBC Plenty of OptionsSince NBC’s pre- and post-race setup will vary week-to-week depending on the track, Thomas and company were tasked with making the Peacock Pit Box as versatile as possible. With that in mind, the upper level features nine different robotic camera positions. Three robos can be deployed at a time and – thanks to the small, lightweight cameras and custom-developed camera mounts deployed on the Pit Box – the operations team can quickly swap camera positions at any time during NBC’s coverage.
Beloved NASCAR driver Dale Earnhardt Jr., who retired after last season makes his broadcast debut as NASCAR on NBC Analyst this weekend at Chicagoland.
“If our director wants to change the shot or we want to totally rotate 180 degrees, we can do that in about 10 minutes,” says Thomas. “If we want to do a show with the track in the background first and then, a few minutes later, we want to look toward the garage with a different set of announcers, we can move the cameras quickly and make that happen. So it’s very flexible.”
In addition to being used for pre- and post-race studio coverage, these robos will be utilized for coverage of the action on pit road throughout NASCAR on NBC telecasts.
“The cameras are going to pull double duty because, if something’s going on in pit lane, those cameras are still going to physically be there. So they are going to give us some different angles that we haven’t seen very much of in the past,” says Thomas. “We’ve tried to create as much flexibility as possible so when Sam and Jeff ask, ‘can we do this?’, then we can say, ‘of course you can.’”
BatCam Returns: Aerial System Headlines NBC’s Army of CamerasNBC Sports will deploy an average of 55 cameras – including the return of the BatCam point-to-point aerial system to cover the backstretch – on big races at Daytona, Indianapolis, and Homestead-Miami this season. Thomas also expects to use BatCam, which debuted last year and can hit speeds of over 100+ mph, at the Watkins Glen road course this year. The BatCam also drew rave reviews throughout NBC’s Triple Crown coverage this past spring.
NBCS Sports is bringing back the BatCam point-to-point aerial system will to cover the backstretch at NASCAR races
The bulk of NBC’s camera complement for NASCAR is made up of Sony HDC-4300’s along with a mix of robos (provided by Robovision) and roving RF cameras. BSI will once again be providing eight RF in-car-camera dual-path systems, which allow two angles to be transmitted from each car at any given moment. Thomas also says his NASCAR on NBC team is currently experimenting with several new camera positions, which he expects to roll out throughout the season.
Going Inside the Action With New Graphics, Analysis ToolsNBC is utilizing SMT’s tools for the fourth straight NASCAR season. This year, the SMT race crawl has been updated to show the live running order and driver statistics at the traditional position on top of the screen and in a new vertical pylon display on the left side. The multiple options provide production with a variety of ways to allow fans to track each driver.
Also new this year is the SMT GOTO interactive touchscreen display, which provides several tools NBC can use throughout each race weekend, giving on-air analysts the ability to telestrate highlights, compare drivers and statistics, and interact with fans on social media.
SMT’s new Broadcast Analytics system has also been added to help enhance the coverage. The system live tracks all the cars during each session and allows production to show a virtual replay of any lap run during practice, qualifying and the race. The system allows production to visualize any lap run by any driver. It can provide a combined display of how a single driver ran on different laps, showing changes they’ve made during the session. The system can also show how different drivers ran the same lap. All of these options will allow fans to see the key moments during each session and better understand how that impacted where each driver finished.
In the Compound and Back Home in StamfordGame Creek Video’s PeacockOne (A and B units) will once again serve as the home to the NASCAR on NBC production team on-site, while an additional pair of Game Creek trucks will house mix effects and editing, as well as robo operations and tape release. In all, NASCAR truck compounds will be stocked with an average of 19 trailers (including BSI, Sportvision, NASCAR operations, and more).
“NASCAR does a great job setting up the compounds for us and providing a beautiful sandbox for us to play in,” says Thomas.
In addition, the NBC production team continues to increase rely more and more on file-sharing with the NBC Broadcast Center in Stamford, CT. AT&T and PSSI have partnered established fiber connectivity at the majority of the NASCAR tracks and will provide NBC with a circuit back to Stamford for file-transfer, as well as home-running individual cameras for at-home productions. Pre- and post-race shows from the Peacock Pit Box will regularly send back cameras to a control room in Stamford, where the show will be produced.
“We started [producing shows out of Stamford] last year and we will expand it more this year,” says Thomas. “It worked well last year and we’re making some improvements this year to make it even more seamless. With the increased support from AT&T and PSSI for network connectivity, I think it’s going to be even better this year. Obviously there are big cost savings on travel [as a result], but the product is of the same quality – so it’s really a win-win.”
SMT (SportsMEDIA Technology) continues its collaboration with the American Flag Football League (AFFL) to provide game management technology for the AFFL’s first U.S. Open of Football Tournament (USOF). The teams playing in the Ultimate Final at BBVA Compass Stadium in Houston will battle for a $1 million cash prize. SMT technical teams will be onsite at the USOF Tournament for every game, providing the customized virtual and clock-and score technology and graphics package that helped to define the league last year during its launch on June 27 at Avaya Stadium. Retired NFL stars return to the field to captain the teams, along with basketball legends and an Olympic gold medalist. SMT’s virtual 1st & Ten line system, used in NFL broadcasts, will be deployed from its Camera Tracker system, working in tandem with SkyCam to give viewers the “Madden-style” play-by-play angle used during NBC Sports’ 2017 season. SMT’s virtual Go Clock, designed specifically for the fast-paced AFFL, will synchronize with in-stadium displays to indicate when the defense can rush the quarterback.
SMT’s Design Studio designed and implemented the AFFL graphics package — including show open and score bug — and the virtual-graphics package. SMT’s clock-and-score technology is made available via its dual-channel SportsCG, a turnkey graphics publishing system that allows greater autonomy via a second channel laptop that can be operated remotely. In addition to producing the score bug, the SportsCG offers real-time, in-game offensive and defensive statistics powered by SMT QB Stats, the same system SMT uses for NCAA and NFL games. “SMT is proud to have helped the AFFL launch a new sports era, and we are thrilled to build on last year’s great success by offering flag football fans the same platform they’re used to when watching college and NFL games,” says Ben, SMT Business Development Manager. “With the debut of our dual- channel SportsCG, we can decrease the production bottleneck associated with rendering graphics on-air, allowing the quickly developing storylines to be told in a more dynamic way.”
June 17, 2018
Sports Video Group
The 2018 U.S. Open from Shinnecock Hills Golf Club gave the Fox Sports team challenges in production planning that led to innovations, the opportunity to refresh old workflows and core infrastructure, and a chance to chart some new directions for golf coverage.
The front-bench area in Game Creek Video’s Encore truck is at the center of Fox Sports’ U.S. Open coverage.
Game Creek Video’s Encore production unit is at the center of the coverage for Fox and FS1 with Game Creek Pride handling RF-video control and submix and providing a backup emergency control room. Pride’s B unit is handling production control for one of the featured groups, Edit 4 is handling all iso audio mixes, and Edit 2 is home to five edit bays with equipment and support provided by CMSI. And there is also the 4K HDR show, which is being produced out of Game Creek Maverick.
“All the Sony 4300 cameras on the seventh through 18th greens are 4K HDR-native with a secondary output at 720p SDR,” says Brad Cheney, VP, field operations and engineering, Fox Sports. There are also six Sony PXW-Z450’s for the featured holes and featured group, the output of two of them delivered via 5G wireless.
“We are producing two 4K HDR shows out of one mobile unit with four RF-based 4K cameras,” he adds. “That is another big step forward.”
In terms of numbers, Fox Sports has 474 technicians onsite, making use of 38 miles of 24-strand fiber-optic cable to produce the event captured by 106 cameras (including 21 wireless 1080p, 21 4K HDR units, six 4K HDR wireless, three Inertia Unlimited X-Mo cameras shooting at 8,000 fps, a Sony HDC-4800 at 960 fps, and three Sony HDC-4300’s at 360 fps) and 218 microphones. Tons of data is being passed around: 3 Gbps of internet data is managed, along with 83 Gbps of broadcast data, 144 TB of real-time storage, and 512 TB of nearline storage.
A Second CompoundEach course provides its own unique challenges. At Shinnecock Hills, there is is the presence of roads running through the course, not to mention the hilly terrain, which also has plenty of deep fescue. But, from a production standpoint, the biggest issue was the small space available for the compound.
Director, Field Operations, Sarita Meinking (left) and VP, Field Operations and Engineering, Brad Cheney are tasked with keeping Fox Sports’ U.S. Open production running smoothly.
“We came out here 18 months ago,” says Cheney, “and, when we placed all of our trucks in the compound map, [they] didn’t fit, and that is without the world feed, Sky, TV Asahi, and others. At Erin Hills last year, we had a support tent, and that gave our camera crew more space, dry storage, and a place to work.”
The decision was made to expand on what was done at Erin Hills last year: move the production operations that most benefit from being close to the course to a large field tent located along the third hole. The field tent is about a half mile from the main compound and is home to the technology area (shot-tracing technologies, etc.); the camera, audio, and RF areas; and the robotic cameras provided by Fletcher. Inertia Unlimited President Jeff Silverman is also located in the tent, controlling X-Mo cameras as well as robotic cameras that can be moved around the course to provide different looks.
Cheney says the team took the field tent to a new level by providing an integrated source of distribution and monitoring so that it could effectively be an island to itself. “It has worked out well. People are comfortable there. It’s dry and offers direct access to the course.”
According to Michael Davies, SVP, technical and field operations, Fox Sports, some of the operations in the field tent, such as those related to enhancements like shot tracing and the Visual Eye, could ultimately move even farther from the main compound.
“Typically, they would be in the main compound,” he explains, “but, once we figured out how to connect the two compounds via fiber for a half mile, it [indicates] how far away you can put things [like the shot-tracking production]. It gets the mind going, especially for events like this that can be hard to get to.”
Fox Fiber Technician Bryce Boob (left) and Technical Producer Carlos Gonzalez inside the fiber cabin
Also located closer to the course is the fiber cabin, a move that allows the team to more quickly deal with any connectivity issues on the course. The 37 miles of fiber cable used across the course is monitored in the cabin, and Carlos Gonzalez, technical producer, Fox Sports, and the team troubleshoot and solve any issues.
“We’re isolated from the compound, which can make it a challenge,” he notes, “but we are actually liking it.”
Cheney says that placing the cabin closer to the course means a reduction in the amount of outbound fiber and also makes the operation a true headend. “It’s something that we will continue to do at Pebble next year [for the 2019 U.S. Open] because of the setup there. This has been another good learning experience for us.”
Steps ForwardOne big step taken in preparation for the 2018 events was that the IP router in Encore was rebuilt from scratch.
“All of the programming in the router was there since day one [in 2015], and we have found new ways to do things,” says Cheney. “To strategically try to pull things out of it just wasn’t worth it. So we started from zero, and it paid off in terms of how quickly we could get up and running.”
Also playing an important part in enhancing the workflows was CMSI and Beagle Networks, which made sure networks and editing systems were all ready to go.
“The team from CMSI and Beagle Networks has been phenomenal in wiring up our networks and making sure it’s robust and all-encompassing,” says Cheney. “We also figured out new ways with IP to control things, move signals, and offer better control for our operators no matter where they are.”
RF wireless coverage this year is being provided completely by CP Communications. There are 26 wireless cameras on the course plus 18 wireless parabolic mics and nine wireless mics for talent on the course. All the signals are run via IP Mesh control systems, and CP Communications also provided all the fiber on the course.
The 5G setup includes a 5G cell mounted on the tower connected to processing gear on the back of a buggy.
Fox Sports is at the forefront of wireless innovation, working with Ericsson, Intel, and AT&T on using next-generation 5G wireless technology to transmit 4K HDR signals from Sony PXW-Z450 cameras to the compound. The 4K cameras are wired into an Ericsson AVP encoder, which sends an IP signal to an Intel 5G MTP (Mobile Trial Platform), which transmits the signal in millimeter wave spectrum via a 28-GHz link to a 5G cell site mounted to a camera tower. That cell site is connected to the Fox IP Network and, in the production truck, to an Ericsson AVP that converts the signal back to baseband 4K.
The potential of 5G is promising, according to Cheney. First, the delay is less than 10 ms, and, conceptually, a 10-Gbps (or even 20-Gbps) 5G node could be placed in a venue and the bandwidth parsed out to different devices, such as cameras, removing the need for cabling.
“You can fully control the system as a whole versus allowing direct management on the device level,” he says.
And, although the current setup requires a couple of racks of equipment, the form factor is expected to get down to the size of a chip within a year.
Expanding InnovationIn terms of production elements, Fox Sports’ commitment to ball-tracing on all 18 holes continues in 2018, with the network equipping each tee box with Trackman radar technology. Eight holes are equipped to show viewers a standard ball trace over live video, with enhanced club and ball data. The other 10 holes have Fox FlightTrack, a live trace over a graphic representation of the golf hole, offering more perspective to the viewer.
Beyond tee-shot tracing, three roaming RF wireless cameras are equipped with Toptracer technology, providing trace on approach shots. And new this year is FlightTrack for fairway shots on two holes, Nos. 5 and 16.
Zac Fields, SVP, graphic tech and innovation, Fox Sports, says the goal next year is to expand the use on fairways. “We want to do more next year and also find a way to use that on taped shots as well.”
Virtual Eye, the system at the core of FlightTrack that takes a 3D model of a hole and uses shot data from SMT as well as from the Trackman and Top Tracer shot-tracking systems to show the ball flight within the 3D model, has also been expanded. The Virtual Eye production team began its U.S. Open preparation a couple months back by flying a plane over the course and capturing photos to map the topography. Then, a few weeks ago, a helicopter shot video of the course, and pictures were extracted from the video and laid over the topographical images.
The FlightTrack team is located inside the field tent, making it easier to hit the course and fix any issues related to shot-tracking technology.
One of the goals, says Ben Taylor, operations manager, Virtual Eye, has been to make the system more automated and to allow it to be used on taped shots. For example, the EVS-replay users themselves can now trigger Virtual Eye to be active with the push of a button. And, when the ball comes to a rest, the graphic slides off the screen.
“The system will reset in the background after the shot,” he notes.
Fields and the Fox team have been happy with the performance, particularly the ability for EVS operators to control the graphic overlay. “It’s pretty slick,” he says. “The system takes the EVS feed and runs it through the graphics compositor and then back into the EVS, so the EVS system is recording itself. It seems complex, but, once the operator gets used to it, it’s easy. And now they can do FlightTrack a lot more.”
When Fox Sports took on the challenge of the U.S. Open in 2015, the industry watched to see how it would change the perception of golf coverage. Four U.S. Opens later, it is clear that the innovative spirit that has been part of Fox Sports since its early days continues unabated, especially as the era of sports data takes hold of the visualization side.
“We want to bring the CG world into our coverage and create animations to tell stories like comparing every tee shot a player took on a certain hole or comparing Dustin Johnson’s fade with another player’s draw,” says Fields. “And now we can show how the wind will affect a shot.”
June 8, 2018
Sports Video Group
With the second Triple Crown in just four years on the line, NBC Sports Group is pulling out all the stops for coverage of this weekend’s 150th Belmont Stakes. With Justify poised to capture the final gem of the Triple Crown, NBC Sports Group has boosted its production complement, adding a second onsite studio set, live pointer graphics to identify Justify on the track, and five additional cameras, including the Bat Cam aerial system that drew rave reviews at both the Kentucky Derby and the Preakness Stakes.
“Once Justify won Preakness, we knew what we were in for, and we started putting everything in motion right away,” says Tim Dekime, VP, operations, NBC Sports Group. “The [equipment levels] were increased a good bit, and we added all the bells and whistles. It means a lot more work and preparation, but it’s very exciting for us, and we are very well-prepared.”
All Eyes on Justify: More Cameras and Virtual Tracking Graphics NEP’s ND1 (A, B, C, and D units) mobile unit will once again be on hand to run the show, with a total of 43 cameras deployed — up from 33 for last year’s non-Triple-Crown race. Besides the Bat Cam aerial system covering the backstretch, the camera arsenal includes a Sony HDC-4800 4K camera (outfitted with a Canon UHD 86X lens) on the finish line, five HDC-4300’s running at 6X slo-mo and five more running at 60 fps, 14 HDC-2500’s (eight hard, six handheld), five HDC-1500’s in a wireless RF configuration (provided by BSI), a bevy of robos (provided by Fletcher) and POVs, and an aerial helicopter (provided by AVS weather permitting).
Ready for a Triple Crown effort at Belmont: (from left) NEP’s John Roché and NBC Sports Group’s Keith Kice and Tim Dekime
Five other cameras have been added because of the Triple Crown possibility: a POV camera at Justify’s gate and one in the PA booth with announcer Larry Collmus (which will be streamed live on the NBC Sports App), a robo to capture a 360° view of the paddock, an additional RF camera roaming the grounds, and, most notably, the Bat Cam system.
In addition to more cameras, NBC plans to use SMT’s ISO Track system to identify Justify with a virtual pointer graphic live during the race. The system will incorporate real-time data — speed, current standing, and distance from finish line — into the on-air pointer graphic, helping viewers follow Justify and other key horses throughout the day’s races.
“We’ll have a live pointer that tracks Justify during the race that our director [Drew Esocoff] will insert, if needed, [so] the horse will be tracked for the viewers watching at home,” says Coordinating Producer Rob Hyland. “It will have a little arrow pointing to where he is at certain points in the race.”
Bat Cam Covers the Back StretchThe Bat Cam was a hit at both Churchill Downs and Pimlico, providing a never-before-seen view of the backstretch and also coming in handy when rain and fog complicated matters for NBC at both the Derby and the Preakness. The two-point cable-cam system can travel 80 mph along the backstretch, running 15-18 ft. above the ground.
“NBC had already used the Bat Cam on NASCAR, so we knew what to expect at the Derby, and it was just a matter of figuring out how to implement it into our show,” says Keith Kice, senior technical manager, NBC Sports. “It’s turned out to be a great [tool for us], especially at [the Preakness]. Even if it wasn’t for all the fog, the infield [at Pimlico] with all the tents and stages and infrastructure makes it very difficult; you really need the Bat Cam just to cover the backstretch because you can’t see it otherwise.”
Given the massive size of the Belmont track, the Bat Cam will cover more ground than at either of the two prior races but will not cover the entire backstretch. The system will run 2,750 ft. — more than 700 ft. longer than at the Kentucky Derby, 500 ft. longer than at the Preakness Stakes — of the 3,000-ft. backstretch.
“The length of the backstretch was definitely a challenge in getting the Bat Cam unit [installed],” says Dekime. “But the benefit here as opposed to Preakness is that there’s nothing in the infield the way that it’s one big party at Pimlico. We are unencumbered, so that’s a positive. The length of the backstretch was a challenge in getting the Bat Cam units to cover most of the backstretch.
Although NBC and the Bat Cam team were forced to bring in larger cranes at Belmont in order to install the longer system, says NEP Technical Director John Roché, setup and operation of the Bat Cam has improved significantly since the Derby.
“It’s no longer a science experiment like it was before,” he says. “We’re able to get [Bat Cam owner/operator] Kevin Chase all the gear that they need, and they are able to give us what we need pretty easily in terms of terminal gear, intercoms, and everything. It’s pretty much plug-and-play now.”
Hyland adds that the Bat Cam “will not only cover the backstretch of the race but will also provide dramatic reset shots of this vast facility. When the Triple Crown is on the line at Belmont, the energy in this venue is electric, and we want to capture the sense of place.”
Triple Crown Chance Warrants Double the SetsBesides additional cameras because of the Triple Crown potential, NBC Sports has also added a second studio set. Host Mike Tirico and analysts Randy Moss and Jerry Bailey will man the 18- x 18-ft. set at the finish line, and a secondary 24- x 24-ft. stage located near Turn 2 will feature host Bob Costas and other on-air talent.
“If it was not going to be a Triple Crown, we would likely be down to just the finish-line set,” says Dekime, “but, now that it is, we’ve put the Turn 2 set back into operation.”
SMT’s Betting and Social Media GOTO videoboard will also be situated at the main set for handicapper Eddie Olczyk, who will use the interactive touchscreen for real-time odds and bet payouts for all races throughout the day. The touchscreen technology and betting touchscreen will enable him to explain and educate the viewers on how he handicaps specific races.
In addition to the onsite sets, NBC plans to incorporate several live remote feeds into the telecast, including from Churchill Downs.
“We brought out all of the tools to showcase the Triple Crown attempt, including a number of remotes that will carry live shots from Churchill Downs, where it all began five weeks ago,” says Hyland. “There will be hundreds of people gathered watching the race. We may have a live remote shot from a Yankees-Mets game just a few miles away. We’re working on a couple other fun ones as well, just to showcase this day and this athletic achievement, should it happen.”
Looking Back at a Wet and Wild Triple Crown CampaignAlthough the horse-racing gods have granted NBC the potential for a Triple Crown this weekend — and the big ratings that go along with it — the weather gods have not been so kind. After the wettest Kentucky Derby on record and the foggiest Preakness Stakes in recent memory, a chance of rain remains in the forecast for Saturday. However, Roché notes that the proliferation of fiber and the elimination of most copper cabling onsite has significantly reduced weather-related issues.
“Despite torrential downpours on the first two races, we’ve been really fortunate,” says Roché. “And no matter what happens here [in terms of rain], we’re getting a little spoiled having two Triple Crowns in [four] years after a 37-year drought. For us to be able to have an opportunity to show the public how we cover racing, especially with the addition of Bat Cam, in a Triple Crown situation is really an honor.”
Kice seconds that notion: “Having a Triple Crown [in play] makes all the hard work and troubles we went through with the weather and logistics on the first two races even more worthwhile.”
June 6, 2018
Sports Video Group
SMT will provide fan-engagement technology solutions for NBC Sports Group’s broadcast of the 150th Belmont Stakes. This year marks the eighth consecutive Triple Crown collaboration between SMT and NBC Sports Group and is particularly exciting as Justify seeks to become only the second horse since 1978 to win a Triple Crown.
Much like the Preakness Stakes and the Kentucky Derby, SMT’s suite of products will engage viewers from gate to finish with real-time, data-driven graphics, up-to-the-second odds, and commentator analysis.
SMT’s Live Leaderboard System highlights the running order of the top six horses using positional data updated 30 times per second per horse, ensuring accuracy and speed for SMT’s on-air graphic presentation.
SMT’s ISO Track system identifies the horses and incorporates real-time data such as speed, current standing, and distance from finish line into an on-air pointer graphic, helping viewers follow the action during the race.
SMT’s ticker produces an on-air display of real-time odds and bet payouts using live data from the race’s Tote provider (in-house wagering system). The ticker also curates and visually displays social media feeds that give followers an inside look at happenings at the track.
SMT’s Track Map System gives viewers a display of the lead horse’s real-time position and split times via an on-screen graphic.
SMT’s Betting and Social Media GOTO video board features real-time odds and bet payouts for all the races throughout the day. The system provides an interactive system for talent to explain the process of horse wagering.
The Data Matrix Switchboard (DMX) provides a customized solution for each Triple Crown race, absorbing, collating, and synchronizing live data feeds into SMT’s proprietary horse racing database. The DMX integrates live data for on-air and off-air graphics in real-time and replay modes, enhancing NBC’s live race presentation and pre and post race analysis. These displaysalso feature real-time advanced odds and minutes-to-post countdowns.
“With a Triple Crown in play for the second time in four years, SMT has another unique chance to help document a historic moment,” says Ben Hayes, Manager, Client Services, SMT. “Our systems help novice race fans understand the core aspects of the sport, while also providing in-depth betting and live race analysis for racing aficionados.”
April 24, 2018
Golf Channel
World No. 1 Justin James, Defending Champion Ryan Reisbeck & 2013 Volvik World Long Drive Champion Heather Manfredda Headline First Televised Event of 2018 from Long Drive’s Most Storied Venue
Veteran Sports Broadcaster Jonathan Coachman Making Golf Channel Debut; Will Conduct Play-by-Play at Each of the Five Televised WLDA Events in 2018
Eight men and four women have advanced to compete in tonight’s live telecast of the Clash in the CanyonWorld Long Drive Association (WLDA) event, airing in primetime from Mesquite, Nevada, at 7 p.m. ET on Golf Channel. In partnership with Golf Mesquite Nevada and taking place at the Mesquite Regional Sports and Event Complex, the group of competitors headlining the first televised WLDA event of 2018 are World No. 1 Justin James (Jacksonville, Fla.), defending Clash in the Canyon champion Ryan Reisbeck (Layton, Utah), and 2013 Volvik World Long Drive champion Heather Manfredda (Shelbyville, Ky.)
A familiar setting in World Long Drive, Mesquite previously hosted the Volvik World Long Drive Championship and a number of qualifying events dating back to 1997, including the World Championship having been staged at the same venue as the Clash in the Canyon from 2008-2012.
FORMAT: The eight men advanced from Monday’s preliminary rounds that featured a 36-man field and will compete within a single-elimination match play bracket during tonight’s live telecast. The four women advancing from this morning’s preliminary rounds (18-person field) also will utilize a single elimination match play bracket this evening to crown a champion.
COVERAGE: Live coverage of the Clash in the Canyon will air in primetime on Golf Channel from 7-9 p.m. ET tonight, with Golf Central previewing the event from 6-7 p.m. ET. An encore telecast also is scheduled to air later this evening on Golf Channel from 11 p.m.-1 a.m. ET. Fans also can stream the event live using the Golf Channel Mobile App, or on GolfChannel.com.
The production centering around live coverage of the competition will utilize six dedicated cameras, capturing all angles from the hitting platform and the landing grid, including a SuperMo camera as well as two craned-positioned cameras that will track the ball in flight once it leaves the competitor’s clubface. New to 2018 will be an overlaid graphic line on the grid, the “DXL Big Drive to Beat,” (similar to the “1st & 10 line” made popular in football) displaying the longest drive during a given match to signify the driving distance an opposing competitor will need to surpass to take the lead. The telecast also will feature a custom graphics package suited to the anomalous swing data typically generated by Long Drive competitors, tracking club speed, ball speed and apex in real-time via Trackman. Trackman technology also will provide viewers with a sense of ball flight, tracing the arc of each drive from the moment of impact.
BROADCAST TEAM: A new voice to World Long Drive, veteran sports broadcaster Jonathan Coachman will conduct play-by-play at each of the five WLDA televised events on Golf Channel in 2018, beginning with the Clash in the Canyon.Art Sellinger – World Long Drive pioneer and two-time World champion – will provide analysis, and Golf Channel’s Jerry Foltz will offer reports from the teeing platform and conduct interviews with competitors in the field.
DIGITAL & SOCIAL MEDIA COVERAGE: Fans can stay up-to-date on all of the action surrounding the Clash in the Canyon by following @GolfChannel and @WorldLongDrive on social media. Golf Channel social media host Alexandra O’Laughlin is on-site, contributing to the social conversation as the event unfolds, and, the telecast will integrate social media-generated content during tonight’s telecast using the hashtag, #WorldLongDrive.
In addition to the latest video and highlights from on-site in Mesquite, www.WorldLongDrive.com will feature real-time scoring. Golf Channel Digital also will feature content from the Clash in the Canyon leading up to and immediately following the live telecast.
Coming off record viewership in 2017 and a season fueled by emergent dynamic personalities, the Clash in the Canyon is the second official event of the 2018 World Long Drive season, as Justin Moose claimed the East Coast Classic in Columbia, South Carolina last month.
Showcasing the truly global nature of World Long Drive, several events will be staged in 2018 through officially sanctioned WLDA international partners, including stops in Germany, Japan, New Zealand and the United Kingdom. Additionally, an all-encompassing international qualifier will be staged (late summer) featuring a minimum of four exemptions into the Open Division of the Volvik World Long Drive Championship in September.
April 15, 2018
Boston.com
The light at the end of the tunnel for Boston Marathon runners making the final turn onto Boylston Street will be shining a little brighter this year. One of the changes the Boston Athletic Association made to the finish line for Monday’s 122nd running of the race is a new digital display board, affixed to the photo bridge above the finish line, that will be visible even if the forecasted rain falls.
“The finish times are going to be displayed big and bright and in color on that video board so that the participants and the spectators on Boylston Street will be able to see from afar what the time is,” said Jack Fleming, Chief Operating Officer of the B.A.A.
For their first year with the new board, which is similar to those that ring Gillette Stadium or TD Garden, the race organizers intend to go with a conservative approach and minimal animation. On Friday, it displayed a countdown clock for Saturday’s 5K and on Sunday it will show a tribute to One Boston Day. But the digital display opens up a new path forward for the finish line, and Fleming said that the B.A.A. could use lights and sound to enhance the spectator experience in the years to come.
“Boylston Street is like the home stretch of the Kentucky Derby or when the team comes out of the tunnel in Gillette Stadium,” he said. “We want our participants to feel that same way.”
In 2021, during the 125th Boston Marathon, don’t be surprised if the roar of the crowd over the final 500 meters is set to a background beat. But Fleming said the aesthetic changes will be made in keeping with the tradition of the event. Of course, no matter what sounds are added, the loudest noise in the runners’ heads will always be the ticking of the clock.
To that end, the organizers swapped the old clock — suspended by cable and beam above the street — for two consoles with double-sided clocks facing the oncoming runners on one side and the world’s media on the other. The race tape will be suspended in between the two consoles, and after the elite runners break the tape it will be wheeled out of the way.
Dave McGillivray, the race director, said that runners will notice some changes this year and a few more next year, building towards 2021 when the B.A.A. plans to showcase the finish line as part of the quasquicentennial celebrations. For that race, the organizers are also considering a request for an increased field size or more ancillary events around the Marathon.
The Boston Marathon finish line: a painted strip across a city street that’s taken on a meaning far beyond that.
“Everything to do with 2013 showed us just how loved Boylston Street is by our participants, by our fans, by the neighborhood, by the community,” Fleming said. “So that was sort of the inspiration for taking some actions on it.”
March 23, 2018
Sports Video Group
Although augmented reality is nothing new to sports production — the 1st & Ten line celebrates its 20th anniversary this year — AR has taken a giant leap in the past three years and is dramatically changing the way stories are told, both on the field and in the studio.
From left: Turner Studios’ Zach Bell, Fox Sports’ Zac Fields, Vizrt’s Isaac Hersly, SMT’s John Howell, and ChyronHego’s Bradley Wasilition
At SVG’s Sports Graphics Forum this month, a panel featuring executives from Fox Sports, Turner Sports, The Future Group, ChyronHego, SMT, and Vizrt discussed best-use cases, platforms, and workflows for AR, as well as how its use within live sports coverage is evolving. The one principle the entire panel agreed on was that AR cannot be used for technology’s sake alone: these elements must be used to further the story and provide valuable information to fans.
“Our philosophy has always been to use [AR] as a storytelling tool. We try not to use it for technology’s sake – whether that is in a live event or in the studio,” said Zac Fields, SVP, graphic technology and innovation, Fox Sports. “The interesting thing is that people can interact with [AR] on their phones and are familiar with what AR is now. That puts the onus on us to present those elements at an even higher quality now. [AR has] become the norm now, and it’s just going to continue to grow. The tools are there for people to come up with new ideas. The one thing that I would hope is that we can make it easier [to use] moving forward.”
Fields’s desire for more–user-friendly AR creation and integration was echoed throughout the panel by both users and vendors. Although a bleeding-edge AR project may be exciting and create a new experience for the fan, the goal is to create a solution that can be set up and used simply for every game.
“We’re trying to make sure that customers have ease of usability and repeatability every day,” said Isaac Hersly, director, business development, Vizrt. “It is an issue, and we are always looking for tools that are going to make it easier to set up and not need a rocket scientist. You [need to be able to] have someone that can operate the system very simply. That is our challenge, and we are always looking to come up with solutions to solve that.”
Turner Sports Brings Videogame Characters to Life With ARLast year, Turner Sports teamed with The Future Group to introduce augmented reality to its ELEAGUE coverage. The two companies worked with Ross Video to create life-like incarnations of videogame characters, allowing fans tuning in to watch games like Street Fighter V or Injustice2 to see these characters brought to life in the studio.
“I think creating AR characters from the games and bringing them to the audience adds an enormous amount of value for the fans and the viewing experience,” said Zach Bell, senior CG artist, Turner Studios. “If you can take characters or aspects of the game and have them as dimensional elements within that environment, it creates a much richer experience and allows fans of the game to visualize these characters in a new way. That in itself adds an enormous amount of connection to the experience for the viewer.”
Although esports presents a different case from a live game taking place on a field, Bell said, he believes similar AR elements will soon be making their way into live sports content (for example, NBC’s 3D AR elements from player scans during Super Bowl LII).
More Than Just a Game: Bringing AR to the MassesIt was only a couple years ago that high-end AR elements were reserved for the highest-profile sports events, such as NFL A games. However, with the technology’s rapid advance in recent years, AR has become ubiquitous for most national-level live sports productions and is making its way into even lower-tier properties. In addition, AR elements are becoming available on multiple cameras rather than just the main play-by-play camera (such as the SkyCam), and these systems can even be remotely controlled from offsite.
“The technology is allowing us to drive the next generation of this [content],” noted John Howell, creative strategist, SMT. “We have done the yellow [1st & Ten] line for 20 years, but, two years ago, SMT helped to create a technology that allowed us to do it on the SkyCam. Having that optical vision tracking to create the pan-tilt information off a $30,000 camera head for an image has enabled us not only to do this off the SkyCam but also to do it remotely.
“[That allows us to deploy AR] on more shows [more cheaply],” he continued, “and that technology will then trickle down to more shows. It won’t be just on Fox’s 4 p.m. Sunday NFL game or ESPN’s MNF or NBC’s SNF; now this [technology] gets to go on a lot more shows.”
What’s Next?: Getting More From Player-Tracking Chips, Customizing ARThe use of AR and the technology driving it has evolved rapidly over the past few years, raising the question, What’s next? The panel had plenty of predictions regarding the next great leap forward, but the primary point of excitement revolved around the continued advance of player-tracking RFID chips, especially the NFL’s Next-Gen Stats system.
“With the emergence of Zebra [Technologies] chips on players and [the NFL] looking at instrumenting the football [with a chip], you could see how that can tie to your first-down–line [graphic],” said Bradley Wasilition, director, sports analysis/lead sports analyst, ChyronHego. “The first-down line could actually dynamically change color, for example, when the first down is reached. Now, when that chip crosses that line, you can [definitively] say whether it is a first down or a player was out of bounds [on the sideline].
“Or think of a dynamic strike zone in baseball or a dynamic offside line in soccer,” he continued. “These are all different things that don’t necessarily reinvent the wheel, but they take baseline AR and move it into the 21st century.”
Fields predicted that, as multiplatform content and OTT outlets grow, fans will someday be able to customize their own AR elements within the sports coverage they are watching: “Eventually, it will get to a point where we can put this data in the hands of the viewer on an OTT offering. Once that happens, they can choose to turn off the strike zone over the plate. That is when we’ll really get some flexibility and customization to people so [viewers] can enhance [their experience].
March 16, 2018
Avixa
Sports. The great common denominator of all conversation. Even if you don’t like sports, you know enough to be able to talk about it, at least for a minute. And sports, by convenient association, is actually one of my favorite ways to talk about what it is that AVIXA members do.
We tell sports stories. Through gigantic video boards (forever “Jumbotrons” to the layman, and hey, that’s alright), humongous speaker systems, tiny microphones, variably-sized digital signage displays and perceptually invisible but actually ridiculously huge lighting systems and projection mapping, AV experience designers make the live event into a highlight reel. Everything has impact, in real-time.
So it happens to be that I’m forever on the lookout for evolving ways to tell sports stories in venues. In reading Sports Video Group’s coverage of the Super Bowl, I found another great angle on stadium storytelling. Most sports fans know that we are in the age of abundant sports data analytics, but what I didn’t know is that we are also in the era where those next-gen stats are changing the in-house show on the big screens at stadiums.
In a first for the Super Bowl, the 2018 game brought some television broadcast features to the in-house displays at U.S. Bank Stadium. And on top of that, they challenged audiences with a whole new graphics package featuring next-gen stats (“NGS” if you’re savvy).
With production tools by SportsMEDIA Technology (SMT), the virtual yellow line and some cool new NGS factoids made it to the big-time on the live-game displays. The latter of these came from SMT’s tapping into the NFL Next Gen Stats API to go deeper with the data.
SMT’s goal to delight fans with even more details to obsess over during the game seems like a good one. Especially because, well, “NFL fans are insatiable — they want data,” said Ben Grafchik, Business Development Manager for SMT.
To meet that need, SMT is exploring ways to tie in traditional data points with NGS in a visual format that fans can easily consume during a game. The objectivity and analytical depth of these additions to video board storytelling is compelling to all diehard fans, but in particular, the next-gen stats appeal to next-gen fans, Grafchik added.
These new graphics may have been a first for the Super Bowl, but actually, Vikings fans enjoyed them for the entire season at home at U.S. Bank Stadium. SMT worked with the in-house production team there to add all sorts of visual spice to the show, gradually going more complex with the offerings as the season went on and fans became accustomed to the new depths of data exploration.
But football isn’t the only sport that’s receiving the NGS upgrade. SMT happens to provide video enhancement and virtual insertion graphics for hundreds of major U.S. and international sporting events and broadcasters. So watch for a lot more variety to come both in house and wherever else you consume your sports content. It will certainly give us all a lot more to talk about when we talk about sports.
March 14, 2018
Sportstar Live
For more than 100 years, tennis, unlike team sports, used statistics sparingly. Basketball, baseball and football needed a plethora of stats, such as shooting percentages, batting averages and touchdowns scored, to measure the performances of their athletes and teams. But tennis players were measured chiefly by their wins, losses, titles and rankings. After all, few cared if the Wimbledon champion made 64% of his first serves or the No. 1 player averaged 77 miles per hour on her backhand.
All that changed in the Computer Age. With more information than they ever dreamed possible, tennis coaches, players, media and fans suddenly craved all sorts of revealing match data, not to mention astute analysis of it. No longer was it just whether you won or lost that mattered, but how and why you won or lost — points, games, sets and matches. Training methods, stroke production, tactics and equipment were also dissected and analysed in much greater depth and detail than ever before.
As the demand for data burgeoned, new technologies, such as sophisticated virtual graphics, tracking technology, statistical applications and telestration, have provided yet more valuable services and information to give athletes that “extra edge.”
Like any prescient, enterprising pioneer, Leo Levin seized the opportunity by developing the first computerised stats system for tennis in 1982. Levin’s seminal work was highlighted by creating the concept of and coining “unforced error,” a term now used in most sports and even by pundits to describe a politician’s self-inflicted blunder.
Since then, the genial 59-year-old, based in Jacksonville, Florida, has covered more than 120 Grand Slam events and countless other tournaments to provide the Association of Tennis Professionals (ATP) and other businesses with match statistics. Levin, dubbed “The Doctor” by broadcaster Mary Carillo for his incisive diagnoses of players’ games, is currently director of sports analytics at SportsMEDIA Technology (SMT), a company that provides custom technology solutions for sporting events.
In this wide-ranging interview, Levin explains his many roles in the exciting, fast-growing field of analytics and how it has changed tennis for the better.
What is sports data analytics?
Sports data analytics is a combination of gathering and analysing data that focuses on performance. The difference between analysis and analytics is that analysis is just gathering the basic data and looking at what happened. Analytics is trying to figure out why and how the basic performance analysis works with other factors to determine the overall performance of the athlete or the team.
When and how did this field start changing amateur and pro tennis? And who were the pioneers?
Honestly, I was. At the end of 1981, the first IBM personal computer hit the market for general consumer use. By the middle of 1982, I was working with a company in California to develop the very first computerised stats system for tennis. The key factor was the way we decided to describe the results of a tennis point in three basic areas. The point had to end with a winner, a forced error, or an unforced error. That created the foundation for how we look at tennis today.
How and when did you become interested in tennis analytics?
I was playing on the tennis team at Foothill College in Los Altos, California, about five miles from Stanford University. When I wasn’t playing matches, I was actually charting matches for my team-mates and then providing that information to the coach and the players to try to help them improve their games.
Brad Gilbert, a former world No. 4 and later the coach of Andre Agassi and Andy Murray, played on your Foothill team. Did you help him?
Brad was on that team, and it was interesting because in his first year, he played No. 2. The player who played No. 1 came to me before the state finals where he had to play Brad in the final, and asked me, ‘How do I beat Brad?’ I was able to give him specific information on strategy and tactics that helped him win the state title.
That was the year Brad took his runner-up trophy and smashed it against a tree and vowed never to lose a match the following year. And the following year, Brad didn’t lose a match.
SportsMEDIA Technology’s (SMT) products and services have evolved from a clock-and-score graphic in 1994 to innovative and sophisticated virtual graphics, tracking technology, statistical applications, and telestration. How do you and your team at SMT use these four methods to analyse statistical data at tennis’ four Grand Slams to provide valuable insight that helps players, coaches, broadcasters and the print media determine how and why a match was won or lost?
One of the challenges with tennis, more so than with any other major sport, is the lack of data. When we started doing this, there really wasn’t any consistent gathering of data from matches. So the first piece we developed was simply a system now known as Match Facts. It pulled factual statistical data directly from the chair umpire. That started with the ATP back in the early 1990s. We were then able to create a base for year-round information on the players. It allowed for the next level of analysis. It has expanded from there. We developed the very first serve speed system to start adding additional data and how players were winning or losing based on the serve speeds. As the technology improved, we’ve been able to harness the new generation — tracking video technology and then on the presentation side, using virtual graphics as a way to be able to place data directly into the field of play to help illuminate what is actually going on. Telestration is a tool that allows the broadcasters to get inside the points and help the fans understand the combinations of shots and strategies the players are using.
Your website (www.smt.com) has a section titled “Visual Data Intelligence” with the subtitle, “SMT delivers the world’s most innovative solutions for live sports and entertainment events across the globe.” What is Visual Data Intelligence? And what are its most important, innovative solutions for live sports and entertainment events?
Visual Data Intelligence goes to the heart of what we try to do as a company. In a lot of different sports, there is a lot of information available. But making it useful to the broadcasters, and specifically to the fans, to help them understand the game is a huge part of what we’re providing. That entails simple things like the first-and-10 line in football. That provides the visual set of information for the commentators and fans that really helps them understand where the teams are and how much yardage they need (to get a first down). It’s gotten to the point where fans in the football stadium are yelling, “Where’s the yellow line?” So we’re expanding that to provide the service to the large screens displayed inside the stadium so teams have their own system to be able to show that to the fans.
How does Visual Data Intelligence apply to tennis?
In tennis where you have a lot of data, the challenge is: how do you provide all that data to the fans and the commentators? We do that through a series of different systems. We have what we call our “open vision system,” which is an IPTV solution that has real-time scoring, stats and video as well as historical data. And it ties it all together and puts it in one place so it provides a true research tool for the commentators and the (print and online) media. Along with that, we have a product we call our “television interface,” which is really a system which drives graphics on air for the broadcasters. This tool allows them to look at the data and see where the trends are. Hit the button and have that information directly on the screen.
Please tell me about the new technology service partnership between Infosys and the ATP, and the analytics and metrics this partnership brings to the tennis world.
I’m not really that aware of what Infosys and the ATP are doing. But I do know that a lot of that hinges on the technology we created for Match Facts. One of the unique things about tennis is the scoring system. Unlike other sports, the player or team that wins the most points doesn’t necessarily win the match. That’s not how our scoring system works. I think they are trying to take a deeper look into the individual points, and how winning or losing specific points in key situations impacts a player’s ability to win or lose matches. The same is true for total games. That’s one of the challenges when you’re trying to do analysis of tennis. In a lot of other sports, you’re just looking at the raw numbers and saying how many points did he score or how many rebounds did she get or how many yards did they gain. But in tennis, it has to be compartmentalised into specific performances in specific situations.
How do insights from game and training data analytics improve coaching?
The key to coaching and player improvement is first to understand what is going on out on the court. It’s a matter of gathering data. One of the challenges tennis has faced because of its late start in the world of statistics and data analysis has been a reluctance by a lot of coaches and players to rely on anything other than what they see and feel. So the real challenge and the real key is to be able to relate the data to what coaches see and what players feel out on the court. When you can make that connection, you have a real chance for improvement.
What are one or two insights that have improved coaching?
The challenge is that every player is different. What the data analysis allows you to do is to customise those things and focus not on what a player does, but what your player does, and how you can get the most out of your player’s game. A simple example of this was when we first started doing detailed statistics and analysis, we worked with the Stanford University tennis programme. Their No. 1 woman player, Linda Gates, was struggling, and the coaches couldn’t figure out where or why. We did an analysis of her game, and we found out that she was dominating her service games on her service points in the deuce court, but she was struggling in the ad court. It wasn’t visually obvious. The coaches couldn’t put their finger on what the problem was. But once we started looking at the numbers and the data, it allowed them to focus in practices on her ad-court shot patterns. Linda went on to win the NCAA Championships that year, 1985, in singles and doubles (with Leigh Anne Eldredge).
An Infosys ATP “Beyond The Numbers” analysis of Rafael Nadal’s resurgence to No. 1 in the Emirates ATP Rankings showed that Nadal ranked No. 1 on tour in 2017 for winning return points against first serves, at 35.2 percent (971/2761). That metric shoots up to an astounding 43.4 percent (454/1045) for his clay-court matches. Which other stunning statistics help explain why other players have had outstanding years this decade?
This goes to the basics of looking at players’ strengths and weaknesses. One stat I always look at is serve and return performance because I still split the game up that way. It’s interesting that when you look at a player like Nadal, you see that he is not only dominant on return of serve. He’s also dominant on his own second serve.
Even with all the analytics we have, an old maxim still holds true: “You’re only as good as your second serve.” You’ll find the players at the top of the rankings for the last four or five years were also at the top of both second serve points won and return of second serve points. Despite all the focus on power and big serves, second serve performance is really a huge key to understanding a player’s overall strengths and weaknesses.
How much do the Women’s Tennis Association tour and its players take advantage of analytics?
Although the WTA was a little behind the ATP curve in terms of gathering and storing match data, the good news is that now they’ve caught up. Their association with SAP and that they’re also now using a Match Facts system to provide data for the players on a match-by-match basis has moved them up the curve.
Which pro players have benefited most from tennis analytics so far? And in what specific ways?
That’s a tough question. Because I don’t work directly with the players and coaches as I used to, I don’t know who is utilising the data more so than others. You can tell just by looking at Roger Federer’s improvement over the last year that his team used analytics to determine that he needed to be more aggressive on his backhand. He’s now hitting a much higher percentage of topspin backhands than he did in previous years and that change has made his game more balanced and puts a lot more pressure on his opponents. Playing to Roger’s backhand used to be the safe play — it’s not any more.
Another area of Federer’s game that came to light using analytics was the difference between his winning and losing matches at Wimbledon. When you compare his final match wins to his matches lost since he won his first Wimbledon in 2003 — 8 titles, 7 matches lost — the numbers that jump out are all about his return of serve, and specifically, his performance on break points. Federer’s serving performance barely changed, but his return game fell dramatically in his losses. In his Wimbledon final wins, Federer converted 30 of 69 break points for 44%. In his losses, he converted only 9 of 53 for 17%. In both cases, he averaged around 8 break points per match. In his wins, he converted almost 4 per match, but in his losses he converted just over once per match. His team looked at that crucial data and added in that nearly all his opponents served and volleyed 2% or less of their service points and concluded that Roger needed to work on hitting his returns deep and not worry about his opponents coming in behind their serves.
Younger players are taking most advantage of the information because they’ve grown up in that world. They’re used to the electronics and the digital experience and having all that information available to them.
How do these insights enhance the fan experience?
I credit (renowned former NFL analyst) John Madden for being one of the very first TV commentators who would take fans inside the game to explain to them things they didn’t necessarily see. Madden would explain to women football fans what the centre or guard was doing on a particular play and why that back ran for 50 yards was all because of this really good block.
What we’re trying to do in tennis and what these insights have provided is to do the same kind of things for tennis fans. Help get them inside the game so they understand the nuances of what’s happening on the court, and they’re not just watching two guys running around hitting the ball.
What is radar-based tracking, which is now used by the United States Olympic Committee (USOC) for every throw an Olympic athlete makes? Is it being used in tennis?
Radar-based tracking is simply tracking the speed and location of the ball or object that is being thrown or hit. Radar-based tracking has been typically used for service speeds in tennis. That is something we pioneered in the late 1980s. The tracking used in tennis has been video-based, as opposed to radar. The advantage of that is that you can track movement of the players as well as the movement of the ball and from a variety of positions and angles.
Can analytics predict which junior players will someday become world-class players or even champions? And if so, can it guide their coaches and national federations to increase the odds that will happen?
Not yet. The challenge is that prediction is different from analysis. You’re trying to draw conclusions from the data, and we don’t have a complete set of data. If you wanted to predict which junior players will become world-class players, sure you can do that if we have genetics, biomechanics, all the physical characteristics measured as well as using analytics to measure the player’s overall performance on the court. We can see whether or not they have specific markers that indicate they will make that jump. But the bottom line is that there are so many factors involved. And a lot of it has to do with the physical side that you can’t necessarily determine from data.
What is bioanalytics? And why is measuring and analysing an elite athlete’s perspiration important?
We’re pioneering bioanalytics in football now. We’re taking biometric readings from players at the university level. The players are equipped with motion sensors and full biometric readers, which are reading things like heart rate, body temperature and respiration. And they’re combining that with the movement data from the tracking information. With that, we’re able to measure the physical output of the players. The sensors in the helmet measure impacts (from collisions).
We’ve been working on this project for a few years. It’s been used for the football programme at Duke University. We’re in the process of adding a couple more universities to this project. At this stage it’s being used for medical purposes. So when a player is on the practice field, they can know immediately if his heart rate starts racing or if his body temperature goes up too high, they can immediately pull him out of practice and get him more electrolytes and hydration. They also weigh the players before and after every practice so they know how much fluid the player has lost during their practice times.
How is bioanalytics used in tennis?
Unlike a team sport where a team can outfit all its players with this equipment, tennis players are all independent contractors. So it’s going to take more of a nationalistic approach — something like what the USTA is doing — to step in and say, “For our junior players, we’re going to outfit some courts and we’re going to provide this level of analysis on the physical side.”
Does analytics apply to tennis equipment and court surfaces? And if so, how?
Sure, it can. Analytics can identify how well players perform using different types of equipment and on different surfaces. For instance, if you’re using some tracking technology to determine what racquet and string combination allows a player to have the most amount of power, that’s a relatively simple exercise. You run a player through a set of drills, hitting particular shots, and measuring the speed of the ball coming off the racquet.
For surfaces, analytics can really help with identifying the type of shots that have an effect on particular surfaces or areas where players’ games break down. For example, you have players who have a long backswing, and that works really well on a slower surface where they have time to take a big backswing. But when you put them on a faster court, where the ball bounces lower and faster, it upsets their timing, and it makes it more difficult for them to adjust. Analytics measures the court’s bounce speed and bounce trajectory. So you can take a player and modify his game on a particular surface taking into account how the ball reacts to it.
You’ve analysed thousands of matches. Which factors influence the outcome of matches the most in men’s tennis and women’s tennis? And why?
The No. 1 factor typically is unforced errors. If you’re making mistakes, you’re basically giving the match to your opponent. Being able to measure and quantify that is a huge factor for player improvement. That entails understanding where you’re making your mistakes — which shots and what situations. The caveat to that is that there are certain players whose games are based on absolutely controlling the pace and tempo of the match. And they have the tools to do that. Two of the best players ever to do that are Steffi Graf and Serena Williams.
What are the disadvantages of and dangers involved with analytics? Will some number crunchers and coaches go overboard with analytics and be guilty of Occam’s razor?
The simple danger is to rely on data alone. The challenge is that you have to make the data relatable to what the player is doing physically and mentally on the court. Analytics doesn’t necessarily measure the mental side of the game, at least not yet. If you’re focusing so much on the analytics of certain shots and not looking at the big picture of their mental focus and how they’re preparing for matches, you can get into trouble.
Since tennis players vary greatly in temperament, talent, current form and other variables, do practitioners of analytics risk over-concluding from their numbers? And what mistakes have you and others made in this regard?
There is always a risk. Data can provide you with valuable information. Then you make that next leap that says, “This information says this, and therefore we have to do this, or therefore we have an issue.” I’ll give you a simple story from a few years ago. Jim Grabb, who was the No. 1 doubles player in the world then, came up to me at a tournament before the US Open and said, “I’m struggling with my first volley in singles. I can’t make a first volley.” And I told him, “You’re the No. 1 doubles player in the world. You have great volleys. And you’re saying you can’t make a first volley in singles.” He says, “Yeah.”
A lot of coaches would say, “How are you hitting it? Let’s analyse the stroke.” I asked, “When you step to the baseline to hit the serve, where is your first volley going?” Jim looked at me like I was speaking a foreign language. So I asked again, “Before you hit your first serve, where are you going to hit your first volley?” He said, “I just react to the ball. I don’t know what you’re talking about.”
So I suggested, “Do this. Every first volley goes to the open court. You serve wide in the deuce court and you volley wide into the ad court. You serve wide in the ad court and volley wide into the deuce court. Just for your first volleys.”
Jim goes out to play and comes back and says, “I didn’t miss a first volley.” The next week he got to the fourth round of the US Open, his best result at a Grand Slam (event) ever in singles. That had to do with the fact that all it really required was a little bit of focus by the player. It didn’t require a level of analysis and stroke production changes. It was simply eliminating decision-making.
What is the connection between analytics and the established field of biomechanics?
Analytics can tell you how a player is performing or how a stroke is performing in key situations. That can then identify that we need to examine the biomechanics of the stroke, particularly if it is breaking down under pressure. Or we can determine that the errors are occurring when the ball is bouncing four feet in the air versus three feet in the air, so their contact point is a foot higher. Now we can look at the biomechanics and see what the player is doing when the ball is a foot higher.
What are player rating systems? And what is the connection between analytics and player rating systems? How valid is the Universal Tennis Ratings system?
I don’t think there is any now. But that’s a direction we can take in the future.
Which match statistic or statistics do you foresee becoming increasingly important as a result of analytics?
I think you’ll see more focus on key point performance as we do more and more analysis of players’ games in key pressure situations. Because you’re serving half of the time and receiving serve half of the time, analytics will look increasingly at each half of the game. We talk a lot about unforced errors, but are they occurring on your serve game or return game? We talk about aggressive play and taking control of the points, but when is that happening? And the serve or return games? On the first serve or second serve?
Data analytics is undeniably changing tennis. Do you think it will revolutionise tennis?
Absolutely! Because the game is always changing. The technology around tennis and all sports keeps changing. Analytics is going to make the athletes better. It’s going to provide them with insights about how they can be at their peak for the key matches. It will help them train better, prepare better, execute shots better under pressure. All those pieces and parts will be available for athletes. And all of their nutritional, sleep, and training regimens will also help tennis players to perform better.
March 9, 2018
Sports Video Group
The 2018 NASCAR season is underway and with it comes a new remote production workflow for NASCAR whereby cameras and audio signals are being sent from race tracks to NASCAR’s production center in Charlotte, NC. The efforts began with the Rolex 24 at Daytona race and will continue with the WeatherTech SportsCar Championship racing series next week, and ARCA Racing Series as the season progresses.
“We have done a lot of testing at smaller events the past couple of years but this year we wanted to push the limits and see what we can do,” says Steve Stum, NASCAR Productions, VP of Operations and Technical Production.
The Rolex 24 Hour race used NEP’s NCP IV production unit to put out 12 hard cameras, two RF cameras for the pit announcers, and 14 in-car cameras around the track. RF was handled by 3G and a tech manager and engineering team ensured that 28 video and 75 audio signals were sent to Charlotte via a single antenna from PSSI Global Services. PSSI Global Services leveraged its C27 mobile teleport, equipped with cutting-edge Newtec modulators and GaN SSPB amplifiers from Advantech Wireless.
Rick Ball, Director of Broadcast Sports at PSSI Global Services, adds: “We’re not afraid to go where no one has gone before, and we’re proud that our efforts continue to create new possibilities in live television.”
Once the signals are back in Charlotte the director, producer, TD, replay, SMT virtual graphics, and announcers created the show.
“Round trip the latency is 1.09 seconds so we have camera returns and feeds for the screens for the fans in the stands,” adds Stum.
With upwards of a third of production costs being sunk into travel Stum says that the goal is to put more money into the production itself, get more specialized equipment, and have a production truck unit that is more aligned with the needs of a remote production.
The efforts are part of a season that Stum says has been going great so far. And all of the testing prior to the Rolex race paid off as Stum says nerves at the beginning subsided as the workflow was proven out.
March 2, 2018
Sports Video Group
As the NFL Scouting Combine becomes an increasingly fan-focused event onsite, NFL Media is expanding its already sizeable coverage of the annual event in Indianapolis. Last year, the NFL added Combine events, including the bench press and press conferences, at the Indianapolis Convention Center next door to Lucas Oil Stadium and allowed a limited number of fans into the stadium’s upper bowl in an effort to boost the NFL Combine Experience. With that in mind, NFL Network and NFL Digital outlets are rolling out their biggest productions to date to cover the growing parade of events taking place at both locations.
“We attack this show with everything we have in order to cover it from every aspect,” says Dave Shaw, VP, production, NFL Media. “The league has continued to expand the fan-focused aspect of the Combine at the convention center. They started that last year and are putting even more events over there this year. So we’ve expanded our show to cover some of the more fan-friendly stuff.”
For its 14th Combine, NFL Media is delivering a whopping 52 hours of live coverage during the event (Feb. 28 – March 5), including 34 hours from Indianapolis: 26 hours of Combine coverage Friday-Monday and eight hours of press conferences Wednesday and Thursday.
“This event really didn’t become ‘an event’ until it was covered by NFL Network,” says Christine Mills, director, remote operations, NFL Media. “It’s grown and evolved, and now fans are becoming more involved [onsite]. It’s interesting how it’s grown from a very small intimate event essentially just for scouts to an event covered by NFL Network and NFL Digital and on social. It’s grown into a fan-facing event, but it has kept that intimate feel at its core.”
Onsite in Indy: Encore and Pride, Four Sets Drive Multiplatform ProductionDespite the expansion, NFL Media has maintained the same footprint in the truck compound at Lucas Oil Stadium. Game Creek Video’s Encore is serving the NFL Network show, and Pride is handling the streaming coverage.
The trucks onsite are fully connected to NFL Media’s broadcast center in Culver City, CA, via diverse fiber circuits (with 12 muxed feeds going each way) to allow extensive file-transfer and backhaul of camera feeds.
“For our coverage, we treat this like we’re covering a high-end game,” notes Shaw. “It’s a very slick production that moves quickly. It is a bit of a marathon, but our production teams do an outstanding job of rolling in features and keeping the action moving. It’s an important show for the NFL Network and NFL Media group because it’s the baseline for what we are about, which is giving viewers the inside look and show fans what they should look for in the upcoming players.”
NFL Media has deployed a total of four sets — three at Lucas Oil (one on the field, two on the concourse level) and one at the convention center — to serve its 23-deep talent roster. Two of the three sets at the stadium are dedicated to the digital operation; NFL Network is manning the convention-center set, which is primarily for press-conference coverage.
“The setup we have at the convention center for NFL Network is very similar to [Super Bowl] Opening Night, where they have eight podium positions set up and we’re right in the middle of that room,” says Mills. “It ends up being a really fun and busy couple of days, especially with the fans more involved now [onsite].”
In addition to the four sets, NFL Network has a position in the traditional announce booth at Lucas Oil Stadium, as well as an interview location in a suite, where head coaches often stop by. For example, last year, NFL Media landed a rare interview with Patriots coach Bill Belichick in this location.
“Most of the head coaches are here in a casual atmosphere trying to pull something away from some of these players they’re evaluating,” says Shaw. “And the coaches have [free rein over] where they want to be in the building, so sometimes they will stop by the announce booth. Having Belichick stop by and do some time with our guys took us all off guard a little, but it was great and got a lot of attention. What’s exciting is, you don’t know what you’re going to pull off here since you have all the coaches and GMs. It’s a lot of fun trying to get in their minds and hearing what they have to say in this kind of atmosphere.”
The Camera Complement: Skycam, Robos, and TeamCamsBetween NFL Network and NFL Digital, the operation is deploying a combined 37 cameras at the two venues, including a SkyCam at the stadium and a large complement of robos (provided by Indy-based Robovision) at both locations. In addition, five ENG cameras are roving the grounds capturing content, which is being sprinkled into both the linear and the streaming coverage.
NFL Media will continue to spotlight the 40-yard–dash drill, with a high-speed camera capturing the smallest details. In addition, SMT is providing virtual graphics and graphics overlays for visual comparison of prospects with one another or with current NFL players’ Combine performances: for example, projected top pick QB Sam Darnold vs. Pro Bowl QB Carson Wentz’s sprint).
In addition, NFL Media is leveraging its Azzurro TeamCam system to provide live shots throughout its press-conference coverage. The TeamCam system, which NFL Network has used for a variety of needs for several years, features a single camera and transports bidirectional HD signals via a public-internet connection — along with IFB, comms, and tally — between Indianapolis and Culver City. In addition to a show produced onsite during the first two days, all press conferences are fed to Culver City via the TeamCam system.
“It’s interesting what we do for our live shots with the TeamCam system,” says Shaw. “We can just do one-off cameras, or we can bring it back; we can do two-ways just with a single camera. It’s a great [tool] for our Wednesday and Thursday coverage.”
NFL Digital Bigger Than Ever at CombineNFL Digital’s presence continues to grow at the Combine. NFL Now Live is streaming on NFL.com, the NFL app, and Yahoo.com Friday-Monday beginning at 9 a.m. ET. In addition, NFL Media is providing extensive social-media coverage across Twitter, Facebook, Instagram, and Snapchat. Twitter Amplify is being used to produce highlights, distribute on-the-ground original content of top achievements across social networks, and deliver original social content to all 32 NFL clubs. On top of that, for the first time, the NFL is coordinating with some of the top college football programs to share, create, and amplify social-media content from Indianapolis.
In addition to live coverage, each prospect goes through the “Car Wash” following his press conference at the convention center. Each player progresses through interviews with NFL Media’s features team, digital team, and social-media team.
“These [Car Wash] interviews help us build features and get footage for the Draft,” says Shaw. “It also helps us down the road, and we’ll use footage all the way through the season. This is an NFL Media-exclusive event, so we go out of our way to give the avid NFL fan that inside position they don’t usually get to see.”
February 28, 2018
Sports Video Group
NFL Network will produce and broadcast 11 live American American Flag Football League (AFFL) games during its debut season, as well as distribute highlights from the AFFL’s upcoming 2018 U.S. Open of Football (USOF) Tournament. The agreement is the first-ever broadcast deal for professional flag football, and “provides a unique opportunity for the NFL to explore digital distribution of AFFL content,” according to the league’s announcement. The 11 game telecasts will be produced by NFL Network and feature NFL Network talent.
“Today marks great progress for football fans and players,” says AFFL CEO/founder Jeffrey Lewis. “As the first-ever broadcast and distribution deal focused on bringing the game of flag football to the broadest possible audience, we are thrilled to partner with NFL Network, the premier platform for football.”
The AFFL is set to launch this summer, and NFL Network is expected to build on the unique use of technology deployed for coverage of the AFFL’s first exhibition game on June 27, 2017, at Avaya Stadium in San Jose, CA. In an effort to create a wholly revamped football-viewing experience similar to the Madden NFL gaming look, the AFFL production team deployed SkyCam as the primary play-by-play angle (prior to NBC Sports’ decision to do so for several games during the 20017 NFL season), RF cameras inside the huddle, and SMT virtual graphics and augmented-reality elements all over the field.
The USOF is a 132-team, single-elimination tournament that will ultimately pit a team of elite former professionals against a team that has conquered a 128-team open national bracket. The tournament marks the AFFL’s first major competition, following an exhibition game in June 2017. NFL Network will televise 11 USOF games live June 29-July 19, concluding with the Ultimate Final, where America’s Champion and the Pros’ Champion will meet in a winner-take-all contest for $1 million.
The broadcasts are currently scheduled for the following dates:
The four Pro teams are expected to be led by Michael Vick, Chad “Ochocinco” Johnson, basketball duo Nate Robinson and Carlos Boozer, Justin Forsett, and Olympic champion Michael Johnson. Airtimes and broadcast talent for USOF games on NFL Network will be announced at a later date.
“Football fans are passionate about having continuous access to entertaining football content all year round,” said Mark Quenzel, SVP, programming and production, NFL. “AFFL games on NFL Network will give viewers a chance to experience a new kind of football competition in the summer months, and we’re excited for the opportunity to deliver more live programming that fans enjoy.”
The AFFL is extending the application deadline for the USOF from March 1 to March 8. Interested applicants can apply to play in the USOF here. Those selected will play in America’s Bracket, which comprises 128 teams.
February 19, 2018
Sports Video Group
One of the highlights of Turner’s NBA All-Star Saturday Night coverage was the debut of a shot-tracking technology developed by Israeli startup RSPCT. Deployed for the Three-point Contest, RSPCT’s system, which uses a sensor attached to the backboard to identify exactly where the ball hits the rim/basket, was integrated with SMT’s graphics system to offer fans a deeper look at each competitor’s shooting accuracy and patterns.
“There is a story behind shooting, and we believe it’s time to tell it. Shooting is more than just a make or a miss,” says RSPCT CEO Oren Moravtchik. “Turner and the NBA immediately understood that the first time they ever saw [our system] and said, Let’s do it.”
During Saturday night’s telecast, Turner featured an integrated scorebug-like graphic showing a circle representing the rim for each of the five racks of balls during the competition. As a player took a shot, groupings indicating where the ball hit the rim/basket were inserted in real time, showing where the ball landed on the rim or inside the basket.
“It’s a bridge between the deep analytics that teams are using and the average fan,” says RSPCT COO Leo Moravtchik. “Viewers can understand shooting accuracy faster and better without having to dive into analytics; they clearly see groupings of shots and why a shot is made or missed. Last night, if a player missed all five shots of a rack, you could see why: if they are all going right or all going left.”
The system, which can be set up in just 30 minutes, consists of a small Intel RealSense Depth Camera mounted behind the top of the backboard and connected wirelessly to a small computing unit.
“We have some very sophisticated proprietary algorithms on the sensor,” says Oren Moravtchik. “The ball arrives at a high speed from the three-point line at various angles. We can [capture] the entire trajectory of the ball: where it came from, how it flew in the air, where it hit the basket — everything. We know the height of the player, the release point, and where it hit the basket, and then we can extrapolate back from there.”
Although Saturday night marked the debut of the RSPCT system for the NBA, Leo Moravtchik sees far more potential once complete data sets on players can be captured — such as a full playoff series or even a full season.
“There may be an amazing player shooting 18 out of 20 from every [three-point] location, but there are differences between locations beyond just field-goal percentage,” he says. “Based on our data, we not only can show them [that] shooting [tendencies] can predict, [that] we can actually project their field goals for the next 100 shots. We can tell them, If you are about to take the last shot to win the game, don’t take it from the top of the key because your best location is actually the right corner.”
RSPCT is not only focusing on sports broadcast and media clients but marketing the system as a scouting and player-development tool.
“We’re [targeting] NBA teams, college teams, and even high school and amateur teams,” says Leo Moravtchik. “Wherever there is a basket — camps, gyms, schools — people want to see how they are shooting. We can bring it there because it’s a 30-minute installation and very cost-effective.”
February 16, 2018
Sports Video Group
The 60th running of the Daytona 500 takes place this Sunday, and Fox Sports, as it has done every year, again has found a way to push the technological envelope and expand on the resources dedicated to broadcasting the Great American Race. Coverage of this year’s race includes the introduction of Visor Cam, the return (and refinement) of the dedicated Car Channels on Fox Sports GO, and — in an industry first — a tethered drone that will provide live coverage from behind the backstretch at Daytona International Speedway.
“Every year, there’s something new,” says Mike Davies, SVP, field and technical operations, Fox Sports. “The Daytona 500 is always a great way to kick off the first part of the year in terms of technological testing: a lot of the things that we bring down to Daytona to look at, to test, and to try are things that manifest themselves later and in other sports. It’s a lot of fun to dream these things up.”
A Unique Point of ViewThis weekend’s race will feature all the camera angles that racing fans have come to expect, plus a few new views that promise to enhance the broadcast. Fans have grown accustomed to seeing their favorite drivers up close thanks to in-car cameras, but, on Sunday, they’ll be able to see what the driver sees.
Visor Cam, which first appeared at the Eldora NASCAR Camping World Truck Series race last year, makes its Daytona 500 debut this weekend. The small camera, developed by BSI, will be clipped to the helmets of Kurt Busch (last year’s Daytona 500 champion) and Daniel Suarez.
“You can try to put cameras everywhere you can, but seeing what the driver is seeing through a camera placed just above his eye line on his visor is pretty cool,” says Davies. “We’re looking forward to having that at our disposal.”
Fox Sports worked closely with NASCAR and ISC to provide aerial drone coverage of the Daytona 500. The drone, which will be tethered to allow longer periods of flight time, will move around behind the backstretch — outside of the racing area — to cover the race from a new angle.
Gopher Cam, provided by Inertia Unlimited, returns for its 10th year with enhanced lens quality for a wider, clearer field of view. Three cameras will be placed in the track, including one in Turn 4 and another on the backstretch.
Cameras, Cameras EverywhereFox Sports will deploy a record number of in-car cameras during the Daytona 500. In total, Sunday’s broadcast will feature 14 in-car cameras, including the pace car — more than in any NASCAR race in the past 15 years. Each car will be outfitted with three cameras for three viewing angles.
Last year, Fox Sports launched two dedicated Car Channels on the Fox Sports GO app, each focusing on a single driver. For this year’s race, Fox Sports has opted for a team approach, showing multiple drivers, cars, and telemetry data on the channel.
In total, Fox Sports will deploy a total of 20 manned cameras, including three Sony HDC-4300’s operating in 6X super-slo-mo, one Sony HDC-4800 operating in 16X HD slo-mo, and an Inertia Unlimited X-Mo capturing 1,000 frames per second. Fox Sports will outfit its Sony cameras with a variety of Canon lenses, ranging from handheld ENG to the DIGISUPER 100. The network will also have four wireless roving pit/garage camera crews, 10 robotic cameras around the track (plus three robotic Hollywood Hotel cameras), and a jib camera with Stype augmented-reality enhancement. The Goodyear Blimp will provide aerial coverage.
Not to be forgotten, viewers will be treated to all the sounds of the race as well, thanks to more than 100 microphones surrounding the track. Fox Sports plans to make use of in-car radios throughout the broadcast, both in real time (having the drivers and crew chiefs narrate the race) and after the fact (using the audio to tell a story).
A Compound Fit for the Super Bowl of RacingFor the first time in 12 years, Game Creek Video’s FX mobile unit will not handle Fox Sports’ Daytona 500 production. Instead, Game Creek’s Cleatus (known by another network as PeacockOne) will be responsible for the main race broadcast and will be joined in the compound by 11 additional units for digital production, editing, RF cameras and audio (BSI), telemetry and graphics (SMT), and studio production. Two satellite uplink trucks will be onsite, as well as a set of mobile generators that will provide nearly 2 MW of power independent of the local power source.
Fox Sports is shaking up its transmission as well, relying on an AT&T gigabit circuit capable of transmitting eight video signals (and receiving four) via fiber by way of its Charlotte, NC, facility to Fox Sports’ Pico Blvd. Broadcast Center in Los Angeles.
“Based on some of the things that we’re doing for the World Cup in Moscow as well as home-run productions for MLS and college basketball, we’ve taken some of that knowledge and leveraged it to doing full-on contribution for NASCAR,” Davies explains. “It’s exciting, it’s scalable, and we’re looking forward to doing it. AT&T put in circuit at every track or is in the process of doing so, so this is a first foray into IP transmission as it relates to NASCAR.”
The benefit of transitioning to IP transmission, according to Davies, is the volume of content that Fox Sports will be able to send from tracks that notoriously lack connectivity. “At the end of the day,” he says, “we’ll be able to leverage resources from Charlotte and Pico to do more things. Right now, we’re able to contribute more to our Charlotte shows via fiber, but, like everything in technology, the more we get used to it and the more we know how to use it, the more useful it’s going to be.”
Daytona 500 Gets a Graphics MakeoverThe on-air graphics package for the Daytona 500 will be new, featuring much of the look and feel of Fox Sports’ football, basketball, and baseball graphics with all the data that NASCAR fans expect.
Fox Sports will up the ante on virtual graphics and augmented reality, deploying Stype camera-tracking technology (with a Vizrt backend) on a jib between Turns 3 and 4 in order to place 3D graphics within the broadcast. For example, the system can be used to create virtual leaderboards, sponsor enhancements, and race summaries that are placed on Turn 3 as virtual billboards.
“Where that jib is between Turns 3 and 4, you can place graphics [on screen in] such a way that you don’t necessarily have to leave the track in order to get information across,” Davies explains. “In the past, we might have used full-screen graphics, but now, we can put the graphics in space, and it looks pretty cool. It’s the third year that we’ve been doing that, and we seem to get better at it each year.”
The network has also enhanced its 3D-cutaway car, putting these graphics in the hands of the broadcast team. And, in the booth, Fox Sports NASCAR analyst Larry McReynolds will have his own dedicated touchscreen, allowing him to enhance any technical story and give the viewer clear illustrative explanations during the race.
A Company-Wide EffortBetween the production personnel, camera operators, engineers, on-air talent, and many more, Fox Sports currently has 300 people onsite at the Daytona International Speedway. In addition, Fox Sports’ Pico and Charlotte facilities, as well as its network-operations center in The Woodlands, TX, are very much a part of the action. And, when the Daytona 500 starts on Sunday, all will be ready to deliver this year’s race to NASCAR fans everywhere.
“Between everything that you’re going to see on-screen and everything under the hood, these are all things that are going to help the company as a whole,” says Davies. “We’ve been able to bring together all of the resources across the company, and it’s particularly exciting to get everybody working as one on this event.”
Latest HeadlinesEuropean Championships 2018: BBC To Give ‘Major Event Treatment’ to Inaugural Multi-Sport ShowcaseSCMS 2018: Fox Sports Execs Reflect on Cloud Workflows, Data Strategy for 2018 FIFA World CupITN Productions Provides Live Commentary for International Champions Cup With ArqivaPGA Tour Partners With NBC Sports to Bring PGA TOUR LIVE to Network’s Direct-to-Consumer Platform, NBC Sports GoldMX1, Arista Networks, Tektronix, Warner Chappelll Production Music Renew SVG SponsorshipsCenturyLink Renews SVG Platinum SponsorshipSCMS 2018: Quantum’s Molly Presley Shares the Latest on StorNext, Tiered StorageSCMS 2018: NBC Sports’ Darryl Jefferson Offers a Look at 2018 Olympics Asset ManagementSCMS 2018: How Machine Learning Can Make Sports Workflows More EfficientEBU To Conduct UHD HDR High Frame Rate Tests With NGA at European Championships
February 8, 2018
Digital Journal
DURHAM, N.C.--(Business Wire)--NBC Olympics, a division of the NBC Sports Group, has selected SMT to provide real-time, final results and timing interfaces for its production of the XXIII Olympic Winter Games, which take place in PyeongChang, South Korea, from February 8 - February 25. The announcement was made today by Dan Robertson, Vice President, Information Technology, NBC Olympics, and Gerard J. Hall, Founder and CEO, SMT.
Since 2000, SMT has been a key contributor to NBC Olympics’ productions by providing results integration solutions that have enhanced NBC’s presentations of the Games via on-air graphics, scheduling, and searches for content in the media-asset–management (MAM) system.
For the 2018 Olympic Winter Games, SMT will deliver TV graphics interfaces for NBC Olympics’ Chyron Mosaic systems in its coverage of alpine skiing, freestyle skiing, snowboarding, figure skating, short track speed skating, speed skating, bobsled, luge, skeleton, ski jumping and the ski jumping portion of Nordic combined.
SMT’s Point-in-Time software system integrates live results to allow commentators to locate a specific time during a competition in both live and recorded coverage. The software graphically shows key events on a unified timeline so that NBC Olympics commentators can quickly see how a race began, when a lead changed, where an athlete’s performance improved, and the kinds of details that dramatically enhance the incredible stories of triumphs and defeats intrinsic to the 2018 Winter Games.
“The complexity and sheer amount of scoring, tracking, and judging data that comes with an event of this size, both real-time and post production, is beyond compare,” said Robertson. “The ability to organize and deliver it aids NBC’s production in presenting the stories of these amazing athletes, and requires nothing short of the capabilities, innovation and track record of SMT.”
“It is our privilege to provide our expertise, experience, and results reporting technology for NBC Olympics’ production of the 2018 Olympic Winter Games, SMT’s 10th straight Olympics,” said Hall. “Our team of 10 on-site engineers have rigorously prepared for PyeongChang with a tremendous amount of testing and behind-the-scenes work, ensuring SMT delivers seamless services of a scope and scale unprecedented in a sports production.”
SMT’s partnership with NBC Olympics began with the 2000 Sydney Games and has included providing graphics interfaces as well as NBC’s digital asset management interface that helped the network receive Emmy Awards for “Outstanding Team Technical Remote,” following the 2008 and 2016 Games.
About NBC Olympics
A division of the NBC Sports Group, NBC Olympics is responsible for producing, programming and promoting NBCUniversal's Olympic coverage. It is renowned for its unsurpassed Olympic heritage, award-winning production, and ability to aggregate the largest audiences in U.S. television history.
For more information on NBC Olympics’ coverage of the PyeongChang Olympics, please visit: http://nbcsportsgrouppressbox.com/.
About SMT
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious events, including the Super Bowl, major golf and tennis events, the Indianapolis 500 and the World Series. The 31-time Emmy Award-winning company is headquartered in Durham, N.C. For more information, visit smt.com.
February 5, 2018
Sports Video Group
To put it mildly, the 2017-18 NFL campaign has been a memorable one for SkyCam. In a matter of months, the dual-SkyCam model — an unheard-of proposition just a season ago — has become the norm on high-profile A-game productions. In addition, the company unveiled its SkyCommand for at-home production in conjunction with The Switch, with plans to continue to grow this central-control model. In addition, last year, SkyCam worked with SMT to debut the 1st & Ten line and other virtual graphics on the SkyCam system; today, it is standard practice on almost any show using a SkyCam.
At Super Bowl LII, SkyCam once again deployed dual SkyCams with the high-angle focusing on an all-22 look and the lower SkyCam focusing on play-by-play. SVG sat down with Chief Technology Officer Stephen Wharton at U.S. Bank Stadium during Super Bowl Week to discuss SkyCam’s role in NBC’s game production, the rapidly growing use of the dual SkyCams by broadcasters, NBC’s use of the system as the primary play-by-play game camera on a handful of Thursday Night Football games this season, and an update on the company SkyCommand at-home–production control system, which was announced earlier this year.
Tell us a bit about your presence at U.S. Bank Stadium and the role SkyCam will play in NBC’s Super Bowl LII production?We were fortunate enough to be here with Fox for the Wild Card Game, and that allowed us to keep a majority of our infrastructure in place. Also, when the stadium was built, they built in a booth for SkyCam and cabled the building, so that obviously helped us quite a bit. But we’ve been here since Sunday working with the halftime show to make sure that our rigging isn’t in the way of them and they’re not in the way of us. And then, Monday, full crew in for Tuesday first-day rehearsal, and then all the way through the week.
In a matter of months, several major NFL broadcasters have adopted the dual-SkyCam model. What are the benefits of two SkyCams?We used to say you knew you had a big show when you had SkyCam on it. Now you have a big show when you have two SkyCams on it. I think one of the key driving factors for [the increased use of] dual SkyCam was working with the NFL and the broadcasters to better highlight Next Gen Stats. And, working with SMT on their auto render system, one of the big values that we now bring is this ability to show you the routes and what’s going on with each player as the play develops from the overhead all-22 position.
It just so happened that, as the dual systems started to evolve, we got this amazing opportunity in Gillette Stadium when the fog came in and no other cameras could be used. Typically, you think of SkyCam as being used for the first replay camera; we’re not necessarily live. But, in that instance, we had to go live with SkyCam, and the first replay became the high SkyCam. That opportunity changed how we are seen and used. It demonstrated what you could do with SkyCam, and that obviously penetrated all the other networks. You get two totally different angles, one more tactical and one play-by-play, and there’s really no sacrifice. You’re not giving anything up on the lower system; you’re actually helping because you don’t have to chase down beauty shots and comebacks since the upper system can do that. The lower system can just focus on play-by-play.
Do you expect the use of dual SkyCams for NFL coverage to continue to grow next season?I think that you’ll continue to see the dual SkyCams become more of the norm, not just for the playoff games but for most A-level shows, because it brings such a value for both Next Gen Stats and the broadcasters. We’re obviously super excited about that.
I think there’s a bifurcation between audiences in terms of [SkyCam] as a primary angle: some really love it, and some don’t like it. But what you’re seeing in broadcast today with the growth of technology and evolving media is that people end up with a buffet of options to choose from: OTT, streaming, mobile, television, or something else. And there is a market for all of it. I think, at the national level, you’ll see more play-by-play action live from SkyCam because broadcasters will be able to use it and distribute it however they like.
At NAB 2017, you introduced SkyCommand, an at-home–production tool that allows SkyCam operators to be located remotely. Do you have any update on this platform, and are broadcasters using it already?We have seen tremendous interest. People are asking where and when they can we do this, but there are obviously a couple different challenges we have to address: one, since it’s a cost-saving model, you’re looking at lower-tier shows in venues that don’t have much infrastructure in most cases. That said, when you take lower-tier games that happen to take place in venues that [have the necessary infrastructure], it becomes very appealing. Most of our network partners have been very interested in finding ways of utilizing Sky Command for [at-home] production. [Our partners] Sneaky Big Studios and SMT are on board, and we’re looking at doing a lot more of it in 2018. We’ve actually got some pilot programs already.
Just a couple weeks ago, we relocated SkyCam into an 80,000-sq.-ft. facility a few miles down the road from our old facility. It’s a brand-new facility, built from the ground up, that’s tailored to our needs. We’ve got two entire broadcast booths with SkyCommand in mind. One is a network-operation center with full streaming capabilities and data connectivity to the games that we’re doing. Beyond SkyCommand, when our operators are onsite, we will have a guy in Fort Worth who is basically at NOC watching the game. This person will be looking at the responses coming out of the computer systems and will be on PLs with the [on-site operators]. And then we can send that video back to the NOC and address any type of issues that we have; it gives us a great ability to manage that. The second booth is where we can actually put an operator and a pilot.
We’re continuing to work with the network vendors —The Switch, CenturyLink, and others — but we’ve already got full 10-gig fiber to the facility. So we’re working now to put all that in place for SkyCommand. I think you’ll see that more in 2018.
In what other sectors is SkyCam looking to grow in the near future?We’re also trying to expand [permanent SkyCam installations] throughout the NFL. I expect that we will have some other announcements coming out shortly about additional teams building on what we did with the Baltimore Ravens last year. Those team SkyCams will continue to grow in 2018, and we’re looking at leveraging Sky Command specifically for those cases.
February 5, 2018
Sports Video Group
SMT (SportsMEDIA Technology) is bringing a number of Super Bowl firsts to Minneapolis on both the broadcast and the in-venue production side. On NBC’s Super Bowl LII broadcast, SMT will deploy a telestrator on the high SkyCam for the first time and also will have the 1st & Ten line available on additional cameras. The in-venue production will offer the 1st & Ten line on the videoboards for the first time in a Super Bowl and will also feature enhanced NFL Next Gen Stats integration.
“It’s always exciting to do something brand new for the first time,” says SMT Coordinating Producer Tommy Gianakos, who leads the NBC SNF/TNF team. “And it’s even better when you’re doing it on the biggest show of the year with a lot of extra pieces added on top.”
In addition, during the Super Bowl LII telecast, NBC Sports’ production team will have access to a new telestration system on the high SkyCam for first replays.
“We’re now adding some telestration elements on SkyCam,” Gianakos explains. “In the past, we’ve been able to have a tackle-box [graphic] on one of the hard cameras if there’s an intentional-grounding play, but we haven’t been able to do it from high and low SkyCam on first or second replay. That intentional-grounding [virtual graphic] right above the tackles on SkyCam is something we haven’t been able to do before, but now we are able to do pretty instantaneously.”
SMT demonstrated it for NBC Sports producer Fred Gaudelli on Friday when a high school football team was on the field, and NBC opted to move forward with the system for the game.
“We’re able to do backwards-pass line virtually in real space; we’re able to measure cushions, able to paint routes on the field, all very rapidly,” says Ben Hayes, senior account manager, SMT. “It’s pretty unique to this show and the first time we’re going to be doing it on-air.”
In addition to having the live 1st & Ten line on both SkyCams and the same six hard cameras available for NBC’s Thursday Night Football and Sunday Night Football telecasts, SMT has added it to the two goal-line cameras, the all-22 camera, and two more iso cameras.
SMT also added next-gen DMX switchboard connectivity to NBC’s scorebug, so on-field graphics will update in real time and list personnel and formations of both teams.
“From a crew standpoint, it was really nice for us to have both Thursday Night Football and Sunday Night Football this season because it gave us a second group of people that understood the expectations of this show and what Fred and [director] Drew [Esocoff] really want from the show,” says Hayes. “We were basically able to merge those two crews for this game and not miss a beat.”
On the Videoboards: 1st & Ten line, Enhanced Next Gen StatsFans at the stadium will be able to see the 1st & Ten line system on the videoboards. For the first time at a Super Bowl, the yellow virtual line will be deployed on three cameras –— on the 50- and both 25-yard lines — for the in-venue videoboard production.
Also, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering real-time content not available on broadcasts.
“It’s amazing to be doing this here at Super Bowl,” says Ben Grafchik, business development manager, SMT. “Obviously, we can build upon the technology in the future, but this is our first step into it. And then I’m looking to try to continue that going forward.”
Fans inside U.S. Bank Stadium will have access to real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?). The production is made possible by SMT’s Dual-Channel SportsCG, a turnkey clock-and-score–graphics publishing system that requires just a single operator.
“We knew the Minnesota Vikings were already doing virtual and NFL Next Gen Stats, so we started thinking about what we could do to spice it up for the Super Bowl,” says Grafchik. “We’re throwing a lot of things at this production in hopes of seeing what sticks and what makes sense going forward for other venues.”
In the lead-up to the game, SMT worked with the league to merge the NFL Game Statistics & Information System (GSIS) feed with NFL Next Gen Stats API to come up with a simple lower-thirds graphics interface. This will allow the graphics operator to easily create and deploy a host of new deep analytics graphics on the videoboard during the game.
“These additional NGS elements get viewers used to seeing traditional stats along with nontraditional stats when they are following the story of the game,” says Grafchik. “If Alshon Jeffery has a massive play, the operator can instantly go with the lower third for his average receptions per target. The whole plan was to speed up this process so that this individual isn’t [creating] true specialty graphics; they’re just creating traditional graphics with extra spice on top of it. By getting quick graphics in like that, it helps to tell a story to the viewer in-venue without much narration on top of it.”
February 4, 2018
Sports Video Group
Since the first beam went up on this massive structure in Downtown Minneapolis, U.S. Bank Stadium has been building to this moment. Super Bowl LII is here, and an all-star team from Van Wagner Sports & Entertainment Productions, stadium manager SMG, and the Minnesota Vikings is ready to put on a Super Bowl videoboard production for the ages.
When 66,000-plus pack into the sparkling bowl, they’ll be treated to quite a few in-venue firsts on those boards, including the Super Bowl debut of SMT’s Yellow 1st & 10 line, a completely new Super Bowl LII graphics package, and an expanded arsenal of camera angles.
“Every Super Bowl, we’re tasked with moving the needle,” says Bob Becker, EVP, Van Wagner Sports & Entertainment (VWSE) Productions, which has designed the videoboard. “What can we do differently this Super Bowl that we haven’t done in the past? That’s our constant challenge. This is my 23rd [Super Bowl], and, every year, it gets bigger and bigger and bigger. When it’s over, you say, ‘Wow, what a great job,’ and then you start stressing about next year and wonder, ‘Well, how do we top that?’ That’s how I feel about that: you’ve got to always up your game.”
The stadium’s crown jewels are a pair of Daktronics video displays behind the end zones that measure 68 x 120 ft. and 50 x 88 ft., respectively. This year, for the first time at a Super Bowl, those boards will feature a full complement of the Yellow 1st & 10 line. SMG and the Vikings had a standing relationship with North Carolina-based SMT throughout the season, offering the yellow line encoded on their 50-yard-line camera. For the Super Bowl, they chose to expand it to include the other main cameras at each of the 20-yard lines. SMT’s Ben Grafchik will be sitting at the front of the control room, calling up specialty data-driven graphics, tickers, and data feeds for the control-room crew to call up as they desire.
Those advanced graphics are part of a completely fresh graphics package that Van Wagner has developed for this game. It’s the classic hard work done by the company: build a season’s worth of graphics to be used on a single night. Also, not only does Van Wagner come in and take over the U.S. Bank Stadium control room, but its team has basically torn it apart, pulling out gear and replacing it with specialty systems in order to take the videoboard show to that next level.
“It’s not because it’s not good,” says Becker, “but that’s how we make it bigger and better. Sometimes, you’ve got to bring technology in to make it bigger and better. And, to these guys’ credit, they have not only been there from Day One for us but have been open to allowing us to tear apart their room and integrate these new things. And it happens a lot that they go, Hey, you know something, I’d love to use that for a Vikings season next year. So there’s benefit on both sides.”
One of the vendors that has gone above and beyond for the control room is Evertz. The company has provided a crosspoint card for redundancy and the EQX router while also supplementing with some spare input cards, output cards, and frame syncs.
It’s a challenging effort to make temporary alterations to the control room, but SMG and the Vikings have welcomed the opportunity to expand with open arms.
“There’s a reason I took this job,” says Justin Lange, broadcast operations coordinator for U.S. Bank Stadium, SMG. “This is a prestigious event, and this is big for this city, the Vikings, and for us as a company. It’s been a great experience. It’s a great opportunity for us to showcase what we can do with this room, what we can do with these boards. The sightlines are great in this facility. The boards are great, the IPTV system is expansive, and we’re just excited to showcase what we have to offer as a facility.”
Normally, the control room features both Evertz IPX and baseband routing, an 8M/E Ross Acuity switcher with 4M/E and 2M/E control panels to cut secondary shows, and Ross XPression graphics systems. The all-EVS room houses a wide range of EVS products, including three 12-channel 1080p replay servers, one 4K replay server, IPDirector, Epsio Zoom, and MultiReview.
For the Super Bowl, the control room will have more cameras to choose from than it has ever had before. A total of 18 in-house cameras deployed throughout the bowl (which is more than the normal eight for a Vikings game), including four RF handhelds, an RF Steadicam, and two robotics.
The crew is also an impressive sight to behold. Nearly 100 people are working on the videoboard show in the combined efforts between Van Wagner, SMG, and the Vikings. There’s also a handful of editors across the street in the 1010 Building (where many broadcasters have set up auxiliary offices) cutting highlight packages and team-specific content.
“This is the biggest event in the world,” says Becker, “and we and the NFL mean to acknowledge that. We’re willing to do what needs to be done to put on the biggest event in the world.
February 2, 2018
NBC Sports
NASCAR will provide its teams with more data in real time this season, giving them access to publicly available steering, brake, throttle and RPM information as well as live Loop Data for the first time.
The information will be provided for every driver on every lap of every session on track.
The steering, brake, throttle and RPM information has been available through NASCAR.com’s RaceView application, which uses the information provided by the electronic control units used in the electronic fuel injection systems. Some teams have created labor-intensive programs that scraped the data from RaceView, so NASCAR decided to save time and effort for teams by directly providing the information.
No other engine data will be released. The ECU can record 200 channels of information (of a possible 1,000 parameters). NASCAR assigns about 60 channels (including the steering, brake, throttle, and RPM), and teams can select another 140 channels to log through practices and races. Those channels will remain at the teams’ discretion and won’t be distributed by NASCAR.
NASCAR’s real-time data pipeline to teams this season also will include Loop Data, which was created in 2005 and has spawned numerous advanced statistical categories that have been available to the news media. The information was born out of a safety initiative that installed scoring loops around tracks after NASCAR ended the practice of racing to the caution flag in ‘03.
Previously, teams had been provided only lap speeds/times; now they will have speeds in sectors around the track marked by the scoring loops.
Teams still won’t be given Loop Data for the pits, where the scoring loops are installed to maintain a speed limit for safety. If a scoring loop in the pits were to fail during a race, teams theoretically could take advantage of that by speeding through that loop (particularly those whose pit stall is in that sector). NASCAR does provide teams with pit speeds after races.
February 2, 2018
Stadium Business
The NFL’s popular Next Gen Stats data feed is getting a boost with real-time data delivery and graphics solutions firm SportsMEDIA Technology (SMT) for Super Bowl LII at the U.S. Bank Stadium.
For the championship game this Sunday in Minneapolis, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering fans real-time content not available on broadcasts.
SMT’s in-stadium production combines in-game stats that are displayed on the stadium’s two massive video boards, as well as 2,000 in-concourse HD video displays.
U.S. Bank Stadium, home of the Minnesota Vikings, boasts 31,000 square feet of video boards, including the west end zone display at 120- by-68 feet high, and the east end zone display at 88- by 51-feet high.
The 65,000 fans at Super Bowl LII will be presented with real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?).
“As an organisation, the Minnesota Vikings constantly look for innovative strategies that provide the best fan experience possible, and SMT’s in-stadium solution is the perfect complement to our new video boards,” said Allen Wertheimer, senior manager of production for the Vikings.
“For years, we’ve heard from fans that they want the same innovative technology in-stadium that they get at home. Now, with SMT’s presentation of the virtual 1st and Ten system and the NFL’s Next Gen Stats on the video boards, we can offer them in-game stats they wouldn’t get watching from home.”
Ben Grafchik, SMT’s business development manager, said: “In anticipation of creating the ultimate Game Day experience for Super Bowl fans at U.S. Bank Stadium, we have worked diligently all season with the Vikings and the NFL to provide in-stadium 1st & Ten graphics and NFL’s Next Gen Stats, giving fans the real-time data they’re hungry for, such as positional information, game leaders, and quarterback passing.
“We are confident that our execution will provide quantifiable and unique data points that truly highlight the skills inherent in elite NFL athletes.”
This year’s Super Bowl pits the New England Patriots against the Philadelphia Eagles.
January 31, 2018
Business Wire
DURHAM, N.C.--(BUSINESS WIRE)--SMT (SportsMEDIA Technology), the leading innovator in real-time data delivery and graphics solutions for the sports and entertainment industries, today announced it is providing in-stadium solutions, including its Emmy-winning virtual 1st & Ten line system and the NFL’s new Next Gen Stats, for Super Bowl LII, to be held Feb. 4 at U.S. Bank Stadium.
For Super Bowl LII, SMT is providing a new version of the NFL’s Next Gen Stats data feed unique to the game, offering fans real-time content not available on broadcasts. SMT’s in-stadium production combines in-game stats integrated into SMT-designed graphics packages that are displayed on the stadium’s two massive video boards, as well as 2,000 in-concourse HD video displays, offering fans a chance to watch highlights and stay informed no matter where they are in the stadium. U.S. Bank Stadium boasts 31,000 square feet of video boards, including the west end zone display at 120- by-68 feet high, and the east end zone display at 88- by 51-feet high.
The more than 65,000 football fans attending the Super Bowl will be treated to a variety of valuable real-time team and player data, ranging from positional information (Who’s on the field?) to game leaders (Who’s the fastest on the field today? Who’s had the longest plays today?) and quarterback passing grids (How has this QB fared in these zones today?). The production is made possible by SMT’s Dual-Channel SportsCG, a turnkey clock-and-score graphics publishing system that requires just a single operator.
“As an organization, the Minnesota Vikings constantly look for innovative strategies that provide the best fan experience possible, and SMT’s in-stadium solution is the perfect complement to our new video boards,” said Allen Wertheimer, Senior Manager of Production for the Minnesota Vikings. “For years, we’ve heard from fans that they want the same innovative technology in-stadium that they get at home. Now, with SMT’s presentation of the virtual 1st and Ten system and the NFL’s Next Gen Stats on the video boards, we can offer them in-game stats they wouldn’t get watching from home.”
“In anticipation of creating the ultimate Game Day experience for Super Bowl fans at U.S. Bank Stadium, we have worked diligently all season with the Vikings and the NFL to provide in-stadium 1st & Ten graphics and NFL’s Next Gen Stats, giving fans the real-time data they’re hungry for, such as positional information, game leaders, and quarterback passing,” said Ben Grafchik, SMT Business Development Manager. “We are confident that our execution will provide quantifiable and unique data points that truly highlight the skills inherent in elite NFL athletes.”
In addition to in-stadium solutions, SMT will provide broadcast solutions for Super Bowl LII, including the virtual 1st and Ten system, data-driven graphics and tickers, and in-game data feeds to commentator touchscreens, among other services. SMT has supported Sunday Night Football on NBC since 2006.
About SMT
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious events. The 31-time Emmy Award-winning company is headquartered in Durham, N.C.
January 30, 2018
Sports Video Group
With the Madden NFL 18 Club Championship Finals in full swing this week and the recent announcement of a new TV and streaming deal with Disney/ESPN, EA’s Madden NFL Championship Series is squarely in the esports spotlight. The series has been moving toward this moment for months, with 11 NFL teams hosting events in which fans competed to advance to the Finals in Minneapolis this week. In its first foray into competitive gaming, SMT’s Video Production Services (VPS) group produced events for the Arizona Cardinals, Buffalo Bills, and Jacksonville Jaguars throughout the end of 2017.
“SMT’s experience with supporting top football shows like the Super Bowl and Sunday Night Football makes us uniquely positioned to attract Madden gamers to the NFL through the medium they are most attracted to: esports,” says C.J. Bottitta, executive director, VPS, SMT. “With a worldwide fan audience now estimated at 280 million, approaching that of the NFL, SMT is excited to enter the growing market of competitive gaming.”
Although the level of services SMT provided varied from show to show, the base complement for all three productions comprised a full technical team of broadcast specialists operating six cameras, multiple replay machines, and a telestration system. SMT kept pace with the Madden’s lightning-quick style of play for the three-hour shows streamed on EASports YouTube channel, Twitch.TV/Madden, and the EA Sports’ Facebook page. In addition, SMT’s Creative Studio customized EA’s promotional trailer with team-specific elements for each of the three events.
“We started doing [Madden events] with teams last year, and there has been an evolution from wanting a [small-scale] podcast-level environment to almost a broadcast-level show,” says Bottitta. “What I loved about the three teams this year was how passionate and excited they were to be doing this. Teams were handling events very differently, but all of them had great people to work with and did a wonderful job.”
Inside the Production: University of Phoenix Stadium, Glendale, AZ
The Cardinals’ Madden NFL 18 Club Championship took place on took place on Saturday Nov. 11, soon after the team’s Thursday Night Football home game against the Seahawks, creating a quick turnaround for SMT and the team’s production staff. SMT provided the producer (Bottitta), director, tech manager, and lead camera operator and advised on what should be added for the production.
“We primarily provided leadership for the Cardinals,” says Bottitta. “They have a fantastic facility, so we reviewed with their tech group what they had and what they needed to add for [a competitive-gaming production] like this. They have a fantastic control room, and they used the crew that they normally use except for the producer, director, tech manager, and lead cameraman, which we provided.”
Inside the Production: New Era Field, Buffalo, NY
In Buffalo, SMT provided a similar level of services for the Bills’ event on Saturday Dec. 2, the day before the team faced off against the New England Patriots. SMT worked with the Bills to manage other shows using the team’s studio at New Era Field: a simulcast radio show, pre/postgame show for the Buffalo Sabres, and Bills GameDay on Sunday.
SMT once again used the team’s crew primarily but provided its own producer, director, tech manager, and camera ops and added a stage manager.
“Buffalo was on a real-time crunch,” says Bottitta, “so they told us the studio they wanted to use, the schedule of the studio, and asked us what was reasonable to expect. We guided them through what would make the most sense, so we could get in there, have a rehearsal and set day and then do the show while also allowing them to still do their normal duties.”
Inside the Production: Daily’s Place Amphitheater, Jacksonville, FL
SMT ramped up its role at the Jaguars’ event, which took place the morning of a home game against the Seahawks on Dec. 10. Since it was a game day, the Jaguars crew was occupied handling the in-venue production, so SMT essentially handled the entire Madden production at Daily’s Place Amphitheater, which is connected to EverBank Field. Since the two events were happening concurrently, the Jaguars provided SMT access to their router, allowing live camera views of warmups to be integrated into the Madden show throughout.
“The Jaguars [production] was the most unique of the three because it was on game day,” Bottitta explains. “They wanted to host it on the morning of what ended up being a very meaningful December football game for the Jaguars for the first time in a long time. Since the game-day crew was obviously busy, we did the whole show. We were taking Seattle and Jacksonville warming up on the field as bump-ins and bump-outs for our show, which was great and really captured the energy of the game.”
The Broadcast Mentality: Madden NFL Coverage Continues To Evolve
As the Madden NFL Club Championship grows (all 32 NFL franchises were involved for the first time this year, with prize money totaling $400,000 at this week’s Championship), the property has made an effort to boost its production value for live streams. Bottitta believes that SMT’s experience on A-level NFL productions, including Sunday Night Football and this weekend’s Super Bowl LII, was integral in the league’s selecting SMT: “I think that made a big difference: knowing that we weren’t just a group that’s doing one more esports tournament; this is a group that does professional sports production.”
He adds that VPS aims to leverage this broadcast-level expertise by bringing in such tools as replay systems and telestrators, which would be standard on an NFL telecast.
“We tried to bring a [broadcast] philosophy to these shows and want to make it more consumable for the viewers,” he says. “We brought telestrators and replay to all of the [productions], and that was not the norm when EA launched [the Club Championship] last year. I did that not only because SMT has a very portable, very easy-to-implement telestrator system but because it really adds to the show. If you went to a game and didn’t see replays or the key camera angles, you’d be in shock. So that became a big part of our production plan.”
January 19, 2018
Sports Video Group
As the Jacksonville Jaguars look to stymie the New England Patriots’ quest for a sixth Super Bowl victory, CBS Sports will cover this Sunday’s AFC Championship from every angle — including overhead.
CBS Sports will deploy 39 cameras in Foxborough, MA: seven super-slow-motion cameras, eight handhelds, and a Steadicam; pylon cams; and a collection of 4K, robotic, and Marshall cameras. The network will also have access to Intel 360 cameras for 360-degree replays. To give viewers an aerial view, CBS will rely on a dual SkyCam WildCat aerial camera system and fly a fixed-wing aircraft over Gillette Stadium.
The CBS Sports crew will work out of NEP SSCBS and have access to 152 channels of replay from 14 EVS servers — four eight-channel XT3’s and10 12-channel XT3’s — plus a six-channel SpotBox and one 4K server.
CBS Sports’ lead announce team Jim Nantz, Tony Romo, and Tracy Wolfson will have plenty of storytelling tools at their fingertips, including SMT’s Next Gen Tele and play-marking systems with auto-render technology on both SkyCams. The lower SkyCam will focus on the actual game play at the line of scrimmage, including the quarterback’s point of view, while the upper SkyCam will provide a more tactical, “all-22” look at the field. During the AFC Championship, Romo will be able to use these tools to break down what he sees on the field for first and second replays.
Coverage begins at 2:00 p.m. ET with The NFL Today, featuring host James Brown and analysts Boomer Esiason, Phil Simms, Nate Burleson, and Bill Cowher at the CBS Broadcast Center in New York City; kickoff follows at 3:05 p.m. ET. Fans wanting to start their day even earlier can tune into The Other Pregame Show (TOPS) on CBS Sports Network, which runs from 10:00 a.m. to noon
January 12, 2018
Sports Video Group
The Tennessee Titans travel to New England this weekend to take on the reigning Super Bowl champions in the AFC Divisional Round. To capture the action on the gridiron from every angle, CBS Sports will rely on dual SkyCam WildCat aerial camera systems with SMT’s Next Gen Tele and play-marking systems, as well as its virtual 1st & Ten line.
The Next Gen Tele System, which debuted during last year’s AFC Divisional Round, channels the NFL’s Next Gen Stats (NGS) data into an enhanced player-tracking telestrator. Combined with SMT’s proprietary play-marking system, which enables rendering of four virtual-player routes on the SkyCam video and its virtual 1st & Ten line, Next Gen Tele System provides a multitude of options for on-screen graphics that CBS Sports talent can leverage to better tell the story of the game.
“From a production standpoint, everything is about storytelling and conveying the story behind the game,” says Robbie Louthan, VP, client services and systems, SMT. “It’s handled in many different ways, but one way is obviously graphics. The advantage there is, you’re able to tell relevant, compelling information in a quick and succinct way without having to have the talent verbalize it to [viewers]. When you can get it reduced down to a graphic that is relevant to the viewer, you’re guaranteeing that the information you want to convey is being handled in a very quick, succinct manner, because there’s very short time frame between plays.”
During Saturday’s game, SkyCam will focus the lower camera system on the actual game play at the line of scrimmage, showing the quarterback’s point of view. The upper system will provide more of a tactical, “all 22” look at the field. Both systems will feature SMT graphics that enhance their respective camera angles and roles.
“Our camera angle creates a view that helps tell the story better than other camera angles,” explains Stephen Wharton, CTO, SkyCam. “Our view just establishes the storytelling for those graphics better than any other camera can, and then, when you add the motion that our camera brings with it, it makes those graphics — whether NGS, routes, and lines or first-down markers —- get placed very well within the angle of the shot, so that that story is being told.”
SMT will deploy four staffers to Gillette Stadium to support the graphics on the dual Skycam system: one operator to support the Next Gen Tele System, a dedicated operator for each of the camera systems, and one to oversee the operation and help produce the content. SkyCam will have a team of nine on the ground in New England, including five operators on the lower camera system (an engineer in charge, an assistant, a rigger, a pilot, and an operator responsible for the camera’s pan/tilt/zoom) and four on the upper camera system (an EIC, rigger, pilot, and PTZ operator).
The same system will return the following week during the AFC Championship Game, and similar systems will appear in other games throughout the NFL playoffs. And, while the action on the gridiron is sure to excite throughout the playoffs, the graphics overlaid on the dual Skycam system will only increase the level of storytelling that the talent can deliver and fans can expect.
“We’re excited about showing off a new way of using Next Gen Stats and really focusing on where the players are running, where the routes are, and creating that sort of Madden look, if you will,” says Wharton. “If you [look at the broadcasters, they’re] usually telestrating: they’re saying, Here’s this guy, and they draw the little yellow line of where he ran. Now we’re leveraging the NFL’s Next Gen Stats system to get that data to create the graphics with SMT and then overlay that from our angle. It creates a very compelling shot.”
Echoes Louthan, “It’s another tool in the toolkit for the announcers — in this case, for [analyst] Tony Romo to use graphics to help tell the story of what he sees. It has been exciting for us to work with Tony on fine-tuning these graphics to [enable] him to use his incredible insight into the game to tell the story.”
(SportsMEDIA Technology), the leading innovator in real-time data delivery and graphics solutions for the sports broadcasts, and SkyCam, the company that specializes in cable suspended aerial camera systems, are continuing to deliver technological innovations to CBS Sports’ broadcasts of the AFC playoff games, including Saturday’s Tennessee Titans vs. New England Patriots contest, Sunday’s Jacksonville Jaguars vs. Pittsburgh Steelers game and the AFC Championship on Jan. 21.
SMT will provide its Next Gen Tele system, an enhanced player-tracking telestrator that harnesses the power of NFL's Next Gen Stats data and SMT’s proprietary play-marking system to instantly render four virtual player routes on SkyCam video that’s available to the producer and talent at the end of every play. This “first-replay series, every replay” availability makes SMT’s system a true breakthrough in which NFL's Next Gen Stats data is able to drive meaningful content as an integral component of live NFL game production. The system debuted last year for the AFC divisional playoffs.
Using dual SkyCam WildCat aerial camera systems to enhance its broadcast, CBS Sports has made standard the “Madden-like” experience that gives football fans a more active and dynamic viewing experience behind the offense, revealing blocking schemes, defensive fronts and throwing windows and providing a deeper understanding of plays. Combined with
SMT’s virtual 1st & Ten line solution placed from SkyCam images, viewers are experiencing the new, modernized look of NFL games. SMT, through its offices in Durham and Fremont, has supported CBS NFL broadcasts since 1996.
“Used in conjunction with SMT’s virtual technology, fans have embraced the enhanced coverage made possible with dual SkyCam systems, a look that younger viewers have come to expect in their games,” said Stephen Wharton, CTO, SkyCam. “With SkyCam, fans get the benefit of a more complete view of the action and play development – we place them right into the action in real-time. Sideline cameras force fans to wait for replays to get a sense of what receivers and quarterback were seeing. With SkyCam, no other camera angle is as immersive or engaging.”
“SMT’s ability to place virtual graphics from SkyCam opens up a plethora of possibilities for broadcasts in terms of augmented reality applications with advertising content, player introductions on the field, or a whole host of possibilities,” said Gerard J. Hall, CEO, SMT. “The potential with our technology is limitless.”
SMT, a leading innovator and supplier of real-time data delivery and graphics solutions for the sports and entertainment industries, provides clients with cutting-edge storytelling tools to enhance their live sports and entertainment productions. SMT’s technology for scoring, statistics, virtual insertion and messaging for broadcasts and live events has been used to enhance the world’s most prestigious live events, including the Super Bowl, NBC Sunday Night Football, major golf and tennis events, the Indianapolis 500, the NCAA Tournament, the World Series, ESPN X Games, NBA on TNT, NASCAR events, and NHL games. SMT’s clients include major US and international broadcasters as well as regional and specialty networks, organizing bodies, event operators, sponsors and teams. The 31-time Emmy Award-winning company is headquartered in Durham, N.C., with divisions in Jacksonville, Fla., Fremont, Calif., and London, England.
Headquartered in Fort Worth, Texas, SkyCam is a leading designer, manufacturer and operator of mobile aerial camera systems. SkyCam plays a significant role in changing the way sporting events are broadcast in the world, appearing at marquee broadcast events, such as The NFL Super Bowl, NCAA Final Four, NBA Finals, Thursday Night Football, Sunday Night Football, NCAA College Football, 2015 CONCACAF Gold Cup and 2014 FIFA World Cup. SkyCam is a division of KSE Media Ventures, LLC
January 08, 2018
TV Technology
NEW ORLEANS—New Orleans Saints and Carolina Panther receivers and quarterbacks weren’t the only ones concerned about what was in and out of bounds Sunday (Jan. 7) in New Orleans during the NFC Wildcard game.
Fox Sports, which telecast the game, walked a different sort of line with its playoff coverage—one that delineates between delivering the great shots needed to present game action and some new tech implementation that actually gets in the way of coverage.
“We don’t want to make things all that different for the production team and give them a whole bunch of stuff that they haven’t had before for the big games,” says Mike Davies, SVP of Field and Technical Operations at Fox Sports. Rather, the strategy is to start with a “base layer” of production technology used throughout the 17 weeks of the regular season and then deploy choice pieces of technology that will have the biggest impact on game production and allow Fox Sports to tell the best story, he says.
“A lot of this stuff we’ve used before and some just this year,” says Davies. “We just pick the best of the best to represent us.”
For example, for the three NFL playoff games Fox Sports is covering the broadcaster will add a second, higher SkyCam to deliver a drone’s-eye view of plays that captures all 22 players on the field. “Although you think of how over the top two SkyCams might sound, it turns out to be very useful,” says Davies. Fox Sports first used the dual SkyCam setup during the preseason and then again in Week 5 for the Packers vs. Cowboys game. “I think that camera angle is new enough that we are still learning what it can do,” he says.
The broadcaster recognized the upper SkyCam “was something special” in Week 5 during a play involving Cowboys running back Ezekiel Elliot. “He jumped over that pile and no camera, including the lower SkyCam, saw that he had reached out over the first down line [except for the new upper SkyCam],” he says. “At least for that moment, we were sold that this is something special and something we wanted to offer.”
However, camera enhancements—both in terms of numbers and applications—aren’t limited to the second SkyCam. For its NFL playoff coverage, Fox Sports will deploy seven 8x Super Mo cameras, rather than the typical five. Fox also will use 6x Super Mo for its SkyCams, which it first did for its Super Bowl LI coverage in February 2017.
“There are so many replay opportunities in football, and the Super Mo gives this crisp—almost cinematic—look at the action,” says Davies.
The sports broadcaster also will take advantage of work it has done this year with Sports Media Technologies (SMT), SkyCam and Vizrt “to cobble together a recipe” to do augmented reality with the SkyCam, he says. Not only does the setup allow Fox Sports to put a live yellow line on the field of play with its SkyCam shots, but also to put graphic billboards and other 2-D graphics on the field and to fly around them with the SkyCam as if they were real objects.
“It’s a bit of an orchestration because the pilot of the SkyCam needs to be flying around the object as if it were an object on the field. If you break through it, it’s not going to look real,” says Davies.
Another enhancement is how Fox Sports will use its pylon cameras, says Davies. Rather than pointing the pylon cams positioned at the front of the end zone down the field, Fox will rotate them so they look down the field at a 45 degree angle, says Davies.
“That gives you a way to cover a play where the camera is actually looking. Yes, you have the goal line, but you also have the out-of-bounds line as well,” he says. As a result, there are more game situations in which the pylon cameras can contribute to coverage. “The pylon cameras are a lot like catching lightning in a bottle. They are great, but you don’t want to use them unless you’ve got something that is really compelling,” says Davies.
While it is too soon to tell if the drop in viewership plaguing the league this season will carry over to the playoffs, Davies is confident that the right technology and production techniques have the potential to help fans reconnect with the game.
“I feel that what we are able to do using all of this incredible technology—the dual SkyCams, the Super Mo’s and the pylons—is that we are able to deliver that kind of experience in replay right after the play that also shows the emotions of players, not just what happens between the whistles,” he says.
Harkening back to his stint at HBO, Davies recalls the connection the cinematic style used for “Inside the NFL” created as “you watched a game that happened three or four days prior.” Today’s production tools give broadcasters that same opportunity to create that connection, he says. “I can’t help but think that these kind of storytelling tools, honestly, can only help,” says Davies.