Head over to our on-demand library to view sessions from VB Transform 2023. Register Here
Back in May, IBM doubled down on its AI efforts with the announcement at the company’s annual Think conference of its new Watsonx product platform, which provides a foundational model library that can be used to fine-tune pretrained models for enterprise application development.
Now, the company is serving up what it hopes is a generative AI ace: For the first time, it is offering AI-generated audio tennis highlights for all matches during the two-week-long U.S. Open Tennis Championships, as well as AI-powered analysis to determine the projected difficulty of player draws and potential opponents.
More than 700,000 people head to Flushing Meadows, New York, each year to watch the best tennis players in the world compete, while more than 10 million tennis fans around the world follow the tournament through the U.S. Open app and website. And for three decades, IBM has been working with the United States Tennis Association on creating digital experiences for tennis fans.
IBM’s data operations bunker
The effort begins in the basement-level IBM data operations center at Arthur Ashe Stadium, where millions of data points are captured and analyzed. There are typically 56 data points collected for every single point of a tennis match.
VB Transform 2023 On-Demand
Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.
IBM is using generative AI models built, trained and deployed with Watsonx, and operating across a hybrid cloud infrastructure from Red Hat OpenShift, to generate detailed audio narration and captions to accompany U.S. Open highlight videos at unprecedented scale — for every match in the singles draw, across all 17 courts.
In addition, IBM debuted its Watsonx-powered AI Draw Analysis that uses both structured and unstructured data to project the level of advantage or disadvantage of all players in the singles draw. Each player receives an IBM AI Draw Analysis at the start of the tournament, which will be updated daily as the tournament progresses and players are eliminated. Every draw is ranked, allowing fans to click into individual matches and see the projected difficulty of their draw and potential opponents.
Previously, the USTA couldn’t cover highlights of all matches
Kirsten Corio, chief commercial officer at the USTA, told VentureBeat that with 128 men and 120 women playing singles in the U.S. Open — as well as doubles, juniors and wheelchair tennis matches — the organization couldn’t cover the highlights of most of the matches throughout the tournament.
“Depending on how many writers you have, you can only do a few matches at a time,” she said. “The other matches would just have stats and scores, but no commentary, so those stories are untold.”
So the USTA and IBM began to think about how to scale tournament coverage by combining stats and stories with generative AI. “How could we use the data and technology to actually write highlights that would be reliable and accurate enough?” said Corio.
Corio added that the USTA dreams of including AI-generated highlights in different languages in the future. “We would love to do that in Spanish, to scale more engagement,” she said. “That’s the natural next step.”
Questions for IBM and USTA about AI hallucinations, data control
While the USTA has been partnering with IBM on its technology efforts for decades, when it comes to today’s advanced AI applications, Corio pointed out that being able to control the data and the ecosystem is key.
The USTA uses its own curated, official data, “but there are plenty out there who peddle in unofficial data,” she explained. “We’re not yet sure what the downstream effects of that could be, so we’re actually putting together a few different task forces across the company post-U.S. Open, to dig into how can it benefit us? How can we protect against any potential conflict?”
A more of-the-moment concern is AI hallucinations — but in a presentation in the IBM Data Center beneath Arthur Ashe Stadium, an IBM spokesperson told VentureBeat that the company is doing human-in-the-loop quality checks on its AI Commentary. “We’re hoping over time we can reduce the need for human QA, but we do check each highlight clip, to make sure that the commentary is solid.”