Press "Enter" to skip to content

US Open won’t have spectators, but it will have IBM’s AI and hybrid cloud

Fans will be able to use IBM’s new tech features as they watch the Grand Slam event from home this year.

Fans can become instant “experts” about the players and the tournament match-ups with new AI-powered insights.” data-credit=”Image: IBM” rel=”noopener noreferrer nofollow”>
Fans can become instant “experts” about the players and the tournament match-ups with new AI-powered insights.

Image: IBM

This year, IBM has developed three new tennis-based digital experiences for fans of the US Open. Spectators won’t be allowed at the USTA Billie Jean King National Tennis Center in Flushing, NY when the Grand Slam event begins on Aug. 31, but they will be able to participate remotely with new fan experiences that use artificial intelligence (AI) underpinned by hybrid cloud technologies. 

More about Innovation

Two of the new solutions are based on Natural Language Processing (NLP) capabilities from IBM Watson. They pull from a variety of data sets and running workloads on multiple clouds. The new solutions were developed by IBM iX, a digital design agency. They’re available on official US Open platforms, including USOpen.org and the US Open app. 

Fans can participate in Open Questions with Watson Discovery. This will give fans a way to engage remotely in iconic sports rivalries such as who the most influential tennis player is in history. IBM will use the NLP capabilities in Watson Discovery to analyze millions of news and sports sources for insights. The unstructured data will be analyzed and summarized and delivered to the fan on their mobile device or laptop. 

SEE: Natural language processing: A cheat sheet (TechRepublic)

There will also be Match Insights with Watson Discovery so that fans can gain AI-powered insights ahead of each match. This uses Natural Language Generation Technology from IBM Research to translate structured data, such as statistics from prior matches, into narrative form. 

IBM will also use AI Sounds to recreate the sound of fans inside the stadium. IBM used AI Highlights technology to recreate crowd sounds during tournaments from previous years. In the past, AI Highlights used Watson to take video footage and rank the excitement level of each clip to compile a highlight reel in near-real time and classify specific crowd reactions, including the crowd roar, to give each clip a crowd reaction score. This insight will be used this year to deliver those sounds based on similar play from last year. The AI Sounds tools will be available to the production teams in-stadium and at ESPN. 

All of this is underpinned by open hybrid cloud. The new fan experience solutions pull from a variety of data sets and APIs running on IBM public cloud and on private clouds. To handle the variety of different workloads required, the USTA is using Red Hat OpenShift to enable this across multiple public and private clouds. 

Also see

Source: TechRepublic