America's Cup Makes Wind Visible for TV Viewers With AI
Capgemini program uses lidar to show virtual wind conditions at sailing competition
September 9, 2024
Fans watching the 37th America’s Cup on TV will for the first time be able to observe a critical but invisible part of the sport that previously was only discernible by the sailors themselves: the wind.
Multinational consultancy Capgemini and America’s Cup Media teamed to create WindSightIQ, a program that uses light detection and ranging (lidar), sensor fusion and scientific computation to show the wind in augmented reality and virtual reality graphics, which can be displayed on screen during broadcasts of the races, which began last month and last through October.
The wind field data collected by WindSightIQ is also fed into a yacht simulator to create a “ghost boat” that can be projected onto the racecourse to show the optimal paths the crews should take based on wind variables like direction, speed, sheer and pressure. The data is not available to the boat crews, who must rely on their instincts and training to plot the fastest routes, much as their predecessors have done since the event began in 1851.
“Before and during the race, commentators will now be able to see the real-time wind patterns and explain to the viewers the options for the competing yachts,” said Grant Dalton, CEO of the Americas Cup Event. “Being able to see the unseen wind and compare teams’ actual performances and tactical decisions to the optimum routes will mean audiences can follow and engage in the racing on a whole new level.”
Fans watching the 37th America’s Cup on TV will for the first time be able to observe a critical but invisible part of the sport that previously was only discernible by the sailors themselves: the wind.
Multinational consultancy Capgemini and America’s Cup Media teamed to create WindSightIQ, a program that uses light detection and ranging (lidar), sensor fusion and scientific computation to show the wind in augmented reality and virtual reality graphics, which can be displayed on screen during broadcasts of the races, which began last month and last through October.
The wind field data collected by WindSightIQ is also fed into a yacht simulator to create a “ghost boat” that can be projected onto the racecourse to show the optimal paths the crews should take based on wind variables like direction, speed, sheer and pressure. The data is not available to the boat crews, who must rely on their instincts and training to plot the fastest routes, much as their predecessors have done since the event began in 1851.
“Before and during the race, commentators will now be able to see the real-time wind patterns and explain to the viewers the options for the competing yachts,” said Grant Dalton, CEO of the Americas Cup Event. “Being able to see the unseen wind and compare teams’ actual performances and tactical decisions to the optimum routes will mean audiences can follow and engage in the racing on a whole new level.”
This article first appeared in AI Business' sister publication IoT World Today.
About the Author
You May Also Like