Formula E: Google cloud set to create descriptive audio for visually impaired fans

Formula E and Google Cloud have announced that they will launch a new initiative that should help make motorsport more accessible for blind and visually impaired fans through an innovative, AI-powered, audio race report.
The project uses Google Cloud’s generative AI technology to create multilingual descriptive audio summaries of every E-Prix race.
The all-electric single-seater series has noted that “the reports will provide fans with a dynamic recap that captures the excitement and key moments of the race, available on-demand shortly after the chequered flag.”
The project is being developed in close partnership with the Royal National Institute of Blind People (RNIB) to ensure the final product meets the needs of visually impaired users.
Formula E and Google Cloud have already tested the new initiative at last weekend’s Berlin E-Prix, and will continue to work on it at London ahead of the full rollout that is planned for Season 12.
Speaking of the initiative, Jeff Dodds, CEO, Formula E, said: “At Formula E, we believe the thrill of electric racing should be accessible to everyone. This innovative collaboration with Google Cloud is a fantastic example of how technology can be used for good, creating a brand-new way for blind and visually impaired fans to experience the drama and emotion of our sport.
"By working closely with the RNIB, we are ensuring this innovation is truly inclusive and fit for purpose, so that no fan is left behind.”
John Abel, Managing Director, Specialised Software, Google Cloud, said: "For too long, the visual nature of racing has been a barrier for fans who are blind or visually impaired. Google Cloud's AI technology will act as a digital storyteller, creating a vivid audio narrative that brings the speed, strategy, and excitement of Formula E to life.
"We are proud to work alongside a partner like Formula E that shares our passion for using innovation to break down barriers and connect people through shared experiences."
Sonali Rai, RNIB’s Media Culture and Immersive Technology Lead said: “Audio description transforms how blind and partially sighted motor sport fans can fully engage in enjoying the full racing spectacle - taking in the visceral sounds of cars on the track while feeling the passion of the crowd.
“RNIB has been working with Formula E and Google Cloud on this AI-powered podcast which promises to give a full picture of the race in an accessible and engaging way for blind and partially sighted racing fans.
"Formula E’s commitment to working directly with the blind and partially sighted community to develop this technology is exactly the right approach and sets a fantastic standard in inclusivity for other sports to follow and stay on track with new advances in innovation.”
As for the technology, the audio report will be created through a multi-stage process powered by Google Cloud's AI platform Vertex AI. The Google’s Chirp model accurately transcribes live race commentary before Google's Gemini models analyse the transcribed commentary alongside live timing data and other official race information.
The audio report identifies key events – such as overtakes, incidents, and strategic pit stops – and generates a fact-based, engaging race summary.
Finally, the text is converted into natural, expressive speech using advanced text-to-speech technology, creating a polished audio report ready for distribution.
The entire process is completed within minutes of the race’s conclusion. The reports will be available globally on Spotify and other popular audio platforms in more than 15 languages, including English, Spanish, French, German, Mandarin, and Arabic.