Sony AI has unveiled Gran Turismo Sophy 2.0, a major upgrade to its racing AI system first introduced in 2022. The new version shows significant improvements in driving skill, decision-making, and real-time adaptability on the track. Developed in partnership with Polyphony Digital, Sophy 2.0 builds on the original’s foundation but now handles complex race scenarios with greater precision and control.
(Sony AI Develops Gran Turismo Sophy 2.0 with Enhanced Capabilities)
The AI was trained using advanced reinforcement learning techniques that let it master vehicle dynamics under diverse conditions. It can now manage tire wear, fuel strategy, and weather changes more effectively than before. These enhancements allow Sophy 2.0 to compete at a level that challenges even the world’s top human drivers in Gran Turismo 7.
During recent tests, Sophy 2.0 demonstrated its ability to overtake cleanly, defend positions without aggressive contact, and adjust lines based on traffic and track conditions. Its behavior mimics that of experienced racers, showing awareness of fair play and sportsmanship. This makes it not just faster, but also smarter and more respectful during multiplayer races.
Sony AI says the update reflects years of research into how machines can learn nuanced human-like behaviors in high-stakes environments. The team focused on making the AI responsive to subtle cues, such as slight shifts in opponent speed or positioning. This results in smoother, more natural interactions during races.
(Sony AI Develops Gran Turismo Sophy 2.0 with Enhanced Capabilities)
Gran Turismo Sophy 2.0 will be integrated into Gran Turismo 7 through a future update. Players will be able to race against it in new modes designed to test their skills against an AI that learns and reacts like a pro. Sony AI believes this technology could also support driver training and simulation development beyond gaming.


