Apple is reportedly developing real-time language translation for its AirPods lineup, in a move that would expand the smarts of its ever-improving earbuds.
That’s according to industry insider Mark Gurman, who reports via Bloomberg that the feature is slated for inclusion in the iOS 19 update, which is expected to be teased at WWDC in June and launch in September.
The development would build upon Apple’s recent efforts to enhance the AirPods’ health and utility features, including the hearing aid capabilities for the AirPods Pro 2 and heart-rate monitoring integrations in the just-launched Beats PowerBeats Pro 2.
The proposed translation feature would deliver translated speech directly through the AirPods while utilizing the iPhone’s speakers to project the user’s responses.
The approach would potentially mirror the functionality already present in earbuds like Google’s Pixel Buds Pro—and mimics much of what we saw in the early days of hearables from countless startups and crowdfunded projects. Way back in 2017, this writer even took the much-hyped Waverly Labs translation buds on a real-life date (to not much success).
After years of relative quiet on the feature, though, it seems to be making a comeback. We’ve seen plenty of smart glasses tout the feature, and Apple padding out the AirPods’ smarts with the feature would likely only help increase the recent uptick.
The potential addition also underscores Apple’s continued commitment to turning the AirPods into a more advanced, ‘proper’ wearable. We’ve written plenty of rumors about added functionality for AirPods since the first generation arrived in 2016—yet this has only begun to happen in the last year or two.
But we’ll keep an eye on this over the coming months. We’re still relatively far away from WWDC 2025 and the brand’s annual September event, and, of course, it’s possible that the roadmap or priorities change between now and then.