Apple’s Best New iOS 26 Feature Has Been on Pixel Phones for Years

Ever since I was a child, I’ve despised answering the phone when an unknown number calls. Who could be on the other end? Literally anyone: an acquaintance, a telemarketer, a serial killer who’s menacingly breathing into the mouthpiece.
While Apple’s upcoming Liquid Glass refresh in iOS 26 is likely to be the most immediately noticeable aspect of the software update as it starts rolling out to the public on September 15, I believe a smaller addition in iOS 26 might even have a bigger impact on how iPhone owners use their devices.
The iPhone is finally getting call screening. Hallelujah. At launch, the feature will support calls coming in from nine languages, including English, Spanish, and Japanese.
Once your iPhone updates to iOS 26, you can opt in and have the software automatically screen calls that come from unknown numbers. In this case, an unknown number is any phone number you haven’t interacted with before.
When your phone automatically picks up the call, a robotic voice asks the caller for their name as well as why they want to get in contact with you. Only after that information is collected, the iPhone will ring and show you these details in a notification bubble so you can decide whether to answer.
I was ecstatic to see this new option as I experimented with a beta version of iOS 26. I’m constantly getting calls from so many unknown numbers that I’ve completely given up answering the phone for anyone not saved in my contacts list.
With the imminent release of iOS 26, I can make informed decisions to ignore or answer these calls. And while most of the calls will still be ignored, I no longer have to wait until the caller starts leaving a voicemail and the live transcription appears on the screen to make a decision.
Call screening will be new for iPhones owners this fall, but users of some Android smartphones, like Google’s Pixel, have had a version of this tool, named Call Screen, available to them for years. Lyubov Farafonova, a product manager at Google, says in a statement emailed to WIRED that millions of Pixel users are using the feature in the US alone. “It is one of our fan favorite features,” she says.
Since its release of call screening in 2018, Google has worked to make the synthetic voice sound more natural for incoming callers. It’s also started showing relevant replies as tappable options while the screening is in progress so users can easily communicate with unknown callers without actually answering the phone. Further leaning into this feature, Google plans to roll out call screening to additional markets this fall.
“Pixel 10 owners in India can start experimenting with the beta version of manual Call Screen. This feature will be initially working in English and Hindi, with more languages and dialects on the way,” Farafonova says. “It will have a functionality to not only transcribe but also translate what the caller says to the Call Screen bot, to make life easier for those who don’t speak the same language as the caller.” Options for call screenings, manual or automatic, are coming soon to Pixel owners in Australia, Canada, Ireland, and the UK as well.
This isn’t the first time a software feature has initially arrived on Google’s flagship Android device before eventually being adopted by Apple. The Pixel added a Magic Eraser for cleaning up photos in 2021, and Apple’s version of its photo-retoucher dropped three years later. Another example of this is homescreen widgets. Google added glanceable widgets, sharing info like the weather, to the Pixel homescreen back in 2017. Three years later, a similar widget feature arrived for iPhone homescreens.
Of course, Android has borrowed a handful of things from iOS along the way. For example, Android’s Quick Share system for passing files between phones came long after Apple’s AirDrop. And Android’s Night Mode, which dims the screen and reduces its output of blue light at nightfall, copies Apple’s earlier Night Shift. (Apple declined to comment on the record for this story.)
Whereas Apple’s Liquid Glass offers an aesthetic that is distinct compared to the operating systems on other smartphones, the addition of call screening is more indicative of the current copycat software era. Any breakout feature is quickly mimicked by other companies, especially tools that use machine intelligence, an area where every player is eager to show off.
As a whole, these latest features Apple is building into iOS 26—not just call screening, but the new live translation abilities, and the feature that will sit on hold for you when making a call—are notable in their reliance on machine intelligence. And this is where Google has an incumbent advantage over Apple. Google’s Gemini voice assistant and the LLM that powers it on the backend, while imperfect, are already widely used, and the last couple generations of Pixel phones have been stacked with AI-enabled software. iPhones have had some AI features (notification summaries, search tools, productivity helpers), but they don't match Google's. Apple Intelligence, meanwhile, continues its slow roll into existence.
At WIRED, we’ll be testing all of the new aspects of iOS 26, like call screening and Liquid Glass, as the software update starts to roll out to millions of iPhone owners. My initial impression, while using the beta version, is that this really is the most dramatic update to the iPhone’s software I’ve seen in years. I’m still trying to determine whether that’s good or bad. Either way, just know now that if I’m ignoring your calls, it’s probably personal.
wired