
- Apple has toned down the transparent Liquid Glass effect in the iOS 26 Developer Beta 3.
- The latest beta replaces the transparency effect with a blurry, frosted look, which is easier to read.
- This change appears to be a response to the public backlash and concerns raised after Liquid Glass was introduced in the first beta.
It seems like Apple has finally addressed the criticism following the launch of the Liquid Glass design language. The latest iOS 26 beta backtracks on the Liquid Glass design, as menu items, control center, and search bar in several apps ditch the overly-transparent effect for a blurry frosted look.
Apple just rolled out iOS 26 Developer Beta 3 for supported iPhones, and those who updated to the latest version were met with a shocking surprise. Several “glassy” elements, especially the transparency effects, have been toned down in the latest beta.

The once transparent interface, which allowed users to see background page elements in a glass-like effect, appears more opaque now. While this does make it easy to read text and labels in said menus, and should be less resource-intensive now, it’s, however, a departure from the initial Liquid Glass design Apple showcased during the developer conference.
Canadian tech influencer @RjeyTech mentioned, “Looks like Apple just gave up on Liquid Glass in iOS 26 beta 3. Probably because of public lash-out and it being too resource intensive.” Sam Kohl from AppleTrack criticized the move and said, “It looks so much cheaper now and feels like Apple is backtracking on their original vision.”
iOS 26 beta 3 completely nerfs Liquid Glass. It looks so much cheaper now and feels like Apple is backtracking on their original vision. pic.twitter.com/WNG8O2qjeg — Sam Kohl (@iupdate) July 7, 2025
Whether it’s a change for better or worse depends on who you ask. Apple received a lot of backlash on X over the Liquid Glass design language when the first beta came out. So this could be a response to that. While I personally loved the Liquid Design UI , I can understand why Apple decided to reverse course on its vision. But again, this is a Developer beta, and nothing is set in stone till the stable release rolls out.

With over 4 year of experience under the belt, I cover all facets of consumer tech, from smartphones to other consumer electronics, our favorite social media apps, as well as the growing realm of AI and LLMs. As an Apps and AI writer app Beebom, I provide my expertise in all these areas, weaving stories that help you get familiar with the tech around you. But you will find me playing NYT daily puzzles in my free time.
Add new comment
Name
Email ID
Δ

- Apple is reportedly in talks with OpenAI and Anthropic to help power its upcoming AI-powered Siri voice assistant.
- The company is struggling to develop its LLM-based Siri, which will be more conversational and better at processing information.
- This will help Apple catch up in the AI race where it has been falling behind to the competition.
Apple has so far struggled to develop its new and improved Siri voice assistant, which is why a new report suggests that the company might be seeking help from AI giants like OpenAI and Anthropic to power Apple’s AI-enabled Siri ambition.
According to Bloomberg , the Cupertino giant is in talks with both OpenAI and Anthropic to build the next generation of AI-powered Siri, which could also run on Apple’s cloud infrastructure for testing.

We reported in May that Apple is working on a new LLM-based Siri , which will be more conversational and better at processing information. The company is likely having trouble building its own large language model, prompting it to reach out to other leaders in the AI industry.
Apple faced backlash after it failed to successfully deliver the promising new Siri, delaying its plans from 2025 to 2026 or even further. Apple is also looking to buy Perplexity AI to help improve its current set of Apple Intelligence features .
To be honest, Apple is struggling to catch up in the AI race, while its competitors are already gearing forward with rapid advancements. The company has to take some bold steps if it wants to catch up with Google’s Gemini, or it will end up far behind. And this could be why, instead of investing and building in-house frontier AI models, Apple is seeking help from other AI players.

With over 4 year of experience under the belt, I cover all facets of consumer tech, from smartphones to other consumer electronics, our favorite social media apps, as well as the growing realm of AI and LLMs. As an Apps and AI writer app Beebom, I provide my expertise in all these areas, weaving stories that help you get familiar with the tech around you. But you will find me playing NYT daily puzzles in my free time.
Add new comment
Name
Email ID
Δ

- Apple and Google have officially confirmed their team-up for the next-gen Siri AI.
- The Cupertino Giant will be using Google’s Gemini for a more personalized Siri model as well as Apple Intelligence features.
- We can expect the next-gen Siri to come out with iOS 26.4, sometime in March or April.
Apple has officially confirmed joining forces with Google to use its Gemini AI model to power the next-generation Siri. It will offer a more personalized experience and will be coming out with the iOS 26.4 update. Apple also plans to leverage Gemini’s capabilities for other Apple Intelligence features as well, later down the line.
The Next-Gen Siri will be powered by Google’s Gemini AI
Apple officially confirmed that it will be partnering with Google in a statement to CNBC . Here’s what it stated, “After careful evaluation, we determined that Google’s technology provides the most capable foundation for Apple Foundation Models, and we’re excited about the innovative new experiences it will unlock for our users.”
Later, Google also shared a post on X confirming the tie-up, “Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google’s Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year.”

Image Credit: X/@NewsFromGoogle
Both statements clearly mention that the Cupertino Giant will be using Gemini to power its assistant Siri. This was already rumored, as Apple’s attempts to acquire Perplexity went nowhere. With Gemini, Siri will get a major AI update . It will be able to handle more nuanced conversations and provide better results. Something long-time Apple users have been asking for years.
The next-gen Siri will arrive with the iOS 26.4 update, which will launch sometime in March or April. And it is only going to be available for Apple Intelligence-supported devices .
Something else worth noting is how Google’s statement mentions that Gemini will power Apple Intelligence features. This leads us to believe that Apple could use Gemini’s multi-modal capabilities for its Writings tools , Image Playground , and Message summaries, too.
Elon Musk Not Happy With Apple and Google Tie Up
xAI CEO Elon Musk also responded to Google’s announcement post on X, sharing his thoughts on the matter by saying, “This seems like an unreasonable concentration of power for Google, given that they also have Android and Chrome.” Though we don’t expect either Apple or Google to respond to Elon, we will update the situation as it progresses.
It is worth noting that xAI is the company behind Grok, which is in hot water currently due to its inappropriate image generation fiasco , and has been getting backlash from multiple news outlets, X users, and even government authorities.

With over 4 year of experience under the belt, I cover all facets of consumer tech, from smartphones to other consumer electronics, our favorite social media apps, as well as the growing realm of AI and LLMs. As an Apps and AI writer app Beebom, I provide my expertise in all these areas, weaving stories that help you get familiar with the tech around you. But you will find me playing NYT daily puzzles in my free time.
Add new comment
Name
Email ID
Δ