I did read Josh's great article and had a go at implementing the tutorials.
Unfortunately, I ran into some issues with the border-radius via SVG mask bit.
It worked great on a hardcoded element, but I still need to figure out how to make it work across a whole component library where the border-radius changes based on the user's branding and container-queries.
Josh’s solution also intuitively appears wrong to me because it seems to assume that nearby elements are emissive, and I can’t agree with that as a standard physical property of “materials” on the web.
I instead assume that materials are by default more similar to paper.
The backdrop blur is unrelated to the surface properties of the elements underneath: it’s about the frosted glass refracting light, and some of that light it refracts comes from beyond its bounding box, which a naive backdrop-blur won’t observe.
But missing out on the refractive aspect of glass takes away the strong visual separation of layers that IMO is Liquid Glass's biggest contribution.
Material has these wonderful designer resources showing how the app ought to be built of consistent moving layers, shown in 3D from the side. It's clear that there's these layers. But once you go 2d, put it all together, its incredibly hard for me to find all the features. The number of times that there's an action button in some corner that folks don't see massive. Motion sort of helps highlight the chrome vs content, but there's just so little thats visually defining.
Liquid Glass's biggest strength IMO is the edge distortion. That when the content moves, there's this non-linear motion that the human visual sense immediately picks up on. Static screenshots don't always look great, but in motion there's so much more distinction of elements than anything else I've seen.
And that key refractive element, that takes such huge advantage of human's motion sensing, is missing here.
I'd seen one other great web web demo, but am struggling to find it right now. I did find this one, which I don't think looks as good, and the performance is bad for me here on mobile. But it nicely breaks down steps and shows how complex a task this can be. Getting these edge distortions in is hard, and relying on SVG filters to do it is a brutal path to hack together. https://atlaspuplabs.com/blog/liquid-glass-but-in-css#breaki...
I think this is a clever moat Apple created with Liquid Glass: they picked an effect that is easy to make a worse version of, but very hard to do the real/right way (and humans have an intuition for real/right because they see real glass every day). So any copycats will look worse in a way that's pretty obvious and Apple gets to keep the "premium-looking" product.
I don't think it's that hard for someone with experience writing shaders to emulate. The moat is that it's almost impossible to replicate with browser technology, which hurts web-based ui systems and is still a big challenge with something like flutter or jetpack compose multiplatform.
Stuff like React Native get it basically for free because their ui is still technically "native".
But apps that rely on web views are screwed and I'm sure Apple will be happy to push devs away from those solutions as they're inferior for users.
Now they just need to figure out a way to push RN apps towards true native.
I fully agree that Liquid Glass is almost impossible to replicate with browser technology without using 3D shaders. But I’m curious to know your opinion about why apps that rely on web views are inferior for users (apart from the above reason). I certainly think Apple thinks they are inferior, but I'm not sure how devs and users feel about it.
Webview apps are limited to 60fps so they feel sluggish on iOS. Running them in safari with the 120fps dev option enabled they can feel like native, but then feel laggy when bundled into their own app.
I think the same is true for when they started using blurs everywhere. They knew Android OEMs would copy it in a worse way and shoot themselves in the foot. That's also why material design doesn't rely on blur effects, it needs to run well on much worse hardware.
That is a very impressive attempt at Liquid Glass! The whole cross-browser and cross-platform bit is still a challenge sadly → "Only works on Impeller, so Web, Windows, and Linux are entirely unsupported for now"
That is one way things could turn out. On the other hand, when Apple started to introduce the iOS-7-style frosted glass effects, we got backdrop-blur in CSS, which handles the hard parts of achieving a similar effect on the web.
If this whole liquid glass effect catches on big time (which isn't out of the question right now), we might see something in the web platform that gives developers access to the platform primitives required to render it.
Why shouldn't there be an effort to make more OS UI elements available via HTML? There's no technical reason against it. So I'm not saying that is what's going to happen, but I don't think this is a kind of moat they created with the explicit intention to make web apps worse.
I recently saw a large gold (plated) surface - larger than a hand. The effect the light played off of that metal was amazing, I've never seen the effect in small gold jewelry. Photographs and video just don't capture it. If Apple ever turns their attention to gold and manages to nail that effect, I would consider one of their devices just for that aspect.
Gold is really good at reflecting infrared which is why it was chosen as the coating for JWST's mirrors. I wonder if the colors just past visible on its way to IR would be just perceptible enough to give the look you are noticing???
The moat is real! I haven't tried recreating Liquid Glass in the browser yet. From what I've seen, it's possible, but not in a practical, cross-browser, "can be applied to an arbitrary component" kind of way.
Of course, as soon as we figure out how to get it done, Apple will move on to the next thing. I'm okay with that though. It was a bold move, and I can't imagine how much time and money Apple spent making Liquid Glass look that good.
Okay, I was with you about users caring about differences in the look of a glass refraction effect. But I'm flabergasted at the fact that there are people that cannot tell the difference between Coke and Pepsi by taste.
I think the other component of the moat is that their OS/GPU stack is power-optimized for this effect in hardware, which generic solutions for generic hardware will have trouble matching - even a lower fidelity replication’s power drain could be an order of magnitude higher as a result.
I agree that Liquid Glass's edge distortion looks great, and I will try my hand at recreating it eventually.
For the current project, I aimed to create a material that looked polished, worked consistently across browsers, and didn't use real 3D. And you're right about the effect being more visible when moving over a fixed background. The demo site I'm working on for the library does this, it's just not ready yet.
It's a nice effect, but to me this doesn't really feel like glass.
I think the most immediate difference is how light has no interaction with the bevels. I also expect some light to shine back into the glass and affect the lighting and coloring. It's not enough to just throw a blur in there.
Also, glass can have its own shadow with some interesting caustics (not sure even Apple does this). I see the shadow here, and it feels like a simple drop shadow. It makes the box feel like a flat card more than a 3d physical object which I think is part of the new trend.
Either way, This will not be easy to emulate with just css, it's probably more suitable to be a shader running in a gpu.
If it can't be done with CSS, then how can it be done? How can you apply GPU shader effects to a common div? If we can't apply GPU effects to basic HTML and need to do so on custom things like an arbitrary Canvas concoction, then we may as well rebuild a brand-new rendering engine that can apply GPU effects.
HTML is dead. I see no reason to care about it because we only need <p> tags to get some text across, as just about everything else is used to make a webpage an ad-bomb. So let's just start again with the <p> tag and better gpu integration, and leave everything else out.
- https://ui.glass/generator/ Get started with this free CSS generator based on the glassmorphism design specifications to quickly design and customize the style properties.
Your project looks awesome! I'm glad not to be the only one going 5+ layers deep into the shadows. The addition of Light Rays is particularly impressive!
That content scrolls quite slow on my phone. Is there another scrolling effect that makes it slower on purpose or it's a side effect of the glass look?
Interesting, no there are no scroll effects anywhere on the page. Would you mind sharing what device, browser, and level of internet connection you're using?
Never mind, I tried recording it now but could not reproduce it anymore. It was probably something on my phone at the time (Firefox on Android). I tried to remove or edit my previous comment but it seems it's not an option. Sorry for the noise.
Really nicely done! It's always surprising to me how often computer graphics isn't "one weird trick" and more like "5 layered tricks." Doing it with cross-browser compat is an additional challenge.
Do you have a sense of which aspects are the most resource-intensive? Naively I would guess it's the backdrop-filter.
Yes, same! I didn't expect it to need so many tricks to implement.
Your intuition is correct, the most resource-intensive part is the blur bit of the backdrop-filter. The higher the blur value, the more neighbouring pixels need to be "consulted" before rendering.
Another resource-intensive aspect is continuous repaint as you scroll or as a video background changes the look of the glass.
Small issue: the shadow between the top and the scrollable area with the toggles and code fades out when you scroll down (presumably it should be initially hidden and fade in).
Good catch! It's not a fading-in issue; the shadow is just scrolling out of view. I designed those "overhang" box-shadows to be applied to the inside of an element rather than cast down from one. I see now that I need to apply the latter kind in this case. Thanks for letting me know!
So you need a video HTML tag with all of these attributes:
<video
autoplay
muted
loop
webkit-playsinline
playsinline
sizes="100vw">
and the CSS:
.bg-video {
position: fixed;
z-index: -1;
inset: 0;
width: 100%; (don't set this to 100vw or you will have scrollbar issues)
height: 100%;
min-height: 100vh;
object-fit: cover;
object-position: center;
pointer-events: none;
}
The part I still need to do myself is provide multiple video sizes and show the one that's most suitable for the viewport.
Yes, thank you for answering. The textures are free to use, but the URL should only be used for prototyping. Here is the website: https://www.transparenttextures.com/
Is this considered new design? What is special about it? I don't see much of a difference compared to the Aero glass effect in Windows Vista from 2006. I've also seen it on many websites, using `backdrop-filter: blur`. Or am I missing something?
You're right, glassmorphism dates back to the early years of this millennium. For instance, Project Looking Glass(2003) attempted an entire desktop environment that resembled slabs of glass that could be turned over in 3D space!
Even so, I haven't found another glass generator that has a texture layer and box-shadow bevel, etc.
You’re not looking far back enough:) I see it more as a new Aqua from OSX Cheetah (2000 CE)
This is a slight tangent, but 20 years ago I had a nightmare in which I got hit by a tram, died and travelled in time into a world where every computer was running Windows Vista. In other words, fair enough, you might be onto something here after all.
I've never used OSX, but I would guess from screenshots that it didn't do dynamic layered gaussian blur of large screen regions in realtime, which would've been too much for the hardware of the year 2000.
I'd be concerned about the maintenance overhead of this ~44-lines of code vs. just 1-line of code that has a similar (while slightly less realistic) net effect:
I’m curious why. I’d have concerns about the performance of this effect, but 44 lines of CSS to achieve a higher level of polish and, dare I say, craft, doesn’t seem like something to be dismissive of at review stage.
I’m confused, but maybe two wrongs here do make a right. Please allow me to explain.
Putting aside for the moment that I personally find Apple’s iOS 26 design objectionable[1], I don’t understand why `backdrop-filter: blur` is the focus of recent implementations jumping on the Liquid Glass hype train.
Using background blur to create UI layer separation (often in combination with darkening/saturation or other contrast enhancements) has been around for over a decade on iOS and almost as long on the web. So what’s new here?
Adding to my confusion, I’m a bit surprised folks here think this is so challenging in CSS. A few commenters have pointed to great implementations elsewhere, but I think they’re underselling them.
Plainly: the solution that involves `backdrop-filter: filter(#filter)` where `#filter` references an inline SVG embedded with `<feImage/>` and `<feDisplacementMap/>` works very convincingly well.
This implementation choose to hard-code `<feImage href="data:image/png;base64,..."/>`. But if glass3d.dev chose to implement 3d highlights in a similar fashion (and wanted to support e.g. a dynamic `border-radius`) this image could easily be rendered in the browser runtime in canvas, and dumped into the CSS using `toDataURL()`. Similarly, a component library with a CSS pre-processor could generate these for any static shape at build-time.
Coming back to the why/should for a second though.
[1] In its current realization, Liquid Glass cannot effectively replace what blur accomplishes. Because Liquid Glass layers are more transparent, the contrast issues are real, and the spectral highlights distract the eye as much as (or more than) they help make up for that lack of contrast. It draws attention to the eye where blur would relax it. It’s a UI meant for an XR system (where it arguably solves a real problem, much like a HUD does in a video game) hacked into devices where it makes no sense, all in the name of a “unified OS” design language.
If any aspect of Liquid Glass is successful it will be when it’s used sparingly to delight in places that are low stakes and where other affordances are present (like a small circular button floating in the corner of the screen with hardware concentricity). A circle shape’s refractions would be smaller, softer, more symmetrical, and therefore arguably less noisy/distracting—in a way resembling a softer blurry background.
Which brings me full circle back to two wrongs.
This website doesn’t do anything new, but that’s why it’s good. Because the truth is, Apple failed to deliver a Siri-based LLM on a schedule it announced and is now trying to distract us with some shiny new thing. Damn, it worked.
Pro tip: Apple adds fancy GPU effects because Android can't rely on good GPUs, so Apple can continually define premium.
Thus, the odds you're doing glass-morphism via a Gaussian blur and drop shadow in CSS, are exactly 0. They are assuredly at abstraction levels far below that.
(disclaimer: worked on Pixel, did the color extraction + color system stuff for Material You)
I instead assume that materials are by default more similar to paper.
But missing out on the refractive aspect of glass takes away the strong visual separation of layers that IMO is Liquid Glass's biggest contribution.
Material has these wonderful designer resources showing how the app ought to be built of consistent moving layers, shown in 3D from the side. It's clear that there's these layers. But once you go 2d, put it all together, its incredibly hard for me to find all the features. The number of times that there's an action button in some corner that folks don't see massive. Motion sort of helps highlight the chrome vs content, but there's just so little thats visually defining.
Liquid Glass's biggest strength IMO is the edge distortion. That when the content moves, there's this non-linear motion that the human visual sense immediately picks up on. Static screenshots don't always look great, but in motion there's so much more distinction of elements than anything else I've seen.
And that key refractive element, that takes such huge advantage of human's motion sensing, is missing here.
I'd seen one other great web web demo, but am struggling to find it right now. I did find this one, which I don't think looks as good, and the performance is bad for me here on mobile. But it nicely breaks down steps and shows how complex a task this can be. Getting these edge distortions in is hard, and relying on SVG filters to do it is a brutal path to hack together. https://atlaspuplabs.com/blog/liquid-glass-but-in-css#breaki...
There is a collection of attempts (CodePen Spark) at Liquid Glass that I just found. Second link is one I thought did a pretty nice job, via very specifically pre defined / hard coded SVG. https://codepen.io/spark/453 https://codepen.io/lucasromerodb/pen/vEOWpYM
Stuff like React Native get it basically for free because their ui is still technically "native".
But apps that rely on web views are screwed and I'm sure Apple will be happy to push devs away from those solutions as they're inferior for users.
Now they just need to figure out a way to push RN apps towards true native.
Does it matter at that point? Seriously asking.
Isn’t that impossible? If I call native code via binding or their official language, the same thing will happens.
It's fun seeing the attempts to mimic Liquid Glass though, the most impressive so far is this Flutter package: https://pub.dev/packages/liquid_glass_renderer
[0]: https://codepen.io/rebane2001/details/OPVQXMv
If this whole liquid glass effect catches on big time (which isn't out of the question right now), we might see something in the web platform that gives developers access to the platform primitives required to render it.
Why shouldn't there be an effort to make more OS UI elements available via HTML? There's no technical reason against it. So I'm not saying that is what's going to happen, but I don't think this is a kind of moat they created with the explicit intention to make web apps worse.
Of course, as soon as we figure out how to get it done, Apple will move on to the next thing. I'm okay with that though. It was a bold move, and I can't imagine how much time and money Apple spent making Liquid Glass look that good.
Somewhat like the blind tasting tests of Coca Cola versus Pepsi versus supermarket brands.
Thanks for sharing the resources!
I think the most immediate difference is how light has no interaction with the bevels. I also expect some light to shine back into the glass and affect the lighting and coloring. It's not enough to just throw a blur in there.
Also, glass can have its own shadow with some interesting caustics (not sure even Apple does this). I see the shadow here, and it feels like a simple drop shadow. It makes the box feel like a flat card more than a 3d physical object which I think is part of the new trend.
Either way, This will not be easy to emulate with just css, it's probably more suitable to be a shader running in a gpu.
HTML is dead. I see no reason to care about it because we only need <p> tags to get some text across, as just about everything else is used to make a webpage an ad-bomb. So let's just start again with the <p> tag and better gpu integration, and leave everything else out.
They've been trying, with various degrees of success, over the past 10 years with Houdini.
https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_propert...
others
- https://ui.glass/generator/ Get started with this free CSS generator based on the glassmorphism design specifications to quickly design and customize the style properties.
- frosted glass sticky header https://www.joshwcomeau.com/css/backdrop-filter/
- glassy glassmorphism [codepen](https://codepen.io/a-trost/pen/dypQzwq), [in context](https://codepen.io/TurkAysenur/pen/ZEpxeYm)
https://patorjk.com/software/taag/#p=testall&f=Graffiti&t=He...
Especially with respect to the abuse of box shadow.
Do you have a sense of which aspects are the most resource-intensive? Naively I would guess it's the backdrop-filter.
and the CSS: .bg-video { position: fixed; z-index: -1; inset: 0; width: 100%; (don't set this to 100vw or you will have scrollbar issues) height: 100%; min-height: 100vh; object-fit: cover; object-position: center; pointer-events: none; }
The part I still need to do myself is provide multiple video sizes and show the one that's most suitable for the viewport.
/* This is mostly intended for prototyping; please download the pattern and re-host for production environments. Thank you! */
It shows, this is awesome, especially that rice paper effect!
Even so, I haven't found another glass generator that has a texture layer and box-shadow bevel, etc.
This is a slight tangent, but 20 years ago I had a nightmare in which I got hit by a tram, died and travelled in time into a world where every computer was running Windows Vista. In other words, fair enough, you might be onto something here after all.
I've never used OSX, but I would guess from screenshots that it didn't do dynamic layered gaussian blur of large screen regions in realtime, which would've been too much for the hardware of the year 2000.
Also, those scrollbars were quite impressive for their time.
Not a lot of web sites have made my Macbook M3 show signs of stress ;)
So it'd actually be the OP CSS that would perform worse than this single line.
Putting aside for the moment that I personally find Apple’s iOS 26 design objectionable[1], I don’t understand why `backdrop-filter: blur` is the focus of recent implementations jumping on the Liquid Glass hype train.
Using background blur to create UI layer separation (often in combination with darkening/saturation or other contrast enhancements) has been around for over a decade on iOS and almost as long on the web. So what’s new here?
Adding to my confusion, I’m a bit surprised folks here think this is so challenging in CSS. A few commenters have pointed to great implementations elsewhere, but I think they’re underselling them.
Plainly: the solution that involves `backdrop-filter: filter(#filter)` where `#filter` references an inline SVG embedded with `<feImage/>` and `<feDisplacementMap/>` works very convincingly well.
Check out this demo for example:
https://codepen.io/Mikhail-Bespalov/pen/MYwrMNy
This implementation choose to hard-code `<feImage href="data:image/png;base64,..."/>`. But if glass3d.dev chose to implement 3d highlights in a similar fashion (and wanted to support e.g. a dynamic `border-radius`) this image could easily be rendered in the browser runtime in canvas, and dumped into the CSS using `toDataURL()`. Similarly, a component library with a CSS pre-processor could generate these for any static shape at build-time.
The closest thing I’ve seen to this is:
https://ruri.design/glass
These implementations are out there.
Coming back to the why/should for a second though.
[1] In its current realization, Liquid Glass cannot effectively replace what blur accomplishes. Because Liquid Glass layers are more transparent, the contrast issues are real, and the spectral highlights distract the eye as much as (or more than) they help make up for that lack of contrast. It draws attention to the eye where blur would relax it. It’s a UI meant for an XR system (where it arguably solves a real problem, much like a HUD does in a video game) hacked into devices where it makes no sense, all in the name of a “unified OS” design language.
If any aspect of Liquid Glass is successful it will be when it’s used sparingly to delight in places that are low stakes and where other affordances are present (like a small circular button floating in the corner of the screen with hardware concentricity). A circle shape’s refractions would be smaller, softer, more symmetrical, and therefore arguably less noisy/distracting—in a way resembling a softer blurry background.
Which brings me full circle back to two wrongs.
This website doesn’t do anything new, but that’s why it’s good. Because the truth is, Apple failed to deliver a Siri-based LLM on a schedule it announced and is now trying to distract us with some shiny new thing. Damn, it worked.
Thus, the odds you're doing glass-morphism via a Gaussian blur and drop shadow in CSS, are exactly 0. They are assuredly at abstraction levels far below that.
(disclaimer: worked on Pixel, did the color extraction + color system stuff for Material You)