iOS 7 did feature translucency and blurred backgrounds. While liquid glass shaders appear more visually striking, the shader cost won’t be that different (i can imagine that new shaders use more memory bandwidth).
I was debating bringing that point up too.
I'll just add UIKit contains a lot of what is now the standard effects tools in iOS 8 (UIVibrancyEffect and UIBlurEffect). So clearly Apple was taking a cycle to iterate on this stuff then too.
Toolbars with a constant stable opaque background reduce visual noise and distraction and make it easy to achieve this goal. The new floating toolbar buttons work against this goal in a number off ways. For example, not only is the content going around your buttons (visual noise around the buttons distracts you from your task) but you also have constant colour changes that make it hard/impossible to use coloured icons on your buttons and also the new buttons make conveying the current state of a button more difficult.
Yup, toolbar buttons definitely have to effectively be re-done in certain cases to align with the new UX, and many of the things we used to take as standard aren't anymore. And Apple doesn't tell us what they
should be either for these cases (just the more common ones). So my job is now to look to see what was done in places so I can try to adopt what Apple is doing in certain places. (EDIT: Turns out Apple hasn't had time to adapt either, see the attached screenshot)
And you can clearly see some of what Apple has done in 26 to address some of the visual readability issues, which themselves aren't terribly pretty. For example, Leman's screenshot from earlier. Where you have this fade out effect where the toolbar would be in order to protect it from the content rendering it unreadable. When it could, you know, just be used as a well defined space for controls with clear visual separation?
The transforming tab bars (as seen in music) are another violation of good UI principles. When the UI transforms it moves the location of buttons that should have constant location allowing muscle memory to kick in, further more it hides useful controls behind an extra step, I don't want to have my interaction slowed down just because I had the temerity to actually try and use the app (scroll a list). I already hate this behaviour in Safari, if I want a distraction free reading experience which hides my toolbar and search bar I'll use reader mode (which I use all the time).
This is a reason I've fought adopting the tab bar in my own app, despite it
mostly feeling right on iPad/Mac (but not at all on iPhone). I've actually gotten to the point where I'm considering abandoning SwiftUI's top-level navigation paradigms in favor of doing it with UIKit on iOS, but as AppKit doesn't really have a comparable navigation controller, I've kept pushing it off (iPad and Mac currently share all the navigation code). SwiftUI has a working navigation controller for macOS, but not AppKit, which is rather annoying.
The end goal is the problem with Apple's current design leads. We don't use most apps just to look at things (and for those we do like photos and videos the UI sensibly disappears when necessary). We use apps to do things, and when you start with the premise that apps are about looking at things not doing things you end up continually making the UI harder to use in favour of trying to get rid of all controls.
I think you can see this in how the WWDC sessions keep talking about "focusing on your app's content". They are trying to push the chrome into the margins more and more to make more space available for what the app wants to show. But this really focuses on consumption models where the app basically is just trying to give you photos, movies, games, etc... stuff you consume, rather than giving you tools to
do things.