Swift / Apple Development Chat

Andropov

Site Champ
Posts
602
Reaction score
754
Location
Spain
Sounds like a project that could benefit from either perforce or Git LFS?
I don't think so. The problem seems to be related with branch switches causing all package dependencies to be invalidated, refetched, and (I think) rebuilt.

Especially if the back end isn’t yours and doesn’t offer a schema or GraphQL.

That said, I’d probably seriously consider something like Apollo for a project where I could. Maintaining a hierarchy of types that can be generated from a schema is overhead that I’d rather spend elsewhere. Make the mobile team co-owners of the schema file so PRs against it require both sides to be aware and sign off. At least catch some of the brittleness by that. Even better would be proper API versioning.

Doesn’t mean you can’t autogenerate Codables from a schema as a build step, though. I’m just not sure I’d be wanting to write one myself.
Making the mobile team co-owners is actually a great idea. I don't know why we aren't doing that. I think API versioning is not necessary at the moment because we work on launching startup products, so we can freely break the previous API. Thankfully that also means I can make changes to CoreData models without designing a migration.

I was favoring Apollo at first when we discussed whether to use that or a custom client, but I'm unsure if it's going to limit our flexibility. For example, if a query returns a URL, I'd prefer having our Swift model store it as Foundation's URL than String. Similarly, if a model has a stable unique identifier shared with backend, maybe I'd prefer to use the UUID type than a regular String. I'm not sure we can do that with Apollo, unless intermediate types are created. Wait: Apparently is relatively easy to do, according to this documentation post.

Well, since it's doable, that's another point towards using Apollo. The process is a bit verbose, but we'd only need to do it once for each custom type. I think I'm sold now.
 

Andropov

Site Champ
Posts
602
Reaction score
754
Location
Spain
That said, I’d probably seriously consider something like Apollo for a project where I could. Maintaining a hierarchy of types that can be generated from a schema is overhead that I’d rather spend elsewhere.
Looks like the code generation comes with its own overheads :p Xcode's latest update (14.3) breaks the installation process outlined in the Apollo docs. It's not a huge deal (it's still easily doable from the command line) but until they found the workaround, no one could compile the project.
 

Nycturne

Elite Member
Posts
1,109
Reaction score
1,417
Looks like the code generation comes with its own overheads :p Xcode's latest update (14.3) breaks the installation process outlined in the Apollo docs. It's not a huge deal (it's still easily doable from the command line) but until they found the workaround, no one could compile the project.

Yeah, that’s one reason I try to keep third party tooling to a minimum. And one reason I tend to prefer Carthage to Cocoapods. Cocoapods is nice, but invasive.

Some projects I’ve worked on have folks that dedicate a little time for the Xcode betas to find issues like this and be able to tell folks when it’s time to move to the new Xcode update after issues like this are solved. Fun.
 

Andropov

Site Champ
Posts
602
Reaction score
754
Location
Spain
Yeah, that’s one reason I try to keep third party tooling to a minimum. And one reason I tend to prefer Carthage to Cocoapods. Cocoapods is nice, but invasive.
We're using Swift Package Manager only. I personally haven't used anything else in ages either. I used Cocoapods once, for a project in 2017, and never tried Carthage.

Btw, I watched the Swift Concurrency: Updating a sample app talk yesterday and it's great. I had somehow missed it. It does a great job at outlining a way to structure concurrency in the apps using the new concurrency features and the result is very neat, much more intuitive than having queues all around the app. And it looks like my intuition that ViewModels should be @MainActor was correct.
 

Nycturne

Elite Member
Posts
1,109
Reaction score
1,417
We're using Swift Package Manager only. I personally haven't used anything else in ages either. I used Cocoapods once, for a project in 2017, and never tried Carthage.

Btw, I watched the Swift Concurrency: Updating a sample app talk yesterday and it's great. I had somehow missed it. It does a great job at outlining a way to structure concurrency in the apps using the new concurrency features and the result is very neat, much more intuitive than having queues all around the app. And it looks like my intuition that ViewModels should be @MainActor was correct.

SPM is my first choice these days. The main difference with Carthage is that it fetches/builds the dependencies separately and spits out xcframeworks (or static libs back in the day) you reference in your xcproj. For projects where you can then cache the build product to cut down your build times, this tends to be a bit nicer than Cocoapods, except when you need to make changes to a dependenc.

Agreed that ViewModels should be MainActor, I think I just commented more that in places where synchronous code tries to access a MainActor class, or implements/uses a synchronous protocol like Identifiable, this gets a bit wonky. And a lot of main thread code is still synchronous rather than MainActor.

EDIT: Can probably get away with some of this in hindsight through good use of nonisolated. A VM that has an “immutable” reference to a CoreData object can probably still conform to Identifiable using a nonisolated accessor.
 
Last edited:

Andropov

Site Champ
Posts
602
Reaction score
754
Location
Spain
EDIT: Can probably get away with some of this in hindsight through good use of nonisolated. A VM that has an “immutable” reference to a CoreData object can probably still conform to Identifiable using a nonisolated accessor.
That is the approach Apple shows in Protect mutable state with Swift actors (14:52) to conform to Hashable when the property Hashable needs to access is actor-isolated. Which is the exact problem you describe with Identifiable + CoreData identifer: protocol with a synchronous requirement needs to access an actor-isolated property. The CoreData object would indeed work with nonisolated if declared as let.
 

Andropov

Site Champ
Posts
602
Reaction score
754
Location
Spain
Found a cool use case for Swift actors yesterday. I have a project with a custom Metal rendering engine, which accesses some state (buffers, textures...) during a draw call that must not change during the draw call itself, or the renderer might crash. I was using some terrible synchronization system based on NSLocks to do it, and I wondered if I could rewrite it all so the entire state that had to be protected to be inside of an actor. Then, as long as the draw call is synchronous (no await / suspension points where the state could change), the actor would ensure that nothing changed during a draw call.

And... it worked. The API is so much nicer now. I don't have to remember what can and cannot change during a draw call, and the sate is guaranteed to be consistent by the compiler. Plus no possibility of deadlocks. I had to drop MTKView and use a custom UIView + CAMetalLayer because MTKView is @MainActor annotated so I couldn't get the drawable without introducing a suspension point in the middle of the draw call, but the Swift Concurrency warnings and errors were super useful all throughout the process. And since the entire thing now runs outside the main thread (CAMetalLayer does not need to be updated on the main thread), heavy UI updates don't affect the rendering engine performance.

I wasn't sure if I was going to run into a wall since the synchronization primitives used in render-related code are very low level and Swift Concurrency is very high level, but it worked beautifully. There was some weirdness around connecting a RunLoop to an async environment, but even that worked out in the end. And it's apparently fully Swift Concurrency compliant (no warnings about it, not even with strict concurrency checking). Cool.
 

Nycturne

Elite Member
Posts
1,109
Reaction score
1,417
Nice to see that effort is making progress. I knew they were working on it, but it looks like there's now agreement/approval for what the road looks like.
 

Andropov

Site Champ
Posts
602
Reaction score
754
Location
Spain
I laughed when I saw that they spent the first couple paragraphs basically outlining how bad C++ can be. They're right though, interoperability shouldn't come at the cost of risking making Swift code unsafe. I read through the proposal and it seems like a reasonable and well-thought compromise. In general, I think the people in charge of Swift evolution are doing an outstanding job.

Another interesting proposals that have been accepted/discussed recently: SE-0377 and SE-0390. I've read through a few of the ownership-related proposals/pitches/reviews and I also think they came up with a very reasonable approach there too, to help Swift's performance predictability where people need it (like in hot paths) without raising the entry level complexity of the language (the borrowing/consuming annotations are opt-in where used).
 
Last edited:

ArgoDuck

Power User
Site Donor
Posts
101
Reaction score
161
Location
New Zealand
Main Camera
Canon
^ Yes as someone who adopted C++ around 1989 and continued to use it until I transitioned to Swift in 2018 i am very happy if i never write another line of C++ in my life!

Of all the languages i used or was familiar with back in the 1980s - Pascal was probably my favorite - C++ appealed because OOP appealed, and that had to do with the ‘naturalness’ of thinking in terms of classes and objects from the perspective of philosophy papers i’d done on the side in my first (physics, math) degree. However, apart from a windows-like project i coded in the 1990s, i found C++/OOP generally quite clumsy. It didn’t live up to the promise.

And then i changed career path in the mid-90s and so didn’t spare the time to analyze why that was!
 

dada_dave

Elite Member
Posts
2,063
Reaction score
2,043
^ Yes as someone who adopted C++ around 1989 and continued to use it until I transitioned to Swift in 2018 i am very happy if i never write another line of C++ in my life!

Of all the languages i used or was familiar with back in the 1980s - Pascal was probably my favorite - C++ appealed because OOP appealed, and that had to do with the ‘naturalness’ of thinking in terms of classes and objects from the perspective of philosophy papers i’d done on the side in my first (physics, math) degree. However, apart from a windows-like project i coded in the 1990s, i found C++/OOP generally quite clumsy. It didn’t live up to the promise.

And then i changed career path in the mid-90s and so didn’t spare the time to analyze why that was!
I have to admit I like the direction of C++ since 2011 - even if I recognize that its improvements have largely come from borrowing the advancements made by other languages and the unsafe, verbose parts of the language is still available to be used.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,211
Reaction score
8,255
I have to admit I like the direction of C++ since 2011 - even if I recognize that its improvements have largely come from borrowing the advancements made by other languages and the unsafe, verbose parts of the language is still available to be used.

What’s changed? I wrote, literally, a million lines of C++ between 1997 and 2006, and haven’t paid any attention to the language since then. We used it for all our fancy internal EDA tools, often by swigging stuff so it could be called from perl, and sometimes embedding the perl interpreter into a big c++ program (e.g. I wrote a chip layout viewer and you could load perl scripts into it that, in turn, called swigged c++ routines to do things like analyze path timing if you dragged a wire to a different location). Ah, how much i loved those days. Is swig still a thing? I really wish i hadn’t stopped paying attention to all that.
 

dada_dave

Elite Member
Posts
2,063
Reaction score
2,043
What’s changed? I wrote, literally, a million lines of C++ between 1997 and 2006, and haven’t paid any attention to the language since then. We used it for all our fancy internal EDA tools, often by swigging stuff so it could be called from perl, and sometimes embedding the perl interpreter into a big c++ program (e.g. I wrote a chip layout viewer and you could load perl scripts into it that, in turn, called swigged c++ routines to do things like analyze path timing if you dragged a wire to a different location). Ah, how much i loved those days. Is swig still a thing? I really wish i hadn’t stopped paying attention to all that.
Swig is still a thing, I don’t use it myself, but it’s definitely still around. There are also lots of other “glue” APIs now. I’m a big fan of some of the modern Python-C++ ones like pybind11 and there’s even newer ones too. Obviously SWIG can do more than just Python but here’s a decent overview wrt to Python and C++: https://stackoverflow.com/questions/57862035/how-to-implement-python-interfaces-for-c-libraries

As for C++ improvements in short, they’ve done a good job of making the language safer to write and more readable without sacrificing performance. For instance there’s basically no reason to use raw pointers anymore but you can interoperate with older APIs that do and encapsulate them to make them safer - all without the performance hit of garbage collection. Things like templates are way more powerful than they used to be while being simultaneously easier to debug - especially in the latest versions. In general there are way more powerful compile-time tools like constexpr that not only help catch errors at compile time but also speed up runtime and of course aid in advanced metaprogramming. Even beyond the proliferation of compile-time tools like templates and constexpr (and tuples!), writing APIs and code in general requires less boilerplate and is easier to write, read, and maintain - things like lambda expressions, ranges, the spaceship operator <=>, auto, etc… (some things are trade offs like you can argue that auto makes things easier to write and especially maintain but harder to read at a glance, but a lot of these are pure wins)

Here’s a good overview from Microsoft:


This has come at a cost. Namely the language itself is much larger and no doubt compiler writers pull their hair out at implementing some of this stuff (though some of the additions have made that easier). Further all the old unsafe ways of doing things are still there - people have talked about maybe doing compiler flags to give warnings or even errors for doing something the old way. I don’t think anyone has done that but people have been discussing different ways for (especially large) projects to formally regulate/enforce what subset of the language the team should use. So that’s different from newer languages like Swift, Rust, etc … that were built from the beginning with at least some if not most of these ideas/concepts/best practices in place.
 
Last edited:

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,211
Reaction score
8,255
Swig is still a thing, I don’t use it myself, but it’s definitely still around. There are also lots of other “glue” APIs now. I’m a big fan of some of the modern Python-C++ ones like pybind11 and there’s even newer ones too. Obviously SWIG can do more than just Python but here’s a decent overview wrt to Python and C++: https://stackoverflow.com/questions/57862035/how-to-implement-python-interfaces-for-c-libraries

As for C++ improvements in short, they’ve done a good job of making the language safer to write and more readable without sacrificing performance. For instance there’s basically no reason to use raw pointers anymore but you can interoperate with older APIs that do and encapsulate them to make them safer - all without the performance hit of garbage collection. Things like templates are way more powerful than they used to be while being simultaneously easier to debug - especially in the latest versions. In general there are way more powerful compile-time tools like constexpr that not only help catch errors at compile time but also speed up runtime and of course aid in advanced metaprogramming. Even beyond the proliferation of compile-time tools like templates and constexpr (and tuples!), writing APIs and code in general requires less boilerplate and is easier to write, read, and maintain - things like lambda expressions, ranges, the spaceship operator <=>, auto, etc… (some things are trade offs like you can argue that auto makes things easier to write and especially maintain but harder to read at a glance, but a lot of these are pure wins)

Here’s a good overview from Microsoft:


This has come at a cost. Namely the language itself is much larger and no doubt compiler writers pull their hair out at implementing some of this stuff (though some of the additions have made that easier). Further all the old unsafe ways of doing things are still there - people have talked about maybe doing compiler flags to give warnings or even errors for doing something the old way. I don’t think anyone has done that but people have been discussing different ways for (especially large) projects to formally regulate/enforce what subset of the language the team should use. So that’s different from newer languages like Swift, Rust, etc … that were built from the beginning with at least some if not most of these ideas/concepts/best practices in place.

interesting. Honestly, if I wanted some of this stuff I’d just code in objective-C instead. Though I admit learning objective-C can be tough, even for a c++ coder. That said, if I was in a place where I had to still code in c++, I certainly would be very glad to have access to smart and shared pointers now.
 

dada_dave

Elite Member
Posts
2,063
Reaction score
2,043
interesting. Honestly, if I wanted some of this stuff I’d just code in objective-C instead. Though I admit learning objective-C can be tough, even for a c++ coder. That said, if I was in a place where I had to still code in c++, I certainly would be very glad to have access to smart and shared pointers now.
I think the advantage is that you can get that kind of functionality and safety without sacrificing the performance of moving to a language like Objective-C where the fact that a lot of it is done at runtime makes it much slower. However now there’s more to learn about C++ but you can argue that it’s easier to learn because students don’t have to worry about many of the common pitfalls that plagued pre-C++11 C++.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,211
Reaction score
8,255
I think the advantage is that you can get that kind of functionality and safety without sacrificing the performance of moving to a language like Objective-C where the fact that a lot of it is done at runtime makes it much slower. However now there’s more to learn about C++ but you can argue that it’s easier to learn because students don’t have to worry about many of the common pitfalls that plagued pre-C++11 C++.
I found that objective-C wasn’t too much slower than c++, when I replaced one with the other. Though I’d be interested in some more extensive testing than my anecdotes. When I wanted speed back in the day, I’d write routines in C, without any of the ++ gloss, for critical routines (like graph comparison algorithms). I’d see maybe 20% improvement by getting rid of classes, manually allocating memory, etc. Typically when I ported something from c++ to objective c, I’d see maybe 5% difference. Though that was done on macs with gcc, and maybe there are better c++ compilers available - the code was originally targetted for Unix (sun, rs-6000, etc.) so I didn’t spend a ton of time monkeying with c++ compiler flags or anything on the mac.
 

dada_dave

Elite Member
Posts
2,063
Reaction score
2,043
I found that objective-C wasn’t too much slower than c++, when I replaced one with the other. Though I’d be interested in some more extensive testing than my anecdotes. When I wanted speed back in the day, I’d write routines in C, without any of the ++ gloss, for critical routines (like graph comparison algorithms). I’d see maybe 20% improvement by getting rid of classes, manually allocating memory, etc. Typically when I ported something from c++ to objective c, I’d see maybe 5% difference. Though that was done on macs with gcc, and maybe there are better c++ compilers available - the code was originally targetted for Unix (sun, rs-6000, etc.) so I didn’t spend a ton of time monkeying with c++ compiler flags or anything on the mac.
So I’ll admit that my own knowledge of Objective-C performance comes from a sarcastic comment in the OneDivZero blog post detailing a brief mostly wrong history of programming languages:

1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic.

Which I’m sure is an extremely unfair characterization 🙃

Doing a quick survey of people’s responses online, I’d say it … depends. There will always be a hit from message passing but how much of one will depend on the specifics of the code, the compiler, the optimizations, etc …
 

dada_dave

Elite Member
Posts
2,063
Reaction score
2,043
So I’ll admit that my own knowledge of Objective-C performance comes from a sarcastic comment in the OneDivZero blog post detailing a brief mostly wrong history of programming languages:



Which I’m sure is an extremely unfair characterization 🙃

Doing a quick survey of people’s responses online, I’d say it … depends. There will always be a hit from message passing but how much of one will depend on the specifics of the code, the compiler, the optimizations, etc …
However the characterization of C++ is on point:

1983 - Bjarne Stroustrup bolts everything he's ever heard of onto C to create C++. The resulting language is so complex that programs must be sent to the future to be compiled by the Skynet artificial intelligence. Build times suffer. Skynet's motives for performing the service remain unclear but spokespeople from the future say "there is nothing to be concerned about, baby," in an Austrian accented monotones. There is some speculation that Skynet is nothing more than a pretentious buffer overrun.

This has only gotten more true with modern C++ as much as I like the changes.
 

Nycturne

Elite Member
Posts
1,109
Reaction score
1,417
So I’m taking a look at some of the upcoming sessions and see some interesting stuff:

- CoreData is dead, long live SwiftData (with the ability to migrate!)
- MapKit and StoreKit updates for SwiftUI
- SwiftCharts Improvements
- Changes to scrolling behaviors in SwiftUI to enable some new stuff (paging)
- Simpler @Observable attribute for SwiftUI observable classes
- Swift is getting macros
- Swift parameter packs are here, something that will make SwiftUI’s APIs a bit more readable in places (ViewBuilder).
- TipKit for teaching users about your UI.
- Animated symbols.
- Something about Inspectors in SwiftUI.

And of course a bunch of stuff related to the spatial UI for Vision Pro.
 
Top Bottom
1 2