Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swift code helps get going quickly (AudioKit for iOS does this well) #10

Open
loopbum opened this issue Feb 19, 2016 · 26 comments
Open

Comments

@loopbum
Copy link
Collaborator

loopbum commented Feb 19, 2016

Hello People, Any chance of someone from Ableton getting a Best Practices, working, Swift 2.x example that wraps the Objective-C code?

Github.com/audiokit/AudioKit has created some nice Swift examples and they help get things rolling real quick. Thanks. ~ Robert

@brs-ableton
Copy link
Contributor

Hi @loopbum - do you have a specific suggestion of what you would want to see with Swift bindings? There's only a single Objective-C class with a single factory method in LinkKit. The rest is C. I could imagine a Swift playground that would allow you to explore the C API in some way would be cool to have, but I'm afraid experimenting with this would not be terribly high priority for us right now.

@loopbum
Copy link
Collaborator Author

loopbum commented Feb 24, 2016

I'm more interested in getting it to work with AudioKit as it's C and Objective-C based. As for C that's not the issue as much as C++ which will be addressed in Swift 3.x

So immediate use case for LinkKit; I've not been able to pull together AudioKit with LinkKit, that's, very easy to use Swift 2.x with AudioKit.

@dansimco
Copy link
Collaborator

+1

@jasonjsnell
Copy link
Collaborator

I've gotten ABLLink to work with Swift 3.0. Recoded the app classes as Swift 3 and using a Bridging-Header.h which allows the Swift to easily call the C classes. Also needed to add the libABLink.a and libc++.tbd to my Build Phases to work.

@kowongh
Copy link
Collaborator

kowongh commented Apr 17, 2017

@jasonjsnell is there a repo for this swift class?

@jasonjsnell
Copy link
Collaborator

I have it as a private repo because I assume Ableton doesn't want the code public (yet? Correct me if I'm wrong). I'll send you a private invite to it.

@jbloit
Copy link
Collaborator

jbloit commented Apr 18, 2017 via email

@jasonjsnell
Copy link
Collaborator

I was able to build an audio engine in Swift. I happen to be using an file player unit to play my sounds, but it is also possible with a render loop in Swift. It takes a few conversions in and out of RawPointers, but it can be done.

https://github.com/jasonjsnell/XvAudioSystem

@rikardvilhelm
Copy link
Collaborator

rikardvilhelm commented Apr 19, 2017 via email

@thomjordan
Copy link
Collaborator

thomjordan commented Apr 20, 2017 via email

@jasonjsnell
Copy link
Collaborator

Sure, I'll do that now. I also reached out to Ableton to see if I can make that repo public...

@dewib
Copy link
Collaborator

dewib commented May 5, 2017

"Me too please", he says jumping up and down like an excited (if slightly lazy) 5 year old...
"I'd love an invite" big smiley face...

@boblemarin
Copy link
Collaborator

I would be very interested too...

@jbloit
Copy link
Collaborator

jbloit commented May 6, 2017 via email

@jasonjsnell
Copy link
Collaborator

Talking with Ableton now about making it public. I'll have news soon...

@jasonjsnell
Copy link
Collaborator

Got the OK from Ableton, here is the Swift wrapper:

https://github.com/jasonjsnell/ABLLinkSwiftWrapper/

@dewib
Copy link
Collaborator

dewib commented May 10, 2017

Perfect timing, tomorrow I start on converting my midi generator app to fully fledged sequencing. Thanks.

@jasonjsnell
Copy link
Collaborator

Great - if you have any questions or need better comments in the code, let me know.

@thomjordan
Copy link
Collaborator

thomjordan commented Jun 22, 2017 via email

@bangerang
Copy link

bangerang commented Jun 22, 2017

Sorry for being a party pooper but using Swift in your audio thread is not something you want to do. Swift and memory allocation uses locks which can cause audio glitches.
You're better off writing your Link wrapper in Objective-C where you can combine it with C or C++ for your realtime stuff.

@aure
Copy link
Collaborator

aure commented Jun 22, 2017

Hi, as the creator of AudioKit I just thought I'd chime in. So, first of all, people like @bangerang are absolutely correct, you should not do audio processing in Swift directly. AudioKit doesn't do this either, its all at the C/C++ level. AudioKit just makes it easier to chain together these nodes on the Swift to get a lot done quickly. And like @jbloit said, Link is on our road map and thanks to @JoshuaBThompson we're moving it up the priority list! So, sorry to pounce on this issue discussion so late in the game, but happy to be here and working with Link!

@jasonjsnell
Copy link
Collaborator

I agree. C and C++ are always going to more efficient than Swift. For my uses, the Swift code hasn't caused any audio glitches for me - I have 32 channels of sound, but they are short, 1-5 second samples (a drum machine, in essence). I haven't done any experimenting with longer form audio samples / streams, etc.

@lijon
Copy link
Collaborator

lijon commented Jun 25, 2017

Chiming in here too.. as the developer of AUM I'm a bit concerned about music apps ability to work good together :) C/C++ vs Swift in the audio thread is not about efficiency. It's about real-time safety. Some calls and code paths do stuff that are not good to do in the audio thread. Swift and Objective-C does this all the time, since their messaging system in itself is the culprit. You must use C or C++ in the audio thread, and even then there's stuff in C/C++ you need to avoid: anything that can block or take locks. So, no memory allocation/freeing, no disk or network I/O, etc.

I often hear the argument that "I know, I know... but for me it works, I haven't noticed any problem". This might be true when running your app standalone, but not when combined in a mixing environment where lots of apps need to run during the same audio render slice and share the resources. If any app blocks the audio thread, it ruins the party for all apps in the configuration. For people playing around in the couch, some glitches and audio drop-outs might be OK. For musicians playing live on stage, not so much...

@jasonjsnell
Copy link
Collaborator

Ah thank you, that's good to learn. It's unfortunate that Swift can't operate safely in the audio thread :(

@OverToasty
Copy link
Collaborator

OverToasty commented Jun 25, 2017 via email

@hashmal
Copy link
Collaborator

hashmal commented Jun 25, 2017

They are not a problem because of how it is used: OSSpinLockTry and OSSpinLockUnlock do not block the thread. This is a key to understand what is safe to do on the audio thread: when something blocks, you don't know when it will resume. could be very fast (most of the time), could be very slow (rarely, but still unacceptable for audio).

Swift and Objective C actually block in a lot of places. Basically, even a method call cannot be proven to execute under a certain amount of time. In most applications this is OK, but not for real time programming.

Note that not using mutexes, method calls, malloc and others is not enough. If you're serious about audio programming, you will adopt a different programming style. Recursion is out of question in many cases.

Do you send messages from the main thread? How do you process them on the audio thread? for (int i = 0; i < msgCount; i++) { ... } is potentially problematic, because you don't know how large msgCount can be. Yes, in practice it is very unlikely your app will suffer because of this, yet you have to consider this case.

fgo-ableton pushed a commit that referenced this issue Nov 28, 2022
…linking

Fix ABLLinkForceBeatAtTime liker error
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests