Skip to content

Conversation

@plietar
Copy link

@plietar plietar commented Nov 21, 2016

No description provided.

@mitchmindtree
Copy link
Member

Hey @plietar thanks for the PR :)

Can you explain a little more the reasoning behind changing the Scope of the audio unit property request from Output to Input? Is this a bug fix? How exactly does the behaviour change?

@plietar
Copy link
Author

plietar commented Nov 21, 2016

Sure, disclaimer though, I've got very little experience with CoreAudio.
My understanding is that each unit has an input and an output format. For a sink only the input format matters.

Before this patch, setting the format worked (set_stream_format changes the input), but setting a render callback would not, since it was checking against the output (https://github.com/RustAudio/coreaudio-rs/blob/master/src/audio_unit/render_callback.rs#L387-L394)

This PR makes stream_format and set_stream_format consistent. If this library ever supports audio sources, then this should be revisited (ie make these two methods take a scope, or make two more methods)

dmski added a commit to dmski/coreaudio-rs that referenced this pull request May 9, 2019
As i understand it, an audio unit can have several inputs and several outputs,
and an 'element' is just an index of one of those (https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitProgrammingGuide/TheAudioUnit/TheAudioUnit.html).

Therefore, it's should be possible, for example, to have several render callbacks for a single audio unit.
An example would be a crossfade unit with 2 inputs: it'll have 2 elements in its input scope
and 1 in output scope, and it'll require either two render callbacks (one for each input),
or two upstream audio units.

This changes Element to be just a number and adds explicit element parameter to all the places
where it hasn't been present before (i.e. setting callbacks and input/output stream formats).

This relates to the issue RustAudio#60 and PR RustAudio#47.
dmski added a commit to dmski/coreaudio-rs that referenced this pull request May 9, 2019
As i understand it, an audio unit can have several inputs and several
outputs, and an 'element' is just an index of one of those.
(https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitProgrammingGuide/TheAudioUnit/TheAudioUnit.html).

Therefore, it's should be possible, for example, to have several render
callbacks for a single audio unit. An example would be a crossfade unit
with 2 inputs: it'll have 2 elements in its input scope and 1 in output
scope, and it'll require either two render callbacks (one for each input),
or two upstream audio units.

This changes Element to be just a number and adds explicit element
parameter to all the places where it hasn't been present before
(i.e. setting callbacks and input/output stream formats).

This relates to the issue RustAudio#60 and PR RustAudio#47.
dmski added a commit to dmski/coreaudio-rs that referenced this pull request May 9, 2019
As i understand it, an audio unit can have several inputs and several
outputs, and an 'element' is just an index of one of those.
(https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitProgrammingGuide/TheAudioUnit/TheAudioUnit.html).

Therefore, it's should be possible, for example, to have several render
callbacks for a single audio unit. An example would be a crossfade unit
with 2 inputs: it'll have 2 elements in its input scope and 1 in output
scope, and it'll require either two render callbacks (one for each input),
or two upstream audio units.

This changes Element to be just a number and adds explicit element
parameter to all the places where it hasn't been present before
(i.e. setting callbacks and input/output stream formats).

I also had to change handling of render callbacks a bit, since there can
now be more than one of them for a single audio unit.

This relates to the issue RustAudio#60 and PR RustAudio#47.
dmski added a commit to dmski/coreaudio-rs that referenced this pull request May 12, 2019
As i understand it, an audio unit can have several inputs and several
outputs, and an 'element' is just an index of one of those.
(https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitProgrammingGuide/TheAudioUnit/TheAudioUnit.html).

Therefore, it's should be possible, for example, to have several render
callbacks for a single audio unit. An example would be a crossfade unit
with 2 inputs: it'll have 2 elements in its input scope and 1 in output
scope, and it'll require either two render callbacks (one for each input),
or two upstream audio units.

This changes Element to be just a number and adds explicit element
parameter to all the places where it hasn't been present before
(i.e. setting callbacks and input/output stream formats).

I also had to change handling of render callbacks a bit, since there can
now be more than one of them for a single audio unit.

This relates to the issue RustAudio#60 and PR RustAudio#47.
@akhudek
Copy link
Contributor

akhudek commented Aug 29, 2023

I've recently been working on using this crate for audio input and encountered the scope problem, which is actually a bit bigger than just this change. It took a lot of searching but Figure 1-3 in https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitHostingGuide_iOS/AudioUnitHostingFundamentals/AudioUnitHostingFundamentals.html makes it clear.

For rendering audio, your application should use the input scope and element 0 (aka element::Output). For taking audio from the microphone you need to use the output scope and element 1 (aka element::Input).

@plietar plietar closed this Dec 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants