Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Microtonality #1387

Closed
softrabbit opened this issue Dec 3, 2014 · 47 comments
Closed

Microtonality #1387

softrabbit opened this issue Dec 3, 2014 · 47 comments

Comments

@softrabbit
Copy link
Member

Like @diizy said over at #1381:

I think we could go further though: allow custom pitch maps, kind of like what Zyn does internally:
allow setting the actual pitch for each note. We already have internal support for arbitrary-pitched
notes, so all we'd need is the GUI and some trivial lookup-table code. This would allow using eg.
non-western scales, 24-note scales, weird experimental scales...

Let's make this a separate issue, mmkay? A few points to get things rolling:

  • Scala files (.scl) are pretty much a de facto standard for tunings, pretty simple format. At a minimum, importing .scl should be supported. Maybe snatch the code for that from Zyn.
  • There's a MIDI tuning standard for transmitting tunings (to VSTs and maybe Zyn?). Probably something that needs to be implemented at some point. Standards: http://www.midi.org/techspecs/midituning.php
  • Fluidsynth has some API support for retuning, IIRC, and shouldn't need that MIDI part in between: http://fluidsynth.sourcearchive.com/documentation/1.1.5-1/synth_8h.html (search for "tuning").
  • IMO, the GUI for editing tunings should be at top-level (Tools menu, maybe) and tunings easily shareable project-wide. After all, a tuning used for one instrument seems likely to be used for another one in the same song as well?
@tresf tresf mentioned this issue Dec 20, 2014
26 tasks
@softrabbit
Copy link
Member Author

@fundamental, does Zyn support some kind of MIDI tuning messages?

@fundamental
Copy link
Contributor

@softrabbit all of those look like SysEx messages which zyn does not support

@musikBear
Copy link

does Zyn support some kind of MIDI tuning messages?

o m g .. and if.. how.. -and then some
paste

You can set your own of-scales as crazy as you can imagine .. its absurdly versatile
In fact -no reason to ask : Can zasfx.......
If its relevant ..it can

@breebee
Copy link

breebee commented Aug 16, 2015

:( it says no one is assigned... i will do it then! What programming must i know is it in python? And is this something hard or simple for a beginner.

@musikBear
Copy link

is it in python?

no c++

hard or simple for a beginner.

hard

@breebee
Copy link

breebee commented Aug 31, 2015

Ok i will develop this, i am determined :) just little guidance , do i needto learn C ? or c++ will simply do, and do i use eclipse or something like code blocks... sorry to be such a noob i dont even know how to build from a source code at the moment.

@Spekular
Copy link
Member

Spekular commented Aug 31, 2015 via email

@breebee
Copy link

breebee commented Sep 6, 2015

Ok thank you, i have been taking many tutorials on c++ this past week i have already been making allot of small programs so i am on my way to helping you guys :)

@breebee
Copy link

breebee commented Sep 19, 2015

Ok guys i think im ready to start. I looked into the code already and i think i need to make the switch statements in the midi controller cpp for different selected temperments and asign the mathmatical pitch bend for very individual note on a 12 scale ratio. Can someone help me identify how to reference a an individual note and if its possibke to build lmms on windows yet.. im using codeblocks and default compiler.

@Spekular
Copy link
Member

Spekular commented Sep 19, 2015 via email

@tresf
Copy link
Member

tresf commented Sep 19, 2015

Ok guys i think im ready to start.

👍

I looked into the code already and i think i need to make the switch statements in the midi controller cpp for different selected temperments and asign the mathmatical pitch bend for very individual note on a 12 scale ratio. Can someone help me identify how to reference a an individual note

I'll tag @softrabbit on this question. 👍

and if its possibke to build lmms on windows yet.. im using codeblocks and default compiler.

Not easily yet as it requires msys2 (terminal emulator) to build properly due to the mixing and matching of Windows and POSIX style paths... i.e. /usr/local/bin vs C:\msys2\usr\local\bin. It works, but you may have difficulty building directly from the IDE. Also assuming CodeBlocks supports CMake, you'll have to try your luck (getting the IDE to build LMMS properly in Windows could take longer than the actual coding).

I've never setup a proper IDE with LMMS personally, but @curlymorphic has put some time into a QtCreator tutorial here: https://www.youtube.com/watch?t=3&v=XTWnQPGL9xs

And some tips here: https://www.youtube.com/watch?v=3OzGXfm6fqE

@curlymorphic I hope you're ok with me posting these (I say this because these videos were intentionally marked as private). Also, I'm not sure how much they will apply to Windows and/or codeblocks, but if you do get a proper IDE working in Windows, please (please!) write a GitHub wiki article and we'll publish it to our repository, we need to engage more developers and if getting more Windows tutorials published helps, we'll do it.

@curlymorphic
Copy link
Contributor

@tresf
The videos links are fine :) I only made them private, because they are not of a professional standard, and I didnt want them found by users searching for lmms. I am happy for the links to put in the wiki. Would these videos be better hosted on an LMMS youtube channel?

IDE's on windows. I have only tried to use QtCreator but have had mixed success. There are 2 main issues I have been having.

  1. The building has to be done in the msys2 environment.The CtCreator in the MSYS2 repo requires qt5, lmms requires qt4(is this still correct?). unfortunately under MSYS2 Qt4 and Qt5 are mutually exclusive. This can be worked around by running the IDE under windows, and building & debugging from the MSYS2 command prompt. I have done some reading, on using remote GDB but have not yet tried it,
  2. I am yet to find a windows version of CMake that is compatible with our codebase, and QtCreator. The cmake file is used by the ide, as a project file, allowing autocomplete, refactoring, and other niceties an ide provides, It is not being used in the case for the building, as mentioned above. It is still possible to open and edit the source files, This is the method I am currently using, but has room for improvement.

@michaelgregorius
Copy link
Contributor

@breebee Are you referring to the file MidiController found under src/core/midi? Only asking because I have just checked where it is used and was not able to find a code path where an instance of that class is created. I have also set a breakpoint in its constructor which was not triggered, e.g. when I set the MIDI device for the default Triple OSC.

There is a controller factory method Controller::create( ControllerTypes _ct, Model * _parent ) which creates an instance of a MidiController if the type of the enum _ct is Controller::MidiController. However, after renaming Controller::MidiController to something else I only had to adjust code where the enum was read but no places where it was set to a value. So it seems that the code path in the factory method is dead.

There is also a class AutoDetectMidiController which inherits from MidiController. This class is instantiated in the class ControllerConnectionDialog (in ControllerConnectionDialog::midiToggled()) which in turn is only used in the method AutomatableModelViewSlots::execConnectionDialog() which is not called from anywhere. So this also seems to be a dead end.

Does anyone know whether these classes are still used? Otherwise we should remove them so that they don't create any confusion.

@musikBear
Copy link

they are not of a professional standard

they are fine and extremely informative
I know you own a 8 core machine, and yet it took hours to build? -right?
I cant imagine how long my older meager 2 core box would use..
Code-blocks was mentioned, is that capable of compiling lmms?

@tresf
Copy link
Member

tresf commented Sep 20, 2015

I know you own a 8 core machine, and yet it took hours to build? -right?

Off topic, but the build process should take 5-20 minutes on most relatively modern hardware. The major factor is the -j4 switch which allows utilization of more than one thread at a time, but is incompatible with Apple currently. @musikBear your machine can't build because we don't have a win32 build tutorial written yet. :)

@tresf
Copy link
Member

tresf commented Sep 20, 2015

The cmake file is used by the ide, as a project file, allowing autocomplete, refactoring, and other niceties an ide provides, It is not being used in the case for the building, as mentioned above. It is still possible to open and edit the source files, This is the method I am currently using, but has room for improvement.

Agreed. A good IDE can help coding even if it can't help building.

@curlymorphic
Copy link
Contributor

I know you own a 8 core machine, and yet it took hours to build? -right?

It has never taken hours to build, but the whole process of setting up the build environment takes some time, with a slow net connection, but this only needs to be done once. When I am building using msys2, I am using windows in a virtual machine. I am guessing each build takes ~10 mins. It is not fair to compare that time against a natively running linux box.

@michaelgregorius
Copy link
Contributor

For fast builds I can recommend the following:

  • Use the -j4 switch (or -j8, whatever fits the machine).
  • Build natively.
  • Have the sources and the build directory on an SSD.

Using this a full Linux debug build only takes 1:36 minutes on my machine. 😃

@softrabbit
Copy link
Member Author

I looked into the code already and i think i need to make the switch statements in the midi controller cpp for different selected temperments and asign the mathmatical pitch bend for very individual note on a 12 scale ratio. Can someone help me identify how to reference a an individual note

With MIDI, you can have only one pitch bend value in effect per channel, so you can't e.g. have an unbent C and a bent E playing at the same time without using multiple channels. I have no idea what the implications are when you throw VSTs in the mix, do they in general listen to one channel or all of them ("omni" mode)?

For the instruments not using MIDI, these 2 lines calculate the pitch:

m_frequency = BaseFreq * powf( 2.0f, pitch + m_instrumentTrack->pitchModel()->value() / ( 100 * 12.0f ) );
m_unpitchedFrequency = BaseFreq * powf( 2.0f, pitch );

Turning that into a table lookup should be a good starting step IMO.

@musikBear
Copy link

@curlymorphic

I am guessing each build takes ~10 mins.

Thanks a lot for clearing that mistake up Dave!
Then i guess i should be able to do it -after (thanks @tresf) an update to win7 ..if my box can run it..

@tresf
Copy link
Member

tresf commented Sep 21, 2015

Then i guess i should be able to do it -after (thanks @tresf) an update to win7 ..if my box can run it..

You'll need to be running the 64-bit build of Windows 7 to follow our build tutorial, but I still wouldn't recommend it for a first-timer. Our Linux builds are much (much!) quicker and easier to follow. Install VirtualBox and Ubuntu 14.04 in a virtual machine and you can keep XP for a bit longer. 👍

@michaelgregorius
Copy link
Contributor

@softrabbit Where is the NotePlayHandle generated by the way? I think it's important that the microtonality stuff can be configured for individual instruments and that it is not something that's set globally. I can imagine that user do not want to have all instruments in a track use the same microtonality. Or that they don't want alternate tunings for the drums.

Just for a clearer picture, here is how it works with VSTs: A VST instrument receives the stream of MIDI data that is directed to the instrument. It is then up to the instrument to decide what to do with the note on and note off events. A simple instrument might just calculate the frequency from the note number using a rather simple formula (that I currently don't have in my head 😄). Another VST might be able to read Scala files and will then map the note numbers to the tuning that's stored in the Scala file. Using this approach gives maximum flexibility to the user.

@breebee
Copy link

breebee commented Sep 21, 2015

Thank you michaelgregorius for investigating that for me, and softrabit thank you as well for directing me to the file. i am attaching an image of how i was able to manually create microtonal scale in ableton by making an instrument rack 12 times and restricting a single note per instance and manually adjusting the pitch for every note. Would something like this be possible directly on the midi, and would it consume alot of memory doing it like this??
scale

-PS : if i successfully build on windows i will make a video tutorial

@musikBear
Copy link

Our Linux builds are much (much!) quicker and easier to follow. Install VirtualBox and Ubuntu 14.04 in a virtual machine and you can keep XP for a bit longer.
@tresf Thanks 👍 , propl. what i should do then. Get a new hd and degrade my current almost 'crowded' hd to slave.

@michaelgregorius
Copy link
Contributor

@breebee There is no need to manipulate MIDI data or MIDI pitch modulations. The MIDI data will only provide you with the information which key/note was played (in the form of a number between 0 and 127). This note number is then used by a simple lookup table to find out the corresponding frequency that the instrument should play.

This means that the simplest implementation would be an look up table (array) of length 128 which directly maps the note numbers to the corresponding frequencies of the scale. The array would be initialized according to the used scala file and and a potential keyboard mapping described in a .kbm file. If there is no keyboard mapping file LMMS should use a default mapping, e.g. the one described in the documentation of the keyboard mapping format (see below).

The scala documentation can be found here:

I guess it makes sense to break down the task into several steps so that it does not seem to overwhelming. Example:

  1. Find the correct place where the mapping needs to be performed (as described above this should be associated with an instrument).
  2. Once you know the place implement the standard mapping using the table lookup described above, i.e. map[69] = 440.0, etc. The formula for the initialization is map[n] = 440.0 * pow(2, (n - 69) / 12.);.
  3. Once the standard map is working try playing around with it, e.g. by initializing everything an octave higher.
  4. Wrap you head around the scala specification and implement the initialization of scales from scala files using the default keyboard mapping.
  5. Wrap your head around the key mapping specification and extend the initialization of the mapping (array) to also use keyboard mappings.

As was already mentioned by @softrabbit you may also have a look at the ZynAddSubFX implementation.

@softrabbit
Copy link
Member Author

I think it's important that the microtonality stuff can be configured for individual instruments and that it is not something that's set globally. I can imagine that user do not want to have all instruments in a track use the same microtonality. Or that they don't want alternate tunings for the drums.

Multiple tunings, stored globally, each can be shared by multiple instruments?

@breebee
Copy link

breebee commented Sep 23, 2015

Maybe a toggle button on the actual instruments gui would resolve the drum issue being effected by the tuning. Using a virtual box might be the solution to build on windows but i am going to try multiple solutions i have found for building blender on windows first with cmake this week and have an update for building LMMS on windows if successful. Btw i think apples "logic pro" daw is the only one ive seen with global microtuning. It might be easier to impliment something like http://www.vst4free.com/free_vst.php?id=1430

@breebee
Copy link

breebee commented Sep 23, 2015

I found zynaddsubfx microtanal cpp and header https://github.com/LMMS/zynaddsubfx/tree/master/src/Misc

@softrabbit
Copy link
Member Author

Maybe a toggle button on the actual instruments gui would resolve the drum issue being effected by the tuning.

A drop down box on the "Misc" tab?

It might be easier to impliment something like http://www.vst4free.com/free_vst.php?id=1430

Well, that's the part that should be built into some suitable spot when dealing with MIDI-based instruments (VSTs, Zyn...). Probably with the options to send a bulk tuning dump, single-note tuning messages or pitch bends on up to 16 channels. Any one of those would be preferrable to manually loading a scale file in the plugin, but as a last resort that's usable, too.

  1. Once the standard map is working try playing around with it, e.g. by initializing everything an octave higher.

Or implement an easy toggle to set A at 432 Hz to satisfy some trolls.

@michaelgregorius
Copy link
Contributor

Multiple tunings, stored globally, each can be shared by multiple instruments?

Sounds like a good option to me. This way there would also be one central place where the data is stored when the file is saved.

@breebee I think the problem with implementing this on the MIDI side is that if you need to use the pitch modulation to describe the final pitch you never know how a VST is going to interpret this. If the plugin is set to interpret full pitch modulation as two semitones (or even an octave) this approach will likely give wrong results. I assume that this is also the cause of the complaint found on the site you have linked:

Use in Ableton live. It works but the tuning is wrong. I have .tun file with pure temperation settings. In zebra & alchemy this tuning works good. Not in this plugin. Frequencies are not correct. Instead of the desired values, it gives others. Where set 13 cents, this plugin set 42 & more bugs.

@breebee
Copy link

breebee commented Sep 26, 2015

So llms is built with qt? Sorry i just noticed the q's in the ui, this means i have to have qtcreator to build llms right?

@michaelgregorius
Copy link
Contributor

@breebee No, you don't need Qt Creator to build LMMS. Qt Creator is just an IDE that uses Qt and that is developed by the Qt developers. You can configure CMake to create several type of builds, e.g. for make or for Eclipse, etc.

LMMS needs Qt because the GUI is implemented with Qt (and some of the core classes are also used throughout the code). So you will need to install Qt and the development packages but you won't need to install Qt Creator.

I suggest that you install Ubuntu in a virtual machine and follow the build instructions in the GUI. This might give you a good overview of what is needed and how things play together. Once you can build everything you can get your feet wet by playing with the code.

@algorev
Copy link

algorev commented Dec 17, 2017

So, how is the implementation going? Just stumbled on his issue and I am very interested in microtonal music. I seems that there has been no activity for a long time, though.

@breebee
Copy link

breebee commented Jan 27, 2018

@algorev
Looks like the only solution for all vsts in your daw to really do micro tonal music is to use bitwig studio and manually modify each note in the midi editor to pitch shift to the correct frequency. Alsit is only going to be a 2 decimal modification so not as precise as you might like. Otherwise if your using a mac logic pro has a generic global tuning but last time i checked it was very limiting and unsure if it will even work for all your instruments maybe only the built in ones.

@he29-net
Copy link
Contributor

he29-net commented Dec 15, 2019

Hi,
is anyone actually still working on this, or is it buried deep in the stack and forgotten? In case this issue is "free", I would like to try and push it at least a few steps forward.

It seems like a relatively complex issue, so I would probably cut it up into more digestible pieces like this:

  • address MIDI-based instruments play an octave too low by default #1857, because it does not make much sense to add microtonal support when even standard 12-TET tuning does not work consistently across all instruments,
  • implement a microtuner class and dialog, which would be used to load and manage scales and keyboard mappings, and to expose required information to instruments or other plugins;
  • allow LMMS native instruments to switch between available scales the tuning information (probably accessed by a combo box added to the Miscellaneous tab);
  • figure out how to implement the support for VSTs (i.e. MIDI re-tune messages);
  • figure out how to do the same for ZynAddSubFX.

I'm not sure if I'll be able to address all of this, but I'll have some free time in the second half of this month so I could at least try to get this moving again after 4 years. :)

@LostRobotMusic
Copy link
Contributor

LostRobotMusic commented Dec 15, 2019

Any xenharmonic scale editor in LMMS should be modeled after ZynAddSubFX's in my opinion, there aren't any scales that aren't possible in that scale editor and it's very easy to use.

That does introduce a good question though, how will LMMS and ZynAddSubFX microtonality mix? ZynAddSubFX has those abilities built in, so which takes priority? Or will the LMMS scale overwrite the ZynAddSubFX one?

@he29-net
Copy link
Contributor

he29-net commented Dec 15, 2019

Well, that's basically why I put Zyn on the list as the last item: with native plugins and VSTs the way to approach it seems clear, but with Zyn it will likely require some custom solution which I'm not yet sure about.

I agree that the Zyn scale editor seems fine (although it seems to suffer from a severe case of amnesia) so the LMMS microtuner dialog would probably look just the same. So compatibility should not be problem as it would be a 1:1 mapping.

As for how to get it to re-tune: on the LMMS side it could behave like the other instruments -- if the combo box in Misc tab changes, a re-tune message would be emitted. That would allow the scale to be set and saved / stored in the same way for all instruments, and to be even easily automated.

The re-tune message would just need to be "custom made" for Zyn, or Zyn would have to get the MIDI re-tune message support. This is the part which I'm not sure about. It could also be a simple file name to load + execute the required load function in Zyn, in case LMMS wants to only support .scl and .kbm.

Unless the Scales dialog is disabled in Zyn, there is probably no way to prevent the user from overwriting it with something else, so the priority would be on the LMMS settings -- any changes in Zyn would not be saved (as is the case now) and would be overwritten by any newly generated retune event.

@he29-net
Copy link
Contributor

Oh, dear.. I think the MIDI specification is going to give me a stroke, and I haven't even started yet.

First you have to register before you can even open any specs (because it's free! why wouldn't you?), then you get one massive document + about two pages worth of links to various individual amendments and a notice that anything in the main document can be outdated.

The main document is a mixture of half-decently typeset original document, various additions and modifications (made clearly using a different font and a different editor which places letters in headings with randomized x axis offset), and -- wait for it -- actual pasted images of scanned sheets of paper. And the amendments are, based on the one I downloaded, written in another toy word processor that typesets MIDI as "MI DI" etc..

Whew. I imagined the MIDI Association would have at least some standards, given that producing the specification is what they do. But it looks like as if after 1996 they fired the writers and all subsequent revisions were typed by first-year kindergarten students. Pretty embarrassing.


Well, anyways: the good news that there is an official standard called MIDI Tuning Messages, and it seems pretty flexible -- basically allowing you to specify the frequency for all keys separately. The microtuner class could simply generate a mapping array and send it all in one batch to re-tune the instrument, or "inject" the mapping for each note separately in the "single-note" mode. The second mode has an advantage in that it does not require the target to keep any state, allowing us to side-step the problem of potentially inconsistent / overwritten settings of a plugin.

The bad news are that the support is not exactly universal, and some VSTs either don't support alternative tunings at all (probably most cases) or they only offer tuning file support like Zyn. So while adding the support for MIDI Tuning Messages is probably a good idea overall, it is questionable whether it will be actually useful for anything in the near future and if we should go with the batch mode or the single-note mode. I.e. working on Zyn first could be more beneficial, and depending on the implementation perhaps even easier, as communication with proprietary VST plugins could be hard to debug.

But that's well down on the list in any case; the initial implementation for native LMMS instruments should not require any complicated messaging, they could just read the active mapping array directly at any time.

@musikBear
Copy link

@he29-net I tried to have midi in a as2 game, but i failed
somewhere i have a midi-file in pure ascii, though
Would it help you if you had a copy of that -eg a complete readable midi file.
If you would find useful, i would dig my win98 box up from the basement, and search for it.
Mind you, it is at least 20 y old, so it is MIDI0!
That alone is a serious limitation! ..in fact it may not even support microtonality at all!
But if you think it could be useful, i will attempt to find it, unless our resident rat has eaten the harddrive :P

@he29-net
Copy link
Contributor

@musikBear Thanks for the offer, but there is no need to search for anything -- what I was talking about is about MIDI messages, not about MIDI files. I'm not really interested in the storage format, what I need to figure out is the sequence of MIDI commands required to retune the target MIDI synthesizer, be it a plugin, or even a physical device (since LMMS supports MIDI out).

The specification is a bit painful to read, but it should have all the required information, so no worries. :)

@he29-net
Copy link
Contributor

he29-net commented Dec 22, 2019

After digging through the code a bit, I realized that not everything would work as I originally intended, so here is little update and a modified ToDo list.

Most of all, I thought it would be a good idea to have a scale manager tab in the Settings dialog, which could be used to load or make new scales. These would then become available to individual instruments in a ComboBox in the Misc tab.

Saving and loading of this scale database would be quite problematic, though. Project opened on another computer could sound completely broken if a given scale wasn't available or had a different ID. Ensuring the consistency of various scales and their associated ComboBox IDs would be pretty messy even within one project, not to mention trying to ensure consistency when opening the project on another computer, where another set of scales (or the same scales in different order) may be loaded.

So now I think it would be best to avoid any global configuration and simply allow the user to load (or manually enter) a scale for a given instrument directly in the appropriate tab. The scale is then a part of the instrument and can be saved along with all its other data, making it much simpler and more reliable.

To allow automation of scale-changes, the instrument itself could hold multiple scales, but that is something that could be implemented later. To keep things simple, the first implementation will only have one scale.

EDIT: After trying to cram all the required controls into the tiny, tiny instrument window, I changed my mind once again. Having project-wide scale and keymap definitions seems to lead to a much better workflow and cleaner UI. It will be more difficult to implement (save / load of multiple scales per project, handling ID updates in all instruments when a scale in the middle gets deleted, ...), but it is at least better than my original "install-wide" Settings dialog idea. Automation is easy this way (just switch between available scales and mappings) and the user could set a default scale that is applied to all new instruments instead of loading it manually for each one, or they could edit an existing scale and have it update on all instruments that use it etc...

Scale management will still reside in the Microtuner class, but instead of being a global "scale manager" that is queried for note→frequency conversions for a given scale ID etc., it will be a part of each instrument and will operate on its currently active / loaded scale. Apart from translating notes to frequencies, it will also contain the methods for .scl loading and methods for generating MIDI tuning messages (and whatever is going to be used to retune Zyn). Each instrument will then use the Microtuner as needed (e.g. VeSTige will use it to generate retune messages, TripleOsc will only use it to translate notes to frequency).

  • address MIDI-based instruments play an octave too low by default #1857, because it does not make much sense to add microtonal support when even standard 12-TET tuning does not work consistently across all instruments [done, needs to be tested and merged];
  • hunt down all BaseFreq constant users, figure out why they use it and how to supply a configurable value instead [done; comments added to places that still use a constant];
  • figure out what to do about pitch bend / detuning on non-12-TET scales, as it is usually given as a number of 12-TET semitones or cents, and most JI scales also have non-uniform intervals [done; 1) keep Master Pitch and instrument pitch in 12-TET semitones as there is no other clear reference interval for a non-uniform scales; apply them to computed frequency after scale and key mapping. 2) Note detune, as visualized in Piano Roll, implies interpolation between neighboring keys / notes. Having it behave any other way would be probably quite confusing, so the goal is to transition smoothly between frequencies assigned to individual keys. EDIT: Nothing can be assumed about the key map and scale: frequency changes may not always be monotonic, making the "gradual transitions" quite unpredictable. So in the end, it seems the only safe alternative is to just go with cents for everything.];
  • implement methods for translation of key + detune inputs into frequency, given a keymap and note-frequency map; use it instead of a fixed 12-TET formula when microtuner is enabled; [done; lookup table may not will be sufficient because of note detuning]
  • implement project-wide scale and keyboard mapping loading, translation and storage class + its setup dialog;
  • implement MIDI single-note and batch tuning messages (for VSTs and MIDI-based instruments);
  • implement MIDI re-tune message support for OpulenZ;
  • figure out how to re-tune ZynAddSubFX; Since LV2 got merged, I think it would be a wasted effort. I assume LV2 can handle retune messages (can it?), so adding support for them to ZynFusion would be cleaner than adding some sort of hack to our "local copy" of zasfx.
  • figure out what is needed for full keyboard mapping support.

Also, it would be nice to have something like LcdSpinBox for floats. Changing the base note frequency using a knob seems really weird, as it needs both high range and high resolution. And in general, having a nice big number like 440.00 seems much better than guessing if a knob is or is not turned half a pixel to the right, which would likely be the difference between 440 and 442 Hz.

@he29-net
Copy link
Contributor

he29-net commented May 29, 2020

Coming soon(-ish):
screen

The first PR will be only for native, non-MIDI-based instruments (to keep the PR size reasonable and to gather some feedback first, before building more stuff on top of it).

What's left to do is saving / loading (both .scl / .kbm and in project) and some GUI-related functionality and polish.

@DomClark
Copy link
Member

DomClark commented Mar 3, 2022

Implemented in #5522.

@DomClark DomClark closed this as completed Mar 3, 2022
@breebee
Copy link

breebee commented Mar 4, 2022

Thank you

@breebee
Copy link

breebee commented Mar 9, 2022

I cant seem to find the microtuner anywhere :(
Can someone help me find the panel?

@CLandel89
Copy link
Contributor

CLandel89 commented Mar 9, 2022

I cant seem to find the microtuner anywhere :( Can someone help me find the panel?

There's a new "tuning fork" button between the tempo and the "project notes" button:
image

And theres a new section "..." in most instrument plugins (only supported ones):
image

@breebee
Copy link

breebee commented Mar 11, 2022

Thank you! Must be windows build missing it on 1.3.0-alpha.1.102g89fc6c960 lol I looked for hours before asking :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests