Davis, Wallace, Dale; I saw the latest discussions on the list about
seeking to use a software synthesizer upon MIDI files, and to take into
account the barrel organ richness (registrations, drums).
On our side (fair organs), we worked a bit reversed from the MIDI to
organ way. Our needs are to make music books for Limonaires (several
scales), and also other fair organs. We choose to take a book file
format (xml) describing the holes in the book. These holes are
positioned on a scale describing the instrument capabilities.
Our objective was to have an unambiguous interpretation way to keep
the digital version of the book to work on, and also the instrument
definition that explains how to translate it to synthesizer language
(MIDI used). Here is the history about it.
On the software we use (since 2009), we started to use the Gervill
MIDI synthesizer using soundfonts for instrument rendering; now it's
the official Java synth in the jvm/jdk [Java Virtual Machine / Java
Development Kit]. This is not related to GRBL, which is a software
component to drive CNC machines (we use it both on mechanical punch,
as well as laser cut for books). More information about it here:
https://github.com/bluenote10/gervill
These soundfonts are created on the fly by the software using a
*.instrumentbundle file, which is almost just a zip file, containing
the wav sound samples of the pipes, with the register set (Bourdon,
Flutes, etc.). All these content are freely available on the web site.
If tests are needed on *.sf2 audio samples, you can launch the software
and the *.sf2 are created in a cache folder.
We faced the following issues in modeling original crafted books, or in
trying to reproduce the full set of organ features in MIDI files:
1. Somehow register encoding may have several possible representations
-- notes can be duplicated in several channels, with different "MIDI
instruments" loaded. In this way, the notes need to be synchronized to
simulate the notes coupling with more than one instrument.
In MIDI files every instruments have their own notes, so in this manner,
we have to maintain the constraints of the note synchronization when
multiple registers are activated.
Extending the sequencer and synth is also possible on this topic.
Adding a new extended MIDI command is also possible to take this
"multiple note synchronization" problem
2. Some sections in fair organs, may also have the same notes on
several register "sets" [divisions] (melody, countermelody). For
example, you can have a C#5 in melody and a C#5 in the countermelody,
so you can't put them on the same track and channel.
3. There are _a lot_ of different sounds in pipes, and exploding the
notes in several channels (register 'sets' and registration) leads to
exploding the number of channels to use. Ref. http://www.organstops.org/
On the software we use, we had to upgrade the number of default
channels from 16 channel to a large number to be able to listen the
book.
I'm aware that a couple of arrangers are using a "book notation" inside
the MIDI file, using the "raw" book tracks arrangement and using the
MIDI mapper for listening. I'm not aware of how they can take into
account the registration topic using this. Their concerns are more
about to work on the book digitally and then to listen (registration
may certainly added later in the composition process).
Using the MIDI in that way, leads also to an issue with drums (these
are different from notes and can be activated at the end of the hole,
as well at the beginning, or defined by a larger holes for a drum
roll).
An implementation of the MIDI "translation logic" we use now can be
found also in the software, about the compilation process into MIDI,
from organ instrument definition.
https://github.com/barrelorgandiscovery/aprintproject/blob/master/aprint-core/src/main/java/org/barrelorgandiscovery/playsubsystem/GervillPlaySubSystemWithRegisterInstruments.java
During the past 2-3 years we implemented a software synthesizer to get
rid of theses limitations, adding also more capabilities to add large
polyphony. These devs also aimed to be used on small hardware platforms,
as well as integrated in the software. This component works, and
further improvements will be done in the future. It is available on
GitHub and freely downloadable: https://github.com/frett27/Ada-Synthetizer
We worked on reading the instrument bundle for sounds and MIDI subset
decoding well fitted for our needs. My current project is to make a
small music box using a resonator for the speaker.
This decision to move to a simple synth was a mature one for our needs,
exploring FluidSynth and other alternatives. Portability to other
systems and embedded ones was also well balanced in this decision.
I was also curious to work with a correct simple software synth design.
There are lots of elements you can learn from and take from us:
The GitHub project: https://github.com/barrelorgandiscovery/
The web site: http://www.barrel-organ-discovery.org/site/index.html
Instrument Hump: http://www.barrel-organ-discovery.org/instruments/
Some video tutorials of how we describe instruments and rendering:
https://www.youtube.com/watch?v=kpGxkNdyLzo
Hope this will help and perhaps leverage your projects, or start
collaboration about these topics.
Patrice Freydiere
France
[ Jazzband Limonaire 2018 instrument bundle
[ https://www.mmdigest.com/Attachments/21/01/28/210128_221103_50_jazzband_limonaire_2018_instrument_bundle.zip
|