The acronym MIDI stands for “Musical Instrument Digital Interface.” In the simplest explanation, MIDI is a language (technically speaking, it’s a protocol) that allows digitally-controlled musical instrument devices to communicate to each other. MIDI has been around since the early 1980’s. It was primarily used with synthesizers and sequencers, but quickly got implemented on all kinds of musically-related gear (drum machines, mixers, drum pads, synchronization boxes, etc.)
Like QWERTY for Music
The most common misconception about MIDI is that MIDI passes audio from one instrument to the next. It’s easy to confuse what’s really going on. When you play a note from a MIDI controller and you hear sounds from a MIDI synth (or a Mac or an iPad/iPhone), you might think that the “sound” is in the controller, but that’s not what’s happening. The easiest analogy to understanding the relationship between MIDI controllers and MIDI synths (or software instruments controlled by MIDI in a Mac or iPad/iPhone) is to think about how a Bluetooth mouse or keyboard interacts with a Mac or iPad/iPhone. The Bluetooth keyboard just sends the keystroke info over to the Mac or iPad/iPhone; It doesn’t magically move the Mac or iPad/iPhone processing power over to the keyboard. The Bluetooth keyboard just “controls” the software on the platform. With MIDI, the controller is just sending the “keystrokes” over to the iPad/iPhone or Mac. (Okay, there’s a little more going on than keystrokes, but not too much more…) Audio (sound) isn’t passed over MIDI in the traditional sense.
In the context of using the jamstik (a MIDI guitar controller), your fretting and picking activity generates MIDI events that the Mac/iPad/iPhone recognizes and connects to the appropriate MIDI software responsible for making sounds. The jamstik is like a QWERTY keyboard typing letters into a sonic version of Text Edit.
MIDI is a little more complex than that illustration, but if you can wrap your head around that concept you can follow the rest of the details.
Breakdown of a MIDI event from a jamstik
At the most basic of levels, it works like this:
Note “events” (note numbers with corresponding velocities) are transmitted on MIDI Channels. Back in the 1980’s when MIDI was transmitted on cables (how primitive!), you needed to be able to distinguish the difference between MIDI events you wanted to send to Synth A or Synth B. So, using assignable channels on your MIDI devices was a way keep your messages coming and going to the right synths. Each MIDI port (or cable) could address 16 MIDI channels. (This still holds true. If you see a physical 5 pin DIN MIDI connector, the maximum number of MIDI Channels that port can address is 16.) If I assigned Synth A to MIDI Channel 1 and Synth B to MIDI Channel 2, the messages on a shared MIDI cable wouldn’t get confused between Synths A & B.
Back in the early days of MIDI guitar controllers, the convention was established that each string would transmit on its own MIDI Channel. There are a number of good reasons for this which I’ll spare you the details of, but the jamstik follows this convention in its default state. The Low E string is on Channel 6, the A is on channel 5, the D is on channel 4, the G is on 3 channel, the B is on channel 2, and the high E string is on Channel 1.
Since MIDI operations have largely moved into the realm of virtual connections in software, the 16-channel limitation isn’t bound to the cable anymore. However, all MIDI software generally still follows the same “16-channels per port” connection. As this applies to the jamstik it’s not super relevant, but as you begin to delve deeper into different MIDI apps and software it will be important to remember that the jamstik creates its own virtual port via ZFi. Some software requires that you identify and select a virtual port, and as this happens over ZFi it will always be your “MyJamstikXXXX” ID from the WiFi browser.
MIDI does have a number of other commands commonly in use that aren’t tied to note events. They are called “Continuous Controllers.” Pitch Bend messages, which get sent when you bend a jamstik string, don’t require a note event to be transmitted with and is a part of the Continuous Controller family. Modulation messages are also Continuous Controllers. Volume (not to be confused with velocity) is a Continuous Controller, as well.
Christopher Heille is the Music Product Specialist with Zivix LLC - the company behind the jamstik and PUC. Chris is a champion for musicians everywhere and encourages anyone with any type of musical ability to get connected and start making music. His earliest "bandmates" were Tascam PortaStudios and Ensoniq SQ80's, supplementing the missing bass and keyboard players for his original live shows as a teenager. As a guitar player who wasn't intimidated by keyboards or MIDI, he was an early adopter to recording on DAWs and has been involved in various capacities (musician, vocalist, programmer, songwriter, engineer, producer, studio owner) on more projects than he can remember. Chris believes the future of the music business is not in New York City, Los Angeles or Nashville, but in the network of shared ideas and creativity. This belief continues to drive his efforts at Minneapolis-based Zivix, LLC. Find him on Twitter @drewchowen.
For all of GarageBand's amazing features and improvements, there's one place where some users believe GarageBand falls flat quickly; drum sounds. This blog is going to walk through a specific AudioUnit Extension enabled app, UVI's BeatHawk, to play new drum sounds in GarageBand like a software plugin.
If you've wanted to use the jamstik+ for composition or notation, Guitar Pro 7 allows you to do exactly that for guitar, bass, drum, or keyboard using tablature and/or standard notation!