Designing a custom Push2 instrument in Max for Live/JavaScript: Part 2

Posted on December 27, 2020

This is part 2 of the two-part series on designing custom Push instruments in Max for Live (using JavaScript). In Part 1 we explored the LiveAPI object object, and used it to construct an abstraction that allows us to monitor when the track that our instrument lives on is selected or deselected. In this part 2 we will get to the fun stuff, and actually develop our Push instrument.

As a reminder, we want to develop an instrument that has a fully custom Push layout, but integrates nicely with the Ableton workflow: when the track of our instrument is selected, it will show the custom Push layout, but when another track is selected, we get the regular layout:

If you want to follow along, download Trichord.amxd, place it on a MIDI track, and open the patch in Max.


What the instrument does, exactly, is not really the point of this blog post, but I’ll briefly describe it just to make it easier to see why we’re doing certain things. The goal is to develop an instrument that will allow us to easily explore Japanese pentatonic scales using trichords, a term coined in a video by Tommaso Zillio in his YouTube video The Simple Theory Of Japanese Music Scales. The basic idea is very simple: scales will consist of two trichords (sequences of three notes) like this:

C C♯
F G G♯

In other words, the C, F and G are always fixed, but then the note in between the C and the F and the note in between the G and the C can be varied. Some standard choices are

Scale First choice Second choice
Miyako-Bushi C♯ G♯
Ritsu D A
Min Yo D♯ A♯
Ryu Kyu E B

but non-standard choices are possible as well.

Our instrument makes this available to the player in a very direct manner: on the left are four buttons that can be used to pick the first note; on the right are four buttons to pick the second note; and in the middle are six buttons that can be used to actually play the scale. We also provide two dials to select a standard scale and a root note for the scale. See the short video on YouTube for a demonstration of how the instrument is used.

In the code, the theory of trichords is implemented in trichordstate.js; this is just straight-forward JavaScript code with nothing that is specific to the Push or even to Max for Live or Ableton, so I’ll not discuss it further in this blog post. You might want to take a quick look to familiarize yourself with this module.

Patcher top-level

If you open the patcher in Max (and exit presentation mode), you will see

As before, the logic of the patcher is implemented by the [js] object in the center of it all. Let’s briefly discuss the infrastructure around it. The top two boxes are not specific to our device, and would be useful in any custom Push instrument:

The bottom two boxes are specific to our instrument:

Finally, let’s discuss the MIDI routing at the bottom. When our device takes over control over the Push’s button matrix, we must generate our own MIDI events in response to button presses. We do this by generating MIDI notes and sending them to midiout.

In addition, we also route midiin straight to midiout. The first reason we do this is that if our device is disabled (and the Push is showing the normal instrument layout), MIDI will work as normal. In addition, even if our device is enabled, we do not take control over the Push’s pitch bend strip, and so pitch bend information will still come in on midiin. By routing this straight to midiout, pitch bending will work as normal even when our custom instrument is used.

The JavaScript code

The bulk of the work happens in the JavaScript code. We will describe this in a bottom-up manner, starting with how we interface with the Push, and working our way up to how our specific instrument uses that code.

Interfacing with the button matrix

Both the Push controller and its button matrix will be presented by LiveAPI objects. We will start by looking at the code for interfacing with the button matrix, see buttonmatrix.js. Supposing we already have the LiveAPI object corresponding to the Push, here’s how we find the button matrix:

function findButtonMatrix(push, callback) {
  var buttonMatrixId ="get_control", "Button_Matrix");
  var buttonMatrix   = new LiveAPI(callback, buttonMatrixId); = "value"; // Monitor the buttons
  return buttonMatrix;

The Push exports a whole bunch of controls; here we are interested only in the Button_Matrix, but feel free to explore: get_control_names can be used to get a list of all of them. The property that we are monitoring is the value of the button matrix: this is how we can respond to button presses. Every time the user presses a button, the callback of the LiveAPI object is called, and is passed the column, row and velocity.

The only thing we do in the ButtonMatrix constructor is call findButtonMatrix with a callback that makes these arguments available in a slightly more convenient format:

exports.ButtonMatrix = function(push, object, callback) {
  this.buttonMatrix = findButtonMatrix(push, function(args) {
    if(args[0] == "value" && args.length == 5) {, args[2], args[3], args[1]);

(we are omitting args[4] here, which I think might be a MIDI channel? not entirely sure about that.)

The other important function in ButtonMatrix makes it possible change the colours of the buttons. It is only a single line, but it just makes this more convenient and moreover explains how this must be done in the first place:

setColor: function(col, row, color) {"send_value", col, row, color);

Finally, ButtonMatrix exports a deleteObservers function in the same way that OurTrack does.

Interfacing with the Push

The Push class in push.js contains some low-level code for interfacing with the code, as well as some code to make working with the Push more convenient. Here will will primarily focus on the low-level code, and just summarize the rest.

First, we need to find the Push. This is done by the following code:

function findPush() {
  var liveApp            = new LiveAPI(null, "live_app")
  var numControlSurfaces = liveApp.getcount("control_surfaces");

  for (var i = 0; i < numControlSurfaces; i++) {
    var controlSurface = new LiveAPI(null, "control_surfaces " + i);
    if (controlSurface.type == "Push2") {
      return controlSurface;

  return null;

We enumerate all control surfaces, and stop at the first one whose type is Push2. Obviously, if you need to code to work with multiple Push devices, you’ll have to adjust this. Note that we also make no attempt to deal with the push being plugged in or unplugged while our device is loaded: we run this code once when the device is first initialized, and then do not run it again. There is therefore definitely scope for improvement here, but that is outside the scope of this tutorial.

The other important low-level function that our Push class provides makes it possible to grab or release control over the button matrix. Until we grab control, we cannot set the button colors nor listen to key presses:

controlButtonMatrix: function(control) {
  if(!this.checkFound()) return;

  if(control) {"grab_control", "Button_Matrix");
    var initColorsTask = new Task(initColors, this);
  } else {"release_control", "Button_Matrix");

The most important part here is the use of grab_control and release_control. However, every time we release and then grab control again, any colors that we previously set will have been erased and the Push will appear completely blank. Our Push class therefore remembers which colors have been set, and provides a function initColors that we can use to set all previously specified colors after we grab control. There is however a subtlety here: in my testing I found that it does not work to set the colors immediately after grabbing control; we must introduce a tiny delay. Not entirely sure why this is, but setting up a Task to run initColors seems to resolve the problem.

This is basically all the low-level code in the Push class. The main other feature the Push class offers is that it stores actions for each button; these actions can be set using

setAction: function(col, row, callback) {..}

and the code then automatically makes sure to invoke the right action when a button is pressed; the callback is passed the column, row, color and velocity of the button. There is nothing Live or Push specific about that code, however, and so we will not discuss it any further in this blog post. We will see this used in the next section, however.

The trichord instrument itself

The Trichord instrument itself is implemented in trichord.js. We will not discuss this code in detail here; most of it is straight-forward, just implementing the functionality of our instrument. We will only give a summary, and comment on a few subtle aspects here and there.


In init we construct the Push object as well as the OurTrack object (which we developed in part 1); the latter we pass a callback that grabs and releases control over the button matrix as our track is selected and deselected, respectively.

We then set up the colors on the Push as needed, and construct some actions. The way we create the necessary actions is probably entirely natural to a functional programmer, but might require a word of explanation for programmers less used to functional languages. Here’s how we are setting up the colors and the actions for the buttons in the middle that allow us to actually play the scale:

for(var i = 0; i < 6; i++) {
  push.setColor(1 + i, 4, 64);
  push.setAction(1 + i, 4, sendNote(i));

The key thing here is that sendNote is function that returns a new function:

function sendNote(note) {
  return function(col, row, color, velocity) {
    outlet(0, [48 + activeScale[note], velocity]);

It must return a function with four arguments (button column, row, color and velocity) because this is what our Push class expects. However, the function then ignores all of those arguments except for the velocity, and then outputs a MIDI note on the first (and only) outlet of our [js] object. Note that when the button is released, this function will be called with a velocity of 0, so we don’t have to worry about genering note-off MIDI commands. Finally, activeScale here is a patcher global variable that contains a list of notes of the current scale; we update it as the user is modifying or selecting the scale (or the root note).


The processing of the routing messages (see the description of the “Parameters” box above) is all very similar, and so rather than defining one function per messages, we instead define an anything function that can handle any of them:

function anything() {
  switch(messagename) {
    // Messages that update the state
    case 'scale':
    case 'root':
    case 'custom1':
    case 'custom2':
      state[messagename] = arguments[0];

    // Messages we just forward directly to the push object
    case 'showColors':
    case 'controlButtonMatrix':
      if(push != null) {
        push[messagename].apply(push, arguments);

      error("Message '" + messagename + "' not understood\n");

Function updatePush looks at the current state and updates the push accordingly, as well as updating activeScale.

This function also deals with forwarding some messages straight to the Push object; we just call the selected function using apply, passing along all arguments.

Limitation: releasing control when device is removed

Unfortunately, our code has one limitation: if the user removes our device entirely from the track, then we do not release control over the button matrix and so the button matrix of the Push will not be useable until a new live set is opened (or the Push is rebooted). Users have a workaround: if the first disable the device before deleting it, all is fine. However, it would of course be better if that would not be necessary.

Unfortunately, it seems that this is difficult to do, and I’ve not yet found a way that works. There are at least three ways to be notified when a device gets deleted:

However, I have experimented with all of these, and did not manage to get any of them to work. It seems I am not the only one; there are some threads on the Max for Live forums about this topic (for example, [1] [2]), and I’ve noticed that some other Max for Live devices that customize the Push layout suffer from the same problem (for example, Push Colour Notes). If anyone manages to resolve this problem, please let me know!


Interfacing with the Push and designing custom Push instruments is not that difficult, as we have seen. There is however not that much information available on the topic, which makes getting started difficult. My hope is that this tutorial helps to address that problem, and that the code we have developed in this example instrument can be a useful starting point for other custom Push instruments. If you develop anything based on what I’ve described here, I’d love to know!

Further reading