Apple’s Yosemite has received a lot of attention for the addition of Bluetooth MIDI over Bluetooth LE. What has gotten less attention are the updates to Network Midi. Here are a few things I have learned.
Network Midi has been included in Mac OSX since about 2004. It allows computers to find one another over the LAN via Bonjour and to join each others’ MIDI sessions. The protocol spoken by Network MIDI is defined in a IETF doc: RFC 6295, which was ratified in 2011. An earlier version of this document (RFC 4695) was ratified in 2006. It seems that this protocol standard has had a long and very slow evolution.
Apple’s implementation of Network MIDI has been sometimes criticized as “buggy.” I myself had been able to isolate a bug in the implementation of the protocol. I was delighted to find that Yosemite not only fixes that one specific bug, but seems to include a Network MIDI stack with entirely different characteristics from the previous versions.
Receive latency is much lower overall.
An erroneous delta timestamp format has been fixed (comex algorithm).
On November 21 I was pleased to participate in a meetup entitled Real-Time Streaming Data. The organizer of this meetup assembles a wide variety of presenters and topics under the umbrella topic of “Large-Scale Production Engineering.” Chris (the organizer) does a remarkable job of keeping a pipeline of interesting talks coming. I’m particularly interested in the January talk humorously entitled “Whatever happened to IPV6?”
It has been a few years since there has been a new project embedding the Python interpreter into Verilog. There is a new one on the block, called “Cocotb” built with a definite focus on writing testbenches and running regressions. It is written in a modern dialect of Python and hosted on Git. It is well-documented. I’m impressed with what I’ve seen.
This article gives a little bit of a history of Python and Verilog and describes how Cocotb fits in.
AudioContext is very new, and many examples on the net illustrate the basics of starting and stopping a sound. The methods that start and stop sounds are called noteOn and noteOff. To start or stop a sound immediately you can call
The argument of 0 means “immediately” but does not correspond to the audio engine’s notion of “now.” Indeed, I first made the mistake of thinking that that the current audio time was relative to the execution time of the script: it is not – audio time is a property of the AudioContext.
To be completely explicit, I now write all mentions of sound triggers using the currentTime of a global audioContext.
var now = audioContext.currentTime;
sound.noteOn(now + 0);
var now audioContext.currentTime;
sound.noteOff(now + 0.5);