Giter VIP home page Giter VIP logo

mikmidi's Introduction

This README file is meant to give a broad overview of MIKMIDI. More complete documentation for MIKMIDI can be found here. Questions should be directed to Andrew Madsen.

MIKMIDI

MIKMIDI is an easy-to-use Mac and iOS MIDI library created by Andrew Madsen and developed by him and Chris Flesner of Mixed In Key. It's useful for programmers writing Objective-C or Swift macOS or iOS apps that use MIDI. It includes the ability to communicate with external MIDI devices, to read and write MIDI files, to record and play back MIDI, etc. MIKMIDI is used to provide MIDI functionality in the Mac versions of our DJ app, Flow, our flagship app Mixed In Key, and our composition software, Odesi.

MIKMIDI can be used in projects targeting Mac OS X 10.7 and later, and iOS 6 and later. The example code in this readme is in Swift. However, MIKMIDI can also easily be used from Objective-C code.

MIKMIDI is released under an MIT license, meaning you're free to use it in both closed and open source projects. However, even in a closed source project, you must include a publicly-accessible copy of MIKMIDI's copyright notice, which you can find in the LICENSE file.

If you have any questions about, or suggestions for MIKMIDI, please contact the maintainer. Contributions are always welcome. Please see our contribution guidelines for more information. We'd also always love to hear about any cool projects you're using it in.

How To Use MIKMIDI

MIKMIDI ships with a project to build frameworks for iOS and macOS. You can also install it using CocoaPods or Carthage. See this page on the MIKMIDI wiki for detailed instructions for adding MIKMIDI to your project.

A note about Swift: MIKMIDI is written in Objective-C, but fully supports Swift. The only caveat is that API changes that affect only Swift, but not Objective-C, such as improved nullability annotation, refined API names for Swift, etc., are not limited to major versions, but rather will sometimes be included in minor version releases. Bug fix/patch releases will not break Swift or Objective-C API. Objective-C API will be stable within a major version, e.g. 1.y.z.

MIKMIDI Overview

MIKMIDI has an Objective-C interface -- as opposed to CoreMIDI's pure C API -- in order to make adding MIDI support to a Cocoa/Cocoa Touch app easier. At its core, MIKMIDI consists of relatively thin Objective-C wrappers around underlying CoreMIDI APIs. Much of MIKMIDI's design is informed and driven by CoreMIDI's design. For this reason, familiarity with the high level pieces of CoreMIDI can be helpful in understanding and using MIKMIDI.

MIKMIDI is not limited to Objective-C interfaces for existing CoreMIDI functionality. MIKMIDI provides a number of higher level features. These include: message routing, sequencing, recording, etc.. Also included is functionality intended to facilitate implementing a MIDI learning UI so that users may create custom MIDI mapping files. These MIDI mapping files associate physical controls on a particular piece of MIDI hardware with corresponding receivers (e.g. on-screen buttons, knobs, musical instruments, etc.) in your application.

To understand MIKMIDI, it's helpful to break it down into its major subsystems:

  • Device support -- includes support for device discovery, connection/disconnection, and sending/receiving MIDI messages.
  • Commands -- includes a number of classes that represent various MIDI message types as received from and sent to MIDI devices and endpoints.
  • Mapping -- support for generating, saving, loading, and using files that associate physical MIDI controls with corresponding application features.
  • Files -- support for reading and writing MIDI files.
  • Synthesis -- support for turning MIDI into audio, e.g. playback of MIDI files and incoming MIDI keyboard input.
  • Sequencing -- Recording and playback of MIDI.

Of course, these subsystems are used together to enable sophisticated features.

Device Support

MIKMIDI's device support architecture is based on the underlying CoreMIDI architecture. There are several major classes used to represent portions of a device. All of these classes are subclasses of MIKMIDIObject. These classes are listed below. In parentheses is the corresponding CoreMIDI class.

  • MIKMIDIObject (MIDIObjectRef) -- Abstract base class for all the classes listed below. Includes properties common to all MIDI objects.
  • MIKMIDIDevice (MIDIDeviceRef) -- Represents a single physical device, e.g. a DJ controller, MIDI keyboard, MIDI drum set, etc.
  • MIKMIDIEntity (MIDIEntityRef) -- Groups related endpoints. Owned by MIKMIDIDevice, contains MIKMIDIEndpoints.
  • MIKMIDIEndpoint (MIDIEndpointRef) -- Abstract base class representing a MIDI endpoint. Not used directly, only via its subclasses MIKMIDISourceEndpoint and MIKMIDIDestinationEndpoint.
  • MIKMIDISourceEndpoint -- Represents a MIDI source. MIDI sources receive messages that your application can hear and process.
  • MIKMIDIDestinationEndpoint -- Represents a MIDI destination. Your application passes MIDI messages to a destination endpoint in order to send them to a device.

MIKMIDIDeviceManager is a singleton class used for device discovery, and to send and receive MIDI messages to and from endpoints. To get a list of MIDI devices available on the system, call -availableDevices on the shared device manager:

let availableDevices = MIKMIDIDeviceManager.shared.availableDevices

MIKMIDIDeviceManager also includes the ability to retrieve 'virtual' endpoints, to enable communicating with other MIDI apps, or with devices (e.g. Native Instruments controllers) which present as virtual endpoints rather than physical devices.

MIKMIDIDeviceManager's availableDevices, and virtualSources and virtualDestinations properties are Key Value Observing (KVO) compliant. This means that for example, availableDevices can be bound to an NSPopupMenu in an OS X app to provide an automatically updated list of connected MIDI devices. They can also be directly observed using key value observing to be notified when devices are connected or disconnected, etc. Additionally, MIKMIDIDeviceManager posts these notifications: MIKMIDIDeviceWasAddedNotification, MIKMIDIDeviceWasRemovedNotification, MIKMIDIVirtualEndpointWasAddedNotification, MIKMIDIVirtualEndpointWasRemovedNotification.

MIKMIDIDeviceManager is used to sign up to receive messages from MIDI devices as well as to send them. To receive messages from a MIKMIDIDevice, you must connect the device and supply an event handler block to be called anytime messages are received. This is done using the connect(_:, eventHandler:) method. When you no longer want to receive messages, you must call the disconnectConnection(forToken:) method. To send MIDI messages to an MIKMIDIDevice, get the appropriate MIKMIDIDestinationEndpoint from the device, then call MIKMIDIDeviceManager.send(_: [MIKMIDICommand], to:) passing an array of MIKMIDICommand instances. For example:

let noteOn = MIKMIDINoteOnCommand(note: 60, velocity: 127, channel: 0, timestamp: Date())
let noteOff = MIKMIDINoteOffCommand(note: 60, velocity: 127, channel: 0, timestamp: Date().advanced(by: 0.5))
try MIKMIDIDeviceManager.shared.send([noteOn, noteOff], to: destinationEndpoint)

If you've used CoreMIDI before, you may be familiar with MIDIClientRef and MIDIPortRef. These are used internally by MIKMIDI, but the "public" API for MIKMIDI does not expose them -- or their Objective-C counterparts -- directly. Rather, MIKMIDIDeviceManager itself allows sending and receiving messages to/from MIKMIDIEndpoints.

MIDI Messages

In MIKMIDI, MIDI messages are objects. These objects are instances of concrete subclasses of MIKMIDICommand. Each MIDI message type (e.g. Control Change, Note On, System Exclusive, etc.) has a corresponding class (e.g. MIKMIDIControlChangeCommand). Each command class has properties specific to that message type. By default, MIKMIDICommands are immutable. Mutable variants of each command type are also available.

MIDI Mapping

MIKMIDI includes features to help with adding MIDI mapping support to an application. MIDI mapping refers to the ability to map physical controls on a particular hardware controller to specific functions in the application. MIKMIDI's mapping support includes the ability to generate, save, load, and use mapping files that associate physical controls with an application's specific functionality. It also includes help with implementing a system that allows end users to easily generate their own mappings using a "MIDI learn" style interface.

The major components of MIKMIDI's MIDI mapping functionality are:

  • MIKMIDIMapping - Model class containing information to map incoming messages to to the appropriate application functionality.
  • MIKMIDIMappingManager - Singleton manager used to load, save, and retrieve both application-bundled, and user customized mapping files.
  • MIKMIDIMappingGenerator - Class that can listen to incoming MIDI messages, and associate them with application functionality, creating a custom MIDI mapping file.

MIDI Files

MIKMIDI includes features to make it easy to read and write MIDI files. This support is primarily provided by:

  • MIKMIDISequence - This class is used to represent a MIDI sequence, and can be read from or written to a MIDI file.
  • MIKMIDITrack - An MIKMIDISequence contains one or more MIKMIDITracks.
  • MIKMIDIEvent - MIKMIDIEvent and its specific subclasses are used to represent MIDI events contained by MIKMIDITracks.

MIDI Synthesis

MIDI synthesis is the process by which MIDI events/messages are turned into audio that you can hear. This is accomplished using MIKMIDISynthesizer. Also included is a subclass of MIKMIDISynthesizer, MIKMIDIEndpointSynthesizer which can very easily be hooked up to a MIDI endpoint to synthesize incoming MIDI messages:

let endpoint = midiDevice.entities.first!.sources.first!
let synth = try MIKMIDIEndpointSynthesizer(midiSource: endpoint)

MIDI Sequencing

MIKMIDISequencer can be used to play and record to an MIKMIDISequence. It includes a number of high level features useful when implementing MIDI recording and playback. However, at the very simplest, MIKMIDISequencer can be used to load a MIDI file and play it like so:

let sequence = try! MIKMIDISequence(fileAt: midiFileURL)
let sequencer = MIKMIDISequencer(sequence: sequence)
sequencer.startPlayback()

mikmidi's People

Contributors

0dmitry avatar adamjansch avatar armadsen avatar bacc3 avatar computersarehard avatar dhilowitz avatar fattjake avatar grype avatar jagdeep-manik avatar jranson avatar jupdike avatar kris2point0 avatar leamerluck avatar mlostekk avatar nightbirdsevolve avatar patrickmik avatar psobot avatar pwightman avatar zevarito avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mikmidi's Issues

MIKMIDITrack properties should generally be KVOable.

My main concern here is that -noteEvents, etc. "pull" from the underlying data, and therefore aren't KVO compatible at all. Rather, I think we should read tracks at initialization (for sequences/tracks read from a file), and store them in our own property, then only allow inserts/removal through a well defined interface that maintains KVC/KVO compliance.

This will e.g. enable KVO to be used to easily trigger drawing updates for a view that draws a MIKMIDISequence/MIKMIDITracks.

Add support for network session

I'm trying to write a MIDI app that would be visible via Network MIDI. From CoreMIDI documentation it appears that MIDINetworkSession has to be used for that. However MIKMIDI does does seem to have this implemented.
Any plans for adding network session support? Or is there a way to use MIDINetworkSession directly with MIKMIDI? Any help will be appreciated.

MIKMIDISequencer should support all MIDI event types

Currently, MIKMIDISequencer can only playback and record note events. It should be able to handle all MIDI event types.

Once MIKMIDISequencer is able to play back any event type MIKMIDIPlayer can be deprectated and eventually removed.

Remove the need to use virtual endpoints for _internal_ scheduling.

Right now, in order to take advantage of CoreMIDI's built in support for scheduling events in advance, we often create virtual endpoints (MIKMIDIClientDestinationEndpoints). This is easy, but it has the major disadvantage of "leaking" these endpoints to the outside world since they're visible to any app on the system.

In MIKMIDISequencer, we do have a need for advance scheduling, but hooking virtual endpoints up to MIKMIDIEndpointSynthesizer instances is clumsy.

I think we should create our own event scheduling code when events are just being used internally. Obviously, when we are actually sending events to an external endpoint (virtual or hardware), we want to continue using CoreMIDI's scheduling.

This thread has some useful information.

Refactor MIKMIDISynthesizer/Instrument API so that it makes more sense

Right now, the API for selecting instruments is somewhat confusing, and not that well factored. For one thing, +availableInstruments probably shouldn't be a class method on MIKMIDISynthesizerInstrument, rather it should be an instance method on MIKMIDISynthesizer. Better documentation would help too.

Active Sense recognised as System Reset

My MiniBrute is sending out Active Sense MIDI commands. They are recognised as such by MIDI Monitor (http://www.snoize.com/MIDIMonitor/), but when inspecting them in - (id)connectInput:(MIKMIDISourceEndpoint *)endpoint error:(NSError **)error eventHandler: (MIKMIDIEventHandlerBlock)eventHandler, they seem to have command type MIKMIDICommandTypeSystemMessage (0xff instead of 0xfe).

Other command types seem to be spot on (eg. CC, Pitch bend, Note On, etc).

[MIKMIDIOutputPort sendCommands toDestination:error] Uses Default MIDIPacketList Size

First off, thanks for making this library available. It is so much nicer to use than the clunky CoreMIDI API!

The sendCommands:toDestination:error method in MIKMIDIOutputPort creates a MIDIPacketList using the default size (1 packet with 256 bytes of data). This causes an error if the commands passed in take up more memory than is available (either by having a lot of data in the packets or a lot of packets). The MIDIPacketList's size should be computed using the following computation:

ByteCount packetListSize = offsetof(MIDIPacketList, packet) + offsetof(MIDIPacket, data) * commands.count;

for (MIKMIDICommand* command in commands) {
    packetListSize += command.data.length;
}

packetListSize can then be used to allocate a block of memory to be used when copying data from the MIKMIDICommand objects to the MIDIPacketList.

The function that performs the copy (MIKMIDIPacketListFromCommands) isn't used anywhere else in the library so this function itself could be changed to actually allocate the MIDIPacketList. However, it's possible that others are making use of that function. In which case, a second function should be created that allocates a MIDIPacketList blob based on the contents of the array of MIKMIDICommands.

I can take a look into fixing this and submit a pull request. Let me know if you have any suggestions or preferences on the fix.

Remove (most) -get<Value>: pass by reference getters.

Even where MIKMIDI wraps a CoreMIDI function that returns a value passed by reference, unless there's a very pressing reason to do otherwise, I'd prefer to avoid pass by reference getters. This does of course require some thought about error handling. In cases where multiple specific errors may be generated, use a regular pass-by-reference NSError, but if possible returning a special/default value (or nil) and logging an error is preferable.

Some examples of this are -[MIKMIDITrack getTrackNumber:] and -[MIKMIDISequence getTimeSignature:atTimeStamp:].

Make MIKMIDIObject creation methods return an existing instance for known MIDIObjectRefs

It might be advantageous for MIKMIDIObject to return an existing instance for a MIDIObjectRef that has already been used to create a MIKMIDIObject. That is, make it so there's a 1:1 correspondence between MIDIObjectRefs and MIKMIDIObject instances, rather than allowing multiple instances to represent the same underlying MIDI object.

The initial inspiration for this thought was an issue hit into in implementing Issue #6. The problem is that trying to create a MIKMIDIEndpoint instance from the object reference passed into the object removal notification when a virtual endpoint is removed fails. This is true even though there is an extant MIKMIDIEndpoint instance for that object ref.

I'm not sure yet if this makes sense, so it needs to be thought through more.

MIKMIDIMapping can't be saved/loaded to/from disk on iOS

MIKMIDIMapping relies on NSXMLDocument, etc. for saving to/loading from disk, meaning it doesn't work on iOS. Either provide an alternate serialization strategy on iOS, or rewrite to remove the dependence on NSXMLDocument.

Finish Meta Event classes

The following meta events still need classes:

Sequence Number
MIDI Channel Prefix
End Of Track
Set Tempo
SMTPE Offset
Sequencer Specific

Swift crash/Obj-C warning when setting numerator on MIKMutableMIDIMetaTimeSignatureEvent

The following Objective-C:

MIKMutableMIDIMetaTimeSignatureEvent *event = [[MIKMutableMIDIMetaTimeSignatureEvent alloc] init];
event.numerator = 4;

Yields the following warning in the console:

[NSConcreteMutableData replaceBytesInRange:withBytes:length:]: range {0, 1} exceeds data length 0

but does not crash. The following Swift:

let event = MIKMutableMIDIMetaTimeSignatureEvent()
event.numerator = UInt8(4)

Raises an NSRangeException:

*** Terminating app due to uncaught exception 'NSRangeException', reason: '*** -[NSConcreteMutableData subdataWithRange:]: range {8, 4026531840} exceeds data length 0'
*** First throw call stack:
(
    0   CoreFoundation                      0x000000010ad36b95 __exceptionPreprocess + 165
    1   libobjc.A.dylib                     0x000000010c88bbb7 objc_exception_throw + 45
    2   CoreFoundation                      0x000000010ad36acd +[NSException raise:format:] + 205
    3   Foundation                          0x000000010b1812b7 -[NSData(NSData) subdataWithRange:] + 252
    4   MIKMIDI                             0x000000010cacb24d -[MIKMIDIMetaEvent metaData] + 237
    5   MIKMIDI                             0x000000010cacdb53 -[MIKMIDIMetaTimeSignatureEvent setNumerator:] + 243

The length of the range (4026531840) is different every time I run it.

MIKMutableMIDIMetaTimeSignatureEvent, MIKMIDIMetaTimeSignatureEvent and MIKMIDIMetaEvent all don't have constructors, so it falls all the way back to MIKMIDIEvent, and initializes as a NULL event type with empty internalData. So when it accesses metadata, its dataLength is a garbage value. Or at least I think that's what's going on.

Add API to MIKMIDISequencer so it can connect directly to an endpoint

This is just a nice to have. Instead of requiring the user of MIKMIDISequencer to connect to a device/endpoint and "manually" route messages to MIKMIDISequencer it would be cool to just give the sequencer the endpoint (or device) in question and have it connect to it. This should of course be optional in addition to the existing -recordMIDICommand: method.

Unify MIKMIDICommand and MIKMIDIEvent

I'm not sure if this is actually practical, but I'd love to remove the mostly annoying distinction between MIDI messages (MIKMIDICommands) as used by devices, endpoints etc., and MIDI events (MIKMIDIEvents) as used by sequences, tracks, etc.

At the very least, it would be nice to make conversion between them easier. e.g. +[MIKMIDICommand commandsWithEvent:event] and vice versa.

Fix issues with multiple objects connecting to the same endpoint simultaneously

Right now, internally, MIKMIDIDeviceManager supports multiple event handler blocks for a single connected input. This means that more than one object can connect to the same endpoint. However, the -disconnectInput: method removes all event handlers, so there's no way to selectively disconnect a single event handler while leaving other connected.

This needs to be fixed. It probably makes sense to return a token from -connectInput:error:eventHandler: that can be passed into the disconnect method to only disconnect that event handler.

All MIKMIDIEvent subclasses' alloc/init initialize with NULL eventType

The over-arching problem to #57. None of the subclasses, as far as I can tell, initialize with alloc/init of the proper eventType (and possibly other properties) because they all fall back to init in MIKMIDIEvent, which defaults to NULL. I'm not sure if this should be solved with a custom init in each subclass or a more unified approach like midiEventWithTimestamp:eventType:data:. Happy to help given some guidance on the preferred approach.

Make it easier to create a new MIKMIDICommand instance of a particular type

Right now, if you want to create e.g. an MIKMutableMIDIControlChangeCommand, you have to do this:

MIKMutableMIDIControlChangeCommand *newCommand = [MIKMutableMIDIControlChangeCommand commandForCommandType:MIKMIDICommandTypeControlChange];

It would be nicer to be able to do this:

MIKMutableMIDIControlChangeCommand *newCommand = [[MIKMutableMIDIControlChangeCommand alloc] init];

and have the resultant command simply be of the correct subclass and have its command type set automatically.

Add error handling to MIKMIDIClientSource/DestinationEndpoint creation method(s).

This issue was raised by the fact that MIDIDestinationCreate() fails on iOS (6+) unless the audio key is set for UIBackgroundModes. See here:

Beginning in iOS 6, apps need to have the audio key in their UIBackgroundModes in order to use CoreMIDI’s MIDISourceCreate and MIDIDestinationCreate functions. These functions return kMIDINotPermitted (-10844) if the key is not set.

This also causes MIKMIDISequencer to fail on iOS unless that's set

We should make sure the error message for that particular case is descriptive to guide people toward the solution.

MIKMIDISequencer is too hard to use for simple playback.

I'd like e.g. this code to just work:

NSError *error = nil;
MIKMIDISequence *sequence = [MIKMIDISequence sequenceWithFileAtURL:midiFileURL error:&error];
self.sequencer = [MIKMIDISequencer sequencerWithSequence:sequence];
[self.sequencer startPlayback];

If deployment target is 10.8 or higher, or iOS, use weak references for delegates in favor of unsafe_unretained

On OS X 10.7, many classes that are likely to be the delegate of various MIKMIDI classes are likely to be classes that don't support weak references. In particular, NSViewController, NSWindow, and NSWindowController. Therefore, MIKMIDI uses unsafe_unretained references to delegates. If the app being built is to be deployed to 10.8 or higher, it would be better to automatically use weak references instead.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.