Cactus / OM#
OM# (om-sharp) is a computer-assisted composition environment derived from OpenMusic: a visual programming language dedicated to musical structure generation and processing.
The visual language is based on Common Lisp, and allows to create programs that are interpreted in this language. Visual programs are made by assembling and connecting icons representing Lisp functions and data structures, built-in control structures (e.g. loops), and other program constructs. The visual language can therefore be used for general-purpose programming, and reuse any existing Common Lisp code. A set of in-built tools and external libraries make it a powerful environment for music composition: various classes implementing musical structures are provided, associated with graphical editors including common music notation, MIDI, OSC, 2D/3D curves, and audio buffers.

Download
→ OM# is available on macOS, Windows, and Linux
What’s new ?

OM# includes a brand-new generation of tools and features in your computer-assisted composition environment:
- New patching interfaces and environment with easier box inspection / display control / automatic alignment / connections / etc.
- No workspace to set-up: open your documents and simply organize them in your usual file-system.
- Interactive visualization of Lisp code corresponding to visual programs.
- A native implementation of the reactive mode for visual program execution.
- New loops. Embed iterative processes in standard patches. Use a collection of new collectors and memory utilities.
- A new set of interface components: list-selection, switch, button, slider, … / Lock your patch for faster interaction.
- A redesigned sequencer interface including dual program/tracks-based visualization, meta-programming tools, and reactive execution modes.
- Human-readable, easily editable text format for patches and other documents. Possibility to read and edit patches as text.
- New score editors, BPF/BPC editors, etc. Nicer display. Easier edit.
- Collection : a versatile container handling the storage, visualization and editing of collection of objects.
- A time-based model for “executable” objects, including dynamic function execution and data send/transfer possibility.
- Dynamic-memory allocated audio buffers (no need to store all your sounds in external files anymore).
- New generation of tools and editors for the representation and manipulation of musical objects (score, sounds, MIDI tracks, temporal data streams, controllers, etc.)
- A framework for handling OSC data and bundles
- …
→ See also this ICMC paper (2017) for a quick overview.

Sources
OM# is a free software distributed under the GPLv3 license.
→ Source repository: https://github.com/cac-t-u-s/om-sharp/
As a Common Lisp program, OM# can be considered just as an extension of Lisp including the specific built-in features of the application.
The application is developed with the latest LispWorks compiler (7.1.2), which provides multi-platform support and graphical/GUI toolkits in Common Lisp.
A limited “Personal” edition of LispWorks 7 is now available: its limited heap size requires compiling sources in several successive runs, and it is not possible to create new OM# executables with it, however, it allows to load and run/use/edit the program from the sources.
Alternatively, the OM# executable also includes a Lisp interpreter which can load and evaluate modifications and extensions of the program sources.
Compatibility
OM# can load patches created in OpenMusic. See the how to import OpenMusic patches.
Most OpenMusic external libraries are easily portable (or already ported). See how to create or adapt a library.
Report any problems in porting or converting libraries or patches on the discussion forum (see below).
See the new om-sharp-users
repository.
→ Use the Issue Tracker to report problems, suggest features or enhancements, or just discuss about the project !
A discussion group is also hosted on Ircam Forumnet.
→ Create an account in order to post questions and replies.
Subscribe to group notifications using Watching / Tracking and other options.
Externals / Libraries
External libraries are packages containing additional pieces of code that can be loaded dynamically in an OM# session. A few of them are readily available and listed below, as well as a number of compatible OpenMusic libraries.
OM# external libraries are structured as a simple folder, called either “libname” or “libname x.y” (where “libname” is the name of the library, and “x.y” is a version number), containing a loader file named libname.omlib.
→ Unzip the external libraries in a common container directory and specify directory in the Preferences/Libraries/
Compatible OpenMusic libraries |
"Classic" libraries
|
Connection with external/DSP tools
|
|
|
Publications
About the general design and implementation of OM#:
- Next-generation Computer-aided Composition Environment: A New Implementation of OpenMusic. Jean Bresson, Dimitri Bouche, Thibaut Carpentier, Diemo Schwarz, Jérémie Garcia. International Computer Music Conference (ICMC’17), Shanghai, China, 2017.
- Timed Sequences: A Framework for Computer-Aided Composition with Temporal Structures. Jérémie Garcia, Dimitri Bouche, Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR’17), A Coruña, Spain, 2017.
- Computer-aided Composition of Musical Processes. Dimitri Bouche, Jérôme Nika, Alex Chechile, Jean Bresson. Journal of New Music Research, 46(1), 2017.
OM# was also used as a support for research and production in a number of recent projects:
- OM-AI: A Toolkit to Support AI-Based Computer-Assisted Composition Workflows in OpenMusic. Anders Vinjar, Jean Bresson. Sound and Music Computing conference (SMC’19), Málaga, Spain, 2019.
- Musical Gesture Recognition Using Machine Learning and Audio Descriptors. Paul Best, Jean Bresson, Diemo Schwarz. International Conference on Content-Based Multimedia Indexing (CBMI’18), La Rochelle, France, 2018.
- From Motion to Musical Gesture: Experiments with Machine Learning in Computer-Aided Composition. Jean Bresson, Paul Best, Diemo Schwarz, Alireza Farhang. Workshop on Musical Metacreation (MUME2018), International Conference on Computational Creativity (ICCC’18), Salamanca, Spain, 2018.
- Symbolist: An Open Authoring Environment for End-user Symbolic Notation. Rama Gottfried, Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR’18), Montreal, Canada, 2018.
- Landschaften – Visualization, Control and Processing of Sounds in 3D Spaces. Savannah Agger, Jean Bresson, Thibaut Carpentier. International Computer Music Conference (ICMC’17), Shanghai, China, 2017.
- Interactive-Compositional Authoring of Sound Spatialization. Jérémie Garcia, Thibaut Carpentier, Jean Bresson. Journal of New Music Research, 46(1), 2017.
- o.OM: Structured-Functional Communication between Computer Music Systems using OSC and Odot. Jean Bresson, John MacCallum, Adrian Freed. ACM SIGPLAN Workshop on Functional Art, Music, Modeling & Design (FARM’16), Nara, Japan, 2016.
- Towards Interactive Authoring Tools for Composing Spatialization. Jérémie Garcia, Jean Bresson, Thibaut Carpentier. IEEE 10th Symposium on 3D User Interfaces (3DUI), Arles, France, 2015.
Design and development: J. Bresson, with contributions by D. Bouche, J. Garcia, A. Vinjar, and others (see Contributors).
This project uses code and features from the OpenMusic project (by C. Agon, G. Assayag, J. Bresson and others, IRCAM STMS lab).
Contact: https://j-bresson.github.io