Cactus / OM#
OM# (om-sharp) is a computer-assisted composition environment derived from OpenMusic: a visual programming language dedicated to musical structure generation and processing.
The visual language is based on Common Lisp, and allows to create programs that are interpreted in this language. Visual programs are made by assembling and connecting icons representing Lisp functions and data structures, built-in control structures (e.g. loops), and other program constructs. The visual language can therefore be used for general-purpose programming, and reuse any existing Common Lisp code. A set of in-built tools and external libraries make it a powerful environment for music composition: various classes implementing musical structures are provided, associated with graphical editors including common music notation, MIDI, OSC, 2D/3D curves, and audio buffers.
→ OM# is available on macOS, Windows, and Linux
Source repository: https://github.com/cac-t-u-s/om-sharp/
What’s new ?
OM# includes a brand-new generation of tools and features in your computer-assisted composition environment:
- New patching interfaces and environment with easier box inspection / display control / automatic alignment / connections / etc.
- No workspace to set-up: open your documents and simply organize them in your usual file-system.
- Interactive visualization of Lisp code corresponding to visual programs.
- A native implementation of the reactive mode for visual program execution.
- New loops. Embed iterative processes in standard patches. Use a collection of new collectors and memory utilities.
- A new set of interface components: list-selection, switch, button, slider, … / Lock your patch for faster interaction.
- A redesigned sequencer interface including dual program/tracks-based visualization, meta-programming tools, and reactive execution modes.
- Human-readable, easily editable text format for patches and other documents. Possibility to read and edit patches as text.
- New score editors, BPF/BPC editors, etc. Nicer display. Easier edit.
- Collection : a versatile container handling the storage, visualization and editing of collection of objects.
- A time-based model for “executable” objects, including dynamic function execution and data send/transfer possibility.
- Dynamic-memory allocated audio buffers (no need to store all your sounds in external files anymore).
- New generation of tools and editors for the representation and manipulation of musical objects (score, sounds, MIDI tracks, temporal data streams, controllers, etc.)
- A framework for handling OSC data and bundles
→ See also this ICMC paper (2017) for a quick overview.
OM# can load patches created in OpenMusic. See the how to import OpenMusic patches.
Most OpenMusic external libraries are easily portable (or already ported). See how to create or adapt a library.
Report any problems in porting or converting libraries or patches on the discussion forum (see below).
A discussion group is hosted on Ircam Forumnet.
→ Create an account in order to post questions and replies.
Subscribe to group notifications using Watching / Tracking and other options.
Externals / Libraries
External libraries are packages containing additional pieces of code that can be loaded dynamically in an OM# session. A few of them are readily available and listed below, as well as a number of compatible OpenMusic libraries.
OM# external libraries are structured as a simple folder, called either “libname” or “libname x.y” (where “libname” is the name of the library, and “x.y” is a version number), containing a loader file named libname.omlib.
→ Unzip the external libraries in a common container directory and specify directory in the Preferences/Libraries/
OM# has been used as a support for research and production in a number of recent projects.
See related papers below:
- OM-AI: A Toolkit to Support AI-Based Computer-Assisted Composition Workflows in OpenMusic. Anders Vinjar, Jean Bresson. Sound and Music Computing conference (SMC’19), Málaga, Spain, 2019.
- Musical Gesture Recognition Using Machine Learning and Audio Descriptors. Paul Best, Jean Bresson, Diemo Schwarz. International Conference on Content-Based Multimedia Indexing (CBMI’18), La Rochelle, France, 2018.
- From Motion to Musical Gesture: Experiments with Machine Learning in Computer-Aided Composition. Jean Bresson, Paul Best, Diemo Schwarz, Alireza Farhang. Workshop on Musical Metacreation (MUME2018), Internationa Conference on Computational Creativity (ICCC’18), Salamanca, Spain, 2018.
- Symbolist: An Open Authoring Environment for End-user Symbolic Notation. Rama Gottfried, Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR’18), Montreal, Canada, 2018.
- Next-generation Computer-aided Composition Environment: A New Implementation of OpenMusic. Jean Bresson, Dimitri Bouche, Thibaut Carpentier, Diemo Schwarz, Jérémie Garcia. International Computer Music Conference (ICMC’17), Shanghai, China, 2017.
- Landschaften – Visualization, Control and Processing of Sounds in 3D Spaces. Savannah Agger, Jean Bresson, Thibaut Carpentier. International Computer Music Conference (ICMC’17), Shanghai, China, 2017.
- Timed Sequences: A Framework for Computer-Aided Composition with Temporal Structures. Jérémie Garcia, Dimitri Bouche, Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR’17), A Coruña, Spain, 2017.
- Computer-aided Composition of Musical Processes. Dimitri Bouche, Jérôme Nika, Alex Chechile, Jean Bresson. Journal of New Music Research, 46(1), 2017.
- Interactive-Compositional Authoring of Sound Spatialization. Jérémie Garcia, Thibaut Carpentier, Jean Bresson. Journal of New Music Research, 46(1), 2017.
- o.OM: Structured-Functional Communication between Computer Music Systems using OSC and Odot. Jean Bresson, John MacCallum, Adrian Freed. ACM SIGPLAN Workshop on Functional Art, Music, Modeling & Design (FARM’16), Nara, Japan, 2016.
- Towards Interactive Authoring Tools for Composing Spatialization. Jérémie Garcia, Jean Bresson, Thibaut Carpentier. IEEE 10th Symposium on 3D User Interfaces (3DUI), Arles, France, 2015.