From 832b53a54f52dbc573cfadd9e5bce6017e957f39 Mon Sep 17 00:00:00 2001 From: Xavier Ducrohet <> Date: Fri, 17 Apr 2009 11:10:20 -0700 Subject: AI 146747: Add the JetCreator tools and docs to the SDK (mac/windows only). BUG=1793093 Automated import of CL 146747 --- docs/JET_Authoring_Guidelines.htm | 2442 ----------------------------- docs/JET_Authoring_Guidelines.html | 2442 +++++++++++++++++++++++++++++ docs/JET_Creator_User_Manual.htm | 3032 ------------------------------------ docs/JET_Creator_User_Manual.html | 3032 ++++++++++++++++++++++++++++++++++++ docs/JET_Programming_Manual.htm | 1333 ---------------- docs/JET_Programming_Manual.html | 1333 ++++++++++++++++ 6 files changed, 6807 insertions(+), 6807 deletions(-) delete mode 100644 docs/JET_Authoring_Guidelines.htm create mode 100644 docs/JET_Authoring_Guidelines.html delete mode 100644 docs/JET_Creator_User_Manual.htm create mode 100644 docs/JET_Creator_User_Manual.html delete mode 100644 docs/JET_Programming_Manual.htm create mode 100644 docs/JET_Programming_Manual.html diff --git a/docs/JET_Authoring_Guidelines.htm b/docs/JET_Authoring_Guidelines.htm deleted file mode 100644 index 2ade2e3..0000000 --- a/docs/JET_Authoring_Guidelines.htm +++ /dev/null @@ -1,2442 +0,0 @@ - - -
- - - - - -- Copyright (C) 2009 The Android Open Source Project - - Licensed under the Apache License, Version 2.0 (the "License"); - you may not use this file except in compliance with the License. - You may obtain a copy of the License at - - http://www.apache.org/licenses/LICENSE-2.0 - - Unless required by applicable law or agreed to in writing, software - distributed under the License is distributed on an "AS IS" BASIS, - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - See the License for the specific language governing permissions and - limitations under the License. -- -
JET™ Content Authoring Guidelines
Vrs 1.0
Authored by SONiVOX
Copyright 2009 Sonic Network, Inc.
This document contains content creation
-guidelines for composers and sound designers authoring music and sound effects
-for the SONiVOX JET platform. JET is an
-interactive music player for small embedded devices, including the Google Android
-platform. It allows applications to include interactive music soundtracks, in
JET works in conjunction with SONiVOX’s
-Embedded Audio Synthesizer (EAS) which is the
The JET content author works in up to three
-different applications to create JET content; a standard
The final result is a .jet file that the -content author gives to the application programmer for use in the game or -application.
- -It is important to use a common set of
-terms to minimize confusion. Since JET uses
Channel: MIDI data associated with a specific
Controller: A
DAW: Digital Audio Workstation. A common term for
EAS: Embedded
JET: Jet Interactive Engine. The name of the SONiVOX JET interactive -music engine.
- -Segment: A musical section such as a chorus or verse that is a component of
-the overall composition. In JET, a segment can be an entire MIDI file or a
-derived from a portion of a
SMF-0: Standard MIDI File Type 0, a MIDI file that contains a single
-track, but may be made up of multiple channels of
SMF-1: Standard MIDI File Type 1, a MIDI file that contains a one more
-tracks, and each track may in turn be made up of one or more channels of
Track: A single track in a DAW containing a timed sequence of
Interactive music can be defined as music -that changes in real-time according to non-predictable events such as user -interaction or game play events. In this way, interactive music is much more -engaging as it has the ability to match the energy and mood of a game much -closer than a pre-composed composition that never changes. In some applications -and games, interactive music is central to the game play. Guitar Hero is one -such popular game. When the end user successfully ‘captures’ the musical notes -coming down the fret board, the music adapts itself and simultaneously keeps -score of successes and failures. JET allows for these types of music driven -games as well.
- -There are several methods for making and -controlling interactive music and JET is one such method. This section -describes the features of JET and how they might be used in a game or software -application. It also describes how JET can be used to save memory in small -footprint devices such as Android enabled mobile handsets.
- -JET supports a flexible music format that -can be used to create extended musical sequences with a minimal amount of data. -A musical composition is broken up into segments that can be sequenced to -create a longer piece. The sequencing can be fixed at the time the music file -is authored, or it can be created dynamically under program control.
- -- -
Figure 1: Linear Music Piece
- -This diagram shows how musical segments are
-stored. Each segment is authored as a separate
The bottom part of the diagram shows how -the musical segments can be recombined to create a linear music piece. In this -example, the bridge might end with a half-step key modulation and the remaining -segments could be transposed up a half-step to match.
- -- -
Figure 2: Non-linear music piece
- -In this diagram, we see a non-linear music -piece. The scenario is a first-person-shooter (FPS) and JET is providing the -background music. The intro plays as the level is loading and then transitions -under program control to the Searching segment. This segment is repeated -indefinitely, perhaps with small variations (using the mute/un-mute feature) -until activity in the game dictates a change.
- -As the player nears a monster lair, the -program starts a synchronized transition to the Danger segment, increasing the -tension level in the audio. As the player draws closer to the lair, additional -tracks are un-muted to increase the tension.
- -As the player enters into combat with the -monster, the program starts a synchronized transition to the Combat segment. -The segment repeats indefinitely as the combat continues. A Bonus Hit -temporarily un-mutes a decorative track that notifies the player of a -successful attack, and similarly, another track is temporarily un-muted to -signify when the player receives Special Damage.
- -At the end of combat, the music transitions -to a victory or defeat segment based on the outcome of battle.
- -JET can also synchronize the muting and -un-muting of tracks to events in the music. For example, in the FPS game, it -would probably be desirable to place the musical events relating to bonuses and -damage as close to the actual game event as possible. However, simply un-muting -a track at the moment the game event occurs might result in a music clip -starting in the middle. Alternatively, a clip could be started from the -beginning, but then it wouldn’t be synchronized with the other music tracks.
- -However, with the JET sync engine, a clip
-can be started at the next opportune moment and maintain synchronization. This
-can be accomplished by placing a number of short music clips on a decorative
-track. A
- -
Figure 3: Synchronized Mute/Unmute
- -JET provides an audio synchronization API
-that allows game play to be synchronized to events in the audio. The mechanism
-relies on data embedded in the
- -
Figure 4: Music Game with Synchronization
The arrows represent events in the music -sequence where game events need to be synchronized. In this case, the blue -arrow represents a time where the player is supposed to press the left button, -and the red arrow is for the right button. The yellow arrow tells the game -engine that the sequence is complete. The player is allowed a certain time -window before and after the event to press the appropriate key.
- -If an event is received and the player has -not pressed a button, a timer is set to half the length of the window. If the -player presses the button before the timer expires, the game registers a -success, and if not, the game registers a failure.
- -If the player presses the button before the -event is received, a timer is set to half the length of the window. If an event -is received before the timer expires, the game registers a success, and if not, -the game registers a failure. Game play might also include bonuses for getting -close to the timing of the actual event.
- -To author JET files and hear them playback interactively,
-the content author will work in two or three applications which are designed to
-work together smoothly. The first is application is any off-the-shelf
Once the composer has completed their
Optionally, the author may elect to create
-a custom DLS soundbank. This can be created in any off-the-shelf DLS authoring
-application, such as Awave from MJSoft, and loaded into JET Creator along with
-the
Below is an overview of this process. A -more detailed explanation of each step follows.
- -Launch DAW – Content authors will need to
-use a third party MIDI authoring application to compose their
Assign SONiVOX EAS Synth plugin as the
-playback synthesizer – The SONiVOX EAS Synth plugin is a VST and AU compatible
-virtual instrument that plugs into VST or AU compatible DAWs. This software
-plugin uses the same SONiVOX EAS MIDI synthesizer engine and default General
-MIDI wavetable soundset inherent in
-Android. Using this plugin allows content authors to hear the exact audio
-rendering of the instruments and
Optionally Load DLS2 Soundset – The SONiVOX -EAS Synth plugin allows for the loading of any DLS2 compatible soundset for -playback. These could include a new GM wavetable set, or a small collection of -just a few custom instruments for a given application. Note, the DLS file does -not replace the internal GM wavetable used by the EAS engine. DLS soundsets -play in conjunction with the internal GM wavetable.
- -Compose MIDI File – Compose
Launch JET Creator – Once all DLS2 and
Assign JET Segment Attributes – After
-creating segments the content author interactive elements. Interactive elements
-include mute and unmute settings of individual tracks in the MIDI file(s) as
-well as
Audition Interactive Playback – After -assigning the segment attributes and creating the JET file, the content author -can audition all interactive playback elements in the JET Audition window.
- -Save .jtc File – After the author is -satisfied with the result, it is recommended they save the JET Creator .jtc -file which will save their settings, references to source files, etc.
- -Export Files – Exporting the JET Creator -file will bundle all source files and their attributes into a single .zip file. -The zip file will also contain a .jet file for use by the Android application.
- -Included in the JET Creator package is the
-EAS software synthesizer in plug-in format. The EAS plugin synth allows the
-composer to hear the instruments used in Android as they are composing their
Follow the instructions for your individual -DAW to install and utilize the plugin. For Mac users this will typically -involve copying the “EAS Synth.componant” file into your plugins folder which -is usually located at /Library/Audio/Plug-ins/Components. PC users will want to -install the “EAS Synth.dll” into the plugin folder that their DAW requires.
- -The EAS Synth is an embedded synthesizer -for small mobile devices. This means it does not have the flexibility of high -end synthesizers typically utilized in a professional application such as -Logic, Digital Performer, etc. As such, only the following attributes are -supported.
- -Macintosh:
- -Mac OSX (Intel) Macs
- -ASIO Supported Soundcards
- -Sample Rate: 44100 hz
- -Buffer Size: 256 kbytes
- -PC:
- -Windows 2000 or
ASIO supported soundcards
- -Sample Rate: 44100 hz
- -Buffer Size: 256 kbytes
- -Each DAW has its own particular method of
-assigning
The SONiVOX EAS Synth virtual instrument is
-a multi-timbral synthesizer. (i.e. it plays back multiple instruments on unique
-
A Logic 8 template file has been included -in the Android Cupcake release to facilitate the above.
- -Playback in Logic 8 may require you to be -in record enable mode for each track you are auditioning. To record enable -multiple tracks hold down the Option key.
- -To write out a standard
In addition, the mix parameters for volume,
-pan and program changes may not appear in the event list and therefore may not
-write out with the
Select All and use the “Insert MIDI > -Insert MIDI Settings as Events” command.
- -Select All and use the “Apply Quantization -Settings Destructively” command.
- -Sonar 7 is a bit easier to set up, use and
-save than Logic 8. Simply open or start a new
SONAR 8 works similarly to SONAR 7.
- -We’ve seen some instances when creating -content with Digital Performer where notes with a release velocity of non-0 -will generate an extra note-on event in the EAS synth. If you are hearing a -doubling, editing the release velocity events to zero should fix this problem.
- -The SONiVOX EAS Synthesizer supports two -simultaneous soundsets or wavetables. One is the internal General MIDI wavetable -inherent to the SONiVOX EAS Synthesizer. The other is a Downloadable Sounds -Level 2 (DLS2) soundset. The internal wavetable is a GM Level 1 compliant -wavetable with 127 melodic instruments and 1 drumkit. It is in a proprietary -SONiVOX format. The DLS2 soundsets are an open format published by the MIDI -Manufactures Association.
- -In the Android Cupcake release, the -internal wavetable is only 200 kbytes, very small, in order to be compliant -with all Android devices which may not have a lot of memory. DLS2 soundsets can -be any size that a particular device supports. Upgraded (larger) internal -wavetables as well as custom DLS2 instruments can be licensed from SONiVOX.
- -To load a custom soundset, click on the
-Load DLS button in the EAS Synth plugin interface. Browse to the DLS2 file you
-wish to load and say OK. Only DLS Level 2 formatted soundsets are
-supported.
Since both the internal EAS GM wavetable -and a custom DLS2 soundset are used simultaneously, you must be sure you have -your MIDI Program Changes set correctly. DLS2 instruments must be assigned to a -Bank other than the default GM bank -used by the internal synthesizer.
- -The internal EAS synthesizer is assigned to
-Banks 121 (melodic instruments) and 120 (drum instruments). This follows the
-General MIDI Level 1 specification. Note: Most
The EAS synth supports MSB (Controller 0),
-LSB (Controller 32) Bank change messages. There are two places you need to set
-this Bank and Program Change number. The first is in your DLS2 soundset. Using
-Bank 1, each Instrument would be assigned MSB 1, LSB 0, then the Instrument
-Program Change number. The second place to use the Bank and Program Change
-number is in your
In your
- -
JET Creator is the desktop application -where you’ll edit and audition the JET interactive music elements. For details -on the JET Creator application please see the “JET Creator User Manual”. Below -are some additional guidelines to help you out.
- -As with all projects, its best to discuss and -design the interactive music scheme with the game designer and programmer -before beginning your composition. An outline and/or specification can go a -long way in saving you from having to redo things after the game is in place.
- -In general you’ll want to first write your
-music in your DAW of choice the way you’re used to composing, then break up the
-final
If you’re trying to conserve memory,
-compose as few MIDI files as possible, and create several segments from that
To make adding segments or events faster, -use the Replicate command. Replicate can add multiple segments or events at one -time and uses an offset parameter and prefix naming convention to keep things -easy to read. The MOVE command is also useful for moving multiple events by a -set number of measures, beats or ticks.
- -There are several interactive audio -concepts possible in JET. Below are a few examples although we hope developers -will come up with others we haven’t thought of! These are:
- -In this method the application is -triggering specific segments based on events in the game. For example a hallway -with lots of fighting might trigger segment 1 and a hallway with no fighting -might trigger segment 2. Using JET TriggerClips in conjunction with this method -creates even more diversity.
- -In this method the application is
-triggering mute and unmute events to specific tracks in a single
Music driven gaming is similar to what
-Guitar Hero and JETBOY have done in that the music content determines how
-graphic events are displayed. The application then queries the user response to
-the graphic events and interactively modifies the music in response. In this
-method the game is utilizing JET Application Events, MIDI controllers that are
-embedded in the
+ Copyright (C) 2009 The Android Open Source Project + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. ++ +
JET™ Content Authoring Guidelines
Vrs 1.0
Authored by SONiVOX
Copyright 2009 Sonic Network, Inc.
This document contains content creation
+guidelines for composers and sound designers authoring music and sound effects
+for the SONiVOX JET platform. JET is an
+interactive music player for small embedded devices, including the Google Android
+platform. It allows applications to include interactive music soundtracks, in
JET works in conjunction with SONiVOX’s
+Embedded Audio Synthesizer (EAS) which is the
The JET content author works in up to three
+different applications to create JET content; a standard
The final result is a .jet file that the +content author gives to the application programmer for use in the game or +application.
+ +It is important to use a common set of
+terms to minimize confusion. Since JET uses
Channel: MIDI data associated with a specific
Controller: A
DAW: Digital Audio Workstation. A common term for
EAS: Embedded
JET: Jet Interactive Engine. The name of the SONiVOX JET interactive +music engine.
+ +Segment: A musical section such as a chorus or verse that is a component of
+the overall composition. In JET, a segment can be an entire MIDI file or a
+derived from a portion of a
SMF-0: Standard MIDI File Type 0, a MIDI file that contains a single
+track, but may be made up of multiple channels of
SMF-1: Standard MIDI File Type 1, a MIDI file that contains a one more
+tracks, and each track may in turn be made up of one or more channels of
Track: A single track in a DAW containing a timed sequence of
Interactive music can be defined as music +that changes in real-time according to non-predictable events such as user +interaction or game play events. In this way, interactive music is much more +engaging as it has the ability to match the energy and mood of a game much +closer than a pre-composed composition that never changes. In some applications +and games, interactive music is central to the game play. Guitar Hero is one +such popular game. When the end user successfully ‘captures’ the musical notes +coming down the fret board, the music adapts itself and simultaneously keeps +score of successes and failures. JET allows for these types of music driven +games as well.
+ +There are several methods for making and +controlling interactive music and JET is one such method. This section +describes the features of JET and how they might be used in a game or software +application. It also describes how JET can be used to save memory in small +footprint devices such as Android enabled mobile handsets.
+ +JET supports a flexible music format that +can be used to create extended musical sequences with a minimal amount of data. +A musical composition is broken up into segments that can be sequenced to +create a longer piece. The sequencing can be fixed at the time the music file +is authored, or it can be created dynamically under program control.
+ ++ +
Figure 1: Linear Music Piece
+ +This diagram shows how musical segments are
+stored. Each segment is authored as a separate
The bottom part of the diagram shows how +the musical segments can be recombined to create a linear music piece. In this +example, the bridge might end with a half-step key modulation and the remaining +segments could be transposed up a half-step to match.
+ ++ +
Figure 2: Non-linear music piece
+ +In this diagram, we see a non-linear music +piece. The scenario is a first-person-shooter (FPS) and JET is providing the +background music. The intro plays as the level is loading and then transitions +under program control to the Searching segment. This segment is repeated +indefinitely, perhaps with small variations (using the mute/un-mute feature) +until activity in the game dictates a change.
+ +As the player nears a monster lair, the +program starts a synchronized transition to the Danger segment, increasing the +tension level in the audio. As the player draws closer to the lair, additional +tracks are un-muted to increase the tension.
+ +As the player enters into combat with the +monster, the program starts a synchronized transition to the Combat segment. +The segment repeats indefinitely as the combat continues. A Bonus Hit +temporarily un-mutes a decorative track that notifies the player of a +successful attack, and similarly, another track is temporarily un-muted to +signify when the player receives Special Damage.
+ +At the end of combat, the music transitions +to a victory or defeat segment based on the outcome of battle.
+ +JET can also synchronize the muting and +un-muting of tracks to events in the music. For example, in the FPS game, it +would probably be desirable to place the musical events relating to bonuses and +damage as close to the actual game event as possible. However, simply un-muting +a track at the moment the game event occurs might result in a music clip +starting in the middle. Alternatively, a clip could be started from the +beginning, but then it wouldn’t be synchronized with the other music tracks.
+ +However, with the JET sync engine, a clip
+can be started at the next opportune moment and maintain synchronization. This
+can be accomplished by placing a number of short music clips on a decorative
+track. A
+ +
Figure 3: Synchronized Mute/Unmute
+ +JET provides an audio synchronization API
+that allows game play to be synchronized to events in the audio. The mechanism
+relies on data embedded in the
+ +
Figure 4: Music Game with Synchronization
The arrows represent events in the music +sequence where game events need to be synchronized. In this case, the blue +arrow represents a time where the player is supposed to press the left button, +and the red arrow is for the right button. The yellow arrow tells the game +engine that the sequence is complete. The player is allowed a certain time +window before and after the event to press the appropriate key.
+ +If an event is received and the player has +not pressed a button, a timer is set to half the length of the window. If the +player presses the button before the timer expires, the game registers a +success, and if not, the game registers a failure.
+ +If the player presses the button before the +event is received, a timer is set to half the length of the window. If an event +is received before the timer expires, the game registers a success, and if not, +the game registers a failure. Game play might also include bonuses for getting +close to the timing of the actual event.
+ +To author JET files and hear them playback interactively,
+the content author will work in two or three applications which are designed to
+work together smoothly. The first is application is any off-the-shelf
Once the composer has completed their
Optionally, the author may elect to create
+a custom DLS soundbank. This can be created in any off-the-shelf DLS authoring
+application, such as Awave from MJSoft, and loaded into JET Creator along with
+the
Below is an overview of this process. A +more detailed explanation of each step follows.
+ +Launch DAW – Content authors will need to
+use a third party MIDI authoring application to compose their
Assign SONiVOX EAS Synth plugin as the
+playback synthesizer – The SONiVOX EAS Synth plugin is a VST and AU compatible
+virtual instrument that plugs into VST or AU compatible DAWs. This software
+plugin uses the same SONiVOX EAS MIDI synthesizer engine and default General
+MIDI wavetable soundset inherent in
+Android. Using this plugin allows content authors to hear the exact audio
+rendering of the instruments and
Optionally Load DLS2 Soundset – The SONiVOX +EAS Synth plugin allows for the loading of any DLS2 compatible soundset for +playback. These could include a new GM wavetable set, or a small collection of +just a few custom instruments for a given application. Note, the DLS file does +not replace the internal GM wavetable used by the EAS engine. DLS soundsets +play in conjunction with the internal GM wavetable.
+ +Compose MIDI File – Compose
Launch JET Creator – Once all DLS2 and
Assign JET Segment Attributes – After
+creating segments the content author interactive elements. Interactive elements
+include mute and unmute settings of individual tracks in the MIDI file(s) as
+well as
Audition Interactive Playback – After +assigning the segment attributes and creating the JET file, the content author +can audition all interactive playback elements in the JET Audition window.
+ +Save .jtc File – After the author is +satisfied with the result, it is recommended they save the JET Creator .jtc +file which will save their settings, references to source files, etc.
+ +Export Files – Exporting the JET Creator +file will bundle all source files and their attributes into a single .zip file. +The zip file will also contain a .jet file for use by the Android application.
+ +Included in the JET Creator package is the
+EAS software synthesizer in plug-in format. The EAS plugin synth allows the
+composer to hear the instruments used in Android as they are composing their
Follow the instructions for your individual +DAW to install and utilize the plugin. For Mac users this will typically +involve copying the “EAS Synth.componant” file into your plugins folder which +is usually located at /Library/Audio/Plug-ins/Components. PC users will want to +install the “EAS Synth.dll” into the plugin folder that their DAW requires.
+ +The EAS Synth is an embedded synthesizer +for small mobile devices. This means it does not have the flexibility of high +end synthesizers typically utilized in a professional application such as +Logic, Digital Performer, etc. As such, only the following attributes are +supported.
+ +Macintosh:
+ +Mac OSX (Intel) Macs
+ +ASIO Supported Soundcards
+ +Sample Rate: 44100 hz
+ +Buffer Size: 256 kbytes
+ +PC:
+ +Windows 2000 or
ASIO supported soundcards
+ +Sample Rate: 44100 hz
+ +Buffer Size: 256 kbytes
+ +Each DAW has its own particular method of
+assigning
The SONiVOX EAS Synth virtual instrument is
+a multi-timbral synthesizer. (i.e. it plays back multiple instruments on unique
+
A Logic 8 template file has been included +in the Android Cupcake release to facilitate the above.
+ +Playback in Logic 8 may require you to be +in record enable mode for each track you are auditioning. To record enable +multiple tracks hold down the Option key.
+ +To write out a standard
In addition, the mix parameters for volume,
+pan and program changes may not appear in the event list and therefore may not
+write out with the
Select All and use the “Insert MIDI > +Insert MIDI Settings as Events” command.
+ +Select All and use the “Apply Quantization +Settings Destructively” command.
+ +Sonar 7 is a bit easier to set up, use and
+save than Logic 8. Simply open or start a new
SONAR 8 works similarly to SONAR 7.
+ +We’ve seen some instances when creating +content with Digital Performer where notes with a release velocity of non-0 +will generate an extra note-on event in the EAS synth. If you are hearing a +doubling, editing the release velocity events to zero should fix this problem.
+ +The SONiVOX EAS Synthesizer supports two +simultaneous soundsets or wavetables. One is the internal General MIDI wavetable +inherent to the SONiVOX EAS Synthesizer. The other is a Downloadable Sounds +Level 2 (DLS2) soundset. The internal wavetable is a GM Level 1 compliant +wavetable with 127 melodic instruments and 1 drumkit. It is in a proprietary +SONiVOX format. The DLS2 soundsets are an open format published by the MIDI +Manufactures Association.
+ +In the Android Cupcake release, the +internal wavetable is only 200 kbytes, very small, in order to be compliant +with all Android devices which may not have a lot of memory. DLS2 soundsets can +be any size that a particular device supports. Upgraded (larger) internal +wavetables as well as custom DLS2 instruments can be licensed from SONiVOX.
+ +To load a custom soundset, click on the
+Load DLS button in the EAS Synth plugin interface. Browse to the DLS2 file you
+wish to load and say OK. Only DLS Level 2 formatted soundsets are
+supported.
Since both the internal EAS GM wavetable +and a custom DLS2 soundset are used simultaneously, you must be sure you have +your MIDI Program Changes set correctly. DLS2 instruments must be assigned to a +Bank other than the default GM bank +used by the internal synthesizer.
+ +The internal EAS synthesizer is assigned to
+Banks 121 (melodic instruments) and 120 (drum instruments). This follows the
+General MIDI Level 1 specification. Note: Most
The EAS synth supports MSB (Controller 0),
+LSB (Controller 32) Bank change messages. There are two places you need to set
+this Bank and Program Change number. The first is in your DLS2 soundset. Using
+Bank 1, each Instrument would be assigned MSB 1, LSB 0, then the Instrument
+Program Change number. The second place to use the Bank and Program Change
+number is in your
In your
+ +
JET Creator is the desktop application +where you’ll edit and audition the JET interactive music elements. For details +on the JET Creator application please see the “JET Creator User Manual”. Below +are some additional guidelines to help you out.
+ +As with all projects, its best to discuss and +design the interactive music scheme with the game designer and programmer +before beginning your composition. An outline and/or specification can go a +long way in saving you from having to redo things after the game is in place.
+ +In general you’ll want to first write your
+music in your DAW of choice the way you’re used to composing, then break up the
+final
If you’re trying to conserve memory,
+compose as few MIDI files as possible, and create several segments from that
To make adding segments or events faster, +use the Replicate command. Replicate can add multiple segments or events at one +time and uses an offset parameter and prefix naming convention to keep things +easy to read. The MOVE command is also useful for moving multiple events by a +set number of measures, beats or ticks.
+ +There are several interactive audio +concepts possible in JET. Below are a few examples although we hope developers +will come up with others we haven’t thought of! These are:
+ +In this method the application is +triggering specific segments based on events in the game. For example a hallway +with lots of fighting might trigger segment 1 and a hallway with no fighting +might trigger segment 2. Using JET TriggerClips in conjunction with this method +creates even more diversity.
+ +In this method the application is
+triggering mute and unmute events to specific tracks in a single
Music driven gaming is similar to what
+Guitar Hero and JETBOY have done in that the music content determines how
+graphic events are displayed. The application then queries the user response to
+the graphic events and interactively modifies the music in response. In this
+method the game is utilizing JET Application Events, MIDI controllers that are
+embedded in the
- Copyright (C) 2009 The Android Open Source Project - - Licensed under the Apache License, Version 2.0 (the "License"); - you may not use this file except in compliance with the License. - You may obtain a copy of the License at - - http://www.apache.org/licenses/LICENSE-2.0 - - Unless required by applicable law or agreed to in writing, software - distributed under the License is distributed on an "AS IS" BASIS, - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - See the License for the specific language governing permissions and - limitations under the License. -- -
Vrs 1.0
Content -Authoring Application for the
- -JET -Interactive Music Engine
- -Authored by SONiVOX
Copyright 2009 Sonic Network, Inc.
This document contains the user guidelines
-for the SONiVOX JET Creator, an authoring application for creating and
-auditioning JET files. JET is an interactive music player for small embedded
-devices, including the Google Android platform. It allows applications to
-include interactive music soundtracks, in
JET works in conjunction with SONiVOX’s
-Embedded Audio Synthesizer (EAS) which is the
In addition to the graphical user
-interface, there are two main functionalities taking place in JET Creator. The
-first involves gathering all the source data (
The JET Creator application is written in -the Python programming language, therefore you need to have the current version -of Python and WXWidgets installed. There is both a Mac and Windows version.
- -It is important to use a common set of
-terms to minimize confusion. Since JET uses
Channel: MIDI data associated with a specific
Controller: A
DAW: Digital Audio Workstation. A common term for
EAS: Embedded
JET: Jet Interactive Engine. The name of the SONiVOX JET interactive -music engine.
- -M/B/T: Measures, Beats and Ticks
- -Segment: A musical section such as a chorus or verse that is a component of
-the overall composition. In JET, a segment can be an entire MIDI file or a
-derived from a portion of a
SMF-0: Standard MIDI File Type 0, a MIDI file that contains a single
-track, but may be made up of multiple channels of
SMF-1: Standard MIDI File Type 1, a MIDI file that contains a one more
-tracks, and each track may in turn be made up of one or more channels of
Track: A single track in a DAW containing a timed sequence of
Interactive music can be defined as music -that changes in real-time according to non-predictable events such as user -interaction or game play events. In this way, interactive music is much more -engaging as it has the ability to match the energy and mood of a game much -closer than a pre-composed composition that never changes. In some applications -and games, interactive music is central to the game play. Guitar Hero is one -such popular game. When the end user successfully ‘captures’ the musical notes -coming down the fret board, the music adapts itself and simultaneously keeps -score of successes and failures. JET allows for these types of music driven -games as well.
- -There are several methods for making and -controlling interactive music and JET is one such method. This section -describes the features of JET and how they might be used in a game or software -application. It also describes how JET can be used to save memory in small -footprint devices such as Android enabled mobile handsets.
- -JET supports a flexible music format that -can be used to create extended musical sequences with a minimal amount of data. -A musical composition is broken up into segments that can be sequenced to -create a longer piece. The sequencing can be fixed at the time the music file -is authored, or it can be created dynamically under program control.
- -- -
Figure 1: Linear Music Piece
- -This diagram shows how musical segments are
-stored. Each segment is authored as a separate
The bottom part of the diagram shows how -the musical segments can be recombined to create a linear music piece. In this -example, the bridge might end with a half-step key modulation and the remaining -segments could be transposed up a half-step to match.
- -- -
Figure 2: Non-linear music piece
- -In this diagram, we see a non-linear music -piece. The scenario is a first-person-shooter (FPS) and JET is providing the -background music. The intro plays as the level is loading and then transitions -under program control to the Searching segment. This segment is repeated indefinitely, -perhaps with small variations (using the mute/un-mute feature) until activity -in the game dictates a change.
- -As the player nears a monster lair, the -program starts a synchronized transition to the Danger segment, increasing the -tension level in the audio. As the player draws closer to the lair, additional -tracks are un-muted to increase the tension.
- -As the player enters into combat with the -monster, the program starts a synchronized transition to the Combat segment. -The segment repeats indefinitely as the combat continues. A Bonus Hit -temporarily un-mutes a decorative track that notifies the player of a -successful attack, and similarly, another track is temporarily un-muted to -signify when the player receives Special Damage.
- -At the end of combat, the music transitions -to a victory or defeat segment based on the outcome of battle.
- -JET can also synchronize the muting and -un-muting of tracks to events in the music. For example, in the FPS game, it would -probably be desirable to place the musical events relating to bonuses and -damage as close to the actual game event as possible. However, simply un-muting -a track at the moment the game event occurs might result in a music clip -starting in the middle. Alternatively, a clip could be started from the -beginning, but then it wouldn’t be synchronized with the other music tracks.
- -However, with the JET sync engine, a clip
-can be started at the next opportune moment and maintain synchronization. This
-can be accomplished by placing a number of short music clips on a decorative
-track. A
- -
Figure 3: Synchronized Mute/Unmute
- -JET provides an audio synchronization API
-that allows game play to be synchronized to events in the audio. The mechanism
-relies on data embedded in the
- -
Figure 4: Music Game with Synchronization
The arrows represent events in the music sequence -where game events need to be synchronized. In this case, the blue arrow -represents a time where the player is supposed to press the left button, and -the red arrow is for the right button. The yellow arrow tells the game engine -that the sequence is complete. The player is allowed a certain time window -before and after the event to press the appropriate key.
- -If an event is received and the player has -not pressed a button, a timer is set to half the length of the window. If the -player presses the button before the timer expires, the game registers a -success, and if not, the game registers a failure.
- -If the player presses the button before the -event is received, a timer is set to half the length of the window. If an event -is received before the timer expires, the game registers a success, and if not, -the game registers a failure. Game play might also include bonuses for getting -close to the timing of the actual event.
- -To author JET files and hear them playback
-interactively, the content author will work in two applications which are
-designed to work together smoothly. The first is application is any
-off-the-shelf
Please see the JET Content Authoring Guidelines documentation for additional -details on content authoring.
- -JET Creator is a python language -application, therefore, you must have Python and wxPython installed on your -machine.
- -JetCreator was created and tested with:
- -Python Version 2.5.4
- -wxPython Version 2.8.7.1
- -These can be downloaded here:
- -PC:
- -http://www.python.org/download/releases/2.5.4/
- -http://www.wxpython.org/download.php
- -MAC:
- -http://wiki.python.org/moin/MacPython/Leopard
- -http://www.wxpython.org/download.php
- -After installing Python and wxPython, -simply unzip or copy all the files in the JET Creator application directory to -a folder on your hard drive.
- -- -
To launch JET Creator go to a command -prompt and set the directory to where you’ve installed Python. Next run python -with the command:
- -python
-jetcreator.py
There are a few different file types -associated with JET Creator.
- -.jtc JET -Creator project file. This file contains all the information associated with a -JET Creator project. When you Save or Save-as out of JET Creator, this file -type is saved.
- -.jet JET -File. This output file is automatically generated from JET Creator whenever you -save your JET Creator project. This is the file that bundles all JET assets -together into a single file that the Android application will use. Give this -file to the Android application developer.
- -.mid
.seg Segment
-File. This is a JET Segment file. It has the same name as the
.zip Zip -Archive file. When you Export a JET Archive, a zip file is created that -contains all the assets (files) necessary for JET Creator. Use this to transfer -JET Creator projects to other people.
- -When -you first launch JET Creator you are presented with an open dialog like the -following.
- -- -
Open will open an existing .jtc (JET Creator file) file. Use the browser -button to browse to the directory where you have saved your .jtc file.
- -New will create a new .jtc file.
- -Import will import a JET Archive (.zip) file.
- -Cancel will cancel the dialog and exit the application.
- -The main window of the JET Creator -application looks like the picture below. There are three main sections from -top to bottom: segment view, event view, and timeline.
- -The segment view section displays a list of
-the current segments, which
Just below the Segment view is the event -view. The event view section displays all events associated with a given -segment. Events only display when the segment they are assigned to is -highlighted. Each event displays its type, start and end points, track and midi -channel assignment, and its event ID.
- -Just below the Event view is the timeline -display. The timeline shows how many measures a given segment is as well as any -events associated with that segment. The timeline changes to display the -currently selected or playing segment. You can trigger an event in this window -while the segment is play by simply clicking on the event in the timeline -display.
- -- -
JET
-Creator Main Window
The buttons along the left side of main -window do the following:
- -Add: - -Displays the segment or event window for adding a new segment or event
- -Revise: - -Displays the segment or event window for updating an existing segment or event
- -Delete: - -Deletes the selected segment or event (will ask for confirmation)
- -Move: - -Displays the move window which allows you to move selected segments or events -in time
- -Queue All: - Queue’s -(selects) all segments for playback
- -Dequeue All: - Dequeue’s -(deselects) all segments
- -Play: - -Starts playback of all queue’d segments. This button changes to Stop if any -segments are playing
- -Audition: - -Displays the Audition window (see below)
- -The segment window is where a given -segment’s attributes are assigned and auditioned, as shown in the picture -below. The left side of the window displays the segments attributes that are -stored in the JET file. The right side of the window allows the author to set -mute flags, repeat and transpose settings and audition the segment as it will -play in the JET game.
- -Note: the audition attributes (mute flags, repeat and transpose) are not stored in the JET content file -(.jet) but rather are defined by the game or application itself. In programming -language, these settings correspond directly with the API calls to the JET -engine. By including them here, the JET content author can simulate how the -segment will respond to the applications API commands during game play.
- -- -
The segment parameters do the following:
- -Segment Name - Sets -the name of the segment
- -MIDI File -
-The name and location of the
DLS File -
-The name and location of the DLS2 file, if any, that the
Starting M/B/T - -Starting measure, beat and tick of the segment
- -Ending M/B/T - -Ending measure, beat and tick of the segment
- -Quantize - -Quantize value for quantizing the current segment during playback
- -The audition fields are as follows:
- -Track Mutes -
-Shows the MIDI tracks (not channels)
-in the
Channel -
-Displays the
Name - -Displays the track name meta event (if present) for each track
- -Repeat - -Indicates the number of times a segment should repeat during playback
- -Transpose - -Indicates the transposition in semi-tones or half-steps a segment should -transpose during playback
- -To the right of the Audition window are a few additional buttons. -These do as follows:
- -OK - -Selecting OK confirms all segment settings and closes the segment window
- -Cancel - -Selecting Cancel cancels any changes and closes the segment window
- -Replicate - -Displays the Replicate Segment window for entering multiple segments at once. -See below.
- -Play/Stop Segment - Starts -or Stops playback of the segment using the segment attributes assigned.
- -Play/Stop MIDI File -
-Starts or Stops playback of the
Pause/Resume - -Pauses or Resumes playback.
- -The event window is where a given segment’s -event attributes are assigned and auditioned, as shown in the picture below. To -add an event to a segment, the author must first select the segment which will -contain the event, then select the Add button. This will bring up the Event -window.
- -- -
There are two main sections to the event -window. The segment section on the left side of the event window is for display -only. It shows what the segment attributes are for the given segment. The Event -section, on the right side, is where events can be assigned. The following -parameters are available:
- -Event Name - -Assigns a name to an event
- -Event Type - -Selects which type of event to assign.
- -Starting M/B/T - -Sets the starting measure, beat, and tick for the event
- -Ending M/B/T - -Sets the ending measure, beat, and tick for the event, if applicable
- -Track - -Sets which track in the given segment the event will apply to
- -Channel -
-Sets which
Event ID - -Sets the event ID for the event. Multiple events can be assigned to the same -segment and therefore the Event ID is used to identify them
- -To the right of the Audition window are a few additional buttons. -These do as follows:
- -OK - -Selecting OK confirms all event settings and closes the event window
- -Cancel - -Selecting Cancel cancels any changes and closes the event window
- -Replicate - -Displays the Replicate Event window for entering multiple events at once. See -below.
- -Play/Stop - -Starts or Stops playback of the segment using the segment attributes assigned. -While the segment is playing, events can be triggered and auditioned.
- -Trigger - -Triggers the event assigned. This replicates the API command that the JET game -will use to trigger the event, therefore giving the content author a method for -auditioning the behaviour of the event.
- -Mute/UnMute - -Mute/UnMute will mute or unmute the track that the event is assigned to
- -Pause/Resume - -Pauses or Resumes playback.
- -To audition the behaviour of an event, you -can select the Play button. This will initiate playback. The trigger button -will send the trigger event when pressed. This is equivalent to selecting the -green trigger event in the timeline.
- -Note: Trigger events are meant to unmute a -single track of a segment when triggered, then mute that track at the end of -the trigger segment. Therefore you should make sure the mute flag is set to -mute the track that a trigger event will be unmuting when receiving a trigger event. -
- -Please read Section 7 “Under The Hood” -below for details on how trigger events work and behave.
- -Often in creating JET files, you’ll need to -create tens or even hundreds of events. You may also need to move events. The -Replicate and Move windows allow for this. There are two Replicate windows for -creating multiple segments or events. They look like the following:
- -- -
Replicate Segment Window
- -- -
Replicate Event Window
- -Both Replicate windows function the same. -After creating an initial segment or event, you can select the Replicate -button. The parameters are as follows:
- -Name Prefix - -Sets the prefix for the name of each segment or event created
- -Starting M/B/T - -Sets the starting time for the first segment or event
- -Increment M/B/T - -Sets the time between segments or events created.
- -Number -
-Sets the number of segments or events you wish to create. If the number
-overflows the length of the
Preview - -Preview allows you to examine the objects created before saying OK to insert -them.
- -The Move function acts similarly to the -Replicate function in that it allows you to edit multiple segments or events at -one time, in this case move them in time. Like Replicate, there are two Move -windows, one for Segments and one for Events. The windows look like the -following:
- -- -
Move Event Window
- -To use Move, first select the segments or -events you wish to move in time, then click the Move button. The parameters are -as follows:
- -Starting M/B/T - -Sets the starting time for the first segment or event
- -Increment M/B/T - -Sets the time in M/B/T you wish to move the objects by.
- -Preview - -Preview allows you to examine the objects created before saying OK to move -them.
- -Clicking the Audition button in the main -window of the JET Creator application will open the Audition window. This is -where the content author or application programmer can simulate the interactive -playback as it may occur in the mobile application or game itself.
- -- -
JET Audition Window
- -There are four main sections to the -audition window. The left most section displays the available segments and -their length in seconds. The middle section displays a running list of what -segments are queued for playback and what their playback status is. The far -right section displays the mute flags for the currently playing segment. The -timeline section at the bottom is the same as in the main window. It displays -the currently playing segment as well as a visual display of any event triggers -associated with that segment.
- -The Audition window allows you to queue up -any segment in any order for playback. To do this simply select the segment you -wish to cue and hit Queue. That segment will appear in the queue window and -start playing (if it is the first segment). Subsequently you can select any -other segment or segments and cue them up for playback. As the segments -complete playback, the next segment in the queue will begin playing. As is the -other windows of JET Creator, you can mute, unmute, trigger event clips, etc. -in realtime as each segment is playing back.
- -Specifically the buttons behave as follows:
- -Queue - -loads the selected segment into the queue and starts playback
- -Cancel and Queue - -cancels the currently playing segment before queueing the selected segment for -playback
- -Cancel Current - -cancels the currently playing segment in the queue and begins playback of the -next segment
- -Stop - -stops playback of all queued segments
- -Mute All - -mutes all tracks in the current segment
- -Mute None - -unmutes all tracks in the current segment
- -Original Mutes - -sets the original mute flags for the current segment
- -The combination of these playback options -allows an author or application programmer to audition any behaviour an -interactive music application may encounter.
- -The JET Creator menus provide access to -many of the parameters in the main window plus a few additional parameters.
- -The File Menu contains the following -elements:
- -New - -Creates a new JET Creator file (.jtc)
- -Open - -Opens an existing JET Creator file
- -Save - -Saves the currently opened JET Creator file
- -Save As - -Saves the currently opened JET Creator file to a new file
- -Import Project - Imports a JET Creator archive (.zip)
- -Export Project - Exports a JET Creator archive (.zip)
- -Exit - -Exits the application
- -The Edit Menu contains the following -elements:
- -Undo - -Undo will undo the last edit made
- -Redo - -Redo will redo the last undo
- -Cut - -Copy selected parameter into clipboard and Delete selection
- -Copy - -Copy selected parameter into clipboard and keep selection
- -Paste - -Paste selected parameter
- -The Edit Menu contains the following -elements:
- -Properties - -Brings up the JET Creator priorities window. This window allows you to set the -following conditions for a given JET file:
- -Copyright Info - Contains copyright info to be inserted into JET file
- -Chase Controllers - Option to chase controllers (on/off). This should usually -be ON.
- -Delete Empty Tracks - Deletes any empty
The Segments Menu contains the following -elements:
- -Add Segment - -Brings up the Segment window
- -Update Segment - Updates segment attributes
- -Delete Segment - Deletes the current segment from the -Segment List
- -The Help Menu will contain at least the -following elements:
- -JET Creator Help - will launch PDF help document or go to on-line help
- -About - -JET Creator version number, SONiVOX info
- -Breaking a
Trigger events allow for the following:
- -Under the hood, JET uses standard MIDI CC
-events to accomplish these actions and to synchronize audio. The controllers
-used by JET are among those not defined for specific use by the
Controllers -80-83 Reserved for use by -application
- -Controller -102 JET event marker
- -Controller -103 JET clip marker
- -Controllers -104-119 Reserved for future use
- -Controller 103 is reserved for marking
-clips in a
For example, to identify a clip with a clip
-ID of 1, the author inserts a
- -
Figure 5: Synchronized Clip
- -In the figure above, if the -JET_TriggerClip() function is called prior to the first controller event, Track -3 will be un-muted when the first controller event occurs, the first clip will -play, and the track will be muted when the second controller event occurs. If -the JET_TriggerClip() function is called after the first controller event has -occurred, Track 3 will be un-muted when the third controller event occurs, the -second clip will play, and the track will be muted again when the fourth -controller event occurs.
- -Note: Normally, the track containing the clip is muted by the application -when the segment is initially queued by the call to JET_QueueSegment(). If it -is not muted, the clip will always play until Jet_TriggerClip() has been called -with the clip ID.
- -Controller 102 is reserved for marking
-events in the
Normally, JET starts playback of the next
-segment (or repeats the current segment) when the
To avoid this problem, the author can place -a JET end-of-segment marker (controller=102, value=0) at the point where the -segment is to be looped. When the end-of-segment marker is encountered, the -next segment will be triggered, or if the current segment is looped, playback -will resume at the start of the segment.
- -The end-of-segment marker can also be used -to allow for completion of a musical figure beyond the end of measure that -marks the start of the next segment. For example, the content author might -create a 4-bar segment with a drum fill that ends on beat 1 of the 5th -bar – a bar beyond the natural end of the segment. By placing an end-of-segment -marker at the end of the 4th bar, the next segment will be -triggered, but the drum fill will continue in parallel with the next segment -providing musical continuity.
- -- -
Figure 6: End-of-segment Marker
- -The application may use controllers in this
-range for its own purposes. When a controller in this range is encountered, the
-event is entered into an event queue that can be queried by the application.
-Some possible uses include synchronizing video events with audio and marking a
-point in a
+ Copyright (C) 2009 The Android Open Source Project + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. ++ +
Vrs 1.0
Content +Authoring Application for the
+ +JET +Interactive Music Engine
+ +Authored by SONiVOX
Copyright 2009 Sonic Network, Inc.
This document contains the user guidelines
+for the SONiVOX JET Creator, an authoring application for creating and
+auditioning JET files. JET is an interactive music player for small embedded
+devices, including the Google Android platform. It allows applications to
+include interactive music soundtracks, in
JET works in conjunction with SONiVOX’s
+Embedded Audio Synthesizer (EAS) which is the
In addition to the graphical user
+interface, there are two main functionalities taking place in JET Creator. The
+first involves gathering all the source data (
The JET Creator application is written in +the Python programming language, therefore you need to have the current version +of Python and WXWidgets installed. There is both a Mac and Windows version.
+ +It is important to use a common set of
+terms to minimize confusion. Since JET uses
Channel: MIDI data associated with a specific
Controller: A
DAW: Digital Audio Workstation. A common term for
EAS: Embedded
JET: Jet Interactive Engine. The name of the SONiVOX JET interactive +music engine.
+ +M/B/T: Measures, Beats and Ticks
+ +Segment: A musical section such as a chorus or verse that is a component of
+the overall composition. In JET, a segment can be an entire MIDI file or a
+derived from a portion of a
SMF-0: Standard MIDI File Type 0, a MIDI file that contains a single
+track, but may be made up of multiple channels of
SMF-1: Standard MIDI File Type 1, a MIDI file that contains a one more
+tracks, and each track may in turn be made up of one or more channels of
Track: A single track in a DAW containing a timed sequence of
Interactive music can be defined as music +that changes in real-time according to non-predictable events such as user +interaction or game play events. In this way, interactive music is much more +engaging as it has the ability to match the energy and mood of a game much +closer than a pre-composed composition that never changes. In some applications +and games, interactive music is central to the game play. Guitar Hero is one +such popular game. When the end user successfully ‘captures’ the musical notes +coming down the fret board, the music adapts itself and simultaneously keeps +score of successes and failures. JET allows for these types of music driven +games as well.
+ +There are several methods for making and +controlling interactive music and JET is one such method. This section +describes the features of JET and how they might be used in a game or software +application. It also describes how JET can be used to save memory in small +footprint devices such as Android enabled mobile handsets.
+ +JET supports a flexible music format that +can be used to create extended musical sequences with a minimal amount of data. +A musical composition is broken up into segments that can be sequenced to +create a longer piece. The sequencing can be fixed at the time the music file +is authored, or it can be created dynamically under program control.
+ ++ +
Figure 1: Linear Music Piece
+ +This diagram shows how musical segments are
+stored. Each segment is authored as a separate
The bottom part of the diagram shows how +the musical segments can be recombined to create a linear music piece. In this +example, the bridge might end with a half-step key modulation and the remaining +segments could be transposed up a half-step to match.
+ ++ +
Figure 2: Non-linear music piece
+ +In this diagram, we see a non-linear music +piece. The scenario is a first-person-shooter (FPS) and JET is providing the +background music. The intro plays as the level is loading and then transitions +under program control to the Searching segment. This segment is repeated indefinitely, +perhaps with small variations (using the mute/un-mute feature) until activity +in the game dictates a change.
+ +As the player nears a monster lair, the +program starts a synchronized transition to the Danger segment, increasing the +tension level in the audio. As the player draws closer to the lair, additional +tracks are un-muted to increase the tension.
+ +As the player enters into combat with the +monster, the program starts a synchronized transition to the Combat segment. +The segment repeats indefinitely as the combat continues. A Bonus Hit +temporarily un-mutes a decorative track that notifies the player of a +successful attack, and similarly, another track is temporarily un-muted to +signify when the player receives Special Damage.
+ +At the end of combat, the music transitions +to a victory or defeat segment based on the outcome of battle.
+ +JET can also synchronize the muting and +un-muting of tracks to events in the music. For example, in the FPS game, it would +probably be desirable to place the musical events relating to bonuses and +damage as close to the actual game event as possible. However, simply un-muting +a track at the moment the game event occurs might result in a music clip +starting in the middle. Alternatively, a clip could be started from the +beginning, but then it wouldn’t be synchronized with the other music tracks.
+ +However, with the JET sync engine, a clip
+can be started at the next opportune moment and maintain synchronization. This
+can be accomplished by placing a number of short music clips on a decorative
+track. A
+ +
Figure 3: Synchronized Mute/Unmute
+ +JET provides an audio synchronization API
+that allows game play to be synchronized to events in the audio. The mechanism
+relies on data embedded in the
+ +
Figure 4: Music Game with Synchronization
The arrows represent events in the music sequence +where game events need to be synchronized. In this case, the blue arrow +represents a time where the player is supposed to press the left button, and +the red arrow is for the right button. The yellow arrow tells the game engine +that the sequence is complete. The player is allowed a certain time window +before and after the event to press the appropriate key.
+ +If an event is received and the player has +not pressed a button, a timer is set to half the length of the window. If the +player presses the button before the timer expires, the game registers a +success, and if not, the game registers a failure.
+ +If the player presses the button before the +event is received, a timer is set to half the length of the window. If an event +is received before the timer expires, the game registers a success, and if not, +the game registers a failure. Game play might also include bonuses for getting +close to the timing of the actual event.
+ +To author JET files and hear them playback
+interactively, the content author will work in two applications which are
+designed to work together smoothly. The first is application is any
+off-the-shelf
Please see the JET Content Authoring Guidelines documentation for additional +details on content authoring.
+ +JET Creator is a python language +application, therefore, you must have Python and wxPython installed on your +machine.
+ +JetCreator was created and tested with:
+ +Python Version 2.5.4
+ +wxPython Version 2.8.7.1
+ +These can be downloaded here:
+ +PC:
+ +http://www.python.org/download/releases/2.5.4/
+ +http://www.wxpython.org/download.php
+ +MAC:
+ +http://wiki.python.org/moin/MacPython/Leopard
+ +http://www.wxpython.org/download.php
+ +After installing Python and wxPython, +simply unzip or copy all the files in the JET Creator application directory to +a folder on your hard drive.
+ ++ +
To launch JET Creator go to a command +prompt and set the directory to where you’ve installed Python. Next run python +with the command:
+ +python
+jetcreator.py
There are a few different file types +associated with JET Creator.
+ +.jtc JET +Creator project file. This file contains all the information associated with a +JET Creator project. When you Save or Save-as out of JET Creator, this file +type is saved.
+ +.jet JET +File. This output file is automatically generated from JET Creator whenever you +save your JET Creator project. This is the file that bundles all JET assets +together into a single file that the Android application will use. Give this +file to the Android application developer.
+ +.mid
.seg Segment
+File. This is a JET Segment file. It has the same name as the
.zip Zip +Archive file. When you Export a JET Archive, a zip file is created that +contains all the assets (files) necessary for JET Creator. Use this to transfer +JET Creator projects to other people.
+ +When +you first launch JET Creator you are presented with an open dialog like the +following.
+ ++ +
Open will open an existing .jtc (JET Creator file) file. Use the browser +button to browse to the directory where you have saved your .jtc file.
+ +New will create a new .jtc file.
+ +Import will import a JET Archive (.zip) file.
+ +Cancel will cancel the dialog and exit the application.
+ +The main window of the JET Creator +application looks like the picture below. There are three main sections from +top to bottom: segment view, event view, and timeline.
+ +The segment view section displays a list of
+the current segments, which
Just below the Segment view is the event +view. The event view section displays all events associated with a given +segment. Events only display when the segment they are assigned to is +highlighted. Each event displays its type, start and end points, track and midi +channel assignment, and its event ID.
+ +Just below the Event view is the timeline +display. The timeline shows how many measures a given segment is as well as any +events associated with that segment. The timeline changes to display the +currently selected or playing segment. You can trigger an event in this window +while the segment is play by simply clicking on the event in the timeline +display.
+ ++ +
JET
+Creator Main Window
The buttons along the left side of main +window do the following:
+ +Add: - +Displays the segment or event window for adding a new segment or event
+ +Revise: - +Displays the segment or event window for updating an existing segment or event
+ +Delete: - +Deletes the selected segment or event (will ask for confirmation)
+ +Move: - +Displays the move window which allows you to move selected segments or events +in time
+ +Queue All: - Queue’s +(selects) all segments for playback
+ +Dequeue All: - Dequeue’s +(deselects) all segments
+ +Play: - +Starts playback of all queue’d segments. This button changes to Stop if any +segments are playing
+ +Audition: - +Displays the Audition window (see below)
+ +The segment window is where a given +segment’s attributes are assigned and auditioned, as shown in the picture +below. The left side of the window displays the segments attributes that are +stored in the JET file. The right side of the window allows the author to set +mute flags, repeat and transpose settings and audition the segment as it will +play in the JET game.
+ +Note: the audition attributes (mute flags, repeat and transpose) are not stored in the JET content file +(.jet) but rather are defined by the game or application itself. In programming +language, these settings correspond directly with the API calls to the JET +engine. By including them here, the JET content author can simulate how the +segment will respond to the applications API commands during game play.
+ ++ +
The segment parameters do the following:
+ +Segment Name - Sets +the name of the segment
+ +MIDI File -
+The name and location of the
DLS File -
+The name and location of the DLS2 file, if any, that the
Starting M/B/T - +Starting measure, beat and tick of the segment
+ +Ending M/B/T - +Ending measure, beat and tick of the segment
+ +Quantize - +Quantize value for quantizing the current segment during playback
+ +The audition fields are as follows:
+ +Track Mutes -
+Shows the MIDI tracks (not channels)
+in the
Channel -
+Displays the
Name - +Displays the track name meta event (if present) for each track
+ +Repeat - +Indicates the number of times a segment should repeat during playback
+ +Transpose - +Indicates the transposition in semi-tones or half-steps a segment should +transpose during playback
+ +To the right of the Audition window are a few additional buttons. +These do as follows:
+ +OK - +Selecting OK confirms all segment settings and closes the segment window
+ +Cancel - +Selecting Cancel cancels any changes and closes the segment window
+ +Replicate - +Displays the Replicate Segment window for entering multiple segments at once. +See below.
+ +Play/Stop Segment - Starts +or Stops playback of the segment using the segment attributes assigned.
+ +Play/Stop MIDI File -
+Starts or Stops playback of the
Pause/Resume - +Pauses or Resumes playback.
+ +The event window is where a given segment’s +event attributes are assigned and auditioned, as shown in the picture below. To +add an event to a segment, the author must first select the segment which will +contain the event, then select the Add button. This will bring up the Event +window.
+ ++ +
There are two main sections to the event +window. The segment section on the left side of the event window is for display +only. It shows what the segment attributes are for the given segment. The Event +section, on the right side, is where events can be assigned. The following +parameters are available:
+ +Event Name - +Assigns a name to an event
+ +Event Type - +Selects which type of event to assign.
+ +Starting M/B/T - +Sets the starting measure, beat, and tick for the event
+ +Ending M/B/T - +Sets the ending measure, beat, and tick for the event, if applicable
+ +Track - +Sets which track in the given segment the event will apply to
+ +Channel -
+Sets which
Event ID - +Sets the event ID for the event. Multiple events can be assigned to the same +segment and therefore the Event ID is used to identify them
+ +To the right of the Audition window are a few additional buttons. +These do as follows:
+ +OK - +Selecting OK confirms all event settings and closes the event window
+ +Cancel - +Selecting Cancel cancels any changes and closes the event window
+ +Replicate - +Displays the Replicate Event window for entering multiple events at once. See +below.
+ +Play/Stop - +Starts or Stops playback of the segment using the segment attributes assigned. +While the segment is playing, events can be triggered and auditioned.
+ +Trigger - +Triggers the event assigned. This replicates the API command that the JET game +will use to trigger the event, therefore giving the content author a method for +auditioning the behaviour of the event.
+ +Mute/UnMute - +Mute/UnMute will mute or unmute the track that the event is assigned to
+ +Pause/Resume - +Pauses or Resumes playback.
+ +To audition the behaviour of an event, you +can select the Play button. This will initiate playback. The trigger button +will send the trigger event when pressed. This is equivalent to selecting the +green trigger event in the timeline.
+ +Note: Trigger events are meant to unmute a +single track of a segment when triggered, then mute that track at the end of +the trigger segment. Therefore you should make sure the mute flag is set to +mute the track that a trigger event will be unmuting when receiving a trigger event. +
+ +Please read Section 7 “Under The Hood” +below for details on how trigger events work and behave.
+ +Often in creating JET files, you’ll need to +create tens or even hundreds of events. You may also need to move events. The +Replicate and Move windows allow for this. There are two Replicate windows for +creating multiple segments or events. They look like the following:
+ ++ +
Replicate Segment Window
+ ++ +
Replicate Event Window
+ +Both Replicate windows function the same. +After creating an initial segment or event, you can select the Replicate +button. The parameters are as follows:
+ +Name Prefix - +Sets the prefix for the name of each segment or event created
+ +Starting M/B/T - +Sets the starting time for the first segment or event
+ +Increment M/B/T - +Sets the time between segments or events created.
+ +Number -
+Sets the number of segments or events you wish to create. If the number
+overflows the length of the
Preview - +Preview allows you to examine the objects created before saying OK to insert +them.
+ +The Move function acts similarly to the +Replicate function in that it allows you to edit multiple segments or events at +one time, in this case move them in time. Like Replicate, there are two Move +windows, one for Segments and one for Events. The windows look like the +following:
+ ++ +
Move Event Window
+ +To use Move, first select the segments or +events you wish to move in time, then click the Move button. The parameters are +as follows:
+ +Starting M/B/T - +Sets the starting time for the first segment or event
+ +Increment M/B/T - +Sets the time in M/B/T you wish to move the objects by.
+ +Preview - +Preview allows you to examine the objects created before saying OK to move +them.
+ +Clicking the Audition button in the main +window of the JET Creator application will open the Audition window. This is +where the content author or application programmer can simulate the interactive +playback as it may occur in the mobile application or game itself.
+ ++ +
JET Audition Window
+ +There are four main sections to the +audition window. The left most section displays the available segments and +their length in seconds. The middle section displays a running list of what +segments are queued for playback and what their playback status is. The far +right section displays the mute flags for the currently playing segment. The +timeline section at the bottom is the same as in the main window. It displays +the currently playing segment as well as a visual display of any event triggers +associated with that segment.
+ +The Audition window allows you to queue up +any segment in any order for playback. To do this simply select the segment you +wish to cue and hit Queue. That segment will appear in the queue window and +start playing (if it is the first segment). Subsequently you can select any +other segment or segments and cue them up for playback. As the segments +complete playback, the next segment in the queue will begin playing. As is the +other windows of JET Creator, you can mute, unmute, trigger event clips, etc. +in realtime as each segment is playing back.
+ +Specifically the buttons behave as follows:
+ +Queue - +loads the selected segment into the queue and starts playback
+ +Cancel and Queue - +cancels the currently playing segment before queueing the selected segment for +playback
+ +Cancel Current - +cancels the currently playing segment in the queue and begins playback of the +next segment
+ +Stop - +stops playback of all queued segments
+ +Mute All - +mutes all tracks in the current segment
+ +Mute None - +unmutes all tracks in the current segment
+ +Original Mutes - +sets the original mute flags for the current segment
+ +The combination of these playback options +allows an author or application programmer to audition any behaviour an +interactive music application may encounter.
+ +The JET Creator menus provide access to +many of the parameters in the main window plus a few additional parameters.
+ +The File Menu contains the following +elements:
+ +New - +Creates a new JET Creator file (.jtc)
+ +Open - +Opens an existing JET Creator file
+ +Save - +Saves the currently opened JET Creator file
+ +Save As - +Saves the currently opened JET Creator file to a new file
+ +Import Project - Imports a JET Creator archive (.zip)
+ +Export Project - Exports a JET Creator archive (.zip)
+ +Exit - +Exits the application
+ +The Edit Menu contains the following +elements:
+ +Undo - +Undo will undo the last edit made
+ +Redo - +Redo will redo the last undo
+ +Cut - +Copy selected parameter into clipboard and Delete selection
+ +Copy - +Copy selected parameter into clipboard and keep selection
+ +Paste - +Paste selected parameter
+ +The Edit Menu contains the following +elements:
+ +Properties - +Brings up the JET Creator priorities window. This window allows you to set the +following conditions for a given JET file:
+ +Copyright Info - Contains copyright info to be inserted into JET file
+ +Chase Controllers - Option to chase controllers (on/off). This should usually +be ON.
+ +Delete Empty Tracks - Deletes any empty
The Segments Menu contains the following +elements:
+ +Add Segment - +Brings up the Segment window
+ +Update Segment - Updates segment attributes
+ +Delete Segment - Deletes the current segment from the +Segment List
+ +The Help Menu will contain at least the +following elements:
+ +JET Creator Help - will launch PDF help document or go to on-line help
+ +About - +JET Creator version number, SONiVOX info
+ +Breaking a
Trigger events allow for the following:
+ +Under the hood, JET uses standard MIDI CC
+events to accomplish these actions and to synchronize audio. The controllers
+used by JET are among those not defined for specific use by the
Controllers +80-83 Reserved for use by +application
+ +Controller +102 JET event marker
+ +Controller +103 JET clip marker
+ +Controllers +104-119 Reserved for future use
+ +Controller 103 is reserved for marking
+clips in a
For example, to identify a clip with a clip
+ID of 1, the author inserts a
+ +
Figure 5: Synchronized Clip
+ +In the figure above, if the +JET_TriggerClip() function is called prior to the first controller event, Track +3 will be un-muted when the first controller event occurs, the first clip will +play, and the track will be muted when the second controller event occurs. If +the JET_TriggerClip() function is called after the first controller event has +occurred, Track 3 will be un-muted when the third controller event occurs, the +second clip will play, and the track will be muted again when the fourth +controller event occurs.
+ +Note: Normally, the track containing the clip is muted by the application +when the segment is initially queued by the call to JET_QueueSegment(). If it +is not muted, the clip will always play until Jet_TriggerClip() has been called +with the clip ID.
+ +Controller 102 is reserved for marking
+events in the
Normally, JET starts playback of the next
+segment (or repeats the current segment) when the
To avoid this problem, the author can place +a JET end-of-segment marker (controller=102, value=0) at the point where the +segment is to be looped. When the end-of-segment marker is encountered, the +next segment will be triggered, or if the current segment is looped, playback +will resume at the start of the segment.
+ +The end-of-segment marker can also be used +to allow for completion of a musical figure beyond the end of measure that +marks the start of the next segment. For example, the content author might +create a 4-bar segment with a drum fill that ends on beat 1 of the 5th +bar – a bar beyond the natural end of the segment. By placing an end-of-segment +marker at the end of the 4th bar, the next segment will be +triggered, but the drum fill will continue in parallel with the next segment +providing musical continuity.
+ ++ +
Figure 6: End-of-segment Marker
+ +The application may use controllers in this
+range for its own purposes. When a controller in this range is encountered, the
+event is entered into an event queue that can be queried by the application.
+Some possible uses include synchronizing video events with audio and marking a
+point in a
- Copyright (C) 2009 The Android Open Source Project - - Licensed under the Apache License, Version 2.0 (the "License"); - you may not use this file except in compliance with the License. - You may obtain a copy of the License at - - http://www.apache.org/licenses/LICENSE-2.0 - - Unless required by applicable law or agreed to in writing, software - distributed under the License is distributed on an "AS IS" BASIS, - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - See the License for the specific language governing permissions and - limitations under the License. -- -
JET -Interactive Music Engine
- -Vrs 2.0
Authored by SONiVOX
Copyright 2009 Sonic Network, Inc.
This document contains programmer guidelines for the SONiVOX
-JET Interactive Music System. JET is an interactive music player for small
-embedded devices, including the Google Android platform. It allows applications
-to include interactive music soundtracks, in
JET works in conjunction with SONiVOX’s Embedded Audio
-Synthesizer (EAS) which is the
The programmer of a JET application will want to work -closely with the content author in designing how real-time application events -and music will interactively work together. Once decided, the content author -will create the content and ultimately save a .jet file for the programmer to -include in the application.
- -Please see “JET Creator User Documentation” for additional -information on authoring JET content.
- -It is important to use a common set of terms to minimize confusion.
-Since JET uses
Channel: MIDI data
-associated with a specific
Controller: A
DAW: Digital Audio
-Workstation. A common term for
EAS: Embedded
JET: Jet -Interactive Engine. The name of the SONiVOX JET interactive music engine.
- -Segment: A musical
-section such as a chorus or verse that is a component of the overall
-composition. In JET, a segment can be an entire MIDI file or a derived from a
-portion of a
SMF-0: Standard
-MIDI File Type 0, a MIDI file that contains a single track, but may be made up
-of multiple channels of
SMF-1: Standard
-MIDI File Type 1, a MIDI file that contains a one more tracks, and each track
-may in turn be made up of one or more channels of
Track: A single
-track in a DAW containing a timed sequence of
Interactive music can be defined as music that changes in -real-time according to non-predictable events such as user interaction or game -play events. In this way, interactive music is much more engaging as it has the -ability to match the energy and mood of a game much closer than a pre-composed -composition that never changes. In some applications and games, interactive -music is central to the game play. Guitar Hero is one such popular game. When -the end user successfully ‘captures’ the musical notes coming down the fret -board, the music adapts itself and simultaneously keeps score of successes and -failures. JET allows for these types of music driven games as well.
- -There are several methods for making and controlling -interactive music and JET is one such method. This section describes the -features of JET and how they might be used in a game or software application. -It also describes how JET can be used to save memory in small footprint devices -such as Android enabled mobile handsets.
- -JET supports a flexible music format that can be used to -create extended musical sequences with a minimal amount of data. A musical -composition is broken up into segments that can be sequenced to create a longer -piece. The sequencing can be fixed at the time the music file is authored, or -it can be created dynamically under program control.
- -Figure 1: Linear Music Piece
- -This diagram shows how musical segments are stored. Each
-segment is authored as a separate
The bottom part of the diagram shows how the musical -segments can be recombined to create a linear music piece. In this example, the -bridge might end with a half-step key modulation and the remaining segments -could be transposed up a half-step to match.
- -Figure 2: Non-linear music piece
- -In this diagram, we see a non-linear music piece. The -scenario is a first-person-shooter (FPS) and JET is providing the background -music. The intro plays as the level is loading and then transitions under -program control to the Searching segment. This segment is repeated -indefinitely, perhaps with small variations (using the mute/un-mute feature) -until activity in the game dictates a change.
- -As the player nears a monster lair, the program starts a -synchronized transition to the Danger segment, increasing the tension level in -the audio. As the player draws closer to the lair, additional tracks are -un-muted to increase the tension.
- -As the player enters into combat with the monster, the -program starts a synchronized transition to the Combat segment. The segment -repeats indefinitely as the combat continues. A Bonus Hit temporarily un-mutes -a decorative track that notifies the player of a successful attack, and -similarly, another track is temporarily un-muted to signify when the player -receives Special Damage.
- -At the end of combat, the music transitions to a victory or -defeat segment based on the outcome of battle.
- -JET can also synchronize the muting and un-muting of tracks -to events in the music. For example, in the FPS game, it would probably be -desirable to place the musical events relating to bonuses and damage as close -to the actual game event as possible. However, simply un-muting a track at the -moment the game event occurs might result in a music clip starting in the -middle. Alternatively, a clip could be started from the beginning, but then it -wouldn’t be synchronized with the other music tracks.
- -However, with the JET sync engine, a clip can be started at
-the next opportune moment and maintain synchronization. This can be
-accomplished by placing a number of short music clips on a decorative track. A
Figure 3: Synchronized Mute/Unmute
- -JET provides an audio synchronization API that allows game
-play to be synchronized to events in the audio. The mechanism relies on data
-embedded in the
Figure 4: Music Game with
-Synchronization
The arrows represent events in the music sequence where game -events need to be synchronized. In this case, the blue arrow represents a time -where the player is supposed to press the left button, and the red arrow is for -the right button. The yellow arrow tells the game engine that the sequence is -complete. The player is allowed a certain time window before and after the -event to press the appropriate key.
- -If an event is received and the player has not pressed a -button, a timer is set to half the length of the window. If the player presses -the button before the timer expires, the game registers a success, and if not, -the game registers a failure.
- -If the player presses the button before the event is -received, a timer is set to half the length of the window. If an event is -received before the timer expires, the game registers a success, and if not, -the game registers a failure. Game play might also include bonuses for getting -close to the timing of the actual event.
- -JET uses the standard EAS library calls to manage multiple
JET uses standard
JET uses
Controllers -80-83 Reserved for use by -application
- -Controller 102 JET event marker
- -Controller -103 JET clip marker
- -Controllers -104-119 Reserved for future -use
- -The application may use controllers in this range for its
-own purposes. When a controller in this range is encountered, the event is
-entered into an event queue that can be queried by the application. Some
-possible uses include synchronizing video events with audio and marking a point
-in a
Controller 102 is reserved for marking events in the
Normally, JET starts playback of the next segment (or
-repeats the current segment) when the
To avoid this problem, the author can place a JET -end-of-segment marker (controller=102, value=0) at the point where the segment is -to be looped. When the end-of-segment marker is encountered, the next segment -will be triggered, or if the current segment is looped, playback will resume at -the start of the segment.
- -The end-of-segment marker can also be used to allow for -completion of a musical figure beyond the end of measure that marks the start -of the next segment. For example, the content author might create a 4-bar -segment with a drum fill that ends on beat 1 of the 5th bar – a bar -beyond the natural end of the segment. By placing an end-of-segment marker at -the end of the 4th bar, the next segment will be triggered, but the -drum fill will continue in parallel with the next segment providing musical -continuity.
- -Figure 5: End-of-segment Marker
- -Controller 103 is reserved for marking clips in a
For example, to identify a clip with a clip ID of 1, the
-author inserts a
Figure 6: Synchronized Clip
- -In the figure above, if the JET_TriggerClip() function is -called prior to the first controller event, Track 3 will be un-muted when the -first controller event occurs, the first clip will play, and the track will be -muted when the second controller event occurs. If the JET_TriggerClip() -function is called after the first controller event has occurred, Track 3 will -be un-muted when the third controller event occurs, the second clip will play, -and the track will be muted again when the fourth controller event occurs.
- -Note: Normally, -the track containing the clip is muted by the application when the segment is -initially queued by the call to JET_QueueSegment(). If it is not muted, the clip -will always play until Jet_TriggerClip() has been called with the clip ID.
- -The JET library builds on functionality in the EAS library. -It is assumed that the reader is familiar with EAS and has implemented basic -EAS audio functionality in the application. Specifically, the application must -first initialize EAS by calling EAS_Init() and must call EAS_Render() at -appropriate times to render audio and stream it to the audio hardware. JET also -requires the use of the dynamic memory model which uses malloc() and free() or -functional equivalents.
- -Most JET function calls return an EAS_RESULT type which -should be checked against the EAS_SUCCESS return code. Most failures are not -fatal, i.e. they will not put the library in a state where it must be -re-initialized. However, some failures such as memory allocation or file -open/read errors will likely result in the specific open content failing to -render.
- -The JET library is initialized by the JET_Init() function. -The application must first call EAS_Init() and then pass the EAS data handle -returned by EAS_Init() to the JET_Init() function. Currently, only a single JET -application can be active at a time.
- -The JET_Init function takes 3 arguments: The first is the -EAS data handle. The second is a pointer to a configuration structure -S_JET_CONFIG and the third is the size of the configuration structure. For most -applications, it is sufficient to pass a NULL pointer and size 0 for the -configuration data.
- -However, if desired, the configuration can be modified to
-allow the application to monitor
When the JET application terminates, it should call -JET_Shutdown() to release the resources allocated by the JET engine. If the application has no other use for the -EAS library, it should also call EAS_Shutdown().
- -To start the JET engine, the content must first be opened -with the JET_OpenFile() function. Just as with EAS_OpenFile(), the file locator -is an opaque value that is passed to the EAS_HWOpenFile() function. It can -either be a pointer to a filename, or a pointer to an in-memory object, -depending on the user implementation of file I/O in the eas_host.c or -eas_hostmm.c module. Only a single JET content file can be opened at a time.
- -Once the JET file is opened, the application can begin -queuing up segments for playback by calling the JET_QueueSegment() function. -Generally, it is advisable to keep a minimum of two segments queued at all -times: the currently playing segment -plus an additional segment that is ready to start playing when the current -segment finishes. However, with proper programming, it is possible to queue up -segments using a “just-in-time” technique. This technique typically involves -careful placement of application controller events near the end of a segment so -that the application is informed when a segment is about to end.
- -After the segment(s) are queued up, playback can begin. By -default, the segments are initialized in a paused state. To start playback, -call the JET_Play() function. Playback can be paused again by calling the JET_Pause() -function. Once initiated, playback will continue as long as the application -continues to queue up new segments before all the segments in the queue are -exhausted.
- -The JET_Status() function can be used to monitor progress. -It returns the number of segments queued, repeat count, current segment ID, and -play status. By monitor the number of segments queued, the application can -determine when it needs to queue another segment and when playback has -completed.
- -When playback has completed and the application is finished -with the contents of the currently open file, the application should call -JET_CloseFile() to close the file and release any resources associated with the -file.
- -EAS_PUBLIC EAS_RESULT JET_Init -(EAS_DATA_HANDLE easHandle, S_JET_CONFIG *pConfig, EAS_INT configSize)
- -Initializes JET library for use by application. Most -application should simply pass a NULL for pConfig and 0 for configSize, which -means that only controller events in the application range (80-83) will end up -in the application event queue. If desired, the application can instantiate an -S_JET_CONFIG data structure and set the controller range to a different range. -In this case, the configSize parameter should be set to sizeof(S_JET_CONFIG).
- -EAS_PUBLIC EAS_RESULT JET_Shutdown -(EAS_DATA_HANDLE easHandle)
- -Releases resources used by the JET library. The application -should call this function when it is no longer using the JET library.
- -EAS_PUBLIC EAS_RESULT JET_OpenFile -(EAS_DATA_HANDLE easHandle, EAS_FILE_LOCATOR locator)
- -Opens a JET content file for playback. Content must be
-formatted for use by the JET library, which is typically accomplished with the
-jetfile.py script (see “Creating JET Content”). Only a single JET content file
-can be opened at a time. However, since JET can contain many
EAS_PUBLIC EAS_RESULT JET_CloseFile -(EAS_DATA_HANDLE easHandle)
- -Closes a JET file and release the resources associated with it.
- -EAS_PUBLIC EAS_RESULT JET_Status -(EAS_DATA_HANDLE easHandle, S_JET_STATUS *pStatus)
- -Returns the current JET status. The elements of the status -data structure are as follows:
- -typedef struct s_jet_status_tag
- -{
- -EAS_INT currentUserID;
- -EAS_INT segmentRepeatCount;
- -EAS_INT numQueuedSegments;
- -EAS_BOOL paused;
- -} S_JET_STATUS;
- -currentUserID: An -8-bit value assigned by the application.
- -segmentRepeatCount: -Number of times left to repeat. Zero indicates no repeats, a negative number -indicates an infinite number of repeats. Any positive value indicates that the -segment will play n+1 times.
- -numQueuedSegments: -Number of segments currently queued to play including the currently playing -segment. A value of zero indicates that nothing is playing. Normally, the -application will queue a new segment each time the value is 1 so that playback -is uninterrupted.
- -EAS_PUBLIC EAS_RESULT JET_QueueSegment -(EAS_DATA_HANDLE easHandle, EAS_INT segmentNum, EAS_INT libNum, EAS_INT -repeatCount, EAS_INT transpose, EAS_U32 muteFlags, EAS_U8 userID)
- -Queues up a JET MIDI segment for playback. The parameters -are as follows:
- -segmentNum: -Segment number as identified in the JET content configuration file.
- -libNum: The library -number as specified in the JET content configuration file. Use -1 to select the -standard General MIDI library.
- -repeatCount: The -number of times this segment should repeat. Zero indicates no repeat, i.e. play -only once. Any positive number indicates to play n+1 times. Set to -1 to repeat -indefinitely.
- -transpose: The -amount of pitch transposition. Set to 0 for normal playback. Range is -12 to -+12.
- -muteFlags:
-Specific which MIDI tracks (not
userID: 8-bit -value specified by the application that uniquely identifies the segment. This -value is returned in the JET_Status() function as well as by the application -event when an event is detected in a segment. Normally, the application keeps -an 8-bit value that is incremented each time a new segment is queued up. This -can be used to look up any special characteristics of that track including -trigger clips and mute flags.
- -EAS_PUBLIC EAS_RESULT JET_Play -(EAS_DATA_HANDLE easHandle)
- -Starts playback of the current segment. This function must -be called once after the initial segments are queued up to start playback. It -is also called after JET_Pause() to resume playback.
- -EAS_PUBLIC EAS_RESULT JET_Pause -(EAS_DATA_HANDLE easHandle)
- -Pauses playback of the current segment. Call JET_Pause() to -resume playback.
- -EAS_PUBLIC EAS_RESULT JET_SetMuteFlags -(EAS_DATA_HANDLE easHandle, EAS_U32 muteFlags, EAS_BOOL sync)
- -Modifies the mute flags during playback. If the sync parameter is false, the mute flags -are updated at the beginning of the next render. This means that any new notes -or controller events will be processed during the next audio frame. If the sync parameter is true, the mute flags -will be updated at the start of the next segment. If the segment is repeated, -the flags will take effect the next time segment is repeated.
- -EAS_PUBLIC EAS_RESULT JET_SetMuteFlag -(EAS_DATA_HANDLE easHandle, EAS_INT trackNum, EAS_BOOL muteFlag, EAS_BOOL sync)
- -Modifies a mute flag for a single track during playback. If -the sync parameter is false, the mute -flag is updated at the beginning of the next render. This means that any new -notes or controller events will be processed during the next audio frame. If -the sync parameter is true, the mute -flag will be updated at the start of the next segment. If the segment is -repeated, the flag will take effect the next time segment is repeated.
- -EAS_PUBLIC EAS_RESULT JET_TriggerClip -(EAS_DATA_HANDLE easHandle, EAS_INT clipID)
- -Automatically updates mute flags in sync with the JET Clip -Marker (controller 103). The parameter clipID -must be in the range of 0-63. After the call to JET_TriggerClip, when JET next -encounters a controller event 103 with bits 0-5 of the value equal to clipID and bit 6 set to 1, it will automatically un-mute the track containing -the controller event. When JET encounters the complementary controller event -103 with bits 0-5 of the value equal to clipID -and bit 6 set to 0, it will mute -the track again.
- -EAS_BOOL JET_GetEvent (EAS_DATA_HANDLE -easHandle, EAS_U32 *pEventRaw, S_JET_EVENT *pEvent)
- -Attempts to read an event from the application event queue, -return EAS_TRUE if an event is found and EAS_FALSE if not. If the application -passes a valid pointer for pEventRaw, -a 32-bit compressed event code is returned. If the application passes a valid -pointer for pEvent, the event is -parsed into the S_JET_EVENT fields. The application can pass NULL for either -parameter and that variable will be ignored. Normally, the application will -call JET_GetEvent() repeatedly to retrieve events until it returns EAS_FALSE.
- -EAS_PUBLIC void JET_ParseEvent (EAS_U32 -event, S_JET_EVENT *pEvent)
- -Parses a 32-bit compressed event code into a data structure. -The application passes the event code received from JET_GetEvent(). The parsed -event data is returned in the memory pointed to by pEvent.
- -EAS_RESULT JET_GetAppData -(EAS_DATA_HANDLE easHandle, EAS_I32 *pAppDataOffset, EAS_I32 *pAppDataSize)
- -Returns the offset and size of the JAPP chunk in the JET -file. The application can use the file I/O functions in the eas_host module to -retrieve application specific data from the file.
- -JET uses standard MIDI files and DLS files that can be
-created with commercially available content tools such as Logic, Cubase,
-Digital Performer, or SONAR for
To create JET file use the “JET Creator” desktop -application. The JET Creator application is written in Python and includes a -full graphical interface. It is available for MAC and PC platforms. See “JET -Creator User Manual” for more information.
- -+ Copyright (C) 2009 The Android Open Source Project + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. ++ +
JET +Interactive Music Engine
+ +Vrs 2.0
Authored by SONiVOX
Copyright 2009 Sonic Network, Inc.
This document contains programmer guidelines for the SONiVOX
+JET Interactive Music System. JET is an interactive music player for small
+embedded devices, including the Google Android platform. It allows applications
+to include interactive music soundtracks, in
JET works in conjunction with SONiVOX’s Embedded Audio
+Synthesizer (EAS) which is the
The programmer of a JET application will want to work +closely with the content author in designing how real-time application events +and music will interactively work together. Once decided, the content author +will create the content and ultimately save a .jet file for the programmer to +include in the application.
+ +Please see “JET Creator User Documentation” for additional +information on authoring JET content.
+ +It is important to use a common set of terms to minimize confusion.
+Since JET uses
Channel: MIDI data
+associated with a specific
Controller: A
DAW: Digital Audio
+Workstation. A common term for
EAS: Embedded
JET: Jet +Interactive Engine. The name of the SONiVOX JET interactive music engine.
+ +Segment: A musical
+section such as a chorus or verse that is a component of the overall
+composition. In JET, a segment can be an entire MIDI file or a derived from a
+portion of a
SMF-0: Standard
+MIDI File Type 0, a MIDI file that contains a single track, but may be made up
+of multiple channels of
SMF-1: Standard
+MIDI File Type 1, a MIDI file that contains a one more tracks, and each track
+may in turn be made up of one or more channels of
Track: A single
+track in a DAW containing a timed sequence of
Interactive music can be defined as music that changes in +real-time according to non-predictable events such as user interaction or game +play events. In this way, interactive music is much more engaging as it has the +ability to match the energy and mood of a game much closer than a pre-composed +composition that never changes. In some applications and games, interactive +music is central to the game play. Guitar Hero is one such popular game. When +the end user successfully ‘captures’ the musical notes coming down the fret +board, the music adapts itself and simultaneously keeps score of successes and +failures. JET allows for these types of music driven games as well.
+ +There are several methods for making and controlling +interactive music and JET is one such method. This section describes the +features of JET and how they might be used in a game or software application. +It also describes how JET can be used to save memory in small footprint devices +such as Android enabled mobile handsets.
+ +JET supports a flexible music format that can be used to +create extended musical sequences with a minimal amount of data. A musical +composition is broken up into segments that can be sequenced to create a longer +piece. The sequencing can be fixed at the time the music file is authored, or +it can be created dynamically under program control.
+ +Figure 1: Linear Music Piece
+ +This diagram shows how musical segments are stored. Each
+segment is authored as a separate
The bottom part of the diagram shows how the musical +segments can be recombined to create a linear music piece. In this example, the +bridge might end with a half-step key modulation and the remaining segments +could be transposed up a half-step to match.
+ +Figure 2: Non-linear music piece
+ +In this diagram, we see a non-linear music piece. The +scenario is a first-person-shooter (FPS) and JET is providing the background +music. The intro plays as the level is loading and then transitions under +program control to the Searching segment. This segment is repeated +indefinitely, perhaps with small variations (using the mute/un-mute feature) +until activity in the game dictates a change.
+ +As the player nears a monster lair, the program starts a +synchronized transition to the Danger segment, increasing the tension level in +the audio. As the player draws closer to the lair, additional tracks are +un-muted to increase the tension.
+ +As the player enters into combat with the monster, the +program starts a synchronized transition to the Combat segment. The segment +repeats indefinitely as the combat continues. A Bonus Hit temporarily un-mutes +a decorative track that notifies the player of a successful attack, and +similarly, another track is temporarily un-muted to signify when the player +receives Special Damage.
+ +At the end of combat, the music transitions to a victory or +defeat segment based on the outcome of battle.
+ +JET can also synchronize the muting and un-muting of tracks +to events in the music. For example, in the FPS game, it would probably be +desirable to place the musical events relating to bonuses and damage as close +to the actual game event as possible. However, simply un-muting a track at the +moment the game event occurs might result in a music clip starting in the +middle. Alternatively, a clip could be started from the beginning, but then it +wouldn’t be synchronized with the other music tracks.
+ +However, with the JET sync engine, a clip can be started at
+the next opportune moment and maintain synchronization. This can be
+accomplished by placing a number of short music clips on a decorative track. A
Figure 3: Synchronized Mute/Unmute
+ +JET provides an audio synchronization API that allows game
+play to be synchronized to events in the audio. The mechanism relies on data
+embedded in the
Figure 4: Music Game with
+Synchronization
The arrows represent events in the music sequence where game +events need to be synchronized. In this case, the blue arrow represents a time +where the player is supposed to press the left button, and the red arrow is for +the right button. The yellow arrow tells the game engine that the sequence is +complete. The player is allowed a certain time window before and after the +event to press the appropriate key.
+ +If an event is received and the player has not pressed a +button, a timer is set to half the length of the window. If the player presses +the button before the timer expires, the game registers a success, and if not, +the game registers a failure.
+ +If the player presses the button before the event is +received, a timer is set to half the length of the window. If an event is +received before the timer expires, the game registers a success, and if not, +the game registers a failure. Game play might also include bonuses for getting +close to the timing of the actual event.
+ +JET uses the standard EAS library calls to manage multiple
JET uses standard
JET uses
Controllers +80-83 Reserved for use by +application
+ +Controller 102 JET event marker
+ +Controller +103 JET clip marker
+ +Controllers +104-119 Reserved for future +use
+ +The application may use controllers in this range for its
+own purposes. When a controller in this range is encountered, the event is
+entered into an event queue that can be queried by the application. Some
+possible uses include synchronizing video events with audio and marking a point
+in a
Controller 102 is reserved for marking events in the
Normally, JET starts playback of the next segment (or
+repeats the current segment) when the
To avoid this problem, the author can place a JET +end-of-segment marker (controller=102, value=0) at the point where the segment is +to be looped. When the end-of-segment marker is encountered, the next segment +will be triggered, or if the current segment is looped, playback will resume at +the start of the segment.
+ +The end-of-segment marker can also be used to allow for +completion of a musical figure beyond the end of measure that marks the start +of the next segment. For example, the content author might create a 4-bar +segment with a drum fill that ends on beat 1 of the 5th bar – a bar +beyond the natural end of the segment. By placing an end-of-segment marker at +the end of the 4th bar, the next segment will be triggered, but the +drum fill will continue in parallel with the next segment providing musical +continuity.
+ +Figure 5: End-of-segment Marker
+ +Controller 103 is reserved for marking clips in a
For example, to identify a clip with a clip ID of 1, the
+author inserts a
Figure 6: Synchronized Clip
+ +In the figure above, if the JET_TriggerClip() function is +called prior to the first controller event, Track 3 will be un-muted when the +first controller event occurs, the first clip will play, and the track will be +muted when the second controller event occurs. If the JET_TriggerClip() +function is called after the first controller event has occurred, Track 3 will +be un-muted when the third controller event occurs, the second clip will play, +and the track will be muted again when the fourth controller event occurs.
+ +Note: Normally, +the track containing the clip is muted by the application when the segment is +initially queued by the call to JET_QueueSegment(). If it is not muted, the clip +will always play until Jet_TriggerClip() has been called with the clip ID.
+ +The JET library builds on functionality in the EAS library. +It is assumed that the reader is familiar with EAS and has implemented basic +EAS audio functionality in the application. Specifically, the application must +first initialize EAS by calling EAS_Init() and must call EAS_Render() at +appropriate times to render audio and stream it to the audio hardware. JET also +requires the use of the dynamic memory model which uses malloc() and free() or +functional equivalents.
+ +Most JET function calls return an EAS_RESULT type which +should be checked against the EAS_SUCCESS return code. Most failures are not +fatal, i.e. they will not put the library in a state where it must be +re-initialized. However, some failures such as memory allocation or file +open/read errors will likely result in the specific open content failing to +render.
+ +The JET library is initialized by the JET_Init() function. +The application must first call EAS_Init() and then pass the EAS data handle +returned by EAS_Init() to the JET_Init() function. Currently, only a single JET +application can be active at a time.
+ +The JET_Init function takes 3 arguments: The first is the +EAS data handle. The second is a pointer to a configuration structure +S_JET_CONFIG and the third is the size of the configuration structure. For most +applications, it is sufficient to pass a NULL pointer and size 0 for the +configuration data.
+ +However, if desired, the configuration can be modified to
+allow the application to monitor
When the JET application terminates, it should call +JET_Shutdown() to release the resources allocated by the JET engine. If the application has no other use for the +EAS library, it should also call EAS_Shutdown().
+ +To start the JET engine, the content must first be opened +with the JET_OpenFile() function. Just as with EAS_OpenFile(), the file locator +is an opaque value that is passed to the EAS_HWOpenFile() function. It can +either be a pointer to a filename, or a pointer to an in-memory object, +depending on the user implementation of file I/O in the eas_host.c or +eas_hostmm.c module. Only a single JET content file can be opened at a time.
+ +Once the JET file is opened, the application can begin +queuing up segments for playback by calling the JET_QueueSegment() function. +Generally, it is advisable to keep a minimum of two segments queued at all +times: the currently playing segment +plus an additional segment that is ready to start playing when the current +segment finishes. However, with proper programming, it is possible to queue up +segments using a “just-in-time” technique. This technique typically involves +careful placement of application controller events near the end of a segment so +that the application is informed when a segment is about to end.
+ +After the segment(s) are queued up, playback can begin. By +default, the segments are initialized in a paused state. To start playback, +call the JET_Play() function. Playback can be paused again by calling the JET_Pause() +function. Once initiated, playback will continue as long as the application +continues to queue up new segments before all the segments in the queue are +exhausted.
+ +The JET_Status() function can be used to monitor progress. +It returns the number of segments queued, repeat count, current segment ID, and +play status. By monitor the number of segments queued, the application can +determine when it needs to queue another segment and when playback has +completed.
+ +When playback has completed and the application is finished +with the contents of the currently open file, the application should call +JET_CloseFile() to close the file and release any resources associated with the +file.
+ +EAS_PUBLIC EAS_RESULT JET_Init +(EAS_DATA_HANDLE easHandle, S_JET_CONFIG *pConfig, EAS_INT configSize)
+ +Initializes JET library for use by application. Most +application should simply pass a NULL for pConfig and 0 for configSize, which +means that only controller events in the application range (80-83) will end up +in the application event queue. If desired, the application can instantiate an +S_JET_CONFIG data structure and set the controller range to a different range. +In this case, the configSize parameter should be set to sizeof(S_JET_CONFIG).
+ +EAS_PUBLIC EAS_RESULT JET_Shutdown +(EAS_DATA_HANDLE easHandle)
+ +Releases resources used by the JET library. The application +should call this function when it is no longer using the JET library.
+ +EAS_PUBLIC EAS_RESULT JET_OpenFile +(EAS_DATA_HANDLE easHandle, EAS_FILE_LOCATOR locator)
+ +Opens a JET content file for playback. Content must be
+formatted for use by the JET library, which is typically accomplished with the
+jetfile.py script (see “Creating JET Content”). Only a single JET content file
+can be opened at a time. However, since JET can contain many
EAS_PUBLIC EAS_RESULT JET_CloseFile +(EAS_DATA_HANDLE easHandle)
+ +Closes a JET file and release the resources associated with it.
+ +EAS_PUBLIC EAS_RESULT JET_Status +(EAS_DATA_HANDLE easHandle, S_JET_STATUS *pStatus)
+ +Returns the current JET status. The elements of the status +data structure are as follows:
+ +typedef struct s_jet_status_tag
+ +{
+ +EAS_INT currentUserID;
+ +EAS_INT segmentRepeatCount;
+ +EAS_INT numQueuedSegments;
+ +EAS_BOOL paused;
+ +} S_JET_STATUS;
+ +currentUserID: An +8-bit value assigned by the application.
+ +segmentRepeatCount: +Number of times left to repeat. Zero indicates no repeats, a negative number +indicates an infinite number of repeats. Any positive value indicates that the +segment will play n+1 times.
+ +numQueuedSegments: +Number of segments currently queued to play including the currently playing +segment. A value of zero indicates that nothing is playing. Normally, the +application will queue a new segment each time the value is 1 so that playback +is uninterrupted.
+ +EAS_PUBLIC EAS_RESULT JET_QueueSegment +(EAS_DATA_HANDLE easHandle, EAS_INT segmentNum, EAS_INT libNum, EAS_INT +repeatCount, EAS_INT transpose, EAS_U32 muteFlags, EAS_U8 userID)
+ +Queues up a JET MIDI segment for playback. The parameters +are as follows:
+ +segmentNum: +Segment number as identified in the JET content configuration file.
+ +libNum: The library +number as specified in the JET content configuration file. Use -1 to select the +standard General MIDI library.
+ +repeatCount: The +number of times this segment should repeat. Zero indicates no repeat, i.e. play +only once. Any positive number indicates to play n+1 times. Set to -1 to repeat +indefinitely.
+ +transpose: The +amount of pitch transposition. Set to 0 for normal playback. Range is -12 to ++12.
+ +muteFlags:
+Specific which MIDI tracks (not
userID: 8-bit +value specified by the application that uniquely identifies the segment. This +value is returned in the JET_Status() function as well as by the application +event when an event is detected in a segment. Normally, the application keeps +an 8-bit value that is incremented each time a new segment is queued up. This +can be used to look up any special characteristics of that track including +trigger clips and mute flags.
+ +EAS_PUBLIC EAS_RESULT JET_Play +(EAS_DATA_HANDLE easHandle)
+ +Starts playback of the current segment. This function must +be called once after the initial segments are queued up to start playback. It +is also called after JET_Pause() to resume playback.
+ +EAS_PUBLIC EAS_RESULT JET_Pause +(EAS_DATA_HANDLE easHandle)
+ +Pauses playback of the current segment. Call JET_Pause() to +resume playback.
+ +EAS_PUBLIC EAS_RESULT JET_SetMuteFlags +(EAS_DATA_HANDLE easHandle, EAS_U32 muteFlags, EAS_BOOL sync)
+ +Modifies the mute flags during playback. If the sync parameter is false, the mute flags +are updated at the beginning of the next render. This means that any new notes +or controller events will be processed during the next audio frame. If the sync parameter is true, the mute flags +will be updated at the start of the next segment. If the segment is repeated, +the flags will take effect the next time segment is repeated.
+ +EAS_PUBLIC EAS_RESULT JET_SetMuteFlag +(EAS_DATA_HANDLE easHandle, EAS_INT trackNum, EAS_BOOL muteFlag, EAS_BOOL sync)
+ +Modifies a mute flag for a single track during playback. If +the sync parameter is false, the mute +flag is updated at the beginning of the next render. This means that any new +notes or controller events will be processed during the next audio frame. If +the sync parameter is true, the mute +flag will be updated at the start of the next segment. If the segment is +repeated, the flag will take effect the next time segment is repeated.
+ +EAS_PUBLIC EAS_RESULT JET_TriggerClip +(EAS_DATA_HANDLE easHandle, EAS_INT clipID)
+ +Automatically updates mute flags in sync with the JET Clip +Marker (controller 103). The parameter clipID +must be in the range of 0-63. After the call to JET_TriggerClip, when JET next +encounters a controller event 103 with bits 0-5 of the value equal to clipID and bit 6 set to 1, it will automatically un-mute the track containing +the controller event. When JET encounters the complementary controller event +103 with bits 0-5 of the value equal to clipID +and bit 6 set to 0, it will mute +the track again.
+ +EAS_BOOL JET_GetEvent (EAS_DATA_HANDLE +easHandle, EAS_U32 *pEventRaw, S_JET_EVENT *pEvent)
+ +Attempts to read an event from the application event queue, +return EAS_TRUE if an event is found and EAS_FALSE if not. If the application +passes a valid pointer for pEventRaw, +a 32-bit compressed event code is returned. If the application passes a valid +pointer for pEvent, the event is +parsed into the S_JET_EVENT fields. The application can pass NULL for either +parameter and that variable will be ignored. Normally, the application will +call JET_GetEvent() repeatedly to retrieve events until it returns EAS_FALSE.
+ +EAS_PUBLIC void JET_ParseEvent (EAS_U32 +event, S_JET_EVENT *pEvent)
+ +Parses a 32-bit compressed event code into a data structure. +The application passes the event code received from JET_GetEvent(). The parsed +event data is returned in the memory pointed to by pEvent.
+ +EAS_RESULT JET_GetAppData +(EAS_DATA_HANDLE easHandle, EAS_I32 *pAppDataOffset, EAS_I32 *pAppDataSize)
+ +Returns the offset and size of the JAPP chunk in the JET +file. The application can use the file I/O functions in the eas_host module to +retrieve application specific data from the file.
+ +JET uses standard MIDI files and DLS files that can be
+created with commercially available content tools such as Logic, Cubase,
+Digital Performer, or SONAR for
To create JET file use the “JET Creator” desktop +application. The JET Creator application is written in Python and includes a +full graphical interface. It is available for MAC and PC platforms. See “JET +Creator User Manual” for more information.
+ +