WO2006133207A2 - Method of and system for controlling audio effects - Google Patents

Method of and system for controlling audio effects Download PDF

Info

Publication number
WO2006133207A2
WO2006133207A2 PCT/US2006/021952 US2006021952W WO2006133207A2 WO 2006133207 A2 WO2006133207 A2 WO 2006133207A2 US 2006021952 W US2006021952 W US 2006021952W WO 2006133207 A2 WO2006133207 A2 WO 2006133207A2
Authority
WO
WIPO (PCT)
Prior art keywords
signal
sensor
audio
detecting
control
Prior art date
Application number
PCT/US2006/021952
Other languages
French (fr)
Other versions
WO2006133207A3 (en
Inventor
Jesse M. Remignanti
Original Assignee
Source Audio Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Source Audio Llc filed Critical Source Audio Llc
Publication of WO2006133207A2 publication Critical patent/WO2006133207A2/en
Priority to US11/709,953 priority Critical patent/US7667129B2/en
Publication of WO2006133207A3 publication Critical patent/WO2006133207A3/en

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H3/00Instruments in which the tones are generated by electromechanical means
    • G10H3/12Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument
    • G10H3/14Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means
    • G10H3/18Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means using a string, e.g. electric guitar
    • G10H3/186Means for processing the signal picked up from the strings
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0091Means for obtaining special acoustic effects
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/321Garment sensors, i.e. musical control means with trigger surfaces or joint angle sensors, worn as a garment by the player, e.g. bracelet, intelligent clothing
    • G10H2220/326Control glove or other hand or palm-attached control device
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/395Acceleration sensing or accelerometer use, e.g. 3D movement computation by integration of accelerometer data, angle sensing with respect to the vertical, i.e. gravity sensing.
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/171Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
    • G10H2240/281Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
    • G10H2240/311MIDI transmission

Definitions

  • This disclosure relates to applying special audio effects to sounds produced, for example, by musical instruments and, more particularly, to controlling the application of such audio effects.
  • a song may call for or it may be desirable to apply one or more special audio effects to musical notes produced by the instrument.
  • audio signals from the instrument are sensed (e.g., with a microphone, pickup, etc.) and sent to a signal processor that may be dedicated to applying such effects to the audio signals.
  • the processed audio signals are usually conditioned (e.g., amplified, filtered, etc.) and provided to speakers or other type of output device.
  • the person playing the instrument
  • the musician typically steps on a foot- pedal that is located on stage near the person.
  • the musician must first locate the foot-pedal and then step on the pedal in a manner as to not look awkward or out of step with the song being played.
  • an audio effects control is configured to include a sensor that senses movement, for example, a change in position, orientation, acceleration or velocity of the sensor.
  • movement for example, a change in position, orientation, acceleration or velocity of the sensor.
  • the movement may be the sensed movement associated with playing a musical instrument.
  • the sensor will sense movement of part of the person to which the sensor is secured.
  • the sensor produces an electrical signal in response to detecting the movement, or change in position or orientation, and the electrical signal is sent to an audio effects unit to control application of one or more audio effects on audio signals produced by the musical instrument.
  • the sensor can be secured to any other item for which movement or position or orientation of the sensor can be initiated and/or controlled.
  • the sensor may be configured to sense any one or several phenomena.
  • the sensor may be configured to sense acceleration of the musical instrument (with the aid, for example, of an accelerometer), velocity, or alternatively a position change of the musical instrument (with the aid, for example, of a gyroscope).
  • the position change sensed by the sensor may include any movement, or a prescribed movement such as the musical instrument or a portion of the instrument rotating about an axis or translating along an axis.
  • the electrical signal may be an analog signal and may be modulated for transmission from the sensor.
  • An electrical circuit may also be provided for conditioning the electrical signal.
  • the audio effects control also includes an audio effects unit which is responsive to the signal generated by the sensor.
  • the electrical circuit may convert the electrical signal into a digital signal prior to transmission to the audio effects unit.
  • the electrical circuit may also convert the electrical signal into a musical instrument digital interface (MIDI) signal.
  • MIDI musical instrument digital interface
  • sensing movement may include sensing acceleration of a portion of the musical instrument, sensing acceleration of a portion of a person playing the musical instrument, sensing a rotation of a portion of the musical instrument and/or sensing a rotation of a portion of a person playing the musical instrument, or sensing a translation of a portion of the musical instrument and/or sensing a translation of a portion of a person playing the musical instrument.
  • FIG. 1 is a diagrammatic view of one embodiment of an audio signal processing system that includes an instrument-mounted sensor that controls the application of audio effects to audio signals produced by a musical instrument.
  • FIG. 2 is a diagrammatic view of the sensor shown in FIG. 1.
  • FIG. 3 illustrates possible detectable movements of the instrument shown in FIG. 1.
  • FIG. 4 is a diagrammatic view of one embodiment of a sensor designed and configured to be hand-mounted so as to control the application of audio effects to audio signals produced by a musical instrument with movement of the hand.
  • one embodiment of the disclosed system includes a sensor 10 mounted to a guitar 12 so that the sensor is capable of sensing movements, or alternatively the position, change in position, orientation, and/or change in orientation of the guitar. Based on the sensed movement or position or orientation of the guitar, and specifically sensor 10, a signal is produced by sensor 10 and provided over a cable or wires 14 to an audio effects unit 16. Along with the signals from sensor 10, audio effects unit 16 also receives audio signals that are produced by guitar 12, and provided, for example, over a cable or wires 18 to audio effects unit 16. Various types and combinations of audio effects may be applied by audio effects unit 16 to the audio signals produced by guitar 12.
  • the audio signals may be amplified, attenuated, distorted, reverberated, time - delayed, up or down mixed into other frequency bands, or applied with other similar effects known to one skilled in the art of conditioning audio signals so as to produced audio effects.
  • sensor 10 may be mounted to one or a combination of other types of musical instruments.
  • string instruments e.g., bass guitar, cello, violin, viola, etc.
  • brass instruments e.g. trumpets, saxophones, etc.
  • woodwind instruments e.g., clarinets, etc.
  • percussion instruments keyboard instruments, or other types of instruments or collections of instruments may be used to produce audible signals.
  • the term musical instrument also includes devices that sense vocal signals.
  • sensor 10 may be mounted onto a microphone so as to sense the movement, orientation or position of the microphone. By detecting the movement, position or orientation of the microphone, a signal produced by sensor 10 may be used to control the application of audio effects to the audio signals (e.g., vocal signals) received by the microphone.
  • a musician may intentionally move guitar 12 in a particular manner such that sensor 10 senses the movement and sends a control signal over cable 14 to audio effects unit 16.
  • Upon receiving the control signal one or more predefined special audio effects are applied in a controlled manner to the audio signals that are provided over cable 18 from guitar 12.
  • the control signal from sensor 10 may provide various types of control to the application of the audio effects.
  • the control signal may initiate the application of one or more audio effects.
  • the musician is free to apply an effect from any location rather than e.g., having to seek out and step on a foot - pedal.
  • Other types of audio effect control may be provided by the control signal.
  • variable control signal (analog or digital) may be produced by sensor 10.
  • the variable signal may be used to dynamically control various aspects of the audio effects.
  • the variable control signal may be used to adjust the resonant frequency of an audio effect or other similar parameter.
  • the audio signals are sent over a cable 20 to an amplifier/speaker 22 that broadcasts the signals.
  • the musician may intentionally move guitar 12 in another mamier such that the movement is detected by the sensor 10. Based on the detected movement, another trigger signal is sent over cable 14 to audio effects unit 16.
  • application of the audio effects may be halted or different audio effects may be applied.
  • the audio effects may last a predetermined time period before ending.
  • the audio effects may continue until a cue is provided from the music, e.g., there is a pause or halt in the music, or a particular note is played.
  • one or more of the audio effects applied to the music can be applied in a fade in and/or fade out fashion.
  • sensing device 24 that senses the movement of the sensor (and correspondingly the movement of guitar 12).
  • sensing device 24 may include an accelerometer that senses acceleration (i.e., rate of change of velocity with respect to time) in one or more directions, and produces an electrical signal as a function of the sensed acceleration.
  • acceleration i.e., rate of change of velocity with respect to time
  • gyroscopes may also be included in sensing device 24.
  • a change in attitude e.g., pitch rate, roll rate, and yaw rate
  • sensing device 24 Other types of sensors that detect change in position, change in velocity, or change in acceleration may be included in sensing device 24.
  • a pressure sensor e.g., piezoelectric sensor, ceramic sensor, etc. mounted on guitar 12 or incorporated into a pick used to play guitar 12 may be used as a sensing device.
  • Sensor 10 may also include multiple sensing devices. For example, one sensing device may be dedicated for detecting motion along one axis and another sensing device may be dedicated for detecting motion along a second axis of rotation.
  • sensing device 24 is preferably connected (via a conductor 26) to an interface circuit 28 that prepares the electrical signal produced by the sensing device for transmission.
  • interface circuit 28 may include circuitry for filtering, amplifying, or performing other similar functions on the electrical signal provided over conductor 26.
  • a conductor 30 provides the conditioned signal to cable 14 for delivery to audio effects unit 16.
  • interface circuit 28 may include wireless technology such as a wireless transmitter or transceiver for transmitting the signals produced by sensing device 24 over a wireless link.
  • Interface circuit 28 may also include circuitry configured and arranged so as to transfer the signals into another domain.
  • an analog signal produced by sensing device 24 may be converted into a digital signal by an analog - to - digital converter included in interface circuit 28.
  • Modulation techniques may also be provided by interface circuit 28.
  • the signals produced by the sensing device 24 may be amplitude, phase, frequency, and/or polarization modulated in the analog or digital domain.
  • the signals produced by interface circuit 28 are pulse-width modulated.
  • Interface circuit 28 may encode the signals that are transmitted to audio effects unit 16.
  • the signals may be encoded to comply with particular formats such as the musical instrument digital interface (MIDI) format.
  • MIDI musical instrument digital interface
  • movement sensed by sensing device 24 may be translated into MIDI control signals for bending pitch or modulating the audio signal from the instrument.
  • By producing these control signals from the sensing device e.g., effects are controlled through the movement of sensing device 24 rather than using the common pitch bend and modulation knobs on a synthesizer.
  • FIG. 3 one set of potential movements of guitar 12 that might be sensed by sensor 10 and initiate signal generation by the sensor are illustrated as an example of how the system operates.
  • three axes 32, 34, and 36 are shown in a right-handed rectangular coordinate system.
  • sensor 10 is capable of sensing rotation of guitar 12 about any one of axes 32, 34, or 36.
  • a signal is produced by sensor 10 and is transmitted to audio effects unit 16.
  • Guitar 12 may also be "rolled" about axis 36 (as represented by angle D) or “yawed” about axis 34 (as represented by angle D) and a signal is produced by sensor 10.
  • sensor 10 may include a gyroscope or other device for sensing the orientation of the sensor, or the sensor 10 may be capable of sensing translation of the guitar.
  • GPS global positioning system
  • a signal may be produced as the position of the guitar changes as the musician moves.
  • a laser system may also be incorporated into sensor 10 to sense position changes of the guitar relative to one or more reflective surfaces (e.g., a polished floor, wall, ceiling, etc.).
  • the signals produced by sensor 10 may be used by audio effects unit 16 to control the application of one or more audio effects to the musical tones produced by guitar 12.
  • the performer may intentionally move the guitar to apply an audio effect known as a "wah - wah" effect. This type of effect is generated by sweeping the resonant frequency of a filter (which may be included in audio effects unit 16). As guitar 12 changes position, the corresponding signals produced by sensor 10 controls the application of the audio effect.
  • guitar 12 may initially be oriented downward (in the "- y" direction) along axis 34 and the signal produced by sensor 10 controls the application of the audio effect at to the low resonant frequency (e.g., 200 Hz) of the filter.
  • the signals produced by sensor 10 controls the application of the audio effect across the frequency spectrum of the filter to an upper resonant frequency (e.g., 4000 Hz).
  • This "wah - wah” effect (or another effect) may also be applied as guitar 12 is rotated about any of the axes (e.g. axis 32, 34, or 36) shown in the figure.
  • sensor 10 may control the application of this effect as guitar 12 is translated (e.g., carried by the performer across a stage), or the orientation of the guitar is changed, or otherwise moved so that the sensor responds.
  • sensor 10 may also be attached to the performer playing the instrument.
  • An example is shown in FIG. 4.
  • sensor 10 is attached to the back of the performer's hand 38.
  • a wrist strap 40 and a figure loop 42 provide tie points to the musician's hand 38.
  • Sensor 10 is attached to a strap 44 that is respectively connected between wrist strap 40 and figure loop 42.
  • Various types of material may be used to produce wrist strap 40, figure loop 42, and strap 44.
  • flexible material such as neoprene or nylon may be used for hold sensor 10.
  • Other types of attachment mechanisms known to one skilled in the art of clothing design or clothing accessories maybe implemented to secure sensor 10 to the musician.
  • sensor 10 While sensor 10 is attached to the performer in the illustrated FIG. 4, and not the instrument, the sensor functions in a similar manner.
  • changes in position, velocity, acceleration, and/or orientation of the musician's hand may be detected and used to produce a control signal.
  • the signal may be used to control the application of audio effects by audio effects unit 16.
  • Similar to detecting movements of an instrument with sensor 10 attached to the musician's hand, various hand movements may be detected.
  • a control signal may be produced if the performer rotates his or her hand about axis 32 (as represented by angle D), or about axis 34 (as represented by angle D), or about axis 36 (as represented by angle D).
  • the performer may trigger a "wah - wah" audio effect by pointing his or her hand toward the ground (along the "- y" direction of axis 34) to apply of the audio effect at the low resonant frequency (e.g., 200 Hz) of a filter. Then, the performer may rotate his or her arm about axis 32 and point their hand toward the ceiling (along the "+ y" direction of axis 34). While making this motion, signals produced by sensor 10 may control the application of the audio effect across the frequency spectrum of the filter to the upper resonant frequency (e.g., 4000 Hz). Other types of audio effects may also be controlled based on the motion of the musician's hand.
  • the lower resonant frequency e.g. 200 Hz
  • the performer may rotate his or her arm about axis 32 and point their hand toward the ceiling (along the "+ y" direction of axis 34). While making this motion, signals produced by sensor 10 may control the application of the audio effect across the frequency spectrum
  • the signals generated by sensor 10 are provided to audio effects unit 16 over cable 14.
  • wireless circuitry e.g., RF, IR, etc.
  • sensor 10 may be implemented into sensor 10 to remove the need for cable 14 and increase the mobility of the performer as he or she plays guitar 12 (or another instrument).
  • the sensor may be attached elsewhere to the musician.
  • sensor 10 may be incorporated into an arm-band or attached to a piece of the musician's clothing or costume.
  • multiple sensors may be attached to the musician for producing multiple signals that may be used to control the application of one or more audio effects by audio effects unit 16. By incorporating one or more of these sensors onto the performer or onto the instrument played by the performer, musical performances are improved since the performer is free to move anywhere on stage and trigger the application of audio effects.
  • audio effect unit 16 is shown as a standalone unit, it may be connected to a computerized system, or alternatively be embodied as a software program run entirely on a computerized system. As such the signals generated by the sensor or sensors would be received by the computerized system and processed by the system before the signals are generated so as to drive one or more loudspeakers, such as speaker 22 in the illustrated embodiment shown in FIG. 1. Accordingly, other implementations are within the scope of the following claims.

Abstract

An audio effects controls for and method of controlling the application of special audio effects applied to an audio signal, comprises a sensor (10) configured to sense movement associated with the generation of the audio signal, wherein the sensor (10) produces a control signal in response to detecting the movement, and the control signal is transmitted to an audio effects unit (16) to control application of an audio effect on an audio signal.

Description

METHOD OFAND SYSTEM FOR CONTROLLING AUDIO EFFECTS
TECHNICALFIELD
[0001] This disclosure relates to applying special audio effects to sounds produced, for example, by musical instruments and, more particularly, to controlling the application of such audio effects.
BACKGROUND
[0002] As a musician or performer plays an instrument during a concert or other type of performance, a song may call for or it may be desirable to apply one or more special audio effects to musical notes produced by the instrument. To apply the effect, audio signals from the instrument are sensed (e.g., with a microphone, pickup, etc.) and sent to a signal processor that may be dedicated to applying such effects to the audio signals. After the one or more audio effects are applied by the signal processor, the processed audio signals are usually conditioned (e.g., amplified, filtered, etc.) and provided to speakers or other type of output device. To initiate the application of the audio effects, the person (playing the instrument) typically steps on a foot- pedal that is located on stage near the person. However, to trigger the application of the audio effects on stage, the musician must first locate the foot-pedal and then step on the pedal in a manner as to not look awkward or out of step with the song being played.
SUMMARY OF THE DISCLOSURE
[0003] In accordance with an aspect of the disclosure, an audio effects control is configured to include a sensor that senses movement, for example, a change in position, orientation, acceleration or velocity of the sensor. For example, by mounting the sensor to a musical instrument, the movement may be the sensed movement associated with playing a musical instrument. Alternatively, by securing the sensor to the person playing the instrument the sensor will sense movement of part of the person to which the sensor is secured. The sensor produces an electrical signal in response to detecting the movement, or change in position or orientation, and the electrical signal is sent to an audio effects unit to control application of one or more audio effects on audio signals produced by the musical instrument. The sensor can be secured to any other item for which movement or position or orientation of the sensor can be initiated and/or controlled.
[0004] The sensor may be configured to sense any one or several phenomena. For example, the sensor may be configured to sense acceleration of the musical instrument (with the aid, for example, of an accelerometer), velocity, or alternatively a position change of the musical instrument (with the aid, for example, of a gyroscope). The position change sensed by the sensor may include any movement, or a prescribed movement such as the musical instrument or a portion of the instrument rotating about an axis or translating along an axis.
[0005] Various types of electrical signals may be produced by the sensor. For example, the electrical signal may be an analog signal and may be modulated for transmission from the sensor. An electrical circuit may also be provided for conditioning the electrical signal. The audio effects control also includes an audio effects unit which is responsive to the signal generated by the sensor. The electrical circuit may convert the electrical signal into a digital signal prior to transmission to the audio effects unit. The electrical circuit may also convert the electrical signal into a musical instrument digital interface (MIDI) signal.
[0006] In various embodiments, sensing movement may include sensing acceleration of a portion of the musical instrument, sensing acceleration of a portion of a person playing the musical instrument, sensing a rotation of a portion of the musical instrument and/or sensing a rotation of a portion of a person playing the musical instrument, or sensing a translation of a portion of the musical instrument and/or sensing a translation of a portion of a person playing the musical instrument.
[0007] Additional advantages and aspects of the present disclosure will become readily apparent to those skilled in the art from the following detailed description, wherein embodiments of the present invention are shown and described, simply by way of illustration of the best mode contemplated for practicing the present invention. As will be described, the present disclosure is capable of other and different embodiments, and its several details are susceptible of modification in various obvious respects, all without departing from the spirit of the present disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as limitative.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] FIG. 1 is a diagrammatic view of one embodiment of an audio signal processing system that includes an instrument-mounted sensor that controls the application of audio effects to audio signals produced by a musical instrument.
[0009] FIG. 2 is a diagrammatic view of the sensor shown in FIG. 1.
[00010] FIG. 3 illustrates possible detectable movements of the instrument shown in FIG. 1.
[00011] FIG. 4 is a diagrammatic view of one embodiment of a sensor designed and configured to be hand-mounted so as to control the application of audio effects to audio signals produced by a musical instrument with movement of the hand.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0012] Referring to FIG. 1, one embodiment of the disclosed system includes a sensor 10 mounted to a guitar 12 so that the sensor is capable of sensing movements, or alternatively the position, change in position, orientation, and/or change in orientation of the guitar. Based on the sensed movement or position or orientation of the guitar, and specifically sensor 10, a signal is produced by sensor 10 and provided over a cable or wires 14 to an audio effects unit 16. Along with the signals from sensor 10, audio effects unit 16 also receives audio signals that are produced by guitar 12, and provided, for example, over a cable or wires 18 to audio effects unit 16. Various types and combinations of audio effects may be applied by audio effects unit 16 to the audio signals produced by guitar 12. For example, the audio signals may be amplified, attenuated, distorted, reverberated, time - delayed, up or down mixed into other frequency bands, or applied with other similar effects known to one skilled in the art of conditioning audio signals so as to produced audio effects. Also, while guitar 12 is shown for producing audio signals, sensor 10 may be mounted to one or a combination of other types of musical instruments. For example, other types of string instruments (e.g., bass guitar, cello, violin, viola, etc.), brass instruments (e.g. trumpets, saxophones, etc.), woodwind instruments (e.g., clarinets, etc.), percussion instruments, keyboard instruments, or other types of instruments or collections of instruments may be used to produce audible signals. Further, the term musical instrument also includes devices that sense vocal signals. For example, sensor 10 may be mounted onto a microphone so as to sense the movement, orientation or position of the microphone. By detecting the movement, position or orientation of the microphone, a signal produced by sensor 10 may be used to control the application of audio effects to the audio signals (e.g., vocal signals) received by the microphone.
[0013] When playing the instrument, a musician may intentionally move guitar 12 in a particular manner such that sensor 10 senses the movement and sends a control signal over cable 14 to audio effects unit 16. Upon receiving the control signal, one or more predefined special audio effects are applied in a controlled manner to the audio signals that are provided over cable 18 from guitar 12. The control signal from sensor 10 may provide various types of control to the application of the audio effects. For example, the control signal may initiate the application of one or more audio effects. By providing this trigger from the control signal, the musician is free to apply an effect from any location rather than e.g., having to seek out and step on a foot - pedal. Other types of audio effect control may be provided by the control signal. For example, rather than providing a discrete trigger signal to initiate (or halt) application of one or more effects, a variable control signal (analog or digital) may be produced by sensor 10. The variable signal may be used to dynamically control various aspects of the audio effects. For example, the variable control signal may be used to adjust the resonant frequency of an audio effect or other similar parameter.
[0014] In this illustrative example, after the audio effects are applied, the audio signals are sent over a cable 20 to an amplifier/speaker 22 that broadcasts the signals. As suggested, to halt the application of the audio effects, in some arrangements the musician may intentionally move guitar 12 in another mamier such that the movement is detected by the sensor 10. Based on the detected movement, another trigger signal is sent over cable 14 to audio effects unit 16. Upon receiving this second trigger signal, application of the audio effects may be halted or different audio effects may be applied. Alternatively, the audio effects may last a predetermined time period before ending. In another arrangement the audio effects may continue until a cue is provided from the music, e.g., there is a pause or halt in the music, or a particular note is played. In addition, one or more of the audio effects applied to the music can be applied in a fade in and/or fade out fashion.
[0015] Referring to FIG. 2, the contents of sensor 10 includes a sensing device 24 that senses the movement of the sensor (and correspondingly the movement of guitar 12). Various sensing techniques known to one skilled in the art of transducers may be implemented in sensing device 24. In one example, sensing device 24 may include an accelerometer that senses acceleration (i.e., rate of change of velocity with respect to time) in one or more directions, and produces an electrical signal as a function of the sensed acceleration. Alternatively or in addition, one or more gyroscopes may also be included in sensing device 24. By including an inertial device such as a gyroscope, a change in attitude (e.g., pitch rate, roll rate, and yaw rate) of sensor 10 may be detected and an electrical signal produced as a function of the sensed attitude change. Other types of sensors that detect change in position, change in velocity, or change in acceleration may be included in sensing device 24. For example, a pressure sensor (e.g., piezoelectric sensor, ceramic sensor, etc.) mounted on guitar 12 or incorporated into a pick used to play guitar 12 may be used as a sensing device. Sensor 10 may also include multiple sensing devices. For example, one sensing device may be dedicated for detecting motion along one axis and another sensing device may be dedicated for detecting motion along a second axis of rotation.
[0016] As illustrated in Fig. 2, sensing device 24 is preferably connected (via a conductor 26) to an interface circuit 28 that prepares the electrical signal produced by the sensing device for transmission. For example, interface circuit 28 may include circuitry for filtering, amplifying, or performing other similar functions on the electrical signal provided over conductor 26. In this example, once the electrical signal is conditioned for transmission, a conductor 30 provides the conditioned signal to cable 14 for delivery to audio effects unit 16. Besides using hard-wire connections to provide the signal to audio effects unit 16, other signal transmission techniques known to one of skill in the art of electronics and telecommunications may be implemented. For example, interface circuit 28 may include wireless technology such as a wireless transmitter or transceiver for transmitting the signals produced by sensing device 24 over a wireless link. Various types of wireless technology, such as radio frequency (RF), infrared (IR), etc., may be implemented in interface circuit 28 and the audio effects unit 16. Furthermore, in some arrangements a combination of hard-wire and wireless technology may be implemented in interface circuit 28 and audio effects unit 16. Interface circuit 28 may also include circuitry configured and arranged so as to transfer the signals into another domain. For example, an analog signal produced by sensing device 24 may be converted into a digital signal by an analog - to - digital converter included in interface circuit 28. Modulation techniques may also be provided by interface circuit 28. For example, the signals produced by the sensing device 24 may be amplitude, phase, frequency, and/or polarization modulated in the analog or digital domain. In one particular example, the signals produced by interface circuit 28 are pulse-width modulated. Interface circuit 28 may encode the signals that are transmitted to audio effects unit 16. For example, the signals may be encoded to comply with particular formats such as the musical instrument digital interface (MIDI) format. In one implementation, movement sensed by sensing device 24 may be translated into MIDI control signals for bending pitch or modulating the audio signal from the instrument. By producing these control signals from the sensing device, e.g., effects are controlled through the movement of sensing device 24 rather than using the common pitch bend and modulation knobs on a synthesizer.
[0017] Referring to FIG. 3 , one set of potential movements of guitar 12 that might be sensed by sensor 10 and initiate signal generation by the sensor are illustrated as an example of how the system operates. To assist the illustration, three axes 32, 34, and 36 are shown in a right-handed rectangular coordinate system. In this example, sensor 10 is capable of sensing rotation of guitar 12 about any one of axes 32, 34, or 36. For example, if guitar 12 is "pitched" about axis 32 (as represented by angle D) a signal is produced by sensor 10 and is transmitted to audio effects unit 16. Guitar 12 may also be "rolled" about axis 36 (as represented by angle D) or "yawed" about axis 34 (as represented by angle D) and a signal is produced by sensor 10.
[0018] Along with detecting the rotation of guitar 12, other movements may be sensed and initiate generation of an electrical signal by sensor 10. For example, sensor 10 may include a gyroscope or other device for sensing the orientation of the sensor, or the sensor 10 may be capable of sensing translation of the guitar. By incorporating a global positioning system (GPS) receiver in sensor 10, for example, a signal may be produced as the position of the guitar changes as the musician moves. A laser system may also be incorporated into sensor 10 to sense position changes of the guitar relative to one or more reflective surfaces (e.g., a polished floor, wall, ceiling, etc.).
[0019] By sensing these rotational, orientation and/or translational changes, the signals produced by sensor 10 may be used by audio effects unit 16 to control the application of one or more audio effects to the musical tones produced by guitar 12. For example, the performer may intentionally move the guitar to apply an audio effect known as a "wah - wah" effect. This type of effect is generated by sweeping the resonant frequency of a filter (which may be included in audio effects unit 16). As guitar 12 changes position, the corresponding signals produced by sensor 10 controls the application of the audio effect. For example, guitar 12 may initially be oriented downward (in the "- y" direction) along axis 34 and the signal produced by sensor 10 controls the application of the audio effect at to the low resonant frequency (e.g., 200 Hz) of the filter. As guitar 12 is rotated toward an upward vertical position (oriented in the "+ y" direction) along axis 34, the signals produced by sensor 10 controls the application of the audio effect across the frequency spectrum of the filter to an upper resonant frequency (e.g., 4000 Hz). This "wah - wah" effect (or another effect) may also be applied as guitar 12 is rotated about any of the axes (e.g. axis 32, 34, or 36) shown in the figure. Also, sensor 10 may control the application of this effect as guitar 12 is translated (e.g., carried by the performer across a stage), or the orientation of the guitar is changed, or otherwise moved so that the sensor responds.
[0020] Along with or in lieu of attaching sensor 10 to the instrument (e.g. guitar 10), one or more sensors may also be attached to the performer playing the instrument. An example is shown in FIG. 4. In this arrangement, sensor 10 is attached to the back of the performer's hand 38. To hold sensor 10 in place and not interfere with the musician's playing of guitar 12, a wrist strap 40 and a figure loop 42 provide tie points to the musician's hand 38. Sensor 10 is attached to a strap 44 that is respectively connected between wrist strap 40 and figure loop 42. Various types of material may be used to produce wrist strap 40, figure loop 42, and strap 44. For example, flexible material such as neoprene or nylon may be used for hold sensor 10. Other types of attachment mechanisms known to one skilled in the art of clothing design or clothing accessories maybe implemented to secure sensor 10 to the musician.
[0021] While sensor 10 is attached to the performer in the illustrated FIG. 4, and not the instrument, the sensor functions in a similar manner. In the example shown in FIG. 4, changes in position, velocity, acceleration, and/or orientation of the musician's hand may be detected and used to produce a control signal. The signal may be used to control the application of audio effects by audio effects unit 16. Similar to detecting movements of an instrument, with sensor 10 attached to the musician's hand, various hand movements may be detected. For example, a control signal may be produced if the performer rotates his or her hand about axis 32 (as represented by angle D), or about axis 34 (as represented by angle D), or about axis 36 (as represented by angle D).
[0022] By attaching sensor 10 to the performer, movement may be better controlled. For example, the performer may trigger a "wah - wah" audio effect by pointing his or her hand toward the ground (along the "- y" direction of axis 34) to apply of the audio effect at the low resonant frequency (e.g., 200 Hz) of a filter. Then, the performer may rotate his or her arm about axis 32 and point their hand toward the ceiling (along the "+ y" direction of axis 34). While making this motion, signals produced by sensor 10 may control the application of the audio effect across the frequency spectrum of the filter to the upper resonant frequency (e.g., 4000 Hz). Other types of audio effects may also be controlled based on the motion of the musician's hand.
[0023] In the illustrated example of FIG. 4, the signals generated by sensor 10 are provided to audio effects unit 16 over cable 14. However, wireless circuitry (e.g., RF, IR, etc.) may be implemented into sensor 10 to remove the need for cable 14 and increase the mobility of the performer as he or she plays guitar 12 (or another instrument). [0024] While this example described attaching sensor 10 to the musician's hand, in other arrangements, the sensor may be attached elsewhere to the musician. For example, sensor 10 may be incorporated into an arm-band or attached to a piece of the musician's clothing or costume. Additionally, multiple sensors may be attached to the musician for producing multiple signals that may be used to control the application of one or more audio effects by audio effects unit 16. By incorporating one or more of these sensors onto the performer or onto the instrument played by the performer, musical performances are improved since the performer is free to move anywhere on stage and trigger the application of audio effects.
[0025] A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. For example, prescribed movements of the sensor are described as producing the control signal for producing the audio effect. It is also possible to have multiple sensors for producing different audio effects. A system can also be provided wherein different prescribed movements of a sensor can produce different audio effects. Further, while audio effect unit 16 is shown as a standalone unit, it may be connected to a computerized system, or alternatively be embodied as a software program run entirely on a computerized system. As such the signals generated by the sensor or sensors would be received by the computerized system and processed by the system before the signals are generated so as to drive one or more loudspeakers, such as speaker 22 in the illustrated embodiment shown in FIG. 1. Accordingly, other implementations are within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. An audio effects control for controlling the application of special audio effects applied to an audio signal, comprising: a sensor configured to sense movement associated with the generation of the audio signal, wherein the sensor produces a control signal in response to detecting the movement, and the control signal is transmitted to an audio effects unit to control application of an audio effect on an audio signal.
2. The audio effects control of claim 1, wherein the sensor includes an accelerometer.
3. The audio effects control of claim 1, wherein the sensor includes a gyroscope.
4. The audio effects control of claim 1 , wherein the sensor is mounted to a musical instrument that generates the audio signal.
5. The audio effects control of claim 1 , wherein the sensor is attached to portion of a musician playing a musical instrument that generates the audio signal
6. The audio effects control of claim 5, wherein the sensor is attached to the hand of a musician playing the musical instrument.
7. The audio effects control of claim 6 wherein the sensor is configured as a ring worn on the hand of the musician playing a guitar.
8. The audio effects control of claim 6 wherein the sensor is a pressure sensor incorporated into a pick held by a hand of the musician used to play a guitar.
9. The audio effects control of claim 1, wherein the sensor is configured and positioned to sense acceleration as the musical instrument is moved.
10. The audio effects control of claim 1, wherein the sensor is configured and positioned to sense a position change of the musical instrument.
11. The audio effects control of claim 10, wherein the position change includes rotating the musical instrument about an axis.
12. The audio effects control of claim 10, wherein the position change includes translating the musical instrument along an axis.
13. The audio effects control of claim 1, wherein the sensor is configured and positioned to sense acceleration of the sensor configured to be secured to a musician playing the musical instrument.
14. The audio effects control of claim 1, wherein the sensor is configured and positioned to sense a position change of the sensor configured to be secured to a musician playing the musical instrument.
15. The audio effects control of claim 14, wherein the position change includes a portion of the musician rotating about an axis.
16. The audio effects control of claim 14, wherein the position change includes a portion of the musician translating along an axis.
17. The audio effects control of claim 1, wherein the control signal is an analog signal.
18. The audio effects control of claim 1, wherein the control signal is a modulated signal.
19. The audio effects control of claim 1, further comprising: an electrical circuit configured to condition the electrical signal sent to the audio effects unit.
20. The audio effects control of claim 19, wherein the electrical circuit is configured to convert the electrical signal into a digital signal prior to transmission to the audio effects unit.
21. The audio effects control of claim 19, wherein the electrical circuit is configured to convert the electrical signal into a musical instrument digital interface (MIDI) signal.
22. A method of controlling the application of an audio effect on an audio signal, comprising: sensing movement of a sensor while the audio signal is being generated; producing a signal representative of the sensed movement; and transmitting the signal to an audio effects unit so as to control application of an audio effect on an audio signal.
23. The method of claim 22, wherein sensing movement of the sensor includes sensing acceleration of a portion of a musical instrument producing the audio signals and to which the sensor is secured.
24. The method of claim 23, wherein sensing movement includes sensing acceleration of the sensor secured to a portion of a person playing a musical instrument producing the audio signals.
25. The method of claim 24, wherein sensing movement of the sensor includes sensing a rotation of a portion of a musical instrument producing the audio signals and to which the sensor is secured.
26. The method of claim 24, wherein sensing movement of the sensor includes sensing a rotation of the sensor secured to a portion of a person playing a musical instrument producing the audio signals.
27. The method of claim 22, wherein sensing movement of the sensor includes sensing a translation of a portion of a musical instrument producing the audio signals.
28. The method of claim 22, wherein sensing movement of the sensor includes sensing a translation of the sensor secured to a portion of a person playing a musical instrument producing the audio signals.
29. A method for producing an audio signal, the method comprising: receiving a first signal; detecting a modification signal produced from a motion activated sensor that is in contact with a moving portion of an individual; modifying the first signal with the modification signal to produce a second signal; and producing an audio signal from the second signal.
30. The method of claim 29 further comprising: in response to detecting that movement of the portion of the individual is in a first direction, producing a first modification signal causing modification of the first signal in accordance with a first pattern; and in response to detecting that movement of the portion of the individual is in a second direction, producing a second modification signal causing modification of the first signal in accordance with a second pattern.
31. The method of claim 30 wherein the first direction and the second direction are substantially opposite from each other and wherein the first pattern and second pattern modify the first signal in a substantially opposite manner.
32. A method comprising: receiving a first signal; detecting a second signal produced by a sensor that monitors motion associated with a source originating the first signal; and applying an audio effect to the first signal in accordance with the second signal to modify the first signal.
33. A method as in claim 32, wherein detecting the second signal includes detecting an orientation of the sensor, which is attached to a musician originating the first signal.
34. A method as in claim 32, wherein detecting the second signal includes detecting an acceleration associated with the sensor, which is attached to a musician originating the first signal.
35. A method as in claim 32, wherein detecting the second signal includes detecting motion associated with the sensor, which is attached to a musical instrument originating the first signal.
36. A method as in claim 32, wherein detecting the second signal includes monitoring the sensor to detect motion associated with a musical instrument originating the first signal.
37. A method as in claim 32, wherein detecting the second signal includes monitoring at least one of the following attributes associated with the sensor: a) an acceleration parameter associated with the sensor, b) a position of the sensor in a three-dimensional space, c) an orientation of the sensor, d) rotation of the sensor about an axis, e) translation of the sensor along an axis, and f) velocity of the sensor.
38. A method as in claim 32, wherein detecting the second signal includes monitoring an output of the sensor to detect a respective motion associated with a microphone device originating the first signal.
39. A method as in claim 32, wherein detecting the second signal includes monitoring the second signal to detect occurrence of a first discrete trigger signal produced by the sensor, the occurrence of the discrete trigger signal indicating to apply a first audio effect to the first signal for purposes of producing an audible output.
40. A method as in claim 39, upon detecting an occurrence of a second trigger signal produced by the sensor, modifying application of the first audio effect to the first signal for purposes of producing the audible output.
41. A method as in claim 40, wherein modifying application of the first respective audio effect to the first signal includes terminating application of the first audio effect to the first signal and applying a second audio effect to the first signal for purposes of producing the audible output.
42. A method as in claim 39, upon detecting an occurrence of the first trigger signal, initiating application of the first audio effect to the first signal for purposes of modifying the first signal for a predetermined duration of time.
43. A method as in claim 32, wherein applying the audio effect to the first signal includes at least one of: amplification, attenuation, distortion, reverberation, time delaying, up mixing, down mixing of the first signal into other frequency bands for purposes of producing an audible output.
44. A method as in claim 32, wherein detecting the second signal includes monitoring motion associated with the sensor that is affixed to a musician originating the audio signal; and wherein applying the audio effect includes producing a wah-wah effect on the first signal in accordance with detected motion associated with the sensor.
45. A method as in claim 32, wherein detecting the second signal includes detecting a range of motion associated with the sensor while the sensor is affixed to the source generating the audio-based signal; and wherein applying the audio effect includes applying a spectrum of different types of audio effects as the sensor is swept through the range of motion.
46. A method comprising: receiving an audio-based signal originating from a source; detecting a first control signal produced by a first sensor that monitors a first type of movement associated with the source originating the audio-based signal; detecting a second control signal produced by a second sensor that monitors a second type of movement associated with the source originating the audio-based signal; and applying audio effects to the audio-based signal in accordance with the first signal and the second signal to modify the audio-based signal.
47. A method as in claim 46, wherein applying the audio effects to the audio-based signal includes: applying a first audio effect to the audio-based signal in accordance with the first control signal; and applying a second audio effect to the audio-based signal in accordance with the second control signal.
48. A method as in claim 46, wherein different prescribed movements associated with the first sensor and the second sensor result in application of different audio effects to the audio-based signal.
49. A method as in claim 46, wherein the first sensor and the second sensor are mounted in a wireless device attached to a person playing an instrument, and wherein detecting a first control signal and detecting a second control signal comprise: receiving at least one wireless signal transmitted from the wireless device, the at least one wireless signal including an encoding of the first control signal and the second control signal.
50. A method as in claim 32, wherein receiving the first signal comprises: receiving a signal from an instrument being played by a person; wherein detecting a second signal produced by a sensor that monitors motion associated with a source originating the first signal comprises: receiving the second signal as a wireless signal transmitted from the sensor, the sensor being a motion activated accelerometer mounted on an extremity of a person playing the instrument, the wireless signal including information indicating an amount of acceleration of the sensor while mounted on the extremity of the person playing the instrument.
51. The method of claim 29 wherein detecting a modification signal produced from a motion activated sensor that is in contact with a moving portion of an individual comprises: detecting the modification signal produced from a pressure sensor incorporated into a pick used to play a guitar.
52. The method of claim 51 wherein detecting the modification signal produced from a pressure sensor incorporated into a pick used to play a guitar comprises: detecting modification signals from multiple pressure sensing devices incorporated into the pick used to play the guitar.
PCT/US2006/021952 2005-06-06 2006-06-06 Method of and system for controlling audio effects WO2006133207A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/709,953 US7667129B2 (en) 2005-06-06 2007-02-23 Controlling audio effects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/145,872 US7339107B2 (en) 2005-06-06 2005-06-06 Method of and system for controlling audio effects
US11/145,872 2005-06-06

Publications (2)

Publication Number Publication Date
WO2006133207A2 true WO2006133207A2 (en) 2006-12-14
WO2006133207A3 WO2006133207A3 (en) 2007-07-26

Family

ID=37492837

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/021952 WO2006133207A2 (en) 2005-06-06 2006-06-06 Method of and system for controlling audio effects

Country Status (2)

Country Link
US (1) US7339107B2 (en)
WO (1) WO2006133207A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012170344A2 (en) * 2011-06-07 2012-12-13 University Of Florida Research Foundation, Inc. Modular wireless sensor network for musical instruments and user interfaces for use therewith
WO2023070142A1 (en) 2021-10-28 2023-05-04 Birdkids Gmbh Portable digital audio device for capturing user interactions

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7667129B2 (en) * 2005-06-06 2010-02-23 Source Audio Llc Controlling audio effects
US20070175322A1 (en) * 2006-02-02 2007-08-02 Xpresense Llc RF-based dynamic remote control device based on generating and sensing of electrical field in vicinity of the operator
DE102008020340B4 (en) * 2008-04-18 2010-03-18 Hochschule Magdeburg-Stendal (Fh) Gesture-controlled MIDI instrument
US7982124B1 (en) * 2009-02-03 2011-07-19 Landis John A Wireless guitar synthesizer
US8461468B2 (en) 2009-10-30 2013-06-11 Mattel, Inc. Multidirectional switch and toy including a multidirectional switch
FR2962248A1 (en) * 2010-07-05 2012-01-06 Nozier Delly Jeune Wireless electric guitar device for e.g. musician, has guitar transmitting and receiving infrared signal, external receiver transmitting signal and receiving infrared signal emitted by electronic frame, and insert receiving battery
JP6007476B2 (en) * 2011-02-28 2016-10-12 カシオ計算機株式会社 Performance device and electronic musical instrument
US8426719B2 (en) * 2011-05-25 2013-04-23 Inmusic Brands, Inc. Keytar controller with percussion pads and accelerometer
US20130058507A1 (en) * 2011-08-31 2013-03-07 The Tc Group A/S Method for transferring data to a musical signal processor
US8609973B2 (en) 2011-11-16 2013-12-17 CleanStage LLC Audio effects controller for musicians
US8907201B2 (en) 2012-10-12 2014-12-09 Lars Otto Jensen Device for producing percussive sounds
US9349360B2 (en) * 2012-11-08 2016-05-24 Markus Oliver HUMMEL Accelerometer and gyroscope controlled tone effects for use with electric instruments
US9520116B2 (en) * 2012-11-08 2016-12-13 Markus Oliver HUMMEL Universal effects carrier
US9147382B2 (en) * 2012-11-27 2015-09-29 Capacitron, Llc Electronic guitar pick and method
FR3008217A1 (en) * 2013-07-04 2015-01-09 Lucas Daniel Sharp MOTION DETECTION MODULE FOR MUSICAL INSTRUMENTS
US9761211B2 (en) * 2013-08-09 2017-09-12 Viditar, Inc. Detachable controller device for musical instruments
US9640151B2 (en) * 2015-06-12 2017-05-02 Pickatto LLC Instrument plectrum and system for providing technique feedback and statistical information to a user
KR20170019651A (en) * 2015-08-12 2017-02-22 삼성전자주식회사 Method and electronic device for providing sound
WO2017029663A1 (en) * 2015-08-19 2017-02-23 Kaiden Instruments Ltd Percussion device and system for stringed instrument
US11030985B2 (en) * 2018-11-27 2021-06-08 Algorhythm Technologies Inc. Musical instrument special effects device
CN111739494A (en) * 2020-05-26 2020-10-02 孙华 Electronic musical instrument with intelligent algorithm capable of blowing transversely and vertically

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030196542A1 (en) * 2002-04-16 2003-10-23 Harrison Shelton E. Guitar effects control system, method and devices

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6150947A (en) * 1999-09-08 2000-11-21 Shima; James Michael Programmable motion-sensitive sound effects device
US6995310B1 (en) * 2001-07-18 2006-02-07 Emusicsystem Method and apparatus for sensing and displaying tablature associated with a stringed musical instrument
US7521623B2 (en) * 2004-11-24 2009-04-21 Apple Inc. Music synchronization arrangement
US7060887B2 (en) * 2003-04-12 2006-06-13 Brian Pangrle Virtual instrument
US7135638B2 (en) * 2003-11-25 2006-11-14 Lloyd R. Baggs Dynamic magnetic pickup for stringed instruments
US20060060068A1 (en) * 2004-08-27 2006-03-23 Samsung Electronics Co., Ltd. Apparatus and method for controlling music play in mobile communication terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030196542A1 (en) * 2002-04-16 2003-10-23 Harrison Shelton E. Guitar effects control system, method and devices

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012170344A2 (en) * 2011-06-07 2012-12-13 University Of Florida Research Foundation, Inc. Modular wireless sensor network for musical instruments and user interfaces for use therewith
WO2012170344A3 (en) * 2011-06-07 2013-01-31 University Of Florida Research Foundation, Inc. Modular wireless sensor network for musical instruments and user interfaces for use therewith
US9269340B2 (en) 2011-06-07 2016-02-23 University Of Florida Research Foundation, Incorporated Modular wireless sensor network for musical instruments and user interfaces for use therewith
US9542920B2 (en) 2011-06-07 2017-01-10 University Of Florida Research Foundation, Incorporated Modular wireless sensor network for musical instruments and user interfaces for use therewith
WO2023070142A1 (en) 2021-10-28 2023-05-04 Birdkids Gmbh Portable digital audio device for capturing user interactions

Also Published As

Publication number Publication date
US7339107B2 (en) 2008-03-04
WO2006133207A3 (en) 2007-07-26
US20060272489A1 (en) 2006-12-07

Similar Documents

Publication Publication Date Title
US7339107B2 (en) Method of and system for controlling audio effects
US7667129B2 (en) Controlling audio effects
JP3915257B2 (en) Karaoke equipment
EP3564947B1 (en) Apparatus for a reed instrument
JP2808617B2 (en) Electronic musical instrument
US6441293B1 (en) System for generating percussion sounds from stringed instruments
US6846980B2 (en) Electronic-acoustic guitar with enhanced sound, chord and melody creation system
JP6552413B2 (en) Synthesizer using bi-directional transmission
JP5664581B2 (en) Musical sound generating apparatus, musical sound generating method and program
US9761211B2 (en) Detachable controller device for musical instruments
JP2007256736A (en) Electric musical instrument
US9875729B2 (en) Electronic mute for musical instrument
JP6024077B2 (en) Signal transmitting apparatus and signal processing apparatus
JP6176372B2 (en) Musical instruments and signal processing devices
JP4147840B2 (en) Mobile phone equipment
JPH0675571A (en) Electronic musical instrument
CN110875029A (en) Pickup and pickup method
JP2012013725A (en) Musical performance system and electronic musical instrument
Farwell Adapting the trombone: a suite of electro-acoustic interventions for the piece Rouse
JP2010066640A (en) Musical instrument
CN211980190U (en) Pickup
KR101063941B1 (en) Musical equipment system for synchronizing setting of musical instrument play, and digital musical instrument maintaining the synchronized setting of musical instrument play
JP2921483B2 (en) Performance parameter control device for electronic musical instruments
KR200408301Y1 (en) Cellular phone with musical instrument
KR200407380Y1 (en) Personal handheld terminal with musical instrument

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06772317

Country of ref document: EP

Kind code of ref document: A2