Create a one-person, integrated, DJ/Projection mapping/live performance environment

Material Information

Create a one-person, integrated, DJ/Projection mapping/live performance environment
Liccardi, Dominick Louis ( author )
Place of Publication:
Denver, CO
University of Colorado Denver
Publication Date:
Physical Description:
1 electronic file (59 pages) : ;


Subjects / Keywords:
Entertainment computing ( lcsh )
bibliography ( marcgt )
theses ( marcgt )
non-fiction ( marcgt )


For the live music observer, there is often a disconnect between visualizations and the performer. Music venues and nightclubs often use pre-programmed lights and videos, or an additional VJ performer, for visual stimulation during the music performance. Using this automated strategy, or multiple DJ/VJ performers, creates a detachment between the audience and the performer. This technical detachment can be distracting for patrons that are intently following the musical performance. In an effort to give artists another option for controlling visualizations, I designed and built an integrated and efficient live music/visualization performance environment, using affordable technology and open source software, that can be performed from one modest computer. This integrated design intricately coordinates the visualizations with the music and is controlled by the performer in real-time. Functionality of the environment includes real time control using external controllers, reactive control using a microphone or audio effects, and specific control using automation from specific sounds within the music performance. By incorporating direct and automated control into the system, I established a technique that is functional and fun to play. Using technology and innovative design, I've introduced a viable option for artists to express themselves musically and visually, immersing the audience in an interactive and dynamic musical experience, controlled exclusively by the artist, using one computer.
Thesis (M.S.))--University of Colorado Denver, 2018.
Includes bibliographical references.
System Details:
System requirements: Adobe Reader.
Statement of Responsibility:
by Dominick Louis Liccardi.

Record Information

Source Institution:
University of Colorado Denver
Holding Location:
Auraria Library
Rights Management:
Copyright Dominick Louis Liccardi. Permission granted to University of Colorado Denver to digitize and display this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder.
Resource Identifier:
on10894 ( NOTIS )
1089449758 ( OCLC )


This item is only available as the following downloads:

Full Text


CREATE A ONE PERSON, INTEGRATED, DJ /PROJECTION MAPPING/ LIVE PERFORMANCE ENVIRONMENT by DOMINICK LOUIS LICCIARDI B.A., Louisiana State University, 2016 A thesis submitted to the Faculty of the Graduate School of the University of Colorado in partial fulfillment of the requirements for the degree of Master of Science Recording Arts Program 2018




! """ This thesis for the Master of Science degree by Dominick Louis Licciardi has been approved for the Recording Arts Program by Leslie Gaston Bird Chair Leslie Gaston Bird Advisor Lorne Bregitzer, Advisor Jeff Merkel, Advisor Date: May 12, 2018


! "# Licciardi, Dominick Louis ( MS Recording Arts) Create a One Person, Integrated, DJ/Projection M apping/ Live Performance E nvironment Thesis directed by Associate Professor Leslie Gaston Bird ABSTRACT For the live music observer there is often a disconnect between visualizations and th e performer. M usic v enues and nightclubs often use pre progr ammed lights and videos or an additional VJ performer for visual stimulation during the music performance Using t his automated strategy or multiple DJ/VJ performers creates a detachment between th e audience and the performer This technical detachment can be distracting for patrons that are intently following the musical performance In an effort to give artists another option for controlling visualizations I design ed and built a n integrated and efficient live music /visualization performance environment using affordable technology and open source software that can be performed from one modest computer This integrated design intric ately coordinates the visualizations with the music and is controlled by t he performer in real time Functionality of the environment in clude s real time control using external controller s re active control using a microphone or audio effects and specific control using automation from specific sounds within the music performance. By incorporating direct and automated control into the system, I established a technique that is functional and fun to play. Using t echnology and innovative design I've introduced a viable option for artists to express themselves music ally and visually, immersing the audience in an interactive and dynamic musical experience control led exclusively by the artist using one computer The form and content of this abstract are approved. I recommend its publication. Approved: Leslie Gaston Bird


! # TABLE OF CONTENTS CHAPTER I. THE NEED FOR AFFORDABLE INTE GRATED MUSIC ENVIRONMENTS 1 II. DES I GN AND BUILD OF A PROJECTION MEGASTRUCTURE 6 Stage Design and Build 6 Nudge Jitter and Interactivity 9 Testing and Pre Produc t i on 14 Steps to Prepare Software for Performance 16 III. THE USEFULNESS OF DYNAMIC VISUALIZATIONS WITH MUSIC 17 Overall I mpression of Integrated Environment 17 Effectiveness of Environment for the Observer 1 8 Effectiveness of Environment for the Performer 1 8 IV. REAL WORLD APPLICATIONS 20 Multiple Applications vs. Jitter Only 20 Future Research Sensors and Arduino 20 Significance of this W ork 22 REFERENCES 23


! $ CHAPTER I THE NEED FOR AFFORDABLE INTEGRATED MUSIC ENVIRONMENTS Introduction The goal of this thesis project is to design and build an integrated performance system so that individual performer s can express themselves affordably using "off the shelf" technology in an innovative way There are a few artists including Amon Tobin ( %&&'()**#(+,-./01-2(3456*'.57/4&(*-658 9 &52"8 9 "(-6* ) and Richie Hawtin ( ) that incorporate immersive 3D visual s synced to their music (VSquaredLabs, 2016; "Plastikman," 2010) T hese productions are large scale, are staffed by teams of designers, 3D animators, engineers, and road crews and in many cases they design their own software for the ir heavily produced sho w These large scale immersive productions are very effective but they are expensive to produce and the artist doesn't have complete control during the show Throughout my research I have investigated many night clubs, bars, YouT ube channels, and theaters and found that many venues and artists that incorporate visuals into their productions use prepared graphics th at are not synced to the music. Here are a few examples of music and visuals that are not in sync. As you can see, often the music is completely out of step with the visuals. Example 1 : Example 2: Example 3: v=SCWQkdmNw38& I notice this detachment mostly in nightclubs. The nightclub will go to great expense to fly in musical talent to perform but will use the same ineffective visuals they use for every show.


! : These automated graphics technique s can be effective for a short time but eventually the detached nature of the visuals can be distracting and take away from the show The i ntegrated environmen t I create d to battle these distractions can be set up by one person and ready for show time in one hour. This cuts down costs and makes the production easily setup between other acts. During the musical performance t he artist produce s arrange s and control s every aspect of the graphics and music with re al time processing The live performance visuals environment hosted by Ableton and processed with Jitter, allows the performer to precisely sync the music to the visuals using a udio reactivity, automation, MIDI and onboard effects like EQ's and side chain compressors T his is an interactive env ironment a music patron can experience not just watch and listen. U nlike expensive festival technology, my setup can be replicated for about $3000 or less using one computer This makes the technology especially attractive to solo artists and producers w ho want to give their fans an extra experience without taking out a loan or buy ing additional computer systems This project although designed for the stage, can be scaled down and easily used in small settings like coffee houses, bars, theaters, or schoo l art installati ons. This gives the public the freedom to experience this innovative tech nology outside the crowded and expensive festival environment. Image 1 : Test Day 1, Februa ry 27, 2018, King Center.


! ; There is ample literature and publications on projection mapping, interactive environments, and live performance control but few that incorporate all these topics into one project. Projection mapping is a relatively new and popular technique to engage audiences at events and to create a spectacle t hat customers will remember. For example, mentions this early use of the technology used in an innovative way, "the famed Tupac Shakur hologram' at Coachella a few years ago was really a more advanced application of projection mapp ing technology that used hidden projectors and invisible surfaces to cre ate the illusion of a hologram (Pixel and Projection Mapping Can Transform Any S urface into a S creen f or Dynamic Audience Engagement, 2018). Early researchers also implemented a proje ction mapped David Bowie at an event in Washington D.C that reportedly produced an impressive image (Pixel and Projection Mapping Can Transform Any S urface into a S creen f or Dynamic Audience Engagement, 2018) These projection mapping techniques by early r esearchers illustrate how the technology can bring a static space to life but they neglect real time control and interactivity of the technology Interactivity is also an important aspect of this portfolio project. With the growing popularity of DIY computers, Arduino, and Raspberry Pi, m odern publications are documenting the rise of interactive spaces at music events. One of the new technologies allows venues to incorporate interactivity into their wristbands to track the actions of concert goers and to enhance the musical experience. But interactivity will not solely be the purview of wristbands in the concert of the future. Even the light show will allow for audience participat ion. Event technologies company Cantora is developing a n as yet unnamed software that takes the ebbs and flows' of an audience to alter the physical venue (Rosenfeld, 2014)." Rosenfeld documents


! < the interest in sensors and interactive light control at music events back in 2014, but they stop short of incorporating thes e techn ologies into one dynamic performance system. Incorporating audio/video live performance control into one dynamic system that functions like an instrument itself, run by one computer, was the biggest challenge on this project. Operating the environment like an instrument, in real time, expands creativity and allows the performer to express emotion in a new way. It can be stated here that it is possible to create a musical instrument materializ ed in an immersive virtual environment that can open up new ways of creative thinking and for musical performance, which allows for the use of traditional, actu al and new musical elements and responding to different kinds of users with different skills and different artistic visions (Valbom, Marcos. 2005) Similar to my portfolio project is a project called WAVE, which incorporates live performance control into a virtual reality (VR) environment. Unlike my portfolio project which uses a midi controller, automation, and audio reactivity using one computer, WAVE uses an innovative gesture control system with 3D sensors using multiple computers. The system is based on gesture interaction with real time sound synthesis and sample playing, while preserving sound quality and creative freedom, together with facilitated access for wider audiences. WAVE integrates a structure for playing, creating and modifying musical an d sound structures depending on the overall context of the intentions and intrinsic purposes of the users, as well as their musical skills. It allows cognitive real time musical deci sions using a low cost system (Valbom, Marcos. 2005) ." The developers of WAVE designed an interactive immersive experience for one person and one person only, using VR and multiple computers. While this is an amazing feat, I chose to create a more interactive environment for a crowd of people, using only 1 computer.


! = My projec t documents how to construct an efficient and integrated music and visualization system that operates like an instrument and is fun to play. By designing and building original components for functionality and visualizations and by using current applications i n innovative way, I introduce a technique that is scalable and efficiently runs from one modest computer the early 2011 17" MacBook Pro SSD, with 16g RAM.


! > CHAPTER II DES IGN AND BUILD OF A PROJECTION MEGASTRUCTURE Materials and Methods Stage Design and Build I designed and built an integrated performance environment consisting of a 20 ft. wide X 9 ft. tall stage structure using 27 white shipping boxes My initial idea was to have some type of refle ctive projection surface that could span the width of a stage and rise high above the performer Image 2: Initial idea. After testing several materials for effective projection mapping including white stretch mesh, plain white sheets, and white boxes I concluded that the wh ite boxes are the most effective and easiest to transport. I first tested white stret ch mesh and white sheets and learned that I might also need a tinker toy type skeleton structure erected on stage to stretch or mount the fabrics. These type of structures can be expensive, very time consuming to build, and heavy during setup and breakdown. In addition to this extra expense for the structure, I also discovered that these


! ? materials weren't reflective enough for my purposes. I needed to have HD capability and these soft surfaces did not provide the resolution I needed to create an immersive experience. I next tested plain white boxes. Fortunately I found a local supplier Centennial Container, that sold white boxes so I bought 10 26x 26x20 custom boxes and started testing in my kitchen. Boxes can be heavy for shipping, so the local supplier was key to the design and keeping expenses affordable. After extensive testing in my small testing area I hoped that the boxes would scale up effe ctively for the performance at the King Center Recital Hall T he boxes seemed very reflective in my small space and I hoped that they would perform in the same way for my large scale experiments Another benefit of boxes is that they fold up flat for easy transportation and they already have a structure for easy stage manipulation and projection Once I decided on the white boxes I was then prepared to scale up the number of boxes for the full stage show I initially bought the custom boxes because they were overstocks and cheaper for testing purposes but when I went back to get more of these specific boxes they were out of stock Due to this situation, I was forced to get other sized boxes. In the long run I think it worked out bett er because the different sized boxes add depth and shape to the stage structure. Since they were out of my original boxes, I bought an additional 20 24 x 24x24 single wall white boxes to be used on either side of the performer and 5 24x24x48 single wal l white boxes for the top of the structure and front of the stage The original 10 26x26x20 boxes make up the core of the structure with five boxes stacked on either side of the performer 48" apart This core structure is topped with one of the large 24 x24x48 boxes, enclosin g the artist on the left, right and top. The other 20 boxes were deployed on either side of the artist, 4 boxes, then 3, then 2, etc. for a stepped effect, situated in a semi circle stretched across the stage Once the materials


! @ were chosen and the megastructure was purchased, testing at the Recital Hall and construction of the full structure could begin. The beauty of boxes, and especially the different size boxes, is that you can experiment with any configuration you can imagi ne. I chose a stepped effect, but you can experiment in any way you want with any shapes you can imagine I designed the stage setup for The King Center using only one box stage configuration, b ut depending on your needs you can have several different con figurations pre planned depending on the venue. The portability and endless mapping configurations of the stage give s the artist options and make s the structure fun to use and easy to transport Image 3 : 20x10 stage structure folds into 3x6 for transport.


! A Nudge Jitter, and Interactivity The three major software challenges of this project were beat matching in Ableton, coordinating the graphics from Jitter to the projection mapping technology using one c omputer and controlling the system in a dynamic and fun way for the artist. As a DJ, my main technique for performing is beat matching. T his is accomplished with DJ software with at least two listening decks. B y listening to digital turntable deck A on the main sound system and matching the beats per minute ( BPM ) on the digital turntable deck B in my headphones I can match the songs tempo s for easy mixing I then gradually raise the volume from deck B into deck A as the songs begin to mix I t hen slowly bring the volume of deck A lower, for a seamless transition. By default, Ableton Live does not support beat matching as a technique for live performance. Fortunately, Max 4 Live hosted by Ableton, allows the user to accomplish tasks that are no t normally accessed in the Ableton software. For this dilemma I designed a Max 4 Live patch called "Nudge" that allows the user to access 2 values that will allow for effective beat matching in Ableton, the global quantization value and the clip nudge val ue. Normally, the clip nudge value is dictated by the global quantization value which is normally set to 1 bar for easy cueing of clips This Ableton default setting is not usable for beat matching because each press of the nudge button moves the cursor i n the audio clip by 1 bar instead of the 2 3 ms. required for beat matching B y using the Ableton API object in Max 4 Li ve, I re classified the 1 bar global quantization value as a toggle that can switch between 1 bar and 2 3ms, and I assigned the toggle to a button on the Traktor F1 controller for real time BPM nudge control This adjustment allows for precise beat matching by slowing or speeding the clip by 2 3 ms


! $B Image 4: Nudge : apply vinyl like nudge to audio clips The next challenge was to efficiently manage graphics creation and projection using one computer. For efficient management of interactive graphics, the visuals environment was designed using Jitter. Jitter is the graphic s software included with Max 4 Live and is hosted inside Ableton Live for efficient graphics processing Ableton Live and is the centerpiece of this project, managing all audio and video acti ng as a host for Max 4 Live, Jitter and projection As a novice to Jitter and graphics I focused on three areas of interest : learning Jitter, taking apart existing Jitter patches to see how they work, and adding usable user interfaces to control patches in unique ways. ( "P atches are programs that you d esign in or that come with Max 4 Live ) I started my Jitter learning adventure by working through and taking notes on all the t utorials that come with the Jitter documentation inside Max 4 Live (Cycling 74, 2014) The onboard tu torials are a digital textbook of information with interactive tuto rials and example patches that help you learn how the software operates I completed several tutorials online that helped direct my vision for the graphics and really understand the inner workings of several complex patches. For this project, I design ed tw o original graphics patches and made considerable improvements to one other Max 4 Live device called Ganz Graf, graf


! $$ futuristic visu als max live/ Not every programmer is a graphic designer, so it's important for programmers to study and explore existing graphics patches to see how other experts accomplish tasks. Sometimes these patches can have little nuggets of ideas that can blossom into something that you make your own in your own patches. For example, the original Ganz Graf patch has only six parameters, a patch On button, a color On button, and X and Y axis positions (which are for info only and useless to control). The patch is s tatic with some audio reactivity. You can hover your mouse over the image and move the image. which looks really cool, but nobody wants to use a mouse while performing. For this patch, I added several parameters for more control and interest. I added an LF O to the color ON/OFF as well as a way to actually automate the color to other colors besides red. I also added LFO's to the X/Y position of the graphics by locating the functional object that controls X/Y inside the patch. The LFO's are preset at differen t speeds to be able to move the graphics quickly or slowly in any direction at the touch of a button, in sync with the BPM. This new X/Y control eliminates the need to hover the mouse over the image to move it and creates multiple automatable parameters th at didn't exist in the original patch. The 3 patches I used for this project can be used on any track inside Ableton Live and most of their parameters can be automated or assigned to external controllers The graphic user interfaces for the patches can be controlled with an external MIDI controller for full control of the performance graphics in real time. I used the Traktor Kontrol F1 controller, giving each patch multiple assignable buttons, knobs, and sliders for full dynamic control throughout the perfo rmance


! $: Image 5: Modified Ganz Graf patch with added functionality. Projecting the graphics in an efficient way using one computer was the next big problem to solve I initially experimented with building the entire graphics and projection system using just Jitter. But, after preliminary testing this technique slowed down my computer considerably and once I added music it made the project un workable I spent several weeks stuck trying to f igure out how to make the visuals and music run more efficiently using just Jitter, but I couldn't figure it out. I am convinced that the entire system can be run exclusively through Jitter but it will take further testing to design a working system. To wo rk around this processing issue I needed to do two fundamental things to make the system work share images between applications in an efficient way and project images to surfaces in an efficient way Efficiency is key because I am also performing the musi c from the same computer so I need all the application to run optimally for the system to perform properly. Finally, after weeks of deep internet searching, I found an efficient way to share images between applications called the Syphon Server. The Syphon Server is an open source application for sharing images between application s, similar to how Soundflower shares audio between applications This was the firs t fundamental application I need ed to start the full project and to make simultaneous music performance possible For my purposes, the Syphon Server came with a Jitter plug in (


! $; syphonserver) that allow ed Jitter to communicate video output to the Syp hon Server For performance purposes, I tested this plug in extensively and I couldn't break it, it just works. I tested the Syphon Server running 3 patches on 3 different channel st rips in Ableton. I ran it for sixty hour s straight with no issues. I also plugged and unplugged the Syphon/Jitter object in unusual ways and the Syphon Server application receiving the signal never failed. This testing showed that the Syphon Server was stable for my purposes, it usefully organized videos that used the Syphon/Jit ter object, and it is efficiently shared the 3 patches I used for the project with the projection software, Madmapper The next piece of the puzzle was to find pr ojection software that could communicate with the Syphon Server After investigating options for projection mappin g, the first product that I tested was the one that ended up working the best called Madmapper. Not only was Madmapper advertised with ultra fast graphics it was a "Syphon enabled" application meaning it was already coded to receiv e images from the syphon jitter plug in According to their website, Syphon has partnered with many companies including software development, creative coding, live coding, game engines, and VJ software for their unique and efficient image sharing capabiliti es. This capability w orked perfectly for my purposes. T hrough extensive experimentation, the sharing software a nd the mapping software communicated optimally for me to be able to perform music and visuals simultaneously in a live setting. Finally, I need ed to implement interactivity for the artist. I wanted to make the system fun to use but not too overwhelming for the artist The artist could choose between direct control using an external controller or reactive automated control using several different tools. Direct control was intended to use between songs so the artist could concentrate more on the visual performance than the music U sing a Traktor F1 controller with pads, knobs, and sliders, the


! $< artist has precision control over multiple parameters R eactive automated control of the visuals was intended to use when the artist was concentrating more on the music. The v isuals were automated on individual tracks using audio and a later update added an envelope follower, midi control, and an EQ for more precise control of individual sounds By using audio to automate visual events, the artist emphasized particular melodies, harmonies, and basses. An envelope follower and EQ was added and used to fine tune particular sounds used for visual emphasis. Midi c ontrol was also added and utilized the artist was able to play the visuals like an instrument on the Push 1 controller. This added functionality gave the artist more freedom to perform music while also having some control of the visuals at the same time Pre Production and Testing After assembling the appropriate patches and buying pre p roduction materials, I received four testing and performance dates at the King Center Recital Hall. The testing dates were from 2 6PM on 2/27/17, 3/15/18, 3/29/18 and the final performance date was 4/13/18. I used each test date to set a goal for the project. The first test goal on 2/27/18 was to assess the overall materials and usability of the full structure in that space. After I built the structure for the first time I was surprised at how good the graphics looked and at how big the structure feels in the space. I projected the images at 1080p HD and the images were crisp and detailed. The structure spanned almost 20 feet across the sta ge and took up most of the stage. At 9 10 feet tall it was an imposing structure As soon as I turned on the visuals, it pulled me in and made me want to watch the graphics unfold The first configuration of the boxes was a square with boxes at different depths, similar looking to music studio sound mitigation. The system worked but I wasn't getting the visual effect s I wanted. The visuals looked stunning and the Jitter patches were performing optimally, but the boxes just weren't arranged properly It loo ked too much like


! $= a movie screen After testing and arranging in Madmapper for two weeks at home, I came up with the current stepped configuration which I started testing on 3/15. For this 3/15 session in addition to fine tuning the stage setup, I also t ook the opportunity to test the three Jitter patches I was preparing I tested the visuals patches by running the visuals and the music at the same time and trying to overload the system by adding tracks, clips, or additional graphics patches. This way I was able to test the efficiency of the system as a whole and also test its limits. After careful testing it appeared that the system could efficiently run the 3 patches and my music files for the final performance. I also learned that the computers limi ts were 7 visuals patches and 10 tracks running music at the same time. Since I never approached these limits during my testing I didn't have to worry about approaching the limit for the performance For the last test date 3/29/18 I worked on finalizing the functionality of the Jitter patches and started more precisely shaping the music content for the final performance. I also took this opportunity to u se the entire stage setup on stage instead of testing the visuals from the audience. This was a good ch ance to get a feel for the environment and learn how to interact with the controllers and the visuals from the stage. For the final testing performance on 4/13/18 the system behaved as designed. I learned that this complex system requires much more practic e than one full day to utilize its intended functionality and to generate meaningful creativity. After the performance Professor Lorne Bregitzer demonstrated how to implement Ableton Live's Enveolpe Follower plug in on the visuals Max patch. The plug in a llowed me to have more precise automated control of the patch and added significant improvement to the synchronization of audio and visuals A notch EQ filter was also added for precision selection of specific sounds and MIDI control was implemented to enhance to effectiveness of the visuals patches.


! $> Steps to Prepare Software for Performance Software required: Ableton, Max 4 Live, Madmapper, and Syphon Server. I assume the reader has experience installing and using these applications. 1) Open an Ableton Project 2) L oad a visualization Max 4 Live Jitter patch with the Jitter plug in [ syphonserver] obje ct attached to the video output 3) O pen Madmapper or any Syphon enabled mapping software the input source choices f or Madmapper shows the l ive Jitter feed from the Syphon Server 4) In Madmapper, map the live Jitter feed to any projection surface in the interface. 5) You can now perform gr aphics and music through Ableton and projection map through Madmapper.


! $? CHAPTER III THE USEFULNESS OF DYNAMIC VISUALIZATIONS WITH MUSIC Results Overall impression of Environment After my first successful test of the immersive environment on February 27, 2018 in the King Center, I was surprised at the usability and effectiveness of the environment. The white surfaces of the boxes reflect ed the images brightly even with the house lights on. I initially thought that I would need the room to be nearly blacked out duri ng the show but the surfaces were so b right that the house lights could be safely dimmed during the show. I was most impressed by the size and scale of the projection surfaces. The boxes take up nearly the entire width of the stag e exposing the audience to stunning visuals. During testing with the lights off, at times I felt a sense of vertigo as if I was inside the graphics, similar to a roller coaster. My overall impression is that the more precise control an artist has of their immediate space, the better. Stunning graphics combined with artist dynamic control creates an inte ractive music environment that' s fun to use while performing and exciting to watch from the audience I mage 6 : Test Day 1, King Center, Feb. 27, 2018.


! $@ Effectiveness of Environment for the Observe r To gauge the effectiveness of the environment, I created a short 5 question anonymous survey and I got 4 students to participate. I showed the students two minutes of automated music with visuals and another t wo minutes of fully controlled music and visuals through the interactive environment The results were overwhelming for the dynamically controlled system. The questions were presented with a scale of 1 to 10, 1 being least. The survey questions were as fo llows: 1. Was this video engaging. 2. Did this video keep your attention? 3. Was this video interesting to watch? 4. Would you pay to see this? 5. Did you like the performance? The results of the anonymous survey were overwhelmingly favored to the dynamically controlled system in every category. This feedback leads me to conclude that there is an interest and attraction in more interconnected and dynamically c ontrolled entertainment systems Based on these f our student s observations, t he effec tiveness of this environment from the observer s perspective is very encouraging and I believe immersive interactive environments are worth pursuing further Effectiveness of Environment for the Performer There are five questions I ask whenever I try a new piece of gear or instrument and these questions translate to this project. 1) Is the instrument intuitive? This means is the design concise, are the buttons where they should be, do t he controls do what is expected?


! $A 2) Is the controller sensitive ? Do I have to make excessive motions or movements or small movements? 3. ) Is it hard to understand or use or too dense to keep me interested? 4) Is it expressive? Can I express any emotion? 5) Did you have fun? After answering these questions about the finished system I've conclude d that this integrated enviro nment was very effective in all these categories. The intuitiveness of the controls made the system fun an d exciting to use, the graphics were compelling, and th e expression of emotion was clearly conveyed.


! :B CHAPTER IV REAL WORLD APPLICATION S Discussion Multiple Application s vs. Jitter Only T he technique I use to share images between applications makes use of several applications but it may be technically possible to run everything through Jitter Jitter can technically do the same functions that Madmapper execute s but my preliminary testing with Jitter led me to use the more efficient video handling and image processing in Madma pper My mission was to evaluate the market and design an interactive music and visuals environment that anybody ca n build. It would be satisfying to figure out how to make Jitter projection map as efficiently as Madmapper, but time constraints would not a llow me to pursue the question. Fortunately, there was an alternative applications called Madmapper that focused on my direct needs including pro jection management that Jitter couldn't provide By choosing Madmapper for projection management, I've demonstrated that sometimes it's more efficient to spread the chores of a complex interactive system to multiple applications for more efficient handling of information. Although it may be possible to do all the tasks in Ji tter in my opinion, there isn't enough memory space in my modest laptop for Ableton to handle music, visuals, video, and projection. Although I did not prove this, the prel iminary tests I did with Jitter tells me that it's highly unlikely that all the med ia will run without error on one computer. Future Research Affordable 360 degree projection mapping immersion is on the horizon. I wanted to include a working 360 model in this project but the expense was too great. A 360 degree project would be possible with additional funding to purchase projection surfaces and additional


! :$ projectors Additionally n o t only do you need the boxes, you need a way to transport the bo xes a n d you also need the space to build a 360 environment for experiment ation This was my biggest deterrent for sca ling up to the 360 degree presentation My portfolio project shows the effectiveness of an interactive audio/visual environment expanding it to a 360 degree environment would enhance the experience even more and would be highly de sirable for future gamers, ravers and marketing projects Art Installations burns, and school projects are a perfect use for 360 mapping technology Madmapper offers a product called miniMAD that coordinates and mixes multiple projectors efficiently. Th e really unique thing about this type of presentation is that you can use any size boxes or screens you want depending on the space you have available and your ima gination. Art installations, burns, and school projects often have limited space and the adva ntage of projection mapping is that it can be as big or small as you need including 360 projection T he added benefit of Ableton integration and midi control makes this technology adaptable anywhere to anyone in small or large environments. I never get tired of looking at interesting graphics presented in a unique way with integrated music I hope students can use this portfolio project as inspiration to create interactive environments in the future and to expand the discussion about art and technology in society Another great direction for future research is the integration of sensors and embedded controllers. This is a n under explored topic for future exploration. Ableton is not only Jitter friendly, it is also Arduino friendly. My current project is automatable and controlled by the Traktor Kontrol F1 controller but t he next logical question for the digital explorer is, what else can I use to control this environment ? Arduino gives us the option to use many different types of sens ors that can be embedded in objects or instruments for the audience to use. The audience


! :: could control the music or the visuals or both (Rosenfeld, 2014 ) It's an interesting and deep concept and could be investigated endlessly. When successfully implement ed, t hese additional sensors could serve as another point of control, really getting the audience involved in the show on a personal level. You could decide to give the audience control of the show; wouldn't that be cool? Significance of this W ork The significance of this work is that I've designed, assembled, and documented an affordable technique for an interactive projection mapping environment for a solo artist using one computer This technique can be used in clubs, stadiums, theaters, and libr aries by musicians and visual artists. The entire structure is easily transported in a car and assembled by the artist in one hour. This type of technology opens up a whole new a venue of expression for artists. W hat once was too expensive and cumbersome is now affordable and transportable.


! :; Reference s Pixel and Projection Mapping Can Transform Any Surface into A Screen For Dynamic Audience Engagement. (2018). Retrieved from factor/pixel projection mapping can transform surface screen dynamic audience engagement/ Rosenfeld, Everett. (2014). Future of concerts: Social wearables and interactive light shows. Retrieved from of concerts social wearables and interactive light shows.html Valbom & Marcos. (2005). WAVE: Sound and music in an immersive environment Computers and Graphics, 29(6) 871 881 Retrieved from https://www sciencedirect origin=gateway&_docan chor=&md5=b8429449ccfc9c30159a5f9aeaa92ffb&ccp=y Cycling 74 (Producer). (2014). Jitter Tutorials Table of Contents [Online documentation]. Retrieved from tutorials/jitindex VSquaredLabs. (2016). Amon Tobin Isam. Retrieved from tobin isam/ Plastikman RA's Top Live Act of 2010. (2010, December 8). Retrieved from