This project develops novel quality-adaptive real-time media-flow architecture to support sensory/reactive environments. Arts, Media, and Engineering (AME) Center at Arizona State University houses an advanced performance stage, known as the Intelligent Stage that allows performers to have novel interactions with their environment through various sensors and actuators. Within this framework, we develop an adaptive and programmable media-flow ARchitecture for Interactive Arts (ARIA) that enables design, simulation, and execution of interactive performances. ARIA interfaces with real-time sensing components and it streams various types of audio, video, and motion data. It extracts various features from streamed data, and fuses and maps media streams onto output devices as constrained by the Quality of Service (QoS) requirements. ARIA provides the choreographer with a visual design tool to describe the structure of the media-flow network, where the nodes correspond to sensors, quality-adaptive computing elements, and actuators. We develop novel media-flow management techniques including real-time and QoS driven media sensing, feature extraction, and data and media fusion algorithms. The impact of the project includes a better understanding of QoS-based data and media-stream management. ARIA offers a new medium that allows artists to integrate novel sensing, interaction, content, and response mechanisms in staged performances. ARIA also acts as a research-/test-/development-bed where engineering and arts students learn and experiment with various aspects of multimedia stream management. The results obtained by this project will lead to a better understanding of quality-based data and stream management issues and will therefore influence the development of novel sensor data management technologies. The project web site is http://aria.asu.edu. ARIA is being disseminated to the public through the Arts, Media, and Engineering Center at ASU.