Concept option 1: As I just finished my Code of Music midterm project, a live audio-visual generator, I am thinking about keeping refining it, not just sound wise, but also interface wise. Because as I was playing with my own web-based instrument, I was pretty happy with the overall sound and visual reactivity and complexity, but I was still struggling with making my live composition more expressive. Having the whole interface living in the browser indicates certain limitations. I was only able to control the audio and visual with three sliders on the screen, a mouse drag function, and a key pressed function. I was having trouble controlling multiple parameters at the same time because there is only one mouse. So I thought about why not make it a physical interface, not necessarily physical buttons and sliders like a MIDI controller, which has already been made probably thousands of times in other ITP projects. I wanted to create something that make people "dance", either some sort of games similar to DDR with visuals indicating what actions users should take to match with the sound in order to get points, or playing with the idea of "body as cursor", to have different user actions as input for different parameters, so as to generate sound and manipulate visuals accordingly, to have human body act as an instrument.
Concept option 2:
I found neopixels really fun and intriguing to experiment with from attending the weekly light art club since beginning of the semester. I have been thinking about making something with neopixels for final but I haven't had too much experience working with neopixels outside light art club, so I am not so familiar with its full potential and what I could achieve with neopixels. It can be an option, but right now requires more research and exploration on how to it interactive.