It’s the final week my second-last and most busy trimester yet. Over the trimester I’ve learned and done a lot of interesting things, such as program optimisation and networking. I worked with ‘big’ data, and I wrote a couple of languages. I’ve blogged about most of the things I’ve done during the beginning of the trimester, but I’ve fallen behind a little for the last few projects, so this will be a bit of a catch-up.
One of the projects I’ve been working this trimester has been an eye detection library for use with unity. The brief of the project was essentially to write something that interfaced with an ‘exotic’ input device (I’m paraphrasing here, the constraints were pretty much “No keyboards/mice/gamepads”). Because the designers had to make an ‘affective’ game, and we had a learning outcome regarding interdisciplinary work, we all decided to write apis for their games. The project I joined required only information regarding whether or not the player was looking at the screen, which I initially thought didn’t sound too hard. Since webcams are pretty much everywhere (and since I didn’t think it’d be that hard) we (the other programmer on the team and I) decided to use OpenCV with webcam input to detect what we wanted. Unfortunately this took us quite a while to get working but eventually we managed to get someone elses project working and modify it to suit our needs.
Once we got the project to a point where it was useable, we set up a small server program that managed OpenCV and dispatched the appropriate data on request. We then set up a unity project with a client script that started the server as a child process and recieved data, and a sample script to show that the api worked. This went by pretty smoothly, the only hiccup being how strings were represented in C++ and C#, but we didn’t really need them so that wasn’t much a problem. Here is a test project using the api (in the video is Luke Savage, the programmer I worked with)
The next project I worked/am working on was/is a multi-user networked synthesizer, aptly named NetSynth. The goal was to create tool to allow possibly non musically minded people to jam. Instead what we ended up with was an awesome multiuser synthesizer that, while it was capable of sounding cool, also had a fantastic capability to sound terrible. The NetSynth client has a simple two oscillator, two envelope setup which allows for a wide variety of sounds. To make it easier to play ‘nice’ sounding things, we decided early on that each key on the keyboard should be mapped to a degree in a scale, rather than have each key set to a specific frequency or note. This meant that the scale could change at any point but the user wouldn’t have to change what they were doing like what they might if they were playing piano. Since there are three rows of ‘letter’ keys, we set each row to a different octave of the same set of degrees. This for the most part worked well.
Each oscillator has a waveform setting (of sine, square, triangle, saw or none), an octave shift setting, a detune setting and a pulsewidth setting which was only active for square waves. This is where most of the problem was. The octave shift had a range of [-2,2] octaves. That combined with the fact that the top row has an octave shift of +1 already, and that in a pentatonic scale, you pretty much get an extra octave per row, that meant that the maximum possible octave shift was +5. That’s annoyingly high. That combined with saw waves with random detune settings makes for some terrible noises. So work has to be done to make sure that it’s not as easy to do this.
I think it’s fair to say that it’s still not quite easy enough for non-musically minded people to use, but we still made something cool that can sound good with the right settings.