The sounds of my action — An interaction design project

Allwin Williams
8 min readMay 4, 2020


Do you know that every one of your actions, even the tiniest, have a consequence?

Photo by Ahmad Odeh on Unsplash

Are you listening to those events which are a result of your actions? What if you can hear the sounds generated by the movement of your hand over the space. This was the motivation behind when I started working on this project.

The first step

I have a computer science background and some knowledge of music, but still, this kind of project was entirely new for me. The first step I did was to search for what are the requirements. On the hardware part, I thought of getting some sensor and I didn’t know which sensors to use. After some quick online search(asking dumb questions to google), I decided to use accelerometer and gyroscope sensors.

For those of you who are wondering, an accelerometer is a tool that measures acceleration. Acceleration is the rate of change of velocity of a body. A gyroscope is a device used for measuring or maintaining orientation and angular velocity. (Courtesy: Wikipedia)

Don’t understand🤨 any of the sciency words above. Don’t worry. Well, I too don’t understand it😅. Anyway, it’s just a copy-paste from the godfather Wikipedia for people who want textbook definitions(not me). Here, let me break it down for you based on my understanding. The accelerometer gives me the change in velocity (change in how fast you move), and the gyroscope gives me the orientation(like the angle of an object but in 3D). Both are in 3-dimensions(x, y, and z coordinates).


I was searching online for sensors and got an accelerometer. It wasn’t working well and I found apparently, it’s a low-quality one(poor me😔). The good ones are too costly for me and exceeding my budget. But, I remembered that these sensors are available on my phone too. I just installed some app to check whether they give the values the way I want them to. It was good, the values were exactly what I needed for both accelerometer and gyroscope. Wow!!

I got the sensor I wanted, and yeah, it’s just my phone📱. But good enough for me to move forward. Not bad, right?

Generating sounds from numbers😳

The kept my phone/sensor aside for some time(both figuratively and literally). I sat on my laptop to figure out a way to generate music from custom numerical inputs. This part too was tricky considering its a 2-week course and I have to exhibit my work but the end of that. So, I asked Google (once again) and it told me to use PureData (or just Pd) which is an open-source visual programming language for multimedia. This was my first time using a visual programming language and didn’t know this thing existed before that day. I started to generate music from basic oscillators from some examples online. Oscillators — Straight out the physics laboratory, these things basically generate sine waves it seems. Those curvy lines in graphs?? Still don’t remember? That’s fine. An oscillator is things that produce sounds(actually waves which sounds😅) whose frequency I could change! It sounded like “Whee Wheee Whhee Whaaaww” (based on the frequency). It seems that’s how a pure sine wave sounds like — Like a siren. Anyway, it’s my first win in this project. A tiny one, but still a win I would take. Excited!!!

The lesser I know about the path that awaited me.

The next part was to have custom number inputs and generate sounds based on that. I figure it out somehow, but it was too bad to hear. Nothing made sense.

Better sounds with better software

Then, I figured out to send the notes from PureData as MIDI(Musical Instrument Digital Interface) notes. This way, I can use any one of the sound editing softwares out there. They are called DAW(Digital audio workstation) — just a fancy nice name, DAW (duh). I tried to use these softwares, to produce sound from the MIDI notes. The digital format of musical notes is called MIDI(just like JPG or PNG for digital images, MIDI is for digital musical notes and can be played by digital/virtual instruments). But this needs an IAC(Inter Application Communication) channel which is basically similar to having wire connection between different software inside a same computer, in this case between PureData and Logic Pro X, a DAW software that I’m using for this project. As soon as I set up the IAC channel and mapping them with instruments in Logic Pro, the notes computed from PureData where received by Logic Pro and played in a virtual instrument. That sounded far better. Like way better!! I started crying as soon as I heard it. It was soooo goooooood!!! You should have seen me at that time. I was jumping around the room.

Still, the project is far from completion but I was determined to complete this no matter what by then.

Continuing with phone

So now, I have an algorithm to convert some numbers into sounds. But yet, getting those numbers from sensors in a phone is a task I haven’t worked on. So, the first thing I did was writing a python server in my laptop to get sensor values and sending it to PureData. Why Python? Because I’m good at that and used the simple server setup for some other projects.

Next was to building an android app to send the sensor values to the python server in real-time, which too needs to be done in Java. I had a course on Java during my bachelor’s but I won’t consider myself good at it. For the most part, what I have done was copying codes from various locations online and putting it together as an app. It took a solid 3 days to work. But it worked🥺. Finally.

OMG, it worked!! It started sending values and I could see them in my python server. Where is my trophy!!!

I’ll tell you how it works. It’s just that they both (my phone and laptop) need to be in the same network(or my laptop connected to my phone hotspot, which is what I did for the most of development). Then, I’ll have an IP for my laptop and an endpoint for my server running in it. So, I gave them as input in my mobile app code. This way, the app will be sending an accelerometer and gyroscope reading constantly to the endpoint where my python code on my laptop will be constantly reading it. Great, right? Yes, it was.

Finishing off

Finally, all was working. Sensor values from phone to python. Python to PureData. PureData sends computed MIDI notes to Logic Pro X. Logic Pro X plays those in a virtual instrument. But the indented work of PureData wasn’t what it was doing. So, I simply removed it and wrote the same logic that it was performing in python itself. One less thing to worry about. Sigh.

The final one thing consisted of three things, an android application(to send sensor values), a python server code(to receive and compute music notes based on the values), and a Logic Pro setting(I just mapped the IAC channel and picked an instrument to play it).

A basic diagram representing how the whole thing works

Oh, and a name. I wanted a name for the project. The working name was AppSense. But this got evolved in so much and I wanted a name to represent the concept of it, not just as an app. So, I decide one quickly, the day before exhibit — “ACTunes”, a funny one thinking about it now. But, it did make sense, like act(or action) to make tunes. This name to me somehow gave a generic vibe that can be used if it gets evolved without any direct app(say sensors and stuff).

The exhibit😬😎

Was the exhibit exciting? You bet. It was sooooo exciting!!!! There were all kinds of displays by many people on a range of electives. And mine too was there😁😎. With my exhibit. I had kept my phone in front of my laptop and speakers. People could lift it up and start to move it around to produce sound. Believe it or not, people just loved it. Some even took my phone and started dancing around the place. Yeah, you can say it was worth the effort working those two weeks. Felt so nice.

A demonstration from the exhibit

Yes and that’s me in the video!!

It was a wholesome experience for me as well as all others who attend the elective there.

Going forward…

What is next after this? I have given it some thought lately. The first thought I had was a band with sensors on both hands. So, the hands will be free and two hands can create two different sounds at the same time. But later, I got to know about Kinect Sensors and similar ones. Kinect is a line of motion sensing input devices produced by Microsoft. It makes mapping the space and tracking the human skeleton easier. So, the next thing is that. No phone, No band, just a sensor at a distance to track the hand movement using motion tracking and generate sounds from that. Let’s see. I’ll post a continuation to this story once I complete that story.


A mobile app to get and send sensor values to a python server running in a laptop in the same network as the mobile. The python server processes the data from the mobile phone using mathematical functions and generates a MIDI pitch (musical note). It is sent to an IAC channel. This can be accessed by any musical software to play sound. In this case Logic Pro X, a DAW is used to play the generated notes.

Feedbacks are very much appreciated.😁

The code for the project can be found at:

Android application:

Python server:

This project was done as a part of 2-week Open elective module ‘R U Listening?’ during January 2020 offered at National Institute of Design, Ahmedabad, India coordinated by Mr. Hitesh Chaurasia as the course faculty and Mr. Dishari Chakraborty being the co-faculty.

Mr. Hitesh Chaurasia, course faculty for ‘R U Listening?’ helped to understand what is sound, how it works, and emotionally connect with sounds. He was extremely helpful in idea generation and setting up exhibits to display our work.

Mr. Dishari Chakraborty, our co-faculty with his intense knowledge in music helped us take a peek into the Indian classical as well as western music. He also helped in creating soundtracks for the display and in selecting instruments for the exhibit.

Along with them, all the course mates were very helpful and supportive of learning and exploring more about sounds by sharing their knowledge and having discussions over it.