AlgoMantra, b. 2005

1/f)))$VediCrystalpunk | CryptoTantrika > ./Swaha!!
OrganicPoetry
AlgoKreeda
AlgoYatra
Recent Posts
Archives
Contributors

Design : The Nimble Nimbus
Participants

Powered by Blogger Free Guestmap from Bravenet.com

Commercial Break
Friday, November 30, 2007
千の風になって: Become 1,000 Winds..
First, some Japanese Pizza by Pizza Hut:

Now with a full belly, to less serious matters.

The distinctly Oriental and exotic title of this post, since I'm not yet ready with anything new yet at the lab, comes from The Top 60 Japanese Buzzwords of 2007:

Sen no kaze ni natte (”become 1,000 winds”) is the title of a song performed by opera singer Masafumi Akikawa. Based on writer Man Arai’s translation of “Do Not Stand at My Grave and Weep,” an English-language poem penned by an unknown author, the song has sold more than a million CDs, making Akikawa the most successful opera singer in Japanese history. [More]


The Nipponese continue to entrance me with their super-sized human games based on video and computer gaming culture. Please watch the insanely hilarious video below of Human Tetris, and go to the original post if you really want to see the one with a sexy Russian девушка instead.



Nor do you want to miss a YouTube video of Melody Roads in Japan, described thusly in The Guardian:

The concept works by using grooves, which are cut at very specific intervals in the road surface. Just as travelling over small speed bumps or road markings can emit a rumbling tone throughout a vehicle, the melody road uses the spaces between to create different notes.

Depending on how far apart the grooves are, a car moving over them will produce a series of high or low notes, enabling cunning designers to create a distinct tune.

Patent documents for the design describe it as notches "formed in a road surface so as to play a desired melody without producing simple sound or rhythm and reproduce melody-like tones".
Monday, November 12, 2007
"parab0xx": a light-controlled musical interface


On 3rd of November 2007, AlgoMantra Labs hosted its first public demo for an exclusive audience (viz., after Cellphabet) of parab0xx. Please watch the accompanying YouTube video to see the installation as it was shown to about 25 people. It's a little dark, but you'll be able to see what's going on, and the sound is clear. We couldn't get better quality since the installation needed dim lighting to work.

News!
After we made it to Wired News, I have released the source code (zip) under a GPL license! Have fun.

What parab0xx is
The parab0xx is a software prototype that shows how you can interact with virtual images projected on a simple living room table, using a webcam. You can play games, make music, or even edit videos. In short, it is a kind of primitive computer that obeys the instructions of a simple LED (light-emitting diode). Alternately, you can use a luminous phone screen, a candle's flame, or even a piece of white paper as a cursor.

Hardware Equipment & Setup
We used a Compaq Laptop running Windows XP, an old Intel webcam (probably Chinese too), and a regular data projector. The projector and camera were tied to a wooden slab which supported the roof and the image was projected vertically on a waist-high black table on the ground. The webcam was aligned to read the image from the projector, as it fell on the table's surface. The setup works only after nightfall and in a dim, bar-like environment.

Software
The whole thing was made in Python 2.5, running on Windows XP. We will provide the source code as soon as it is agreed upon internally to do so.

Conceptual Details
The ten orange boxes are programmed to bounce off the edges, and move perpetually slowly in a straight line. Apart from this, they do nothing unless somebody messes with them. The blue box follows any singular source of light closely, and if there are multiple sources, it sits at the place where their geometric average falls.

Everytime the blue box touches any yellow box once, a 4-5 second sample of tabla is triggered and the orange box that was hit changes its direction(bounces off). Normally, if the blue box is also moving, you will end up triggering it multiple times (direction also changes that many times in a second). The way to trigger the sample only once is to put the blue somewhere in the path of any yello box, take away or switch off the light, and wait for one of the orange box to arrive. This system allows us to create an endless number of rhythms and beats from a very small, single sample.

You can even program this arrangement to behave in preditable ways, by placing pieces of white paper on the black table. When the orange boxes pass over any paper periodically, they start behaving as a source of light themselves! You can create some funky feedback patterns using this, and play a game of prediction and strategy, besides creating complex tabla rhythms.

Applications
What we have made is a type of human-computer interface, and the applications can therefore be in any industry where humans need ways of interacting intuitively with computers - aerospace, surgery, gaming and music are the obvious ones.

Why we made it
If you've seen the movie Minority Report, you'd remember Tom Cruise uses light-emitting gloves to control a screen while searching through data. We thought that wasn't really science-fiction and could be achieved today itself. Morever, we wanted to explore the costs involved in simulating technologies used by interactive surfaces like Microsoft Surface & the reacTable (used by Bjork in her Volta tour).

Of course, we did not have a touch-screen and millions of dollars in research funding, but decades of slavery to the QWERTY keyboard (thump! thump!) and Engelbart's ridiculous mouse was enough to motivate us. If Brian Eno wants more Africa in computers, so that they can interface with the whole human body (not just fingertips), we tried to put a bit of India into computing - the country of light and sound.

On that note, Happy Diwali & Season's Greetings!

For more information or business/media queries, write to: algomantra (((AT)) gmail.com

Labels:

Wednesday, November 07, 2007
CamPong 1.0: Adventures in haptic control
A few months ago I was toying with motion capture using the camera on my Nokia N70 and made a very primitive prototype of the classic arcade game Pong, but with only one paddle. I just thought I'll share the code now that I have a fast connection for a few days.

The camera simply tracks any black object in the periphery of the visual field, against a whitish background. The blue trace of the black object in the periphery drives the paddle. When the paddle hits the ball, the phone says "Pong!".

So it may not work everywhere, but it served as a stepping stone for our later, far more ambitious work - the Parab0xx (video releasing shortly), which we demonstrated in Bombay on Nov 3rd, 2007.

You can download the Python source code here. [LINK] Let me know if you have any problems with it by leaving a pheromone trail, err....comment here. You need to have Python installed on your S60, okay?


Some theoretical notes:

This essay by William Bogard places my recent work into a theoretical space I'm quite satisfied with:

Presumably, if man could see what touches him, it would ease his fear of it. Canetti, like Foucault, sees visibility, optical space, as a trap; what is observed can be known and thus controlled. But he notes another way that man loses his fear of being touched, and that is simply through being touched itself.


There are two easy strategies of making the cellphone motion-aware - 1) moving the phone itself and using its video feed as the control signal 2) moving an object to be tracked in front of the camera (this is the one I chose for CamPong).

The cool thing about 1) is that its a great way to communicate with the phone for any person holding the phone. Like every time you shoot down a spaceship you feel a vibration. Some of this trickery is already in the market (Wii?)

In number 2) there is the elegance of dance, like Tom Cruise demonstrates in Minority Report.

AlgoMantra Labs is currently very interested in using motion-detection techniques using a mobile phone, specifically those that do not involve the use of an accelerometer (the Nokia 5500 makes me dr00l, though). I do not expect gyroscopes to become ubiquitous among mobile handsets anytime soon.

While reading a paper by Drab & Artner I came across a mention of Mozzies, a game that was released with Siemens Sx1 in 2003, also using the S60 platform.

The Camera is used to detect the motion. The mosquitoes can be seen as they are placed on the live video feed from the camera. Aiming is done by moving the phone around so that the mosquitoes are at the cross hair.