SixthSense is quite intriguing by the name but it’s a very simple
concept to understand. When I say simple dosen’t mean that it was very
easy to design and create for Pranav Mistry and his team. Pranav Mistry ;
a Research Assistant and PhD candidate at the MIT Media Lab. Before
joining MIT he worked as a UX Researcher with Microsoft and received a
Masters in Media Arts and Sciences from MIT and Masters of Design from
IIT Bombay. His education qualifications may go over ones head and so
will his unique inventions like the TeleTouch and thirdEye.
Moving on …… the ’SixthSense’ is a wearable gestural interface that
augments the physical world around us with digital information and lets
us use natural hand gestures to interact with that information and
the prototype is comprised of a pocket projector, a mirror and a camera.
The hardware components are coupled in a pendant like mobile wearable
device. Both the projector and the camera are connected to the mobile
computing device in the user’s pocket. The projector projects visual
information enabling surfaces, walls and physical objects around us to
be used as interfaces; while the camera recognizes and tracks user’s
hand gestures and physical objects using computer-vision based
techniques. The software program processes the video stream data
captured by the camera and tracks the locations of the colored markers
(visual tracking fiducials) at the tip of the user’s fingers using
simple computer-vision techniques. The movements and arrangements of
these fiducials are interpreted into gestures that act as interaction
instructions for the projected application interfaces. The maximum
number of tracked fingers is only constrained by the number of unique
fiducials, thus SixthSense also supports multi-touch and multi-user
interaction.
I would like to use this amazingly out featured and bragged about gadget
as it is going to have a great impact on the 21st Century. The
SixthSense prototype implements several applications that demonstrate
the usefulness, viability and flexibility of the system.
The map
application lets the user navigate a map displayed on a nearby surface
using hand gestures, similar to gestures supported by Multi-Touch based
systems, letting the user zoom in, zoom out or pan using intuitive hand
movements. The drawing application lets the user draw on any surface by
tracking the fingertip movements of the user’s index finger.
SixthSense
also recognizes user’s freehand gestures (postures). For example, the
SixthSense system implements a gestural camera that takes photos of the
scene the user is looking at by detecting the ‘framing’ gesture. The
user can stop by any surface or wall and flick through the photos he/she
has taken. SixthSense also lets the user draw icons or symbols in the
air using the movement of the index finger and recognizes those symbols
as interaction instructions. For example, drawing a magnifying glass
symbol takes the user to the map application or drawing an ‘@’ symbol
lets the user check his mail. The SixthSense system also augments
physical objects the user is interacting with by projecting more
information about these objects projected on them. For example, a
newspaper can show live video news or dynamic information can be
provided on a regular piece of paper. The gesture of drawing a circle on
the user’s wrist projects an analog watch.
The current prototype system costs approximate $350 to build.