Arduino makers project - GET


GET is a project that connects your senses, usage and behaviours with your every-day-tools and mobile apps. GET is a bracelet which allows you to gain access and control over such tools through an invisible and intuitive interface which is so smart that it can recognize your gestures thus interpreting your will. GET let you the most easy and fast access to any notification with text and vocal contents, and guarantees to maintain your privacy while using very cheap technologies.

A crowdfunding campaign has successfully been raised to allow the team to finance and manage the project and therefore turn the prototype into a market-ready device. To have more detail about the project and this campaign head to Eppela's platform.


What have you made?

I developed a software which allows the user to execute the different functions of the device, in such a way to have a clear reading of incoming and outgoing values and in addition a reliable monitoring on their interpretation and execution. Everything is connected to the electronics installed within GET. Equally important was the realization of its design, which also in the creation of the prototype, which proved to be an essential step. GET represents the real aims at removing any sensor, screen or superfluous gadgets, to achieve the perfect balance between function and necessity. A design that might be so pleasant, balanced and wearable in all circumstances.

What gave you the initial inspiration?

I always found human–computer interaction to be fascinating. My will was to simplify and simplify, in order to minimise the impact of hi-tech devices in our daily uses. Thanks to this project I had the chance to test new kinds of interaction which do not require any physical connection with the object and to imagine an hypothetical future where the perfect interface becomes the body itself…

What is the original idea behind this project?

Originally it was an interactive installation in which the user could access to audio-visual content using his senses, but soon, I realized that I was only looking for a pretext to use the bones conduction. And this need and the collaboration/consultancy done in parallel, for another project, provided me the missing ingredients. In fact, it had arisen the need to add an intuitive access and non-invasive way to improve the experience and to add more interaction possibilities to the user.

The new goal became then, to create an invisible interface that allow access and control to N devices and N apps with the simplest gestures.

How does it work?

As for every mobile device, there is the possibility to access voluntarily in specific applications, functions or you can rather receive different notifications. Nothing changes for GET, which is planned "active mode" where the user requires the function and "passive mode" where it sends GET to the user its notifications.

Active Mode
1. Activation: Performing gesture (like a quick semi-rotation of the wrist, twice on the right) receiving feedback vibration, indicating the successful activation. Right now, the device waits for the new gesture-command to activate the function that the user wants to access.

2. Execution: When the new gesture-control (already preset) is performed and recognized, NEWS function it is activated.

3. Reading: The user lifts the arm up inserts his finger in the ear and will be able to listen to the notification in all privacy.

Passive mode
1. Receiving notification: customized vibration notifies reception a new Tweet.
2. The activation of the device in this case is automatic, and allows the user to easily access to the notification by simply lifting the arm or otherwise, as in the case to a call, to reject it with another gesture.

The device already has the functions determined, where a precise gesture goes to activate a preset application. In future it is planned to customize gestures and the selection of the app, as well as the proper function to perform.

How long did it take to make it real?

I worked focusing on the conception and functioning for one month, to achieve a simple, consistent and easy UX. Subsequently it took me a month of very deep research of materials, assembly, testing and defined creation of the prototype.

How did you build it?

The hardest task was to fit in the least invasive way the electronic parts. Being a pr prototype I did not have to create specific batteries or boards. The following components have been used : Arduino, Prototype board, Bone Conductor Transducer, Vibrating Mini Motor Disc, Amplifier, IMU - Inertial, Measurement Unit, Electromyograph, Diodo, Transistor, Capacitor Electrolytic, Resistors.