Switching to Maya to continue my Schlomo thesis animation
Switching to Maya to continue my Schlomo thesis animation
Modeled by Karen Tapia and William Meshchery
Lighting and Rendering by William Meshchery
The iQue was a success. After many long awaited months we successfully managed to manipulate an environmental lighting system using our brainwaves.
The final demonstration of the Interactive Project took place at the Subspectra Senior Show at the Fashion Institute of Technology. On opening day, those who attended were given the opportunity to use the device and enjoy the changing dynamic environment. Although the show proved to be a success, a series of changes needed to be made in order to accommodate the show.
The first major change came with the funding of the Animation Department, which allowed us to increase the lights in our system to five, up from one. Four Hue bulbs were aligned down the center of the show, while a large floodlight illuminated the space behind the display. The added lights required changing the code to allow the headset to communicate to the bridge while also connecting to each of the five lights.
After updating the code, we kept the lights on a purple pulsing idle state. When the user would bring their level of attention higher than 85%, the lights would dynamically light up with new colors and luminosity. Each user would try the device and be rewarded with a newly lit space.
However, the programming wasn’t the only drastic change made to the device. An attached Raspberry Pi with an external battery allowed the headset to be completely mobile. This removed the need for long dangling wires and gave the user a sense of freedom and comfort. The new battery allowed the device to function for over 8 hours before losing it’s charge.
In the end, the goals of the project were satisfied and appreciated. Over a dozen attendants bravely sampled the device, including professors, students, investors, and even the Dean of the school. The satisfaction when the lights would change could clearly be seen across each user’s face, which made the long months of work worthwhile.
It was a pleasure to work on such a visually stimulating project, and the future of the iQue is still unknown. Possibly after some time to rest, we may return to the creation and improve it for personal use.
Thank you to those who have supported us and shown interest in our work.
Rendered in Mental Ray for Thesis Displays
Volumic Pass from Mental Ray
The iQue is a wearable device that allows you to use your mind to control electronic devices, such as a Phillips Hue Bulbs. The device works by measuring values of concentration and meditation and converting those values into data that can be transmitted to remote devices.
We have successfully managed to communicate to the Phillips Hue lightbulb with the Mindwave headset. By using nothing but our brain waves we can turn our lights on or off. Once the user’s attention reaches a certain value, the light will turn on and stay on. To turn the light bulb off, simply do the same by meditating.
Although this may seem simple, the ability to use our very own brainwaves to manipulate the light is a major breakthrough towards our final goals with the project. As of today we will only be demonstrating the proprietary eSense meters. We will be streaming our attention and meditation values directly controlling the hue api using Cylon.js. In the future, we will be expanding our research to control the Hue and Luminosity of the bulb, visually representing our mental state in vibrant color.
Future plans include changing the colour of the light based on brain waves and making it all work on a raspberry pi.
Brain Waves are monitored by an EEG Sensor, connected to an Amplifier, and then processed in as data connected to a computer. An EEG sensor works by connecting electrodes to the surface of the skull. The sensors pick up the brain waves by being connected to the scalp. The sensors pick up the small waves and then the Amplifier increases them enough to be read as data. The data picked up the sensors are read as Alpha, Beta, Theta, and Delta. Each brain wave is attributed to a different state of mind.
We would approach this task by using surface electrodes connected to an amplifier, which is then connected to a microcontroller designed for translating these signals into raw data. By hooking up all this hardware to an Arduino, we would be able to control the amplifier and the microcontroller using Arduino software. Also, by using the Arduino we would have the ability to connect to the bluetooth adapter, which is what will allow us to relay the data into the lighting system.
The Philips Hue lighting system has several ways to interact and control the device. The way that we would manipulate the system is by using the Bridge. The bridge is designed so that the system can be accessed through the internet (URL) and then sending values through the URL.The reason behind choosing the Hue is how the entire project is able to come together. The Hue bulb can connect to the bridge which connects to your router. This is what allows user input to affect the light. The Arduino software can run script that allows you to control the lights through the Arduino itself. By using pre-existing ArduinoHue Libraries available on Github we are able to change the input of the device by interpreting the data the our EEG Sensor spits out. All we would have to do is achieve 80% Concentration (mainly Alpha Brain Waves converted into numerical values) to switch the light from on or off.
By purchasing the MindWave Mobile starter Kit product, which is basically an EEG sensor that uses Bluetooth to transfer wave data directly to a Bluetooth enabled PC or mobile device without an Arduino, it enables us to achieve our goal of controlling the Hue light system much more economically. This not only cuts the amount of work we have to do but also makes it practical for a user to walk around and not be strapped down sitting next to a PC since the MindWave sends the brain waves wirelessly using a built in Bluetooth adapter and has 8 hours of battery life using a AAA battery. This is also a safer alternative than Plan 1 because testing and exposing your head and brain to even small voltages can be hazardous and dangerous.
Another benefit to the MindWave is that it comes with built-in APIs and support for mobile devices, again helping us achieve our goal more practically while cutting costs. The API we would most likely be taking advantage of is called the ThinkGear Connector (TGC). It is a daemon program that runs in the background and we may use Flash or any other programming language that communicates through sockets. Easier said than done however this is the simplest way to accomplish our goal.
The last benefit of using MindWave over another EEG sensor connected to an Arduino board is that the MindWave can detect eye blinks which we may utilize to turn lights on and off. For example, 3 sequential eye blinks would trigger the lights. Elaborating on the sensors even further, the MindWave can detect 2 mental states; we could have the Philips Hue system change color based on how the user feels. Another tidbit about using MindWave is that it uses dry electrodes that can measure brainwaves millimeters from the scalp and thus can easily be worn over hair. These sensors are a significant technological breakthrough in that they are the only non-contact EEG sensors ever developed.
A remote portable device that controls electronics using a basic library of thought commands using EEG and Bluetooth technology.
EX1: The User sets up Smart LED Bulbs in his room that are compatible with Bluetooth. As he enters the room, the user would only need to think about the lights turning on and they do.
The user would put on the device and then first connect it to their computer. By launching the Application they can set up the device using their own neurological brain patterns. Once the device is synched up the user can they begin connecting their [IQ] to other devices. The Simple Library would consist of On/Off commands that allow them to control basic functions of most devices.
Turning on Electronics would be just a thought.
As technology increases and sensors become more accurate, the user would be able to expand the basic library into a large list of complex commands and macros. This would allow the users more control over the devices they are connected to. In example, They would be able to control the intensity of the light and dim them on command. They would also be able to turn on applications on their computer. Eventually iQs would be able to connect to one another and be able to send telepathic thoughts, changing the way we communicate between people.
Max Rodriguez & William Meshchery