03Oct

Maya Switch

In Schlomo, Thesis by William Meshchery / October 3, 2015 / Comments are closed

Switching to Maya to continue my Schlomo thesis animation

24Aug

Toy Car

In Misc by William Meshchery / August 24, 2015 / Comments are closed

 

Modeled by Karen Tapia and William Meshchery

Lighting and Rendering by William Meshchery

18May

iQue

The iQue was a success. After many long awaited months we successfully managed to manipulate an environmental lighting system using our brainwaves.

The final demonstration of the Interactive Project took place at the Subspectra Senior Show at the Fashion Institute of Technology. On opening day, those who attended were given the opportunity to use the device and enjoy the changing dynamic environment. Although the show proved to be a success, a series of changes needed to be made in order to accommodate the show.

The first major change came with the funding of the Animation Department, which allowed us to increase the lights in our system to five, up from one. Four Hue bulbs were aligned down the center of the show, while a large floodlight illuminated the space behind the display. The added lights required changing the code to allow the headset to communicate to the bridge while also connecting to each of the five lights.

After updating the code, we kept the lights on a purple pulsing idle state. When the user would bring their level of attention higher than 85%, the lights would dynamically light up with new colors and luminosity. Each user would try the device and be rewarded with a newly lit space.

However, the programming wasn’t the only drastic change made to the device. An attached Raspberry Pi with an external battery allowed the headset to be completely mobile. This removed the need for long dangling wires and gave the user a sense of freedom and comfort. The new battery allowed the device to function for over 8 hours before losing it’s charge.

In the end, the goals of the project were satisfied and appreciated. Over a dozen attendants bravely sampled the device, including professors, students, investors, and even the Dean of the school. The satisfaction when the lights would change could clearly be seen across each user’s face, which made the long months of work worthwhile.

It was a pleasure to work on such a visually stimulating project, and the future of the iQue is still unknown. Possibly after some time to rest, we may return to the creation and improve it for personal use.

Thank you to those who have supported us and shown interest in our work.

dsc_03771 (1)

William Meshchery
Max Rodriguez

16Apr

WIP Schlomo Screenshot

In Schlomo, Thesis by William Meshchery / April 16, 2015 / Comments are closed

Rendered in Mental Ray for Thesis Displays

16Apr

Volumetrics Test

In Schlomo, Thesis by William Meshchery / April 16, 2015 / Comments are closed

Volumic Pass from Mental Ray

13Apr

iQ Project

The iQue is a wearable device that allows you to use your mind to control electronic devices, such as a Phillips Hue Bulbs. The device works by measuring values of concentration and meditation and converting those values into data that can be transmitted to remote devices.

People will be capable of  putting on the iQue and control a set of Phillips Hue bulbs. As the user concentrates on the bulbs, raising the value of concentration, they can turn the bulbs on. If the user reaches a state of mediation, the bulbs can be turned off. They device is completely wireless as a raspberry pi was attached to the headset, allowing it to be used without the need of a laptop. The added batter pack gives extra power allowing the device to be used for multiple hours.
03Dec

iQ Prototype

We have successfully managed to communicate to the Phillips Hue lightbulb with the Mindwave headset. By using nothing but our brain waves we can turn our lights on or off. Once the user’s attention reaches a certain value, the light will turn on and stay on. To turn the light bulb off, simply do the same by meditating.

Although this may seem simple, the ability to use our very own brainwaves to manipulate the light is a major breakthrough towards our final goals with the project. As of today we will only be demonstrating the proprietary eSense meters. We will be streaming our attention and meditation values directly controlling the hue api using Cylon.js. In the future, we will be expanding our research to control the Hue and Luminosity of the bulb, visually representing our mental state in vibrant color.

  But How?

When we first started we ran into a problem with the operating system. Windows was incompatible with the software so we switched over to Ubuntu Linux.Cylon is a JavaScript framework for robotics and physical computing. It runs inside of Node.js which is a platform built on Chrome’s JavaScript runtime for easily building fast apps. The headset transfers values to our Linux laptop via Bluetooth into Cylon.js to the Phillips bridge via WiFi router to the hue bulb.

Future plans include changing the colour of the light based on brain waves and making it all work on a raspberry pi.

20Nov
29Oct

iQue Plans

Goal

Activate and deactivate a lighting system using Brain Waves.

Background:

Brain Waves are monitored by an EEG Sensor, connected to an Amplifier, and then processed in as data connected to a computer. An EEG sensor works by connecting electrodes to the surface of the skull. The sensors pick up the brain waves by being connected to the scalp. The sensors pick up the small waves and then the Amplifier increases them enough to be read as data. The data picked up the sensors are read as Alpha, Beta, Theta, and Delta. Each brain wave is attributed to a different state of mind.

Plan I:

We would approach this task by using surface electrodes connected to an amplifier, which is then connected to a microcontroller designed for translating these signals into raw data. By hooking up all this hardware to an Arduino, we would be able to control the amplifier and the microcontroller using Arduino software. Also, by using the Arduino we would have the ability to connect to the bluetooth adapter, which is what will allow us to relay the data into the lighting system.

The Philips Hue lighting system has several ways to interact and control the device. The way that we would manipulate the system is by using the Bridge. The bridge is designed so that the system can be accessed through the internet (URL) and then sending values through the URL.Phillips starter packThe reason behind choosing the Hue is how the entire project is able to come together. The Hue bulb can connect to the bridge which connects to your router. This is what allows user input to affect the light. The Arduino software can run script that allows you to control the lights through the Arduino itself. By using pre-existing ArduinoHue Libraries available on Github we are able to change the input of the device by interpreting the data the our EEG Sensor spits out. All we would have to do is achieve 80% Concentration (mainly Alpha Brain Waves converted into numerical values) to switch the light from on or off.

  • PC
  • Router
  • Philips 452714 9W (60-Watt) A19 Hue Lux Connected Home LED Starter Kit $99.97
  • Electrodes: 2″ x 2″ Premium White Cloth (4/pack) – $1.95
  • Arduino UNO R3 board with DIP ATmega328P – $24.95
  • OpenBCI 8-bit Board Kit (Arduino™-compatible) – $399.99

 

Plan II:

ique

By purchasing the MindWave Mobile starter Kit product, which is basically an EEG sensor that uses Bluetooth to transfer wave data directly to a Bluetooth enabled PC or mobile device without an Arduino, it enables us to achieve our goal of controlling the Hue light system much more economically. This not only cuts the amount of work we have to do but also makes it practical for a user to walk around and not be strapped down sitting next to a PC since the MindWave sends the brain waves wirelessly using a built in Bluetooth adapter and has 8 hours of battery life using a AAA battery. This is also a safer alternative than Plan 1 because testing and exposing your head and brain to even small voltages can be hazardous and dangerous.

MindWave Headset

Another benefit to the MindWave is that it comes with built-in APIs and support for mobile devices, again helping us achieve our goal more practically while cutting costs. The API we would most likely be taking advantage of is called the ThinkGear Connector (TGC). It is a daemon program that runs in the background and we may use Flash or any other programming language that communicates through sockets. Easier said than done however this is the simplest way to accomplish our goal.


Philips Hue

The last benefit of using MindWave over another EEG sensor connected to an Arduino board is that the MindWave can detect eye blinks which we may utilize to turn lights on and off. For example, 3 sequential eye blinks would trigger the lights. Elaborating on the sensors even further, the MindWave can detect 2 mental states; we could have the Philips Hue system change color based on how the user feels. Another tidbit about using MindWave is that it uses dry electrodes that can measure brainwaves millimeters from the scalp and thus can easily be worn over hair. These sensors are a significant technological breakthrough in that they are the only non-contact EEG sensors ever developed.

Required Equipment:

  • PC or Mobile Device (iOS/Android)
  • Router
  • MindWave Mobile: Brainwave Starter Kit – $99.99
  • Philips 452714 9W (60-Watt) A19 Hue Lux Connected Home LED Starter Kit: – $99.97
08Oct

iQ Alpha

iQ

A remote portable device that controls electronics using a basic library of thought commands using EEG and Bluetooth technology.

EX1: The User sets up Smart LED Bulbs in his room that are compatible with Bluetooth. As he enters the room, the user would only need to think about the lights turning on and they do.

How It Works

The user would put on the device and then first connect it to their computer. By launching the Application they can set up the device using their own neurological brain patterns. Once the device is synched up the user can they begin connecting their [IQ] to other devices. The Simple Library would consist of On/Off commands that allow them to control basic functions of most devices.

Turning on Electronics would be just a thought.

Components

  • Micro Arduino Controller
  • Head EEG sensor
  • Bluetooth 4.0 antenna

Future Direction

As technology increases and sensors become more accurate, the user would be able to expand the basic library into a large list of complex commands and macros. This would allow the users more control over the devices they are connected to. In example, They would be able to control the intensity of the light and dim them on command. They would also be able to turn on applications on their computer. Eventually iQs would be able to connect to one another and be able to send telepathic thoughts, changing the way we communicate between people.

Max Rodriguez & William Meshchery

24Sep

Virtual Reality

The next big thing out there in tech and gaming has Oculus written all over it. We are at the forefront of virtual reality and it will take over human interaction just as quick as the smart phones have. VR will change the way we communicate everyday:

  • New social meetings
  • Seeing an old friend
  • Attending a show
  • Seeing a game
  • Performing a surgery
  • Human to computer interfaces

These and many more will all  be dramatically enhanced and revolutionized thanks to VR.

We see smartphones as a way that we brought computers out into the real world, but virtual reality’s really the way we’re now gonna go into the virtual world.

– USC professor Mark Bolas

Virtual reality will stretch far beyond the boundaries of Facebook avatars and will pave way to the most realistic and precise human interactions. A surgeon would be able to perform any complex surgery from the comfort of his home; away from any harmful bacteria or infection. Being able to walk on the surface of a faraway planet or combat training are just a few examples.

Going with Matthew Rader’s dream of being able to move through a movie while it is actually playing will be superseded by live theater plays where you may stand anywhere, live boxing matches up close and personal, and even live sports in conjunction with smaller and more available motion capture technologies that would be attached to athletes, actors, and even astronauts. Movement data would be fed to your fully immersive internet-enabled VR headset and you could experience it live in front of your eyes.

Who knows maybe in the future there would exist a video camera that will record the 3d position of any moving object in reference to itself thus rendering all visual data from each angle of recording and applying the visual images gathered as textures. Fully 3d photorealistic live reality. Ridiculous.

Moving even further in the future we could see brain to headset communications recreating the physical movement in our eyes thus removing all dangers of  using a VR since you would most likely be stationary now. We have already achieved brain to brain communication where a person’s brain transmits a thought through the internet into another person’s brain. It would be only a matter of when and how much. With everything discussed here, at this moment we are only on the tip of the iceberg that is Virtual Reality.

 

The newest Prototype, Crescent Bay.
The newest Prototype, Crescent Bay.

Sources:

10Sep

Rough Intro

In Thesis by William Meshchery / September 10, 2014 / Comments are closed

Very rough edit

William Meshchery background image
0