Contact Us

Use the form on the right to contact us.

You can edit the text in this area, and change where the contact form on the right submits to, by entering edit mode using the modes on the bottom right. 

Name *
Name
         

123 Street Avenue, City Town, 99999

(123) 555-6789

email@address.com

 

You can set your address, phone number, email and site description in the settings tab.
Link to read me page with more information.

2014

List of projects undertaken in the year 2014. This includes volunteer, capstone, and class projects.

Filtering by Tag: Capstone

EOG

Enabling Engineering

The Need

There are currently about 50,000 people with locked-in syndrome (LIS) in the United States. Locked-in-syndrome is a condition that results in complete paralysis with the exception of the eyes. This condition reduces the patient’s ability to communicate with the outside world using conventional methods. The current communication methods for these patients are limited, time-consuming, and dependent on other individuals.

The Project

Our goal was to build a system that allows the user to express approximately 1.25 words per minute, which is twice as fast as the current best approach, partner-assisted scanning. Our system uses a computer to run an on-screen keyboard application. Our system uses a clinical measuring technique called electrooculography, or EOG. The technique relies on the physiological property that the front of a person’s eyes is more positively charged than the back of the eyes. We use this standing voltage difference, called the corneoretinal potential, to determine which way the uses his or her eyes. By looking at the desired character on the screen and then blinking, letters can be selected. Our GUI keyboard application contains letters A-Z and numbers 0-9 as well as common punctuation, text-to-speech functionality, and saving to a text file.

Current Status

This project was tested by John, an individual with locked-in syndrome, who is supported by Lifestream. He was able to type a paragraph of text much more quickly and without the help of a support staff member. The project also won second prize in NU’s Capstone Design Competition.

Memory-Assistive Glasses

Enabling Engineering

The Need

Memory loss is a serious issue for both the affected individual and those who surround them. Today, an estimated 5.2 million Americans have some stage of Alzheimer’s, a disease that attacks the brain and causes memory loss and dementia. People with Alzheimer’s or other forms of memory loss often struggle with recognizing faces and remembering how to perform basic tasks. These individuals are often required to be aided by family members, nurses, and are sometimes forced to live in assisted living residences. According to the Alzheimer’s Association, “in 2013, 15.5 million caregivers provided more than 17.5 billion hours of unpaid care valued at over $220 billion”.

The Project

Our goal was to help memory-impaired individuals, such as early stage Alzheimer's patients, by identifying people they come in contact with, identifying objects, and displaying step-by-step instructions for simple tasks. Our design uses Google Glass to interface with the user, which includes a small display, a 5MP camera, a bone-conduction speaker, and a touchpad. Using Google Glass and an Android application, the user takes a picture of people they would like to recognize and adds the picture to a database. When the user would like to identify someone, they take a picture of the person and Google Glass displays the matched person’s name and relationship on Glass’s display. If that person is not found in the database, the user must add the person to the database via our Android application. Memory-impaired individuals also have difficulty remembering steps for simple tasks, such as where to put the dishes after unloading the dishwasher. Using our application, the user scans the QR code and the corresponding steps are displayed to the user on the Glass’s display .

Current Status

This project is complete and won third prize in the NU Capstone Design Competition.

Help Me Get There

Enabling Engineering

The Need

37 million people in the U.S. are blind or have a visual impairment. For these individuals, travel within a busy city can be dangerous. The urban infrastructure often fails to convey the information necessary for the visually impaired to travel safely. Our goal is to make crossing the street safer.

The Project

The Help Get Me There application will identify a user’s location with RFID tags located at both ends of crosswalks communicating with an RFID reader and Bluetooth transmitter located on the user. The RFID reader located on the user will forward this information to the smartphone allowing the application to pull up information about the intersection. The application itself will then fulfill a variety of roles to provide the user with information necessary to cross the street easily, including

• Intersection Identification: The application will be capable of informing users what intersection they are at, by announcing the two intersecting street names using RFID tags.

• Crosswalk Alignment: The application will use the compass within smartphones to align the user so the he or she may safely cross the intersection.

• Crosswalk Signal Alignment: When arriving at an intersection the user would like to know where the button is for crossing the street without needing to ask. The direction of the button will be sent to the user using the same method as crossing the street, by giving the user a direction from the smartphone compass.

• Intersection Description: The application will send information about islands, number of car lanes, and if the road is one-way or bidirectional at a crosswalk.

• Button Identification: Our application will be engineered to announce the name of the button when pressed and only activate the button’s function on release.

Current Status

This project is in progress.

iCraft 2.0

Enabling Engineering

The Need

Tim was in a car accident that left him unable to use his arms. He needs to be fed by caregivers who must be present throughout the feeding process. He would like to have the independence and freedom to feed himself.

The Project

Our goal was to design and build a robotic system that allows Tim to feed himself. The system consists of a robotic feeding arm, food bowls and a water bottle, eye-tracking, and a custom GUI. The simple GUI lets the user select one of the bowls to eat from by looking at the appropriate section of the GUI. Based on this command, the robotic arms moves to the appropriate bowl, scoops food, and delivers it to the user’s mouth. The user can chose to return food on the spoon if not all of it is desired. The software system remembers which bowl was selected last and how many times each bowl was selected, allowing the system to estimate the amount of food remaining in each bowl.

If the user would like a drink, they can select that option from the GUI. The robotic arm swivels to offer a drink straw to the user. The straw has a valve attached at the end that prevents water from spilling until the valve is squeezed open with the user’s mouth. The GUI is simple an intuitive to use. The design focuses on bright colors with large selection regions. We implemented a simple, multi-level menu system.

Current status

The project is complete and was successfully tested by end users at Lifestream, including Tim. The project was also received widespread media attention, including a CNN feature, and articles by Engadget, Forbes, PCWorld, and Herald News. The project also won first prize in NU’s ECE Capstone Design Competition.