A SEMINAR REPORT ON BRAIN COMPUTER INTERFACE SESSION Submitted to- Submitted By Prachi Parashar Rahul Sharma (Assistant Professor- . Department of Computer Science and Engineering 4 BRAINGATE TECHNOLOGY SEMINAR REPORT mencosulwiemudd.tktic Architecture of BCI Brain Computer. Brain Computer Interface, Ask Latest information, Abstract, Report, Presentation ( pdf,doc,ppt),Brain Computer Interface technology discussion,Brain Computer.

Brain Computer Interface Seminar Report Pdf

Language:English, Japanese, Portuguese
Published (Last):04.09.2016
ePub File Size:29.86 MB
PDF File Size:10.82 MB
Distribution:Free* [*Registration Required]
Uploaded by: WERNER

BRAIN COMPUTER INTERFACEAbstract: Brain–computer interfaces (BCIs) enable users to control devices withelectroencephalographic. Report by Mr. Alan Wong. This seminar on Brain Computer Interfaces was held on 25 March The speaker of this seminar was Professor Bertram Shi of the . Seminar report and PPT on Brain-Computer Interface BCI gives the basic Download the PDF seminar report, documents and PPT Learn all.

Monkeys have navigated computer cursors on screen and commanded robotic arms to perform simple tasks simply by thinking about the task and without any motor output. Other research on cats has decoded visual signals. Studies that developed algorithms to reconstruct movements from motor cortex neurons date back to the s.

Work by groups led by Schmidt, Fetz, and Baker in the s established that monkeys could quickly achieve voluntary control over the firing rate of individual neurons in primary motor cortex under closed-loop operant conditioning. There has been explosive development in BCIs since the mids. Phillip Kennedy and colleagues built the first wireless, intracortical brain-computer interface by implanting neurotrophic cone electrodes first into monkeys and then into the brains of paralyzed patients.

Several groups have explored real-time reconstruction of more complex motor parameters using recordings from neural ensembles, including research groups. Human BCI Interface: Non-invasive BCIs: As well as invasive experiments see below , there have also been experiments in Humans using non-invasive neuroimaging technologies as interfaces. Electroencephalography EEG is the most studied potential human interface, mainly due to its fine temporal resolution, ease of use, portability and low set-up cost.

Birbaumer had earlier trained epileptics to prevent impending fits by controlling this low voltage wave. The experiment saw ten patients trained to move a computer cursor by controlling their brainwaves. The process was slow, requiring more than an hour for patients to write characters with the cursor, while training often took many months.

Another research parameter is the type of waves measured. Birbaumer's later research with Jonathan Wolpaw at New York State University has focused on developing technology that would allow users to choose the brain signals they found easiest to operate a BCI, including mu and beta waves. A further parameter is the method of feedback used and this is shown in studies of P signals.

Related Interests

Patterns of P waves are generated involuntarily stimulus-feedback when people see something they recognize and may allow BCIs to decode categories of thoughts without training patients first. By contrast, the biofeedback methods described above require learning to control brainwaves so the resulting brain activity can be detected. There has been great success in using cochlear implants in humans as a treatment for non congenital deafness, but it's not clear that these can be considered brain-computer interfaces.

There is also promising research in vision science where direct brain implants have been used to treat non-congenital blindness. One of the first scientists to come up with a working brain interface to restore sight was private researcher, William Dobelle.

Dobelle's first prototype was implanted into Jerry, a man blinded in adulthood, in The system included TV cameras mounted on glasses to send signals to the implant. Initially the implant allowed Jerry to see shades of grey in a limited field of vision and at a low frame-rate also requiring him to be hooked up to a two-ton Mainframe. Shrinking electronics and faster computers made his artificial eye more portable And allowed him to perform simple task sun assisted.

The second generation device used a more sophisticated implant enabling better mapping of phosphenes into coherent vision. Phosphenes are spread out across the visual field in what researchers call the starry-night effect. Immediately after his implant, Jens was able to use his imperfectly Restored vision to drive slowly around the parking area of the research institute.

BCIs focusing on motor Neuroprosthetics aim to either restore independent control of the body or provide assistive devices to individuals paralyzed by a variety of causes, including spinal cord injury, amyotrophic lateral sclerosis, muscular dystrophy, multiple sclerosis, spinal muscular atrophy, cerebellar disorders and certain types of stroke. Matt Nagle is one of the first people to use a direct brain-computer interface to restore functionality lost due to paralysis.

He is the first paralyzed person to have operated a prosthetic arm using just his mind. On July 4, , Nagle became paralyzed from the neck downwards after being assaulted by a person wielding a knife. He was confined to his wheel chair and was unable to breathe without a respirator. Fortunately there was a scientist and a new device to help him overcome his disabilities.

This implanted sensor picked up the electric signals that command the limbs of the body to move. In the case of a healthy man, these signals would have been forwarded to the spinal cord. This implanted device enabled Nagle to do things like check his e-mail, turn the TV on or off, draw a crude circle on the screen, play the game Pong, and control a prosthetic arm-with just his thoughts.

Of course, he needed months of training to perform these tasks but his achievement underlines the staggering potential of BCI Technology. Experiments have shown that after a training period, the user can obtain control over specific components of oscillatory activity in the EEG. The mode of operation determines when the user performs a mental task and, therewith, intends to transmit a message.

Brain Computer Interface BCI Seminar Report and PPT

In principle, there are two distinct modes of operation, the first being externally-paced cue-based, computer-driven, asynchronous BCI and the second internally-paced uncued, userdriven, asynchronous BCI. Feedback is a very important component in the training phase and during application.

Feedback can be discrete or continuous, realistic e. Together with the FB, the BCI forms a closed loop system composed of two adaptive controllers brain and the computer.

The goal of the feature extraction components is to find a suitable representation of the bioelectric brain signal that simplifies the subsequent classification or detection of specific thought-related patterns of brain activity. The signal feature should encode the commands sent by the user, but not contain noise and other signal components that can impede the classification process. The task of the classifier is to use the signal features to assign each recorded sample of the signal to a given class of mental patterns.

Monkeys have navigated computer cursors on screen and commanded robotic arms to perform simple tasks simply by thinking about the task and without any motor output. Studies that developed algorithms to reconstruct movements from motor cortex neurons, which control movement, date back to the s.

Work by groups in the s established that monkeys could quickly learn to voluntarily control the firing rate of individual neurons in the primary motor cortex via closed-loop operant conditioning.

There has been rapid development in BCIs since the mids. Several groups have been able to capture complex brain motor centre signals using recordings from neural ensembles groups of neurons and use these to control external devices.

The first Intra-Cortical Brain-Computer Interface was built by implanting neurotrophic- cone electrodes into monkeys. In , researchers decoded neuronal firings to reproduce images seen by cats.

The team used an array of electrodes embedded in the thalamus of sharp-eyed cats. Researchers targeted brain cells in the thalamus lateral geniculate nucleus area, which decodes signals from the retina. Neural ensembles are said to reduce the variability in output produced by single electrodes, which could make it difficult to operate a Brain Computer Interface. After conducting initial studies in rats during the s, researchers developed Brain Computer Interfaces that decoded brain activity in owl monkeys and used the devices to reproduce monkey movements in robotic arms.

Researchers reported training rhesus monkeys to use a Brain Computer Interface to track visual targets on a computer screen with or without assistance of a joystick Closed-Loop Brain Computer Interface. A Brain Computer Interface for three-dimensional tracking in virtual reality was developed and also reproduced Brain Computer Interface control in a robotic arm.

Researchers used recordings of pre-movement activity from the posterior parietal cortex in their Brain Computer Interface, including signals created when experimental animals anticipated receiving a reward. In addition to predicting kinematic and kinetic parameters of limb movements, Brain Computer Interfaces that predict electromyographic or electrical activity of muscles are being developed.

Such Brain Computer Interfaces could be used to restore mobility in paralyzed limbs by electrically stimulating muscles. A new 'wireless' approach uses light-gated ion channels such as Channelrhodopsin to control the activity of genetically defined subsets of neurons in vivo.

Invasive BCI research has targeted repairing damaged sight and providing new functionality to paralyzed people. Invasive BCIs are implanted directly into the grey matter of the brain during neurosurgery. As they rest in the grey matter, invasive devices produce the highest quality signals of BCI devices but are prone to scar-tissue build-up, causing the signal to become weaker or even lost as the body reacts to a foreign object in the brain.

Direct brain implants have been used to treat non-congenital acquired blindness. BCIs focusing on motor neuro-prosthetics aim to either restore movement in paralyzed individuals or provide devices to assist them, such as interfaces with computers or robot arms.

Partially invasive BCI devices are implanted inside the skull but rest outside the brain rather than amidst the grey matter. They produce better resolution signals than non- invasive BCIs where the bone tissue of the cranium deflects and deforms signals and have a lower risk of forming scar-tissue in the brain than fully-invasive BCIs. These would involve implanting a laser inside the skull. ECoG is a very promising intermediate BCI modality because it has higher spatial resolution, better signal-to-noise ratio, wider frequency range, and lesser training requirements than scalp-recorded EEG, and at the same time has lower technical difficulty, lower clinical risk, and probably superior long-term stability than intra-cortical single-neuron recording.

This feature profile and recent evidence of the high level of control with minimal training requirements shows potential for real world application for people with motor disabilities. There have also been experiments in humans using non-invasive neuro imaging technologies as interfaces. Signals recorded in this way have been used to power muscle implants and restore partial movement in an experimental volunteer. Although they are easy to wear, non-invasive implants produce poor signal resolution because the skull dampens signals, dispersing and blurring the electromagnetic waves created by the neurons.

Electroencephalography EEG is the most studied potential non-invasive interface, mainly due to its fine temporal resolution, ease of use, portability and low set- up cost. But as well as the technology's susceptibility to noise, another substantial barrier to using EEG as a brain-computer interface is the extensive training required before users can work the technology.

Another research parameter is the type of waves measured. As well as furthering research on animal implantable devices, experiments on cultured neural tissue have focused on building problem-solving networks, constructing basic computers and manipulating robotic devices.

Research into techniques for stimulating and recording from individual neurons grown on semiconductor chips is sometimes referred to as neuroelectronics or neurochips. Development of the first working neurochip was claimed by a Caltech team led by Jerome Pine and Michael Maher in The Caltech chip had room for 16 neurons.

In the eighties, Wolpaw started an EEG-based cursor control in normal adults using band power centered at 9 Hz. At this time the Wolpaw system in Albany is cue-based and uses autoregressive AR parameters. A linear equation defines the cursor movement necessary for character selection. The SCP are measured in a 2-second window referred to a 2-second baseline cue-based and used to move a ball-like light with a target.

Patients using this system are able to write text after many training sessions. The Graz BCI System is a cue-based system with motor imagery as mental strategy and classifies oscillatory activity in the Hz and Hz frequency band. Parameters are band power or adaptive AR parameters. The Donchin BCI is based on the presentation of a 6x6 letter matrix, in which in short intervals, one of the rows or one of the columns of the matrix is flashed.

A BCI can also be realized based on the evaluation of the amplitude of steady state VEPs induced by flickering lights. When the user focuses attention to one of more flicking lights the corresponding amplitude becomes enhanced.

The most immediate and practical goal of Brain Computer Interface research is to create a mechanical output from neuronal activity. The challenge of Brain Computer Interface research is to create a system that will allow patients who have damage between their motor cortex and muscular system to bypass the damaged route and activate outside mechanisms by using neuronal signals.

This would potentially allow an otherwise paralyzed person to control a motorized wheelchair, computer pointer, or robotic arm by thought alone. Fig 4. A brain actuated wheelchair.

Neuroprosthetic device using a The subject guides the wheelchair Brain Computer Interface. Most Brain Computer Interfaces translate neural activity into a continuous movement command, which guides a computer cursor to a desired visual target.

If the cursor is used to select targets representing discrete actions, the Brain Computer Interface serves as communication prosthesis. Examples include typing keys on a keyboard, turning on room lights, and moving a wheelchair in specific directions. Visual attention, however, might be needed for application control to drive a wheelchair, to observe the environment, etc.

Feedback plays an important role when learning to use a Brain Computer Interface. How can brainwaves directly control external devices? For humans, however, noninvasive approaches avoid health risks and associated ethical concerns.

Simulation of the Fig. Left Active brain areas. Upper right Extracted brain activity patterns. Lower right Pattern classification processing. Thus, EEG signals suffer from a reduced spatial resolution and increased noise when measurements are taken on the scalp. Consequently, current EEG-based brain-actuated devices are limited by low channel capacity and are considered too slow for controlling rapid and complex sequences of robot movements. Recently, researchers had shown for the first time that online EEG signal analysis, if used in combination with advanced robotics and machine learning techniques, is sufficient for humans to continuously control a mobile robot and a wheelchair.

An evoked BCI exploits a strong characteristic of the EEG, the evoked potential, which reflects the immediate automatic responses of the brain to some external stimuli. In principle, evoked potentials are easy to detect with scalp electrodes. However, evoking them requires external stimulation, so they apply to only a limited task range.

Spontaneous BCIs are based on the analysis of EEG phenomena associated with various aspects of brain function related to mental tasks that the subject carries out at will. In such asynchronous protocols, the subject can deliver a mental command at any moment without waiting for external cues. The user and the BCI are coupled together and adapt to each other. In other words, we use machine learning approaches to discover the individual EEG patterns characterizing the mental tasks users execute while learning to modulate their brainwaves in a way that will improve system recognition of their intentions.

We use statistical machine learning techniques at two levels: Incorporating rejection criteria to avoid making risky decisions is an important BCI concern. How is it possible to control a robot that must make accurate turns at precise moments using signals that arrive at a rate of about one bit per second?

The subject delivers a few high-level mental commands and the robot executes these commands autonomously using the readings of its onboard sensors.

This approach makes it possible to continuously control a mobile robot— emulating a motorized wheelchair—along nontrivial trajectories requiring fast and frequent switches between mental tasks. For brain-actuated robots, in contrast to augmented communication through BCI, fast decision making is critical.

Real-time control of brain-actuated devices, especially robots and neuro prostheses, is the most challenging BCI application. A first line of research is online adaptation of the interface to the user to keep the BCI constantly tuned to its owner. In addition, brain signals change naturally over time. In particular, they can change from one session that supplies the data to train the classifier to the next session that applies the classifier.

The most widely used neuroprosthetic device is the cochlear implant, which was implanted in approximately , people worldwide as of There are also several neuroprosthetic devices that aim to restore vision, including retinal implants, etc. The differences between Brain Computer Interfaces and Neuroprosthetics are mostly in the ways the terms are used: Neuroprosthetics typically connect the nervous system, to a device, whereas Brain Computer Interfaces usually connect the brain or nervous system with a computer system.

The terms are sometimes used interchangeably and for good reason.

seminar report

Neuroprosthetics and Brain Computer Interface seek to achieve the same aims, such as restoring sight, hearing, movement, ability to communicate, and even cognitive function. Both use similar experimental methods and surgical techniques.

Cyberkinetic Neurotechnology Inc, markets its electrode arrays under the BrainGate product name and has set the development of practical Brain Computer Interfaces for humans as its major goal. Neural Signals was founded in to develop Brain Computer Interfaces that would allow paralyzed patients to communicate with the outside world and control external devices. As well as an invasive Brain Computer Interface, the company also sells an implant to restore speech.

Neural Signals' Brain Communicator Brain Computer Interface device uses glass cones containing microelectrodes coated with proteins to encourage the electrodes to bind to neurons.

Avery Biomedical Devices and Stony Brook University are continuing development of the implant, which has not yet received FDA approval for human implantation. The Audeo is being developed to create a human-computer interface for communication without the need of physical motor control or speech production.

Post navigation

Using signal processing, unpronounced speech representing the thought of the mind can be translated from intercepted neurological signals. Mindball is a product developed and commercialized by Interactive Productline in which players compete to control a ball's movement across a table by becoming more relaxed and focused. Interactive Productline is a Swedish company whose objective is to develop and sell easy understandable EEG products that train the ability to relax and focus.

He also published a numbers of Research Papers on various topics. Flag for inappropriate content. Related titles.

Jump to Page. Search inside document. Jan-nayak Ch. Submitted By: Ashish Kumar Sethi Roll No. IT 8th Sem. May 27, Certificate This is Certified that Mr. Head, IT Deptt. A person who doesnt have hand or leg can do the work by robotic hand or leg For lie detecting test, it is also used.

BCIs require a wired connection to the equipment. Ssc Cgl. Savita Gondi.

Jergee Mari Guay. Kamakshya Nayak. Lavan Ya. Ijarai ManagingEditor. Rahul Shandilya. Terminal X. Prasanna Travolta. Midbrain Activation. Bhawesh Kumar. Wambui Kahende. Maleesha Senanayake. Ramachandran Vasudevan. Popular in Science General. RG Segaran. Dhilipan Guna. Lear Ncm.The method has outperformed the results achieved by them, obtaining a higherMutual Information which was the criterion used in the competition of 0.

Some preliminary work is being done on synapsing neurons on silicon transformers and on growing neurons into neural networks on top of computer chips. B Sandhya. The computers translate brain activity and create the communication output using custom decoding software. By contrast, the biofeedback methods described above require learning to control brainwaves so the resulting brain activity can be detected.

The brain on the other hand is on a whole other complexity levelcompared to the workings of the inner ear. He was confined to his wheel chair and was unable to breathe without a respirator. It is a new communication link between a functioning human brain and the outside world. As the device is perfected this will not be an issue.

CARMELIA from Stockton
I enjoy studying docunments wearily . See my other posts. I take pleasure in rock balancing.