Post by 17robots on Jul 19, 2015 19:21:01 GMT -5
This is an article about the software OpenViBE and its possibility for use in a virtual reality brain computer interface.
OpenVibe Software
We need to first off understand what OpenViBE is. According to their website, they say that, "OpenViBE is a software dedicated to designing, testing and using brain-computer interfaces." They continue, "OpenViBE is a software for real-time neurosciences (that is, for real-time processing of brain signals). It can be used to acquire, filter, process, classify, and visualize brain signals in real-time." OpenViBE is also free and supports both Windows and Linux operating systems. It is open source and can support over 30 acquisition devices with its servers, allowing for an easy change between these devices without having to any modification to the processing chain. The list can be found here.
It also features a graphical programming language for those who aren't a fan of actual programming. And it comes equipped with a Lua interpreter inside the OpenViBE software. If you don't prefer Lua, then it also features a python interpreter with the SciPy and NumPy libraries.
The OpenViBE introductory video
As you saw in the video, the person can do various commands by thinking about moving different body parts. It does require quite a bit of concentration, and could leave the user tired after a time.
The person calibrates the machine by moving the body parts in real life, so that the software can pick up the waves from the motor cortex. Next, the user has to think about moving that part of the body, and the command will execute. Though, this can prove to be a slow process (as seen in the museum tour scene before the checkpoints). These commands allow for a range of possibilities, from moving a ball, to even walking in a museum (again, seen in the museum demo).
Well, this introduction seems to show us a lot of capabilities and potential this piece of software has when used by the right people. But, what do we really want? What do we want our system to actually be able to do? And how does it compare to this? Let's begin by looking at what we want in this.
Our Expectations
There are a number of things that we expect a virtual reality brain computer interface to do. For example, we want it to be able to read sensory output from the virtual environment (such as audio and video) and write that to our brains. This would cause our brain to think as if we were in virtual environment and not in real life. We also want the device to read from our motor cortex and use those readings as input for virtual reality. This should occur while our actual body is immobile. What I want to focus on is the input aspect of the device, as it pertains to the software.
We want to read our motor cortex for signals that are supposed to be sent to our muscles, with some sort of brain imaging, convert those signals into code, and turn them into some sort of action in an virtual environment. That was the over-simplified version but in a virtual environment, we want the animations of an avatar to match how we move our bodies regularly. Take lifting the arm for example, we expect the BCI system to read the signals from the brain that is telling the body to raise the arm, and we want it to be turned into an animation of the arm being raised. Another example is walking. We want the BCI system to do what the arm would do in the prior example, but also move us forward in this virtual environment. This would apply to every action including running, jumping, moving an individual finger etc. The point is that we want to feel like we are actually moving in the virtual environment as if it were real life. That also means that the system needs to be fast, with little to no latency, and precise.
Though the expectations seem high, let's see if the OpenViBE meets them.
What the OpenViBE Software has to offer
We saw most of the features in the introduction.
I took the initiative to download the program and take a tour around just to see what we were dealing with.
The Pros
1. It is free. There is no dent financially for any group who wants to use this software.
2. It is open source. Anyone who wants to can develop our own programs to do specific tasks.
3. It has a self-explanatory graphical programming for those people who don't prefer to program.
4. It has support for over 30 BCI devices which allows for strong compatibility.
5. It executes commands based on signals from the motor cortex, which is what we want to use as input.
6. This software can be used for virtual environment and video games.
7. Written in C++(first of its kind with the software)
The Cons
Although the software offers quite a bit of features that meet what we want, there are some downfalls to this. These mainly lie in the performance area. For starters, the introductory video pointed out that the demos were slow and occasionally didn't pick up the right signals. Adding on, whenever you start a new project or task, you always have to recalibrate the signals. Though this is understandable, it is not what we are aiming for. Another specification that doesn't meet the expectations is the use of a Lua interpreter and not a C++ one to match the underlying language. Though some find it a great language, it is sometimes unreliable and not as powerful as the software's C++ design.
Conclusions
This software by comparison beats out the cons. Although it is a great system, it has a couple performance issues that do make it hard to use now. However, for what you are getting and not spending a penny, you really are getting a lot. Hopefully the developers can improve on the software and optimize it enough for use in FDVR (Full Dive Virtual Reality). I suggest downloading it and examining it because it has no monetary impact and it's wide application use.
Sources
1). Y. Renard, F. Lotte, G. Gibert, M. Congedo, E. Maby, V. Delannoy, O. Bertrand, A. Lécuyer, “OpenViBE: An Open-Source Software Platform to Design, Test and Use Brain-Computer Interfaces in Real and Virtual Environments”, Presence : teleoperators and virtual environments, vol. 19, no 1, 2010
Personal Comments
This is the first article I have ever done and there wasn't much I could find. I am sorry that it was so short and I hope to improve over time.
OpenVibe Software
We need to first off understand what OpenViBE is. According to their website, they say that, "OpenViBE is a software dedicated to designing, testing and using brain-computer interfaces." They continue, "OpenViBE is a software for real-time neurosciences (that is, for real-time processing of brain signals). It can be used to acquire, filter, process, classify, and visualize brain signals in real-time." OpenViBE is also free and supports both Windows and Linux operating systems. It is open source and can support over 30 acquisition devices with its servers, allowing for an easy change between these devices without having to any modification to the processing chain. The list can be found here.
It also features a graphical programming language for those who aren't a fan of actual programming. And it comes equipped with a Lua interpreter inside the OpenViBE software. If you don't prefer Lua, then it also features a python interpreter with the SciPy and NumPy libraries.
The OpenViBE introductory video
As you saw in the video, the person can do various commands by thinking about moving different body parts. It does require quite a bit of concentration, and could leave the user tired after a time.
The person calibrates the machine by moving the body parts in real life, so that the software can pick up the waves from the motor cortex. Next, the user has to think about moving that part of the body, and the command will execute. Though, this can prove to be a slow process (as seen in the museum tour scene before the checkpoints). These commands allow for a range of possibilities, from moving a ball, to even walking in a museum (again, seen in the museum demo).
Well, this introduction seems to show us a lot of capabilities and potential this piece of software has when used by the right people. But, what do we really want? What do we want our system to actually be able to do? And how does it compare to this? Let's begin by looking at what we want in this.
Our Expectations
There are a number of things that we expect a virtual reality brain computer interface to do. For example, we want it to be able to read sensory output from the virtual environment (such as audio and video) and write that to our brains. This would cause our brain to think as if we were in virtual environment and not in real life. We also want the device to read from our motor cortex and use those readings as input for virtual reality. This should occur while our actual body is immobile. What I want to focus on is the input aspect of the device, as it pertains to the software.
We want to read our motor cortex for signals that are supposed to be sent to our muscles, with some sort of brain imaging, convert those signals into code, and turn them into some sort of action in an virtual environment. That was the over-simplified version but in a virtual environment, we want the animations of an avatar to match how we move our bodies regularly. Take lifting the arm for example, we expect the BCI system to read the signals from the brain that is telling the body to raise the arm, and we want it to be turned into an animation of the arm being raised. Another example is walking. We want the BCI system to do what the arm would do in the prior example, but also move us forward in this virtual environment. This would apply to every action including running, jumping, moving an individual finger etc. The point is that we want to feel like we are actually moving in the virtual environment as if it were real life. That also means that the system needs to be fast, with little to no latency, and precise.
Though the expectations seem high, let's see if the OpenViBE meets them.
What the OpenViBE Software has to offer
We saw most of the features in the introduction.
I took the initiative to download the program and take a tour around just to see what we were dealing with.
The Pros
1. It is free. There is no dent financially for any group who wants to use this software.
2. It is open source. Anyone who wants to can develop our own programs to do specific tasks.
3. It has a self-explanatory graphical programming for those people who don't prefer to program.
4. It has support for over 30 BCI devices which allows for strong compatibility.
5. It executes commands based on signals from the motor cortex, which is what we want to use as input.
6. This software can be used for virtual environment and video games.
7. Written in C++(first of its kind with the software)
The Cons
Although the software offers quite a bit of features that meet what we want, there are some downfalls to this. These mainly lie in the performance area. For starters, the introductory video pointed out that the demos were slow and occasionally didn't pick up the right signals. Adding on, whenever you start a new project or task, you always have to recalibrate the signals. Though this is understandable, it is not what we are aiming for. Another specification that doesn't meet the expectations is the use of a Lua interpreter and not a C++ one to match the underlying language. Though some find it a great language, it is sometimes unreliable and not as powerful as the software's C++ design.
Conclusions
This software by comparison beats out the cons. Although it is a great system, it has a couple performance issues that do make it hard to use now. However, for what you are getting and not spending a penny, you really are getting a lot. Hopefully the developers can improve on the software and optimize it enough for use in FDVR (Full Dive Virtual Reality). I suggest downloading it and examining it because it has no monetary impact and it's wide application use.
Sources
1). Y. Renard, F. Lotte, G. Gibert, M. Congedo, E. Maby, V. Delannoy, O. Bertrand, A. Lécuyer, “OpenViBE: An Open-Source Software Platform to Design, Test and Use Brain-Computer Interfaces in Real and Virtual Environments”, Presence : teleoperators and virtual environments, vol. 19, no 1, 2010
Personal Comments
This is the first article I have ever done and there wasn't much I could find. I am sorry that it was so short and I hope to improve over time.