Field of research From Wikipedia, the free encyclopedia
Neuroinformatics is the emergent field that combines informatics and neuroscience. Neuroinformatics is related with neuroscience data and information processing by artificial neural networks.[1] There are three main directions where neuroinformatics has to be applied:[2]
Neuroinformatics encompasses philosophy (computational theory of mind), psychology (information processing theory), computer science (natural computing, bio-inspired computing), among others disciplines. Neuroinformatics doesn't deal with matter or energy,[3] so it can be seen as a branch of neurobiology that studies various aspects of nervous systems. The term neuroinformatics seems to be used synonymously with cognitive informatics, described by Journal of Biomedical Informatics as interdisciplinary domain that focuses on human information processing, mechanisms and processes within the context of computing and computing applications.[4] According to German National Library, neuroinformatics is synonymous with neurocomputing.[5] At Proceedings of the 10th IEEE International Conference on Cognitive Informatics and Cognitive Computing was introduced the following description: Cognitive Informatics (CI) as a transdisciplinary enquiry of computer science, information sciences, cognitive science, and intelligence science. CI investigates into the internal information processing mechanisms and processes of the brain and natural intelligence, as well as their engineering applications in cognitive computing.[6] According to INCF, neuroinformatics is a research field devoted to the development of neuroscience data and knowledge bases together with computational models.[7]
Models of neural computation are attempts to elucidate, in an abstract and mathematical fashion, the core principles that underlie information processing in biological nervous systems, or functional components thereof. Due to the complexity of nervous system behavior, the associated experimental error bounds are ill-defined, but the relative merit of the different models of a particular subsystem can be compared according to how closely they reproduce real-world behaviors or respond to specific input signals. In the closely related field of computational neuroethology, the practice is to include the environment in the model in such a way that the loop is closed. In the cases where competing models are unavailable, or where only gross responses have been measured or quantified, a clearly formulated model can guide the scientist in designing experiments to probe biochemical mechanisms or network connectivity.
Artificial neural networks (ANNs), usually simply called neural networks (NNs), are computing systems vaguely inspired by the biological neural networks that constitute animal brains.[8] An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain. Each connection, like the synapses in a biological brain, can transmit a signal to other neurons. An artificial neuron that receives a signal then processes it and can signal neurons connected to it. The "signal" at a connection is a real number, and the output of each neuron is computed by some non-linear function of the sum of its inputs. The connections are called edges. Neurons and edges typically have a weight that adjusts as learning proceeds. The weight increases or decreases the strength of the signal at a connection. Neurons may have a threshold such that a signal is sent only if the aggregate signal crosses that threshold. Typically, neurons are aggregated into layers. Different layers may perform different transformations on their inputs. Signals travel from the first layer (the input layer), to the last layer (the output layer), possibly after traversing the layers multiple times.
Brain emulation is the concept of creating a functioning computational model and emulation of a brain or part of a brain. In December 2006,[9] the Blue Brain project completed a simulation of a rat's neocortical column. The neocortical column is considered the smallest functional unit of the neocortex. The neocortex is the part of the brain thought to be responsible for higher-order functions like conscious thought, and contains 10,000 neurons in the rat brain (and 108 synapses). In November 2007,[10] the project reported the end of its first phase, delivering a data-driven process for creating, validating, and researching the neocortical column. An artificial neural network described as being "as big and as complex as half of a mouse brain"[11] was run on an IBM Blue Gene supercomputer by the University of Nevada's research team in 2007. Each second of simulated time took ten seconds of computer time. The researchers claimed to observe "biologically consistent" nerve impulses that flowed through the virtual cortex. However, the simulation lacked the structures seen in real mice brains, and they intend to improve the accuracy of the neuron and synapse models.[12] Mind uploading is the process of scanning a physical structure of the brain accurately enough to create an emulation of the mental state (including long-term memory and "self") and copying it to a computer in a digital form. The computer would then run a simulation of the brain's information processing, such that it would respond in essentially the same way as the original brain and experience having a sentient conscious mind.[13][14][15] Substantial mainstream research in related areas is being conducted in animal brain mapping and simulation, development of faster supercomputers, virtual reality, brain–computer interfaces, connectomics, and information extraction from dynamically functioning brains.[16] According to supporters, many of the tools and ideas needed to achieve mind uploading already exist or are currently under active development; however, they will admit that others are, as yet, very speculative, but say they are still in the realm of engineering possibility.
Research on brain–computer interface began in the 1970s at the University of California, Los Angeles under a grant from the National Science Foundation, followed by a contract from DARPA.[17][18] The papers published after this research also mark the first appearance of the expression brain–computer interface in scientific literature. Recently, studies in Human-computer interaction through the application of machine learning with statistical temporal features extracted from the frontal lobe, EEG brainwave data has shown high levels of success in classifying mental states (Relaxed, Neutral, Concentrating) mental emotional states (Negative, Neutral, Positive)[19] and thalamocortical dysrhythmia.[20]
Neuroinformatics is the scientific study of information flow and processing in the nervous system. Institute scientists utilize brain imaging techniques, such as magnetic resonance imaging, to reveal the organization of brain networks involved in human thought. Brain simulation is the concept of creating a functioning computer model of a brain or part of a brain. There are three main directions where neuroinformatics has to be applied:
Brain simulation is the concept of creating a functioning computational model of a brain or part of a brain.[21] In December 2006,[22] the Blue Brain project completed a simulation of a rat's neocortical column. The neocortical column is considered the smallest functional unit of the neocortex. The neocortex is the part of the brain thought to be responsible for higher-order functions like conscious thought, and contains 10,000 neurons in the rat brain (and 108 synapses). In November 2007,[23] the project reported the end of its first phase, delivering a data-driven process for creating, validating, and researching the neocortical column. An artificial neural network described as being "as big and as complex as half of a mouse brain"[24] was run on an IBM Blue Gene supercomputer by the University of Nevada's research team in 2007. Each second of simulated time took ten seconds of computer time. The researchers claimed to observe "biologically consistent" nerve impulses that flowed through the virtual cortex. However, the simulation lacked the structures seen in real mice brains, and they intend to improve the accuracy of the neuron and synapse models.[25]
Mind uploading is the process of scanning a physical structure of the brain accurately enough to create an emulation of the mental state (including long-term memory and "self") and copying it to a computer in a digital form. The computer would then run a simulation of the brain's information processing, such that it would respond in essentially the same way as the original brain and experience having a sentient conscious mind.[13][26][15] Substantial mainstream research in related areas is being conducted in animal brain mapping and simulation, development of faster supercomputers, virtual reality, brain–computer interfaces, connectomics, and information extraction from dynamically functioning brains.[27] According to supporters, many of the tools and ideas needed to achieve mind uploading already exist or are currently under active development; however, they will admit that others are, as yet, very speculative, but say they are still in the realm of engineering possibility.
Neuroinformatics (in context of library science) is also devoted to the development of neurobiology knowledge with computational models and analytical tools for sharing, integration, and analysis of experimental data and advancement of theories about the nervous system function. In the INCF context, this field refers to scientific information about primary experimental data, ontology, metadata, analytical tools, and computational models of the nervous system. The primary data includes experiments and experimental conditions concerning the genomic, molecular, structural, cellular, networks, systems and behavioural level, in all species and preparations in both the normal and disordered states.[28] In the recent decade, as vast amounts of diverse data about the brain were gathered by many research groups, the problem was raised of how to integrate the data from thousands of publications in order to enable efficient tools for further research. The biological and neuroscience data are highly interconnected and complex, and by itself, integration represents a great challenge for scientists.
The United States National Institute of Mental Health (NIMH), the National Institute of Drug Abuse (NIDA) and the National Science Foundation (NSF) provided the National Academy of Sciences Institute of Medicine with funds to undertake a careful analysis and study of the need to introduce computational techniques to brain research. The positive recommendations were reported in 1991.[29] This positive report enabled NIMH, now directed by Allan Leshner, to create the "Human Brain Project" (HBP), with the first grants awarded in 1993. Next, Koslow pursued the globalization of the HPG and neuroinformatics through the European Union and the Office for Economic Co-operation and Development (OECD), Paris, France. Two particular opportunities occurred in 1996.
The two related initiatives were combined to form the United States proposal on "Biological Informatics". This initiative was supported by the White House Office of Science and Technology Policy and presented at the OECD MSF by Edwards and Koslow. An MSF committee was established on Biological Informatics with two subcommittees: 1. Biodiversity (Chair, James Edwards, NSF), and 2. Neuroinformatics (Chair, Stephen Koslow, NIH). At the end of two years the Neuroinformatics subcommittee of the Biological Working Group issued a report supporting a global neuroinformatics effort. Koslow, working with the NIH and the White House Office of Science and Technology Policy to establishing a new Neuroinformatics working group to develop specific recommendation to support the more general recommendations of the first report. The Global Science Forum (GSF; renamed from MSF) of the OECD supported this recommendation.
This scheme should eliminate national and disciplinary barriers and provide a most efficient approach to global collaborative research and data sharing. In this new scheme, each country will be expected to fund the participating researchers from their country. The GSF neuroinformatics committee then developed a business plan for the operation, support and establishment of the INCF which was supported and approved by the GSF Science Ministers at its 2004 meeting. In 2006 the INCF was created and its central office established and set into operation at the Karolinska Institute, Stockholm, Sweden under the leadership of Sten Grillner. Sixteen countries (Australia, Canada, China, the Czech Republic, Denmark, Finland, France, Germany, India, Italy, Japan, the Netherlands, Norway, Sweden, Switzerland, the United Kingdom and the United States), and the EU Commission established the legal basis for the INCF and Programme in International Neuroinformatics (PIN). To date, eighteen countries (Australia, Belgium, Czech Republic, Finland, France, Germany, India, Italy, Japan, Malaysia, Netherlands, Norway, Poland, Republic of Korea, Sweden, Switzerland, the United Kingdom and the United States) are members of the INCF. Membership is pending for several other countries. The goal of the INCF is to coordinate and promote international activities in neuroinformatics. The INCF contributes to the development and maintenance of database and computational infrastructure and support mechanisms for neuroscience applications. The system is expected to provide access to all freely accessible human brain data and resources to the international research community. The more general task of INCF is to provide conditions for developing convenient and flexible applications for neuroscience laboratories in order to improve our knowledge about the human brain and its disorders.
Seamless Wikipedia browsing. On steroids.