The hot jobs of this decade, almost without exception, have become “cerebral” in some way or another. Programmers build complex algorithms, quatitative financial analysts build equally complex models, and data analysts (with their myriad titles) are swimming in complex methods; even in the health industry you can see the trend toward an increased emphasis on problem solving ability…physician’s assistants that are capable of accurately diagnosing various conditions are more in demand than ever (long the exclusive domain of board certified medical doctors). How appropriate is it that brain technology would further the trend in “cerebral”-ization of work?
In collaboration with computer scientists, brain researchers have poked holes in the veil of the future–several technologies, previously only possible in the pages of Isaac Asimov and other Sci-Fi writers, such as Deep Brain Stimulation, Neuromorphic Computing, and Machine Learning have opened a new frontier for game-changing products and applications.
Deep Brain Stimulation (DBS)
DBS is essentially a pacemaker, repurposed for the brain. While nearly all current applications for DBS are in the correction of disruptive electrical signals in the brain, it proves it is possible to externally and locally trigger specific circuits in the brain responsible for motion, sensation, memory, emotion, and even abstract thought. Why might this lead to the creation of so called hot jobs? Imagine being the engineer who implements a DBS system to help reduce cravings for food due to boredom? Or a DBS system that helps you recognize individuals by stimulating circuits containing relevant information about that individual?
This is an image of a neuromorphic processor with 1/4 million synapses on a 16×16 node array
You might have already pieced together what this means but it is just what it sounds like: computers that are like brains in form. Now, they don’t actually look like brains but they utilize a fundamental architecture of nodes (neurons) connected (a la synapses) in a network with variable strengths. These variable strengths allow learning to happen (if you forgot how this works look here). As you can imagine this chip would be fundamentally different than the Intel processor your desktop or laptop probably have under the hood. The fundamental difference is that like your brain, neuromorphic chips must be trained in order to perform a task. Another interesting feature of these types of chips is that the task also needs to be designed. I can’t imagine a sexier job that thinking up tasks and training regimes for neuromorphic chips! If you aren’t convinced this is possible or coming in the near future, you might be surprised to hear that Intel and Qualcomm already have working prototypes and are planning to put them into cell phones very soon (read about it here).
If the concept of a machine learning doesn’t sound totally anthropomorphic to you…it probably should. But once again our understanding of how networks of neurons work has opened a huge can of worms for those who know how to hook them up and go fishing. Machine Learning forms much of the theoretical framework underlying neuromorphic computing. The major difference is that not being implemented in hardware allows the user a ton of flexibility to build creative and novel solutions. The types of problems that are being solved with Machine learning are crazy…there are many things that you and I are good at but would make your computer crash every time–face recognition, reading, writing, speaking, listening, and identifying objects are all within the domain of machine learning. As you can imagine we have only begun to tap the well of interesting applications for machine learning and there may be an inexhaustible need for engineers to come up with them.
Add some meat to your social media feed…follow The Public Brain Journal on Twitter
Clayton S. Bingham is a Biomedical Engineer working at the Center for Neural Engineering at University of Southern California. Under the direction of Drs. Theodore Berger and Dong Song, Clayton builds large-scale computational models of neurological systems. Currently, the emphasis is on the modeling of Hippocampal tissue in response to electrical stimulation with the goal of optimizing the placement of stimulating electrodes in regions of the brain that are dysfunctional. These therapies can be used for a broad range of pathologies including Alzheimer’s, various motor disorders, depression, and Epilepsy.
If you have any interest in writing here or would like to hear more about the work done by Clayton in the USC Center for Neural Engineering he can be reached at: clayton dot bingham at gmail dot com.