The complexity and potential of the human brain are astonishing. The brain is capable of performing millions of complex formulations in milliseconds. And yet, despite this power, humans are barely tapping the brain’s potential. Many researchers claim that most human beings only use a fraction of their brain’s capacity .
Computers have been compared to human brains in many ways. Computers are extremely fast processing devices, and we generally think they can do things faster than we can, since they perform very complex computations in less than a few microseconds. The fastest supercomputer today is called Tianhe-2, a supercomputer developed by China’s National University of Defense Technology, with a performance of 33.86 petaflop/s (quadrillions of calculations per second) .
Although this is almost incomprehensibly fast, current computer technology does not completely imitate the operations of the human brain. In fact, for a computer to approach the processing speed and skill of the brain, it would have to be very bulky, expensive, and power-inefficient. For example, in order to reach its incredible processing speed, Tianhe-2 needs to use 3,120,000 processor cores at the same time, occupying a space of 720 square meters (the equivalent of 10 midsize apartments), and spending 17.8 megawatts of power (enough to power 6,000 homes at the same time). The system has 1.4 million gigabytes of memory and 12.4 million gigabytes of storage. It was developed by a team of 1,300 scientists and engineers, at a total cost of 2.4 billion Yuan (390 Million US Dollars) .
The human brain packs phenomenal computing power into a tiny space (0.01 square meter) and uses only 20 watts of energy (barely enough to run a dim light bulb) . You could pack 72,000 human brains into Tianhe-2, and the supercomputer consumes power equivalent to 890,000 brains. So, one would expect it to be much powerful than our tiny brains. But is it?
Just imagine, for one second, if a supercomputer mimicked the human brain. In Japan, researchers used another supercomputer, named “K,” to simulate human brain activity – for one second . K is currently the fourth most powerful computer in the world, with 705,024 processor cores and 1.4 million gigabytes of memory.
It used the open-source Neural Simulation Technology (NEST) tool to replicate a network consisting of 1.73 billion nerve cells connected by 10.4 trillion synapses . While significant in size, the simulated network represented only 1% of the neuronal network in the human brain.
It took 40 minutes of computing time for this supercomputer to process and simulate the data for just one second of human brain’s 1% activity. Currently there is no computer in existence powerful enough to simulate the whole brain at the level of the individual nerve cell and its synapses .
This is because the average human brain consists of about 100 billion nerve cells (neurons) that are linked together by trillions of connections called synapses. As the tiny electrical impulses shoot across each neuron, they have to travel through these synapses, each of which contains about 1000 different switches which route that electrical impulse. In total, one human brain could contain hundreds of trillions of these neural pathways. Each neuron is able to contact any other neuron with no more than six inter-neuronal connections — six degrees of separation. 
The human brain is also the cheapest and most compact storage unit you can get. Considering there are around 100 billion cells in the human brain, and that each cell’s DNA can store 1.5 gigabytes of data, the approximate amount of data the brain’s DNA can store is 150 trillion gigabytes (the equivalent of 30 trillion DVDs worth of space) .
These unique capabilities of the human brain in terms of computation and data storage continue to inspire the next generation of supercomputer designers. Inspired by the fascinating architecture of the brain, scientists have recently developed a new kind of computer chip that uses no more power than a hearing aid and may eventually excel at calculations that perplex today’s supercomputers . It tries to mimic the way brains recognize patterns, relying on densely interconnected webs of transistors similar to the brain’s neural networks.
The new brain-inspired chip’s electronic neurons are able to signal others when the amount of sensed light passes a certain threshold. Working in parallel, the neurons begin to organize the data into patterns suggesting the light is growing brighter, or changing color or shape. The chip contains 5.4 billion transistors, yet draws just 70 milliwatts of power. By contrast, modern Intel processors in today’s personal computers and data centers may have 1.4 billion transistors and consume far more power — 35 to 140 watts . This is approximately 2000 times more power than the proposed new chip will use. This chip has one million neurons, making it about as complex as the brain of a bee.
Following another recent study, IBM has unveiled a prototype of another brain-inspired computer, which is powered by what is called "electronic blood." Inspired by the space- and energy efficiency of the brain, IBM developed a "redox flow" system which pumps an electrolyte "blood" through a computer, carrying power in and taking heat out .
Their vision is that a petaflop supercomputer that would normally fill half a football field today, will be able to fit on your desktop by the year 2060. Using traditional approaches, this is impossible. If you were going to build a petaflop supercomputer by simply increasing the number of CPUs, you would need an area of 13,000 square meters, nearly two football fields, to host such a computer; and you would require 320 megawatts of electricity to power it, which is equivalent to a medium-sized city‘s electrical consumption.
If we want to fit a supercomputer inside a sugar cube in the near future, we will need a major paradigm shift in today’s electronics technology. To achieve this shift, we need to be motivated by our astonishing brain. As mentioned above, the human brain is tens of thousands of times more dense and efficient than any computer produced with the latest technology. That's possible because it uses only one extremely efficient network of capillaries and blood vessels to transport heat and energy, all at the same time .
According to the scientists working in this area, the future generation of supercomputers can be more efficient in terms of storage space, energy consumption, and performance if they are inspired by our brains. Although engineering computers that can reach and exceed the complexity of the brain still remain a dream for scientists, our miraculous brain continues to inspire them as they pursue new technological discoveries.
What would you trust to remember critical information – your brain, or your computer? If you said your computer, you would be wrong. Even the strongest computers in the world cannot match the human brain’s complexity, memory, and processing speed. The human brain is the world’s greatest computer.
 Caine, Renate Nummela, and Geoffrey Caine. "Making connections: Teaching and the human brain." Educational Resource Information Center (1991).
 Gholkar, Neha, Frank Mueller, and Barry Rountree. "A Power-aware Cost Model for HPC Procurement." In the Proceeding of the Twelfth Workshop on High-Performance, Power-Aware Computing (HPPAC'16), Chicago, IL, May 2016.
 Zhendong, Pu. “China bytes back with fastest computer”. China Daily (2013).
 Versace, Massimiliano, and Ben Chandler. "The brain of a new machine." IEEE spectrum 47, no. 12 (2010): 30-37.
 Helias, Moritz, Susanne Kunkel, Gen Masumoto, Jun Igarashi, Jochen Martin Eppler, Shin Ishii, Tomoki Fukai, Abigail Morrison, and Markus Diesmann. "Supercomputers ready for use as discovery machines for neuroscience." Front Neuroinform 6, no. 26 (2012): 2.
 Diesmann, Markus, and Marc-Oliver Gewaltig. "NEST: An environment for neural systems simulations." Forschung und wisschenschaftliches Rechnen, Beiträge zum Heinz-Billing-Preis 58 (2001): 43-70.
 Moravec, Hans. "When will computer hardware match the human brain." Journal of evolution and technology 1, no. 1 (1998): 10.
 Drachman, David A. "Do we have brain to spare?." Neurology 64, no. 12 (2005): 2004-2005.
 Goldman, Nick, Paul Bertone, Siyuan Chen, Christophe Dessimoz, Emily M. LeProust, Botond Sipos, and Ewan Birney. "Towards practical, high-capacity, low-maintenance information storage in synthesized DNA." Nature 494, no. 7435 (2013): 77-80.
 Hsu, John. "IBM's new brain [News]." Spectrum, IEEE 51, no. 10 (2014): 17-19.
 Damaraju, Satish, Varghese George, Sanjeev Jahagirdar, Tanveer Khondker, Robert Milstrey, Sanjib Sarkar, Scott Siers, Israel Stolero, and Arun Subbiah. "A 22nm ia multi-cpu and gpu system-on-chip." In Solid-State Circuits Conference Digest of Technical Papers (ISSCC), 2012 IEEE International, pp. 56-57. IEEE, 2012.
 Ruch, Patrick, Thomas Brunschwiler, Stephan Paredes, Ingmar Meijer, and Bruno Michel. "Roadmap towards ultimately-efficient zeta-scale datacenters." In Proceedings of the Conference on Design, Automation and Test in Europe, pp. 1339-1344. EDA Consortium, 2013.
 Garimella, Suresh V., Tim Persoons, Justin Weibel, and Lian-Tuu Yeh. "Technological drivers in data centers and telecom systems: Multiscale thermal, electrical, and energy management." Applied Energy 107 (2013): 66-80.