When I was a teenager in the early-to-mid 80s, I was very excited about the potential for computers to simulate intelligence. An early reading of Margaret Boden's excellent introduction to Artificial Intelligence (AI), Artificial Intelligence and Natural Man (stolen from my mother, who was studying behavioural psychology at the time), and my father's Christmas gift one year of Douglas Hofstadter's Gödel, Escher, Bach: An Eternal Golden Braid, persuaded me that AI was not only possible, it was inevitable, and that not only could machines be made to think like humans but humans could be uploaded to machines. I was, I suppose, an early proponent of transhumanism.
But over the last fifteen years, my view has changed. Studying chemistry, molecular biology, and evolution has convinced me that life and intelligence are intimately bound up with proteins, neurons, axons, and blood, and the chemistry that binds them all together. Transhumanists talk about creating a new substrate for human intelligence, on silicon or something else. The believe that the essence of what makes us intelligent, what makes us us, can be distilled from the biology and uploaded into a new technology. Singularitarianism, a more extreme form of transhumanism, believes that, given enough processor power, that the machines themselves will become so powerful as to spontaneously achieve intelligence, the so-called technological singularity.
I'll deal with these two beliefs separately.
The first, the ability to distill a human mind, and reinfuse a new substrate with it, is quite appealing. Imagine being able to forego your clapped-out old body, and transplant yourself into a new, and essentially immortal, one. No more aches and pains; no more disease or failings due to old age. And if your new body did start to wear out or got damaged, you could either fix it, get a new one, or have spare copies of yourself hanging around for just such an eventuality.
The only slight problem is that the conditions for life to emerge from the biology and chemistry, and likewise intelligence, are very far from being understood. The structures of the brain appear to be important in supporting our intellects, but why and where that happens, and whether that structural information can be extracted and recreated, is absolutely beyond current knowledge. We have around one hundred billion neurons and perhaps one hundred trillion synapses in our brains. By contrast, one of the best understood animals, the nematode worm, has 302 neurons, and we still don't understand how that tiny brain works, let alone being able to upload a worm-intellect to a microchip. The fruit fly has 10,000 neurons, and its behavioural complexity is already many orders of magnitude more complex than the worm. No doubt advances will be made in simulation of individual neurons and later, mass neuron assemblages, but simulation is not emulation.
Which brings me to the second, and related, issue: the increasing power of computers. Moore's Law is an oft-quoted and oft-abused name for a trend in computer processor technology. In its pure form, it states that "the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years". Plotting the number of transistors on a vertical log scale versus time shows a more-or-less straight line on this semi-log plot:
Transhumanists/singularitarians such as Ray Kurzweil, however, believe that in order for the singularity to occur (where available computing power outstrips all human brain power, and then some. and -- somehow -- a super-intelligence emerges from this), there will need to be so-called exascale computing processor technologies. And this is where the other interpretation of Moore's Law, which (incorrectly) states that computer processor performance doubles approximately every two years, will begin to fail.
The current list of the most powerful supercomputers is maintained at www.Top500.org, and the K-computer, the number one supercomputer in the world as of today, has hit 10 petaFLOP/S of performance. It would seem that one only need scale up the K-computer by 100-fold to achieve an exascale machine.
However, an article recently published in Science  points out why this isn't feasible.
The Science article draws on a four-year-old paper commissioned by DARPA, the US Government research group which developed the technologies which ultimately led to the internet, in 2008  to investigate what would be required to advance computer processor technologies by one thousand fold, from the so-called petascale range into the exascale range. The report's conclusions surprised even the authors, many of whom had worked in developing the petascale range supercomputers from the 1,000-times less powerful terascale range machines of the mid-to-late 1990s. The brief was to identify the technological advances required to develop exascale machines by 2015; the authors categorically stated that it couldn't be done by then, and might well take more than a decade to achieve, if at all. The 2008 paper is still representative of the state-of-the-art in exascale challenges, which in and of itself should be a clue as to their immensity.
This has obvious implications for the singularity: if, for the first time, the growth of processor performance slows from its current exponential rate, which seems likely, then 2045 begins to seem extremely unlikely as the date by which global aggregate computing power matches global aggregate human brain power, and the singularity is not going to happen. Should we be surprised by this limit on exponential growth? Not really; it happens in nature all the time:
So what are we to conclude?
In other words, you're less likely to see such contrasts as this in the future:
(An Osborne Executive portable computer, from 1982, and an iPhone, released 2007. The Executive weighs 100 times as much, is nearly 500 times as large by volume, cost approximately 10 times as much (adjusting for inflation), and has 1/100th the clock frequency of the iPhone.)
 What It'll Take to Go Exascale, Robert F Service, Science 27 January 2012 (http://www.sciencemag.org/content/335/6067/394)
 ExaScale Computing Study: Technology Challenges in Achieving Exascale Systems, Peter Kogge et al, DARPA 2008 (http://www.cse.nd.edu/Reports/2008/TR-2008-13.pdf)