As humans we can evolve and become something else, or we can become extinct without spawning new species, but there is something that we cannot do: create a technological singularity.
Human progress is not in the right direction
We probably do not need powerful computers to build artificial intelligence that surpasses every AI researcher, we probably do not need to simulate the brain just like planes do not need to simulate how birds fly. We need to understand the fundamentals of intelligence, just like we figured out the fundamentals of aerodynamics. Incentives are not there. Not much has changed at a fundamental level since Kasparov was defeated (or earlier).
Human economy does not allow it
With Bitcoin using as much power as Denmark, it is clear we can build powerful computers, but incentives (again) are not in simulating neurons, they are in calculating hashes, a useless proof of work, for artificial scarcity. No amount of hashes is going to produce intelligence. There are no incentives for the fundamental work that is required. In the short-term, paperclips are more profitable than gods. Humans are competitive species, not collaborative ones.
Furthermore, a private singularity (impossible, previous paragraph) would be a disaster, no company can compare with a god. We can hope Google would “not be evil“. Indifference is not evil, would that help? Currently, the knowledge economy is stopped with artificial scarcity to keep the status quo. Society would never meet that god, only few would be able to afford that, the rest expected to die.
Progress is a series of logistic functions
Logistic functions look like exponentials, until the channel is saturated and diminishing returns happen. When jumping from one to another it may seem as an exponential, then Dark Ages happen. There can be logistic functions in the future that take us to the singularity, but those are not possible with the human channel (previous points). We may soon face the greatest era of diminishing returns, saturating the human channel.
Challenges are always underestimated
From present ignorance we can never properly estimate the proportions of future challenges. Our societies are becoming too complex and backpropagation fails with too many layers. Inefficiencies accumulate in a mess.
Reshaping culture, economy, and politics at such fundamental levels will require centuries, then the fundamental research leading to the singularity will be possible.