Nikola Tesla & His Prediction.
- Rashad
- May 17
- 6 min read
“You will live to see man-made horrors beyond your comprehension” is a quote spoken by Nikola Tesla at the first Electrical Exhibition in September 1898 in response to a question about militarizing his radio-controlled model rowboat invention.
There is little historical information detailing the motive and reasoning behind this statement, but a century later, we understand almost exactly what he meant.
But do we really?
I honestly think that the worst is still yet to come—let me explain why. In 1965, Gordon Moore, an American engineer and co-founder of Intel, made a very accurate prediction: “The number of transistors per silicon chip will double every year.” Here’s what that statement actually means.
The law is 60 years old and still stands strong. More transistors and components mean more computing power, higher efficiency, and more complex functions—all occurring at a diminishing cost. Therefore, as time goes by, you can do more at a lower cost. For example, the first computers in the 1960s and 70s were large enough to fill a room and cost millions (equivalent to tens of millions today). Sixty years later, you can fit an iPhone in your pocket, which has more computing power than those early computers.
If we return to the original trigger of this article and unravel the quote, we might assume the following. "Tesla emphasized humanity’s moral obligation in creating or enhancing new technologies, predicting that society will have a struggle with the ethical implications of its advancements.
Do not get me wrong, the danger is not only considered to be within the boundaries of warfare. Furthermore, he also hinted at societal issues, pollution, climate change, and other disruptions caused by industrialization
Ultimately, Tesla's quote serves as a warning about the dual nature of technological progress, urging us to consider the potential consequences of our innovations.
In the following paragraphs, I will share my views on AI, warfare & other technological trends which, will solidify Tesla’s & Moore’s predictions.
In a recent episode of the Modern Wisdom podcast, Chris Williamson asked Naval Ravikant the following question: “What do you think is currently ignored by the media but will be studied by historians?”
Naval tackled very interesting points throughout his 10 minute answer, however the most accurate and agreeable statement he said is the next quote. “The future of all warfare is drones, there will be nothing out there on the battlefield. The end state of drones is autonomous self directing bullets” As someone who’s career revolves around tech & who has above average understanding of technological trends, I simply can not ignore how big of a threat autonomous drones are to humanity.
In last 5 years humanity has seen 2 major wars. One of the was fought within my country. The 44 day war between Azerbaijan & Armenia. The second one is still ongoing - the war between Russia & Ukraine. Thousands and thousands of drone footages emerged from these two conflicts. The whole warfare landscape has changed due to existence of 1st generation drones.

“As general drone use in conflict has risen, the number of drone strikes and fatalities has similarly increased. In 2023, there were over 3,000 deaths from drone strikes, or just under two per cent of all battle deaths. This represented a rise of 168 per cent since 2018. The number of drone strikes showed a far more dramatic increase, with 4,957 strikes recorded in 2023, up from just 421 in 2018.”
The data does not show 2024 & 2025, but you get the idea of exponential growth of drone usage.
Have you seen the recent drone shows in China or Dubai? That literally is the future of drone warfare. Thousands of little kamikaze drones flying around like a swarm & dispatching autonomous bullets. This projection is not lightyears or generations away. We will see obscure scenes of military drone usages in our lifetime, in fact within next 5 years. Moreover, usage stats of recent 3 years is going to be nothing in comparison to future statistics.
Hottest topic in tech is AI at the moment. Insertion of AI into drone warfare is going to make those deadly gadgets even more lethal. AI race between USA & China has reached its all time high.
Everyone who knows the term AI has the fear of AI taking all our jobs. That fear makes a lot of sense. However, it does not mean that we will all be unemployed in years to come. AI will take a serious toll on certain jobs which can be automized easily. We can see it already happening within a few professions. Simple tasks will be completed by autonomous systems & AI’s. Governments will find a somewhat logical way to compensate the ratio between AI & unemployment.
As you know photo cameras did not make painters and artists obsolete. The same way AI will not make us obsolete either. However, what troubles me is the abundance of time which will be created by autonomous machines doing our chores & tasks for us. What are we going to spend that time on? Entertainment? Further brain rot? Social media? Meaningless debates online? Further conflicts over energy sources? Excessive progressivism? Never ending liberalist battles? Greta Thunberg?
“Even now, remotely piloted UAVs are using AI for autonomous takeoff, landing, and routine flight. All that’s left for human operators to do is concentrate on tactical decisions, such as selecting attack targets and executing attacks.
AI also allows these systems to operate rapidly, determining actions at speeds that are seldom possible if humans are part of the decision-making process. Until now, decision-making speed has been the most important aspect of warfare.”
What happens when the decision making upon impact is sole responsibility of an AI. On the battlefield, would the machine be held responsible if the target was mistaken or if civilians were killed as a result?
The United States and Chinese military are testing the use of swarming drones—dozens of unmanned aircraft that can be sent in to overwhelm enemy targets and can result in mass killings.
Alvin Wilby, vice president of research at Thales, a French defense giant that supplies reconnaissance drones to the British Army, told the House of Lords Artificial Intelligence Committee that rogue states and terrorists “will get their hands on lethal artificial intelligence in the very near future.” Echoing the same sentiment is Noel Sharkey, emeritus professor of artificial intelligence and robotics at University of Sheffield who fears “very bad copies” of such weapons would get into the hands of terrorist groups.
There is only one small benefit of development of AI into warfare - humans will be pushed out of life and death decisions. However, the causality of that is that lethality of the weapons used will grow by 100x.
The educated projections I have outlined above are expected to be implemented within next 5 years. In a recent podcast Ray Dalio said the following “Next 5 years - It’s like going through a time warp. We’re going to be in a different world. And the disruptors will be disrupted,”
Nonetheless, even bigger changes will be occurring towards the year 2045. By 2045, Kurzweil makes one of his boldest (and possibly most harrowing) predictions, known as The Singularity. It essentially asserts that nonbiological intelligence will surge past human intelligence at a rate so fast that unenhanced human intelligence will be unable to follow it. That point, known as the Singularity, will mark the moment when artificial general intelligence will be capable of recursive self-improvement -- meaning it can progressively redesign itself to become autonomously more powerful and intelligent.
Couple of years back, I have written a separate article on technological singularity. You can find it here. Therefore, I would like to summarize my thoughts using the same paragraphs as before, in that article.
A continent full of technological wizards and gurus will bring an evolution of a new generation that will have new morality codes. The advancements of technology are an example of a no-return situation meaning that the individuals of the particle age will have to bring about civilization changes. The later generation technologies and rules will be absolute and incomprehensible.
In a more summative approach, the notion of technological singularity affects both positively and negatively. The context in this article is based on the troubling evolution of technological singularity which according to various points of view, this era will be chaotic. If at all singularity reaches the realm of possibility, then more negative consequences will affect the world compared to the positive effects.
They say Moore’s law will hold its validity all the way up to the technological singularity. Meaning if Kurzweil’s prediction is valid and if Moore’s law holds till then, that means Tesla was right. However, in my honest opinion all of them were right - we just have to wait and see. Some theories need a test of time.
Although these predictions are not pleasant, it is up to humanity and those in positions of power to effect positive change. Otherwise, we are facing a future which is not so bright.
As a quote from my favorite TV show, Game of Thrones, goes: “The night is dark and full of terrors."
Yet, despite the daunting challenges and uncertainties that lie ahead. I have hopes in humanity and our ability to adapt, innovate, and find meaning in every era. History has shown that even in the face of rapid change and emerging threats, people have always found ways to rise above adversity, create beauty and build better communities. If we approach the future with curiousity, virtue, responsibility and a genuine desire to uplift one another, there's every reason to believe that we can affect change. The night may be dark, but dawn always follows.
Pura Vida!
Rashad
Comments