Computing : The Technological Journey So Far



Computing in modern times is generally believed to be an attribute of "modern computers" but in truth, computing actually predates our modern computers and is not only done by computers. Before we proceed further, let's clarity ourselves on what computing actually is.

Computing simply speaking is a goal oriented activity, that is, it's an activity undertaken for the purpose of getting a "desired" result and it is nowadays mostly mathematical in nature. For example, you wish to move certain goods from point A to B in the shortest possible time, all you need do is take into account the factors involved like the weight of the goods, size of the goods, speed of the means of transportation, size of the means of transportation, possible routes, distance to the destination, e.t.c. Luckily for us, most of the aforementioned factors can be quantified and because of this, mathematical approaches can be employed, you could model the scenario mathematically and use the necessary mathematical operations to arrive at a mathematical result (which could be done by hand or by computers). The mathematical results could help you make better decisions that would help in achieving the goal, which in our case is moving the goods to a destination in the shortest possible time. In the scenario just discussed, the mathematical processes/activities involved in arriving at a decision is termed "computing".

To get best results from most activities performed daily by man, be it in commerce, humanities, engineering and sciences, computation is inevitable. Computing could be done by hand (for human beings) or with the use of a computer and in the case of human beings, it can be classified into two, one is the conscious computing and the other is the subconscious computing. The conscious computing is mostly done intentionally with our brain and therefore can be controlled. The subconscious computing on the other hand is not mostly done intentionally, in fact, the subconscious computing is among what makes us alive. The brain believed to be the seat of consciousness and control, is made up of cells called "neurons" and it has been discovered recently that a single neuron performs very complex computations that till today science has not fully unraveled and engineering cannot completely mimick. It is why the brain is still by far better than our current most advanced computer. The kind of computing done by these neurons is the subconscious kind and it is partly responsible for our biological behaviors and natural activities.

The earliest form of computing was done by hand, which was about thousands of years ago when man started becoming civilized. In around c. 2700 - 2300 BC, the first computing device (nonhuman) called the Sumarian abacus was invented by the Babylonians, a much more sophisticated form of the abacus was invented by the Chinese in the 2nd century BC.

images (1).jpeg

Also, the Chinese were the first to invent computing devices that used gear mechanism - a mechanism that would prove handy in the invention of future mechanical computers, but the first mechanical analog computing device called the Antikythera mechanism was invented in Greece and it dates back to around c. 100 BC.

The first programmable mechanical analog computing devices first appeared in the medieval Islamic world and were invented by Muslim astronomers and engineers, the Muslims also made important advances in cryptography.

The works of the medieval Islamic scientists and engineers in terms of computing devices would inspire the Europeans in the middle ages to develop better computing devices and this time around, computing devices were designed to carry out complex arithmetic computations - the devices invented previously were mostly used for simple arithmetic computations, astronomical and navigational computations.

Pascaline - a mechanical calculator, was invented by Blaise Pascal in the mid 19th century (when he was just 18 years of age) in an attempt to help his father (who was a tax collector), this computing device was designed to basically do addition and subtraction. An improved version of Pascaline was invented by the German polymath Gottfried Wilhelm Leibniz, it was intended to perform direct multiplication in addition to other operations (addition and subtraction) done by the Pascaline but his plan didn't work as he expected, surprisingly he invented a newer kind of calculator called the "stepped reckoner" and it was the first digital mechanical calculator.


Throughout the 18th century and the beginning of the 19th century saw the invention and development of calculators that could do direct multiplication and were inspired by the designs of Pascal and Leibniz.

In 1822, the first automatic calculator was conceived by Charles Babbage, it was called the difference engine and was intended to print polynomial tables. In 1834, a precursor to modern computers in terms of architecture was also conceived by Charles Babbage and was called the analytical engine, it is for this reason he is often regarded as the father of computer. Also, the first program that would be a precursor to modern programs was proposed by a close friend of his, her name was Ada Lovelace, the program was intended to be implemented on the analytical engine. The analytical engine was never physically built by Charles Babbage, he only made sketches of how it would be like and operate, it was later, after his death that computers following his ideas began to exist, the first being the one designed by his son Henry Babbage.


The 20th century saw mind-boggling and explosive breakthroughs in computing technology, with world war 2 acting as catalyst. Modern computers, especially electronic digital ones first appeared in the 20th century and were designed to not only perform arithmetic computations as previous ones did but also perform varieties of computations, including very complex mathematical computations that would take humans and the mechanical calculators many years to perform. It was during this period that computer science as it's own discipline first emerged. The discovery of semiconductors and it's applications in the 20th century made it possible for mobile and microcomputers to exist.

Today very powerful computers exist that performs very complex computations to attend to some of our daily needs , we can find them in the entertainment world, hospitals, manufacturing industries, at home, offices, schools, research institutions, e.t.c. However, further research in computing technology is still on going (especially in the development of other non-electronic computers such as optical computers, quantum computers, DNA computers , e.t.c) and will continue to go on to further attend to needs that still can't and would not be met by the available computers, and also aid in better understanding of our universe and our existence.

images (2).jpeg

It is here we conclude this article, have a thoughtful day and see you next time.

For further reading


History of computing

Computer science

How Computationally Complex Is a Single Neuron?

Thank you all once again for stopping by to read my jargons and also thank you @juecoree, @discovery-it and the @Steemstem team for your valuable supports.

images (3)~2.jpeg

Lastly, please don't forget to do the needful
If you enjoyed my jargons.



Congratulations @clinton19! You have completed the following achievement on the Hive blockchain and have been rewarded with new badge(s):

You distributed more than 200 upvotes.
Your next target is to reach 300 upvotes.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

To support your work, I also upvoted your post!

Check out the last post from @hivebuzz:

HiveFest⁶ Meetings Contest


There might be issues with some of your images as far as copyright is concerned. You might want to replace them with free to use images.