Algorithms, past, present and future …


From an early age, primary school pupils learn they can divide 14 by 4. They share 14 sweets in 4 plates, by mechanically drawing or putting one sweet at a time in each plate of the four plates. They repeat this process until they have less sweets left than they are plates available. As they progress through their education, they will become accustomed to the concept decimal values, before revising the concept of remainder, which would lead to the concept of modulus classes. Eventually, some of them will be able to express and understand fully that 14 mod 4 is equivalent to 2 and so 18 mod 4.
For these school children, mathematical problems are solved following rigorously step-by-step procedures, through the primary and secondary school. Once the techniques are mastered, a solution for a certain type of problems is always obtained with success. As they become more efficient at using these methods, their brain can find a solution automatically.
These methods illustrate quite well that at the heart of mathematics, and especially algebra, self-contained step-by-step set of operations are performed, to compute some calculations. These effective methods are expressed as a finite list of well-defined operations, that are applied to an initial state and an initial input. An initial input is transformed successively, transforming this initial state to well-defined successive states. After last operations is executed, the process is terminated and the last state becomes an output; the solution of a problem. For our school children, the initial state is four empty plates and an initial input of 14 sweets. They successively execute the actions “pick a sweet”, “place it an a plate” and “stop when there is less sweet than plates”. The output of will be the number of sweets inside and outside the plates; more precisely 4 bonbons in each dish and 2 left outside.
These types of methods are formally called “algorithms”; in honours of the scholar Abu Abdallah Muhammad ibn Musa al-Khwarizmi [ add ref]. This mathematician and astronomer is considered to have established the basis for innovation in algebra and trigonometry. Through time, his legacy has been enthusing or “torturing” pupils and under the reign of George the IV in England, algorithms have been integrated in computers. Generally considered as the first computer, Charles Babbage built a mechanical calculator that is capable of computing many tables of numbers that are used by engineers, scientists, and navigators. Based on an idea published in 1796 by J. H. Muller, the Difference engine can tabulate polynomial functions to approximate logarithms and trigonometric tables, for example. This calculator was originally built to overcome the difficulty in producing tables free of errors. When computed by a team of mathematicians, these tables were prone of errors.

In two hundreds years, computers have developed capabilities from tabulating numerical tables to predict the weather 45 days in advance. It would be far too ambitious to describe this fascinating development in one article though. It is feasible thought to have a glimpse of how mathematics has contributed a lot in the development of computer science. As kings and queens of England made history, computer science has been developing continually. In the mind of a computer scientist, an algorithm uses precise instructions, preferably in a language that is understood by the computer. These algorithms tend to embrace many mathematical concepts, such as algebra and logic. More importantly, the data they process cannot only be structured using a simple boolean values (0 or 1), but also vectors, matrices, lists, trees, hash-tables, records, and object-oriented classes. Similarly to mathematics, these lists of instructions always return a result. Except that the concept of output has been extended to the idea to return “nothing”, which is often referred as “void”. With this newly added value, modularity can simplify complex computer programmes in routines, subroutines, procedures, and methods. Codes can then be used again and development time reduced dramatically.
The author finds mathematics and programming languages being beautiful languages that are always refining themselves to represent the world surrounding us. As a result we can make sense of it, by simulating it. Enthusiastic mathematicians and computer scientists may be fascinated to witness that algorithms are still helping humanity to understand mathematics to solve problems. Have you considered that without mathematics and algorithms, http://www.accuweather.com would not be able to predict the weather as far as 45 days on this planet. This application of the chaos theory may not compute accurate predictions as its users would like too, but it is nonetheless less a great achievements.
In the future, it is imaginable that programming languages could become more specialised to a problem domain. Consequently programmers could develop algorithms that solve more efficiently problems related to one particular area. Some of the readers may argue that such tools have been available for a while now. It is undeniable that the C programming language is more suited to develop applications and operating systems. Let’s not for forget that Java, C# and Visual Basic are valuable interpreted programming languages to develop desktop applications, with its effective graphical user interfaces library. Php and ASP.Net have been very successful at developing web applications. Python has been adopted by a large part of the research community and also helped integrating system in the industry. All these applications of programming languages depends more on the platform, rather than the problem domain. The same sort algorithm applies the similar instructions to solve the same problems, whether it is in C, Java or Php. Would it be more efficient to specialise a programming just for the financial sector? Would it be interesting to develop a programming language that provides primitives and data structure to predict biological problems? Would music technology benefit from a music-related programming language.

These are some of the questions that could move forward computing and algorithms forward. The problem-specific primitives and data structure could lead in human and machine learning unusual order of operations, that have not been though yet. Yet the author is aware that not everybody mathematics, algorithms, and programming languages. Also it may be foolish to believe it can represent every aspect of the world that is surrounding us. She is aware too that children are independent being with a lot of curiosity and enthusiasm. More importantly, they adopt a trial-and-error process approach to life, that can innovate currently much better than a computer. Still this type of methods have emerged too in computing and will be discussed in the next article.


2 responses to “Algorithms, past, present and future …”

Leave a comment