Monday, February 11, 2013

A Brief Overview of Programming Languages and Symbolic Logic (Part I)


What kind of spirit is it, that can support you to keep your interest on C++ Standardization for 20+ years?
Nothing else I could spend my time on could do so much good to so many people.
from a  Interview with Bjarne Stroustrup (2012); Author of the C++ Programming Language  

Some would say that the best way to learn a programming language is to simply install your language of choice and start writing code based on the manual, tutorials or help system specific to that language.  However, having studied more than few programming languages in my lifetime,  I am going to emphasize another deeper background approach.

The ability to use digital logic circuits in computation has been with us a very short historical period of time.  The rise in their use has paralleled the rapid spread of urbanization, growth in population, universities, international trade and market economies.  Today, more than sixty years after the United States Army Ballistic Research Laboratory developed ENIAC, computational logic is now the substructure of all economic, engineering, defense, science, and mathematical  efforts.  Each year our lives are more organized around and organized with improvements in computer hardware, computational theory, information theory, networking and user interface design. During these scant last sixty years, at least 2500 computer languages have been developed. A "History of Computer Languages" constitutes a reading list of some of the brightest minds to live in the post World War II era. A brief, detailed but important list of popular programming languages can be found here.


For some time now,  the disciplines of digital intelligence have been engaged in segregation and specialization. An electrical engineering degree is now something quite different from a computer science degree. Most of us who write and learn computer software today concern ourselves little with how computer hardware processes logic.  But there was  time when the two disciplines were indivisible. Some scholars still believe that learning machine architecture (sometimes called 'machine architecture and organization'  or MOA) is essential to understanding the binary (or Boolean) logic (1,2 that is the foundation for all digital computation and programming languages.  Despite all such discussion, there is still no definitive path for the creation of a competent software engineer. The conventional wisdom generally involves obtaining a computer science degree.

That being said, the Wikipedia  list of college drop billionaires includes Bill Gates (Microsoft founder), Steve Jobs (Apple founder), Mark Zuckerberg (Facebook founder) and Larry Ellison (Oracle founder).  Many of us who studied other disciplines in college simply fell into computer administration and software engineering because we found we enjoyed or had a knack for it. All this being said, pursuing an electrical engineering or computer science degree would still be the recommended best first step on the way being hired by one of the companies organized by the men described above.

And perhaps the best step in understanding software engineering is to gain an understanding of math, logic, and more specifically mathematical (or symbolic) logic. Many of you in this class (5th - 10th graders at SPA) learn a math curriculum that prepares you to understand many of the principles from which computational logic and computer science are derived.  However, chief in importance among all these principles is the simple yet all encompassing conception that the principles of quantitative logic can be represented by symbolic logic or natural language.  To this concept, our survival on this Earth as a species for the last few thousand years owes much.  Let us examine briefly the history of mathematical (or symbolic) logic.

(To be continued)

No comments: