A programming language

Programming language theory (commonly known as PLT) is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of programming languages and their individual features. It is a multi-disciplinary field, both depending on and, in some cases, affecting mathematics, software engineering, linguistics, and even the cognitive sciences. It is a well-recognized branch of computer science, and as of 2009, an active research area, with results published in numerous journals dedicated to PLT, as well as in general computer science and engineering publications.
A programming language is a machine-readable artificial language designed to express computations that can be performed by a machine, particularly a computer. Programming languages can be used to create programs that specify the behavior of a machine, to express algorithms precisely, or as a mode of human communication.
Many programming languages have some form of written specification of their syntax and semantics, since computers require precisely defined instructions. Some are defined by a specification document (for example, an ISO Standard), while others have a dominant implementation (such as Perl ).
The first programming languages predate the modern computer. The 19th century had “programmable” looms and player piano scrolls which implemented what are today recognized as examples of domain-specific programming languages. By the beginning of the twentieth century, punch cards encoded data and directed mechanical processing. In the 1930s and 1940s, the formalisms of Alonzo Church’s lambda calculus and Alan Turing’s Turing machines provided mathematical abstractions for expressing algorithms; the lambda calculus remains influential in language design.
A programming language provides a structured mechanism for defining pieces of data, and the operations or transformations that may be carried out automatically

on that data. A programmer uses the abstractions presented in the language to represent the concepts involved in a computation. These concepts are represented as a collection of the simplest elements available (called primitives).
Programming languages differ from most other forms of human expression in that they require a greater degree of precision and completeness. When using a natural language to communicate with other people, human authors and speakers can be ambiguous and make small errors, and still expect their intent to be understood. However, figuratively speaking, computers “do exactly what they are told to do”, and cannot “understand” what code the programmer intended to write. The combination of the language definition, a program, and the program’s inputs must fully specify the external behavior that occurs when the program is executed, within the domain of control of that program.
Programs for a computer might be executed in a batch process without human interaction, or a user might type commands in an interactive session of an interpreter. In this case the “commands” are simply programs, whose execution is chained together. When a language is used to give commands to a software application (such as a shell) it is called a scripting language.
Many languages have been designed from scratch, altered to meet new needs, combined with other languages, and eventually fallen into disuse. Although there have been attempts to design one “universal” computer language that serves all purposes, all of them have failed to be generally accepted as filling this role.


1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)



A programming language