Skip to main content

Programming Language

Reasons to study concepts of Programming Languages
    Increased capacity to express programming concepts.
    Improved background for choosing appropriate language
    Increased ability to learn new languages.
    Understanding the significance of implementation
    Increased ability to design new languages
    Overall advancement of computing. 
    Programming Domains:  Scientific applications, Business applications, AI, Systems Programming, Scripting Languages, Special Purpose Languages. 
LISP:  Functional Programming
DIAGRAM 1.2, 1.3
C++ --> Translator --> Machine code
Assembly Language --> Assembler --> Binary 
Interpreted Languages:  LISP, Prolog, BASIC
Compiled Languages:  C, C++, Fortran, COBOL, Pascal
Hybrid Languages:  Java
Compiler makes sense of ASCI characters via Lexical Analysis.  Throws away useless stuff, i.e. comments and such. Uses whitespace as a delimiter. Groups important stuff into tokens (a string which has significance).
Human language is context sensitive.  If the machine reads "A+B=C" isn't dependant on what comes before or after.
FIGURE 1.5 PAGE 31
Lexical Analyzer:  Source Program --> Lexical Analyzer --> Tokens
Lexeme:  String value of a token. i.e. "A", "Total", "Sum". The number 100 is stored as a literal.
Syntax Analyzer/ Parser:  physically interacts with Lexical Analyzer a token at a time.
Interpreters for hybrid languages are platform specific. The Java class files are all the same.
** Know the difference between pure interpretation (Lisp), compilation (C++), and hybrid implementation (*.java --> *.class).**
Binding:  Assigning a value to an attribute. Binding time is when an attribute gets a value. There is early binding and late binding.
Times:  Language design time, language implementation time, compile time, run time.


In compiled languages, the variable attributes (type, name, address, range of legal values) the type and name attributes are bound early, at compile time.
In pure interpretation, binding time is late - run time. Bindings can change while the program is running via assignment operators.
pure interpreter has the memory of a goldfish - every time it sees a line, it sees it for the first time.
Von Neumann machine:  data and instructions store in the same memory. Load and store machine.
Von Neumann bottleneck is the buss between CPU and memory.
Imperative:  statements - do this, now do this, now do this... C, COBOL, Pascal, Fortran.
Functional:  AI, everything is a function. LISP, ML, Scheme.
Logic declarative:  AI, statements of fact, rules of application of rules. Prolog.
Object oriented:  an application is the interaction of objects.  C++, Small Talk, Java
Object Oriented Programs
Abstraction:  reduce to the simplest of processes.
Process Abstraction:  subroutines in a program.
Interface:  how we use an abstracted entity. If you know how to use the interface, you can use the entity.
Data Abstraction:  int sum;  // all you need to know is the attributes of an int to use sum.
Data Structures:  like an employee record with different parts.
Abstract Data Types:  a programmer defined data structure, with functions that manipulate the structure. Implementation is hidden from user. (Encapsulation) Why hide the information? Because it doesn't do the user any good to see it.
Functions are really methods with messages.
An object is a virtual computer. A class is like a template. An object is an instance of a class.
OOP ADT
user (public) --> interface --> implementation (private
JAVA PROGRAM ON PAGE 449
Memory is allocated from the system stack from "bottom to top". Memory is allocated from the heap from "top to bottom".
OO Code is usually shorter because of inheritance.
Polymorphism:  many types. Allows you to write templates in C++.
Inheritance:  one class definition is a parent (base) for a child (derived) class definition. Derived classes inherit all the data and functions of a base class. Promotes code re-usability.
In Java the word "extension" is used for inherited object.
JAVA EXTENSION ON PAGE 496
Static:  at compile time. In static memory allocations, the amount of memory used doesn't change.
Dynamic:  at run time. In dynamic memory, you could use a total of more memory during the run of the program than is available. Dynamic memory is used out of the stack and heap. Overhead for dynamic memory is insignificant compared to benefit.
Syntax:  grammatical rules (like CFG). You can tell if syntax is correct without knowing what any of the statements mean. Parsing deals with syntax
Semantics:  meaning.
Human language is context sensitive. Computer languages are context free.
PAGE 106, 107
Lexeme: The string value of a token.
Recognition:  If you know the rules of a grammar, you can recognize if a statement is valid. 
Generation:  You can also generate all valid statements.
G = {terminals, non-terminals, production rules, start symbol}
Terminals:  the leaves of the parse tree, like "stack_1"
Non-terminals:  categories of terminals, like "variable identifier".
Meta-language:  languages about other languages.
Production Rules:  which terminals or non-terminals can replace which non-terminals.
Compilers recognize from written code back to the start symbol


Unrestricted ( Context Sensitive ( CFG ( Regular Languages ) ) )
Type O:  unrestricted and recognized by a TM
Context Sensitive:  recognized by a linear bound automata
    Context Free:  recognized by a Push Down Automata. Programming languages are usually CFG
Regular:  recognized by a finite state machine (no memory)
PAGE 112, EXAMPLE 3.1
    X --> aX or  ā ** Right Recursive (left most derivation) This is can be used to indicate right associatively, such as for the exponentiation operator.
X --> Xa Left Recursive (Avoid this to avoid recursion loops). This can be used to indicate left associatively, such as for addition and subtraction.
Context Free:  A --> a  left hand side is always nonterminal
Context Sensitive:  a --> b  a and b are strings of terminals and non-terminals. Restriction is that the number of symbols in a can not exceed the number of symbols in b. (The context is:  something that happened earlier dictates what happens later, because terminals can appear on the left hand side)
BNF:  Backus-Naur Form - a meta-language
In C++, local variables are "somewhat" dynamic.
Undeclared variables are context sensitive (dependant on what happened before). Attribute grammars take care of this. Example of how grammars must be enhanced. Type casting is another example
PAGE 113, 114, 115
Higher precedence operators are found lower in the parse tree.
Associatively (operators with same precedence):  grammars usually have left to right associatively for addition and subtraction. Exponentiation is right to left.
PAGE 120 IF...ELSE
BNF vs EBNF
--> {;}
What is in braces can be reproduced 0 to infinite number of times.
--> [;]
What is in brackets is optional, but you can only have 1 or none.

Dynamic Error:  something that happens at run time.
    Example:  cin >> x; y = y/x;  // x is entered as 0 
    Attribute Grammar:  Grammar that has enhancements. It is a way to do formally what was previously done informally. BNF Grammar is about syntax, not semantics. Want to catch type casting errors? Those must be caught at compile time. Attribute Grammars help catch things like type casting errors. 
    Static Semantics:  Static almost always means compile time. You have to have a symbol table, a table of attributes, to store symbols. When it encounters something like int x = 10 it says "Is there an "x"? No? Put an "x" in the table and assign 10 to it." 
    Symbol Tables can be local, looking in the local module for a symbol first, then looking in the enclosing module. Environment = Symbol Table. 
    ATTRIBUTE TYPES:  Actual Types and Expected Types. 
    Actual Type:  variables and expressions have types associated with them. A synthesized attribute that is passed up the tree from the leaves to the root. The actual type of comes from the look-up table. 
    Expected Type:  an inherited attribute that moves (up from the variable to which something is being assigned on the left side of the assignment statement to the root and then moves) down from the root to the leaves (on the right side of the assignment statement). The expected type of an expression comes from actual types. 
    Expected Type comes from the LHS of an assignment.
    Actual Type comes from the RHS of an assignment. 
    PAGE 126 FIGURE 3.6
    PAGE 127 FIGURE 3.7 
    Expression Tree: Operands are the leaves. Operators are interior nodes. The navigation of an expression tree executes the program.

    VOCABULARY 
    Orthogonality:  A relatively small set of primitive constructs can be combined in a relatively small number of ways to build the control and data structures of a programming language. Furthermore, every possible combination of primitives is legal and meaningful. 
    Imperative Languages:  Languages that give lists of commands. Because of the von Neumann architecture, the central features of imperative languages are variables, which model the memory cells; assignment statements, which are based on the piping operation; and the iterative form of repetition, which is the most efficient way to implement repetition on this architecture. Examples:  C, COBOL, Pascal, FORTRAN 
    Functional:  Languages used mostly for A.I. where everything is a function. Examples:  LISP, ML, Scheme. 
    Logic Declarative:  Also used for A.I. Statements of facts are made as well as rules of application of rules (statements). Example:  Prolog. 
    Object Oriented:  Makes use of data abstraction, which encapsulates processing  
    Parse Tree:  The parse tree represents the syntactic structure of the program. Usually there is no tree created, only the information that would be required to build a tree is generated and used. 
    Load Module or Executable Image:  The user and system code together. 
    Linking & Loading:  The process of collecting system programs and linking them to user programs, accomplished by a linker. 
    Fetch-execute cycle:  The execution cycle on a von Neumann architecture computer where instructions are moved from memory to the CPU and executed. 
    Von Neumann bottleneck:  The buss between the CPU and main memory. 
    Compiler Implementation:  Programs are translated to machine language which can be executed directly on the physical computer. This makes for very fast execution once the compilation is complete. Examples:  C, C++, FORTRAN, COBOL, Pascal, Ada 
    Pure Interpretation:  A process where there is no translation of the programming whatever. An interpreter program acts as a software simulation of a machine whose fetch-execute cycle deals with high-level language program statements rather than machine instructions. This software simulation provides a virtual machine for the language. Examples:  Lisp, Prolog, BASIC, shell scripts, *.BAT files, JavaScript 
    Hybrid Implementation Systems:  A compromise between compilers and pure interpreters; they translate high-level language programs to an intermediatelanguage designed to allow easy interpretation. This method is faster than pure interpretation because the source language statements are decoded only once, but is slower than compiled code. The virtual machine that is provided is for the intermediate language. Examples:  Java, Perl

    Syntax:  The form of a programming language’s expressions, statements and program units. 
    Semantics:  The meaning of a programming language’s expressions, statements and program units. 
    Sentences:  The strings of a language are called sentences or statements. The syntax rules of a language specify which strings of characters from the language’s alphabet are in the language. 
    Lexeme:  The lowest-level syntactic units of a programming language. They include the (string values of the) language’s identifiers, literals, operators, and special words. The description of lexemes can be given by a lexical specification, which can be separate from the syntactic description of the language. 
    Token:  A category of a language’s lexemes which may include , and . 
    Backus-Naur Form (BNF):  A formal notation for specifying programming language syntax, closely associated with the specification of Chomsky’s generative device for context free languages. 
    Production Rule:  A BNF definition of an abstraction such as
     ā  = . 
    Nonterminals:  Abstractions, like , in a BNF description. 
    Terminals:  The lexemes and tokens of the rules of a BNF grammar. 
    Grammar:  A collection of rules. 
    Start Symbol:  A special nonterminal symbol of a grammar from which all the sentences of the grammar are derived. 
    Derivation:  A sentence generation. 
    Sentential Form:  Each of the strings of the derivation of a language, including . 
    Leftmost Derivation:  A derivation where the replaced nonterminal is always the leftmost nonterminal in the previous sentential form. The derivation continues until the sentential form contains no nonterminals. That sentential form, consisting of only terminals, or lexemes, is the generated sentence. 
    Graph:  A collection of nodes, some of which are connected by lines, called edges. A directed graph is one in which the edges are directional. A parse tree is a restricted form of a directed graph. 
    Syntax Graph:  A directed graph which represents the information in BNF and EBNF rules. 
    Attribute Grammar:  A device used to describe more of the structure of a programming language than is possible with a CFG. It is an extension to a CFG which allows certain language rules to be described, such as type compatibility.

    Static Semantics:  Only indirectly related to the meaning of programs during execution. Rather, it has to do with the legal forms of programs (i.e. syntax rather than semantics). In many cases the static semantic rules of a language state its type constraints. Static semantics is so named because the analysis required to check these specifications can be done at compile time. 
    Attributes:  Associated with grammar symbols, they are similar to variables in the sense that they can have values assigned to them. 
    Attribute Computation Functions:  Sometimes called semantic functions, are associated with grammar rules. They are used to specify how attribute values are computed. 
    Predicate Functions:  Associated with grammar rules, they state some of the syntax and static semantic rules of the language. 
    Fully Attributed:  If all the attribute values in a parse tree have been computed, the tree is said to be fully attributed. 
    Synthesized Attributes:  Used to pass semantic information up a parse tree. 
    Inherited Attributes:  Pass semantic information down a tree. 
    Intrinsic Attributes:  Synthesized attributes of leaf nodes whose values are determined outside the parse tree, such as from the symbol table, which is used to store variable names and their types. 
    Operational Semantics:  A way to describe the meaning of a program by executing its statements on a machine, either real or simulated. The changes that occur in the machine’s state when it executes a given statement define the meaning of that statement. Similar to restating program in pseudo code or assembly. 
    Axiomatic Semantics:  Defined in conjunction with the development of a method to prove the correctness of programs. In a proof, each statement of a program is both preceded and followed by a logical expression that specifies constraints on program variables. These are used to specify the meaning of the statement using Predicate Calculus. 
    Assertion:  A logical expression, also known as a predicate. An assertion immediately preceding a program statement, a precondition, describes the constraints on the program variables at that point in the program. An assertion immediately following a statement, a postcondition, describes the new constraints on those variables after execution of the statement. 
    Weakest Precondition:  The least restrictive precondition that will guarantee the validity of the associated postcondition. 
    Axiom:  A logical statement that is assumed to be true. 
    Inference Rule:  A method of inferring the truth of one assertion on the basis of the values of other assertions. 
    Rule of Consequence:  If S1, S2 … Sn are all true, then the truth of S can be inferred. 

    Loop Invariant:  A loop invariant is some statement, p, such that if p and c are true when starting the loop, then p is true after each invocation of the loop. 
    Total Correctness:  If loop termination can be shown, the axiomatic description of the loop is called total correctness. 
    Partial Correctness:  If the other conditions can be met but termination is not guaranteed, it is called partial correctness. 
    Denotational Semantics:  The fundamental concept of denotational semantics is to define for each language entity both a mathematical object and a function that maps instances of that entity onto instances of the mathematical object. i.e.  ā 0 | 1 | 0 | 1
    Mbin('0') = 0  Mbin('1') = 1 
    Mbin('0') = 2 * Mbin()
    Mbin('1') = 2 * Mbin() + 1 
    Encapsulation:  The grouping of subprograms and the data they manipulate. 
    Abstract Data type:  The definitions of the type and the operations on objects of the type are contained in a single syntactic unit. Other program units may be allowed to create variables of the defined type. Also, The representation of objects of the type is hidden from the program units that use the type (clients). Object:  An instance of an ADT. 
    Class:  An ADT in an OO language. 
    Object Oriented Programming:  Makes use of Abstract Data Types, Inheritance, and Dynamic Binding. 
    Interface Inheritance:  If only the interface of the parent class is visible to the subclass. 
    Implementation Inheritance:  If implementation details of the parent class are visible to the subclass. 
    Pure Virtual Function:  It has no body and it cannot be called. It must be redefined in derived classes. 
    Abstract Class:  Any class that includes a pure virtual function. 
    Design issues of object-oriented programming:  exclusivity of objects, subclasses and subtypes, implementation and interface inheritance, type checking and polymorphism, single and multiple inheritance, dynamic binding, and explicit or implicit deallocation of objects. 
    Recursive-Descent Parsing:    Consists of a collection of subprograms, many of which are recursive. 
    Top-Down Parsing:  Leftmost derivation. A top-down parser traces the parse tree in preorder beginning with the root. LL grammar. 
    Bottom-Up Parsing:  A bottom-up parser constructs a parse tree beginning at the leaves and progressing toward the root. A bottom-up parser produces the reverse of a rightmost derivatio

Comments

Popular posts from this blog

Mawar Merah Di Angkasa

Mawar Merah di Angkasa (Nebula Roses) Gambar di bawah adalah gambar ledakan bintang di angkasa yang diperoleh NASA dengan Teleskop yang sangat canggih.Kejadian tersebut membuktikan kebenaran Firman Allah dalam Al-Quran Surah Ar-Rahman: "Selain itu (sungguh negeri) ketika langit pecah belah lalu menjadilah ia mawar merah, berkilat seperti minyak" (Ar-Rahman: 37) Daripada pembacaan buku karangan Dr.Danial Zainal Abidin mejelaskan disini apa yang menakjubkan adalah proses kehancuran bintang sehingga bertukar menjadi bentuk bunga ros ini dinyatakan dalam Al-Quran.Terjemahan ini adalah berasaskan apa yang disebut oleh Said Hawa di dalam al-Asas fi al-Tafsir. Ayat ini menceritakan mengenai langit yang sedang melalui proses kehancuran dan pada masa itu ia berubah menjadi merah seumpama bunga ros ynag bersinar. Seperti yang disebutkan sebelum ini, apabila Quran menyebutkan langit dan bumi ia memfokuskan mengenai objek-objek dan bintang adalah antara objek penting di langit sehingga A...

PERBEZAAN ASTRONOMI (ILMU FALAK) DAN ASTROLOGI (ILMU NUJUM)

بِسۡمِ ٱللهِ ٱلرَّحۡمَـٰنِ ٱلرَّحِيمِ السلام عليكم ورحمة الله وبركاته 1. ASTRONOMI   Astronomi atau dikenali juga Ilmu Falak atau Kajibintang adalah Ilmu yang berkaitan dengan pengkajian cakerawala, peredaran bintang2, jaraknya dengan bumi dll yang berkaitan  berdasarkan pemerhatian atau kiraan (matematik). Ilmu ini adalah Ilmu yang sangat terpuji didalam agama kerana ianya sangat bermanfaat untuk menerbitkan taqwim, waktu solat, waktu puasa dll. 2. ASTROLOGI Astrologi atau dikenali juga Ilmu Nujum Bintang adalah Ramalan atau kepercayaan kuno berkaitan peredaran bintang2 yang dihubungkan dengan pergerakan bumi pada masa2 tertentu yang kemudian ditafsirkan berdasarkan tarikh kelahiran manusia.   Pentafsiran ini adalah berkaitan umur, kehidupan, perkahwinan, pekerjaan, nasib baik buruk dll. Horoskop – adalah satu bentuk carta yang terhasil / dihasilkan oleh seseorang yang berkecimpung didalam Ilmu Nujum Bintang ini (Astrologer)  untuk di gunakan oleh ...

Kisah Sujud Pokok

Ada sebuah hadis menyebut: Maksudnya: Daripada Saidina Ibn Abbas r.a, bahawasanya, seorang lelaki datang kepada Rasulullah s.a.w lalu berkata; Wahai Rasulullah. Sesungguhnya aku melihat diriku pada suatu malam (di dalam mimpi), sedang aku tidur, seolah-olah aku sedang solat di belakang sebuah pokok. Lalu aku sujud. Maka pokok itu turut mengikut sujudku. Lalu aku mendengar pokok itu mengucapkan: "Wahai Tuhanku, tuliskan untukku pahala di sisiMu dengan sujud ini, gugurkanlah dosa dariku dengan sujud ini, jadikanlah anugerah bagiku dengan sujud ini dan terimalah sujud ini dariku sepertimana Engkau menerimanya daripada hambaMu Daud." Saidina Ibn Abbs r.a lalu berkata: Maka Rasulullah s.a.w membacanya kemudian Baginda s.a.w sujud. Dalam sujud tersebut, Baginda s.a.w membaca doa yang disebut oleh lelaki tersebut daripada ucapan pokok tersebut. Sujudlah Selagi Ada Masa Sujudlah selagi ada masa, selagi ajal belum tiba. Katakanlah, solatku, sujud dan rukukku, ibadahku, hidup dan...