WikiLuc

Guest connection: login: Guest, password: anonymous

User Tools

Site Tools


dit:cours:prog2

Cours de programmation avancée

Organisation

Planning prévisionnel du cours


Cours 4: Analyse lexicale et syntaxique

covers.oreillystatic.com_images_9781565920002_lrg.jpg

Lundi 19 février

  • Rappels de théorie des langages
  • Analyse lexicale: automates finis, lex et flex
  • Analyse syntaxique: automates à piles, yacc et bison

Le livre de référence: lex & yacc, Doug Brown, John Levine, Tony Mason, O'Reilly, October 2012

Documentation technique

lex & yacc, Doug Brown, John Levine, Tony Mason, O'Reilly Media, October 2012

Flex and bison are tools designed for writers of compilers and interpreters, although they are also useful for many applications that will interest noncompiler writers. Any application that looks for patterns in its input or has an input or command language is a good candidate for flex and bison. Furthermore, they allow for rapid application prototyping, easy modification, and simple maintenance of programs. To stimulate your imagination, here are a few things people have used flex and bison, or their pred- ecessors lex and yacc, to develop:

  • The desktop calculator bc
  • The tools eqn and pic, typesetting preprocessors for mathematical equations and complex pictures
  • Many other “domain-specific languages” targeted for a particular application
  • PCC, the Portable C Compiler used with many Unix systems
  • Flex itself
  • A SQL database language translator

Cours 3: Gestion mémoire

cdn-images-1.medium.com_max_800_1_gl5xl5_o6q1gajaphw2zbw.jpeg

Lundi 12 février

  • Principes généraux
  • Mark-and-sweep
  • TD: Implémentation en C++

Le livre de référence: The Garbage Collection Handbook; The art of automatic memory management, Richard Jones, Antony Hosking, Eliot Mosshttp, 2012.

Ce livre inclut le matériel du premier livre sur le sujet: Garbage Collection: Algorithms for Automatic Dynamic Memory Management, Richard Jones‎, RafaelLins.

Voici une présentation pédagogique de la question avec la visualisation des différentes approches.

Le problème du “Concurrent garbage collection” est l'un des plus beaux défis algorithmiques des années 70-80. La première solution a été apportée par Dijkstra et Lamport ensemble, avec beaucoup d'efforts comme vous le lirez ci-dessous. Un monument! Il y a eu ensuite des dizaines d'autres solutions proposées. C'est ce problème qui a montré qu'on ne pouvait raisonner sur les programmes concurrents de manière opérationnelle et qui a popularisé les méthodes de preuves de programmes parallèles.

Voici l'article magnifique de Dijkstra, Lamport et al. Vous pouvez le comparer avec l'article original de Dijkstra et al.

On-the-fly Garbage Collection: an Exercise in Cooperation, Edsger W. Dijkstra, Leslie Lamport, A. J. Martin, C. S. Scholten, E. F. M. Steffens, November 5, 1978 Leslie Lamport writes:

This paper presents the first concurrent garbage collection algorithm–that is, an algorithm in which the collector operates concurrently with the process that creates the garbage. The paper is fairly well known; its history is not.

I received an early version of the paper from Dijkstra, and I made a couple of suggestions. Dijkstra incorporated them and, quite generously, added me to the list of authors. He submitted the paper to CACM. The next I heard about it was when I received a copy of a letter from Dijkstra to the editor withdrawing the paper. The letter said that someone had found an error in the algorithm, but gave no indication of what the error was. Since Dijkstra’s proof was so convincing, I figured that it must be a trivial error that could easily be corrected.

I had fairly recently written [23]. So, I decided to write a proof using that proof method, thinking that I would then find and correct the error. In about 15 minutes, trying to write the proof led me to the error. To my surprise, it was a serious error. […]

The lesson I learned from this is that behavioral proofs are unreliable and one should always use state-based reasoning for concurrent algorithms–that is, reasoning based on invariance.


Cours 2: Les mystères du ''let rec'' enfin dévoilés

Lundi 29 janvier

  • Notion de clôture
  • Interprète à liaison lexicale
  • Gestion des déclarations récursives par placeholder
  • Gestion de l'environnement par frame en Scheme

Le livre de référence: Principes d'implantation de Scheme et Lisp, Christian Queinnec, 2007. Le spécialiste français de l'implémentation Lisp, un livre fascinant pour les passionnés de compilation. Une version (partielle) en ligne est disponible.

NB: Une version plus ancienne s'appelle Les langages LISP, Christian Queinnec, 1997

Closure, Wikipedia

www.computerhope.com_jargon_s_scheme.jpg

The concept of closures was developed in the 1960s for the mechanical evaluation of expressions in the λ-calculus and was first fully implemented in 1970 as a language feature in the PAL programming language to support lexically scoped first-class functions.

Peter J. Landin defined the term closure in 1964 as having an environment part and a control part as used by his SECD machine for evaluating expressions. Joel Moses credits Landin with introducing the term closure to refer to a lambda expression whose open bindings (free variables) have been closed by (or bound in) the lexical environment, resulting in a closed expression, or closure. This usage was subsequently adopted by Sussman and Steele when they defined Scheme in 1975, a lexically scoped variant of LISP, and became widespread.


Cours 1: Lisp, le langage d'un autre monde

Lundi 8 janvier

Thème: le langage Lisp et son interprète

  • Organisation et objectif de cette partie du cours
  • Histoire du langage Lisp
  • Éléments de base: atomes, listes, setq, defun, récursivité
  • Interprète LISP: eval et apply

Le livre à acheter: Structure and Interpretation of Computer Programs, Harold Abelson, Gerald Jay Sussman, Julie Sussman, MIT Press

History of Lisp, John McCarthy, Artificial Intelligence Laboratory, Stanford University, 1979 As a programming language, LISP is characterized by the following ideas: computing with symbolic expressions rather than numbers, representation of symbolic expressions and other information by list structure in the memory of a computer, representation of information in external media mostly by multi-level lists and sometimes by S-expressions, a small set of selector and constructor operations expressed as functions, composition of functions as a tool for forming more complex functions, the use of conditional expressions for getting branching into function definitions, the recursive use of conditional expressions as a sufficient tool for building computable functions, the use of lambda-expressions for naming functions, the representation of LISP programs as LISP data, the conditional expression interpretation of Boolean connectives, the LISP function eval that serves both as a formal definition of the language and as an interpreter, and garbage collection as a means of handling the erasure problem. LISP statements are also used as a command language when LISP is used in a time-sharing environment.


Bibliographie

Langages de programmation

Measuring programming language popularity It is difficult to determine which programming languages are most widely used, and what usage means varies by context. One language may occupy the greater number of programmer hours, a different one have more lines of code, a third may utilize the most CPU time, and so on. Some languages are very popular for particular kinds of applications. For example, COBOL is still strong in the corporate data center, often on large mainframes; FORTRAN in engineering applications; C in embedded applications and operating systems; and other languages are regularly used to write many different kinds of applications.

Lisp

  • The little LISPer, 3rd Edition, Daniel P. Friedman, Matthias Felleisen, Pearson, 1989. Le livre qui a fait découvrir la récursivité à toute une génération! (Il existe aussi “The little Schemer”, “The little MLer”, etc.)
  • Principes d'implantation de Scheme et Lisp, Christian Queinnec, 2007. Le spécialiste français de l'implémentation Lisp

Scheme

Sémantique

Lambda-calcul


Ressources

Lisp

Scheme

Bigloo homepage Bigloo is a Scheme implementation devoted to one goal: enabling Scheme based programming style where C(++) is usually required. Bigloo attempts to make Scheme practical by offering features usually presented by traditional programming languages but not offered by Scheme and functional programming. Bigloo compiles Scheme modules. It delivers small and fast stand alone binary executables. Bigloo enables full connections between Scheme and C programs, between Scheme and Java programs, and between Scheme and C# programs.

dit/cours/prog2.txt · Last modified: 2018/02/12 08:02 by luc.bouge