Invited Speakers

The Program Committe is happy to annouce that the following speakers accepted our invitation

Nicola Angius (University of Sassari, IT)

Lenore Blum (Carnagie Mellon University, USA)

David Alan Grier (IEEE & George Washington University, USA)

Furio Honsell (University of Udine, IT)

Pierre Mounier-Kuhn (CNRS & University of Paris-Sorbonne, F)

Franck Varenne (University of Rouen, F)

 

Functions, Abstractions and Idealizations in the Explanation of Computing Systems Behaviour (Nicola Angius)

The issue of explanation in Computer Science arises in connection with the problem of stating why a computational system displayed an execution which is not correct with respect to the system’s requirements. In the process of designing, specifying, programming, and implementing computational artefacts, computer scientists are admittedly involved in multiple, hierarchical descriptions of the systems to be realized. Providing an explanans to an occurred miscomputation implies identifying the description(s) against which the artefact is not correct an tracing back the corresponding error state to fix the system accordingly. It is argued here that providing explanations in computer science is a pragmatic activity (van Fraassen 1980): depending on the context wherein the explanatory request is advanced, different contrast-classes showing allowed executions of the system under inquiry may be provided. For each of those contrast-classes, a relevance relation should be specified which expresses why the observed computation was executed instead of any other in the contrast-class.

This talk is turned to show how, depending on the description pragmatically selected, the corresponding relevance relation provides explanantia that fulfil divergent models of scientific explanation. It is argued that semantic specifications in the form of some state transition system used in formal verification allow for nomological explanations of observed executions. Functional analyses of high-level language, assembly language, and code-machine language programs are shown to explain an observed miscomputation in terms of functional error states that can be multiply realized by different lower-level implementations. It is brought into question the thesis that architecture descriptions afford mechanical explanations of the system’s failures. Descriptions of hardware architectures computing some given execution are acknowledged as mechanism sketches supplying functional explanations and focusing on state or combinatory elements introduced as functional units. Only the descriptions of working transistors implementing logic gates are capable of providing mechanical-causal explanations. However, it is argued here that those explanations concern the physical processes of an electric circuit, not the computational processes of a digital circuit.

 

Alan Turing and the Other Theory of Computation (Lenore Blum)

The two major traditions of the Theory of Computation have for the most part run a parallel non-intersecting course. On one hand, we have the tradition arising from logic and computer science addressing problems with more recent origins, using tools of combinatorics and discrete mathematics. On the other hand, we have numerical analysis and scientific computation emanating from the classical tradition of equation solving and the continuous mathematics of calculus. Both traditions are motivated by a desire to understand the essence of computation, of algorithm; both aspire to discover useful, even profound, consequences.

While those in the logic and computer science communities are keenly aware of Alan Turing's seminal role in the former (discrete) tradition of the theory of computation, most still remain unaware of Alan Turing's role in the latter (continuous) tradition, this notwithstanding the many references to Turing in the modern numerical analysis/computational mathematics literature. These references are not to recursive/computable analysis (suggested in Turing’s seminal 1936 paper), usually cited by logicians and computer scientists, but rather to the fundamental role that the notion of “condition” (introduced in Turing’s seminal 1948 paper) plays in real computation and complexity.

This talk will recognize Alan Turing’s work in the foundations of numerical computation (in particular, his 1948 paper “Rounding-Off Errors in Matrix Processes”), its influence in complexity theory today, and how it provides a unifying concept for the two major traditions of the Theory of Computation.

 

Walter Shewhart and the Philosophical Foundations of Software Engineering (David Alan Grier)

To create engineering standards of practice for software, the early practitioners of this discipline had to accomplish two goals. First, they had to materialize an immaterial artifact, the program. They had to conceptualize software in a way that could be measured. Second, they had to introduce the concept of randomness into a decidedly deterministic framework. Practically, these two goals forced them to reject two dominant modes of engineering practice, those of mechanical and electrical engineering. Historically, this was a relatively straightforward task as mechanical and electrical engineers proved easy to ignore. They wanted to dictate the nature of the new engineering field but were unwilling to do the work to shape it.

The early software moved quickly to base their practice on the works of Walter Shewhart(1891-1967), who was one of the founders of industrial engineering and quality control. In particular, the work of provided the foundation for the 10 IEEE standards that formed the basis for ISO 12027, commonly called the “SWEBOK Standard” that describes what we now call the classic form of software engineering.

In the process of adopting these ideas, they found that they had to accept the logical positivism that undergirded Shewharts’ work. Shewhart was a friend of C. I. Lewis and borrowed ideas from Lewis’ pragmatic writings and his critique’s of rational positivism. These ideas forced the engineers to recognize that they had to deal with the problem of logical implication, the case in which a true consequent can be paired with a false antecedent and still produce a true statement. While the problems of logical implication could be found in other forms of engineering, it had a power impact on a field what was thought to be based on deterministic automata and logical proof.

 

Wherefore thou art... Semantics of Computation? (Furio Honsell)

The apparent remoteness and gratuitousness of most Computation Models is the Pythagorean dream made true, but also the original sin of Computing. The more computers are used in life-critical applications the more incumbent are digital woes. We have no choice but to load upon ourselves the heavy burden of explaining the semantics of programming languages and proving correctness of programs with respect to specifications and adequacy of encodings. The starting point of Denotational Semantics is that the meaning of an algorithm is a total function, but what is the meaning of a non terminating program, i.e. a process, such as an operating system or the internet? A possible answer is to shift from input/output behaviours to equivalences, from algebras to co-algebras, from initial data to circular and infinite objects, from functions to possibly non-terminating strategies on games. I will try to outline a brief history of the quest for a Final Semantics in Computing, where the interpretation function is viewed as a final mapping rather than an initial mapping.

 

Logic and Computing in France: A Progressive Convergence (Pierre Mounier-Kuhn)

What role played mathematical logic in the emergence of computing? The recent discovery of a volume of technical reports, written in 1950-1952 by the designers of France's first computer at Société d'Electronique et d'Automatique, sheds a new light on the local reception of the works published by von Neumann, Wilkes, and Turing. While the first two were highly influential, the latter was only known and understood by a young mathematician who reflected on programming methods for the machine; yet this junior thinker suggested to launch research on alternative architectures, which became a veritable R&D program for the company in the following decade.

My paper will set this early awareness in contrast with the near-absence of mathematical logic on the broader French academic scene at that time, and its remoteness from the concerns of computer users or designers. It will then analyze the later « discovery » of the theory of recursive functions and the Turing machine, by computer scientists or by scholars interested in formal linguistics and machine translation – a typical convergence of intellectual agendas which proved decisive in building a new discipline.

 

The Turn of Object-Oriented Programming in Computerized Models and Simulations (Franck Varenne)

During the first decades after the 1950’s, digital computer aided modeling has mostly been twofold. On the one hand, based on its ability to emulate any computing device, the computer was taken as a formal calculator. On the other hand, based on its ability to make numerous but advantageously controllable approximate computations, the computer was used as a numerical solver - a numerical simulator - of mathematical models. As a consequence, the computer was seen by modelers either as a model in itself (an analogue) or as an instrument (a tool) which permitted the resolution and, through that, the manipulation of some pre-given mathematical model.

Since the 1990’s, the situation has become richer and more complex. The delayed but now vibrant development of a kind of programming - which techniques date back at least to the 60’s - called object-oriented programming (OOP) and consequently of object-oriented modeling (OOM) in most empirical sciences and techniques has lead more and more modelers to see computer simulations as something else than approximate computations of models: complex computational models are more and more used as some kind of “substitutes” and not only as analogues or tools.

This talk will try to contextualize and to re-estimate the epistemological implications of such turn. In particular, it will show that a too restrictive definition of “computer simulation” – e.g. not sufficiently aware of the span of Von Neumann’s initial ideas - prevents us to measure, characterize and temper this somewhat excessive and worrying conception of simulations as “substitutes”. The OOP turn is a fascinating historical and epistemological object in that it reveals us what we sometimes had forgotten about the computer and its very idea. This turn has implications not only in the ways modelers use and see computers but also in the ways epistemologists have to precisely understand what a computer can be, can do and - perhaps - can’t do.

Online user: 1