|
|
Accepted TalksA note on knowing machines (Alessandro Aldini, Pierluigi Graziani and Vincenzo Fano) The Gödelian Arguments represent the effort done by some scholars to interpret Gödel's Incompleteness Theorems with the purpose of showing that minds cannot be explained in purely mechanist terms. In this setting, Reinhardt and Carlson opened a line of research intended to study a formal theory of knowing machines, using Epistemic Arithmetic, encompassing some typically informal aspects of the Gödelian Arguments about the knowledge that can be acquired by machines. In such a framework, several variants of the Church-Turing Thesis are constructed and interpreted by employing an epistemic notion of knowledge formalized through a modal operator K for "knowledge" and adequate axioms. For instance, the axiom K f --> f establishes the factivity of knowledge. In our contribution, we survey such a theory of knowing machines and extend some results provided by Alexander, who recently has proved that a machine can know its own code exactly but cannot know its own factivity (despite actually being factive). We show a machine that, for (at least) a specific choice of the function f and of the related input, knows its own code and own factivity. This represent an additional element for the analysis of the Gödelian Arguments.
This paper aims to study one of the oldest medieval handbooks on calculation with Indo-Arabic numerals in positional notation, the Carmen de algorismo by the Franciscan friar and scholar Alexander de Villa Dei, who was also a professor of Humanities at the University of Paris in the 12th century. This work had a remarkable spread during the Middle Ages in many European countries, alongside the Algorismus prosaicus by Johannes de Sacrobosco. In our study we will portrait the overall picture of the spread of new techniques of calculation with Indo-Arabic numerals in cultured circles and of the consequent literature, since it is different from the contemporary works called abacus books, devoted to merchant and practical calculations. Despite its importance, the work has not yet been thoroughly investigated both for its relative difficulty, because it is composed in verse by a refined author, and for the presence of a contemporary literature of the same content, starting precisely from the one by Sacrobosco.
In recent years a common trend characterized by the adoption of computational text mining methods in the study of digital sources emerged in digital humanities, often in opposition to traditional-hermeneutic approaches. In our paper, we intend to show how text mining methods will always need a strong support from the humanist. In our opinion, humanist research involving computational techniques should be thought of as a three steps process: from close reading (selection of a specific case study, initial feature description) to distant reading (text mining analysis) to close reading again (evaluation of the results, interpretation, use of the results). We believe that failing to understand the importance of all the three steps is a major cause for the mistrust in text mining techniques developed around the humanities. On the other hand we think that text mining techniques are a very promising tool for the humanities and that researchers should not renounce to such approaches, but should instead experiment with advanced methods such as the ones belonging to the family of deep learning. In this sense, we believe that, especially in the field of digital humanities, exploiting complementarity between computational methods and humans will be the most advantageous research direction.
Our paper will discuss the role of art and design in the history and philosophy of computing. It focuses on the work of John Lansdown (1929-1999) and Bruce Archer (1922-2005) in London during the 1960s and 70s. Much has been written on Babbage and Lovelace’s speculations on the relation of computational machines to creativity in the 19th century, which can be represented particularly through Babbage’s possession of an automaton dancer and a portrait of Joseph Marie Jacquard woven on a Jacquard loom. A century later Lansdown, Archer and others addressed this area through the lens of arts and design. Whereas much of art and design thinking since the 1970s has been dominated by the idea of the computer as merely a tool, recent approaches to computing among artists and designers are once again finding adherents, whether in live-coding, ‘maker faires’, or in increased interest in overtly algorithmic art. This paper aims to demonstrate how the engagement with computing represented by Lansdown, Archer and their contemporaries, much earlier than the current resurgence, shows a specific intellectual approach that contributes to the history of thinking about the relationship between computing and art and design.
We reflect on the computational aspects that are embedded in life at the molecular and cellular level, where life machinery can be understood as a massively distributed system whose macroscopic behaviour is an emerging property of the interaction of its components. Such a relatively new perspective, clearly pursued by systems biology, is contributing to the view that biology is in several respects a quantitative science. The recent developments in biotechnology and synthetic biology, noticeably, are pushing the computational interpretation of biology even further, envisaging the possibility of a programmable biology. Several in silico, in vitro and in vivo results make such a possibility a very concrete one. The long-term implications of such an “extended” idea of programmable computation on a living hardware, as well as the applications that we intend to develop on those “computers”, pose fundamental questions.
The digital philosophy is strictly speaking a new speculative theory that places the bit at the foundation of reality and explains the evolution of reality as a computational process. This theory actually reinterprets some previous philosophical intuitions, starting from the Pythagorean theory of numbers as the beginning of all things and as a criterion for the comprehension of reality. Significant antecedents of this digital and computational philosophical approach, however, can be traced in the tradition of late antiquity and early Middle Ages too. One of the less investigated chapters of this ‘pre-history’ of digital thinking is placed in the so-called Ottonian Renaissance (10th century), when we can identify theorists of what has been called – in reference to modern authors as Leibniz – ‘computational paradigm’. The paper focuses on the works of authors such as Abbo of Fleury (945-1004) and Gerbert of Aurillac (946-1003). Their theoretical basis is the famous biblical verse of Wis 11, 21 ("Omnia creata sunt in numero, mensura et pondere"): their passion for arithmetic arises from the awareness that "the numbers contain within them, or originate, the beginnings of all things". We can apply to these philosophers the modern concept of mathesis universalis, the term employed by Leibniz to describe his combinatorial art, or characteristica universalis.
Fraenkel and Mostowski developed a permutation-based model of set theory where the Axiom of Choice does not hold. This was around 1922-1938. Almost 100 years later, this simple model fostered a huge quantity of research results. Among these, we find Lawvere-style initial-algebra syntax with binders, final-coalgebra semantics with resource allocation, and minimization algorithms for mobile systems. These results are also guaranteed by describing FM-sets in terms of category theory, and proving an equivalence of models between several different categories. We aim at providing survey of some of these developments, framed in the recent research line of ``nominal computation theory'', where the essential notion of name is declined in several different ways.
Functional or appealing? Traces of a long struggle (Giovanni A. Cignoni) Smart-phones, tablets and video-game consoles are computing devices sold as mass market products. They are technological accessories that people buys not necessarily because of their functional purposes. The related marketing strategies are based on qualities like the external design, the name of the brand, the perception of the products as status symbols. The paper proposes few examples in the history of computing, mainly focused on Olivetti machines, which help to devise the long process in which appealing but functionally useless (and at times disadvantageous) characteristics became a relevant part of the design of computing devices. Sometimes with grumblings on the part of the engineers.
Mechanism, computational structure and representation in cognitive science (Dimitri Coelho Mollo) We propose that the mechanistic view of concrete computation can be useful in solving some of the philosophical problems at the foundation of cognitive science, and in particular that of representation. I argue that at least for one important kind of theory of representation in cognitive science, i.e. structural representation, the mechanistic view of computation may help solve, or dissolve, traditional metaphysical problems. Structural representation is based on the idea that representations represent what they do by virtue of instantiating the same relational structure, i.e. by being structurally resemblant to what they represent. One natural way to cash out the relevant relational structure of representational vehicles is in terms of computational structure. A representation would thus represent all the entities in the world that share its (computational) structure. I argue that uniting the mechanistic view of concrete computation with a structural account of representation helps to give both notions --- computation and representation --- a respectable philosophical standing in cognitive science. In particular, I argue that this combination of views allows ‘deflating’ representational content in a way that nonetheless preserves the explanatory purchase that the notion of representation is supposed to have in cognitive science.
McCarthy's LISP and basis for theory of computation (Hong-Yi Dai) It is well-known that LISP is an algebraic list-processing language designed for artificial intelligence, dedicated to symbolic computation, powered by function orientation, enhanced by program manipulation, and underpinned by garbage collection. Ever since it was born, LISP has been celebrated by practitioners as a genuine high-level, functional- and meta-programming language for potentially hard problems involving symbol manipulation. Largely overlooked is that LISP can also serve as a formalism for the theory of computation. Indeed, it embraces and integrates ideas from all three major characterizations of computation, namely lambda calculus, general recursive functions, and Turing machines. Unfortunately, both LISP itself and its generalization were not well received for the theory of computation. 50 years later, given prominent and promising developments of the programming-language approach to computability and complexity, it is time to give McCarthy's pioneering work a reappraisal. We first review the development of LISP and then revisit McCarthy's formalism. In the course, we highlight the historical role of the two in linking the traditional and the modern approaches to the theory of computation.
The contribution of Carl Adam Petri to our understanding of ‘computing’ (Giorgio De Michelis) Carl Adam Petri (1926-2010) is well known in the Computer Science community for the nets having his name. It is less known that, from its very early introduction, net theory was, for Petri, the kernel of a radical shift in scientific knowledge. In this paper, I want to make one small step for overcoming this limit trying to popularise in a larger community the radical novelty and, of course, the relevance of the approach Carl Adam Petri used for developing scientific knowledge of physical and social phenomena. This has, as we will see, much to do with the concept of computing and, indirectly, with the relations between science and philosophy. Without entering into further details, let me recall that the answer of Petri to the question: “What is modelling?” is that he prefers to the widespread view, considering it a partial function from reality to model, the view that it is a translation from a shared informal model to a formal model. This talk will summarise three aspects of Petri’s thinking, deserving a wider attention: the notion of model, the new algebraic foundations for a theory of modelling and its application to Pragmatics.
It is commonly accepted that computability theory, as a theory of computable functions, emerged as a consequence of Gödel’s incompleteness famous results of 1931. Computing theory thus appears as naturally diverging from the logical perspective and the philosophy the logicist and set-theoretical tradition of Frege, Russel or Carnap. However, if closer attention is payed to his entire work, Frege’s approach to logic and formal systems proves to be closer in some essential respects to the computational approach than to the logicist tradition. Out of a detailed study of Frege’s understudied Habilitationsschrift, we’ll show how an original functional approach to magnitudes, in which recursion appears as the schema informing functional equations determining quantitative domains, gives Frege the formal basis for the definition of a general concept of number independent from intuition. We’ll show that this particular approach to the general concept of number underlies Frege’s entire logical undertaking, informing, in particular Frege’s introduction of the propositional function as well as the logical system built around it, which only incidentally can be associated with a set-theoretical conception. Such a perspective will allow to consider Frege as a strong and insightful pioneer of the logical and philosophical stakes associated to the tradition of computing sciences.
The methodology of computing utilization in philosophy (Andrej Gogora) The main aim of the contribution is to analyse the utilization of computing (digital resources and tools) in the philosophical research. Generally, it deals with the methodology of the utilization of computing processes in the philosophical research; and it clarifies its fundamental methodological assumptions. In terms of the scientific disciplines, it's located at the intersection of philosophy and digital humanities. The computing is presented here as a method as well as a topic. It consists of the four particular objectives – to reconstruct the process of philosophical research in general; to map the existing applications of computing (digital resources and tools) in the philosophical research; to identify and categorize the applications of computing in the respective phases of the philosophical research; to define the basic methodological assumptions, benefits and threats of the utilization of computing in the practice of philosophical research. The primary purpose of the contribution is to draw attention to the neglected and insufficient utilization of computing processes in the traditional philosophical research; and to emphasize the enhancement of traditional philosophical research by means of the computing processes.
What is the epistemology of wayward web search? (Robin Hill) The epistemology of web search should reveal something about search as a method of knowledge acquisition, or something about knowledge itself, or even something about the World-Wide Web. Search, treated as testimony, is an attempt to fill a knowledge gap surrounded by rich context. In a library or conversational setting, that context is readily available, but not when search executes only the method of pattern-matching on a search string. Web search failures, in particular, are revealing, as results returned often show misdirection of some reference. When the result is true in spite of that, this search blunder resembles the Gettier problem, except that Gettier problems in human conversation are easily rectified. Informally, the transfer from web page to search engine severs the flow of semantics. Floridi's principles for knowledge give a platform for explanation. The erotetic model invoked requires that an item of information be expressible as a question, carrying all of the context, with a binary answer. Failing that requirement means that the further criteria of correctness and relevance are not achieved; hence the aleatorization of information cannot be resolved, raising the question whether knowledge can still be delivered.
Anatoly Kitov: Monologue with Soviet sachems (Vladimir Kitov, Valery Shilov and Sergey Silantiev) Life and work of outstanding Soviet scientist Anatoly Kitov (1920-2005) is now attracting the attention not only of historians of science, but also cinema producers and writers. His biography is full of grand ideas and plans, pioneering publications and dramatic episodes. These are: cruel battles of the World War II, the struggle for the recognition of cybernetics in the USSR, very first monographs on computers and programming, the project of national computer network etc. It concerns also the years of struggle for the introducing of mathematical methods and models based on the extensive application of computers in the socialist planned economy. It would have provided Soviet leadership by objective and operative information. This history lasted for thirty years, since 1959, when Kitov appealed to the Soviet leader Nikita Khrushchev. In 1985 he appealed once more to General Secretary of the Central Committee of the Communist Party of the Soviet Union Mikhail Gorbachev. Unfortunately (in first turn for the country…) all his attempts have remained unsuccessful. Several subsequent appeals to the political authorities in 1987-89 also brought no results. The dialogue did not take place – only monologue remained. Really, it is hard to consider as a dialogue party the party which, at best, depreciatingly mutters something unintelligible, and at worst, organizes your personal persecution. So, scientist’s warnings about the imminent collapse of the USSR have come true. Ideas and achievements of Anatoly Kitov in the development of nationwide and local industrial management systems are described and analyzed in this report in broad economic and political context.
Emerging computer technologies: from information to perception (Nicola Liberati and Shoji Nagataki) The aim of the presentation is to introduce the elements we need in order to analyse the new emerging digital technologies focussing on their innovations. These technologies are willing to merge the digital world with our everyday experience by providing the perception of digital objects as they were part of our world, such as augmented reality or some applications for Oculus Rift. These digital objects are not merely “data” anymore because they are perceptual objects in our world and so we should leave the analysis related to the concept of the “information” in order to tackle the problem from a perceptual point of view. Starting from a post-phenomenological analysis we will show how the innovation of these technologies is strictly related to their “transparency” and thanks to the introduction of two different kind of transparencies we will be able to fully understand how they work and why the production of perceptual objects is so innovative. New computer technologies are making the “data” perceptual and so the word “information” has to be reframed in its importance at least.
Several types of types in programming languages (Simone Martini) Types are an important part of any modern programming language, but we often forget that the concept of type we understand nowadays is not the same it was perceived in the sixties. Moreover, we superpose the concept of "type" in programming languages with the concept of the same name in mathematical logic, an identification which is only the result of the convergence of two different paths, that started apart with different aims. The paper will present several remarks (some historical, some of more conceptual character) on the subject, as a basis for a further investigation. The thesis we will argue is that there are three different characters at play in programming languages, all of them now called types: the technical concept used in language design to guide implementation; the general abstraction mechanism used as a modelling tool; the classifying tool inherited from mathematical logic. We will suggest three possible dates ad quem for their presence in the programming language literature.
The importance of including game studies into the history of computing (Ignasi Medà Calvet) The beginning and later widespread use of the early micros and home computers were strongly related to the emergency of the first computer games. However, such an essential episode for the history of computing has traditionally focused on identifying novelty and significance, focusing on the recollection of the emergence of games and technologic devices to play with, as well as on important firsts, notable designers, and successful corporate innovators. These accomplishments are usually mapped on to an imagined evolutionary timeline that identify key moments in the past largely in terms of their relationship to contemporary developments. By limiting our study of the history of computing and videogames to only those events and agents most visible or apparently relevant to the present, we neglect the valuable contributions of other very different actors, such as politicians, programmers, designers, distributors, hobbyists, gamers and fan communities. In fact, by giving voice to these myriad other subjects –including also their everyday practices– we reveal a diverse set of activities and roles that collectively contribute to the shaping of computing technology, gaming practices and even the gaming industry in their respective local contexts.
The teaching of mathematics has been questioned for more than 30 years by the development of informatics due to its relations with mathematics : they have some common foundations and a strong link with proof, there is a significant development of shared fields and objects at their interface, and computers changed the way some mathematicians work. In coherence with the role played by epistemology in didactics of sciences, we defend that epistemological studies of the relations between mathematics and informatics must feed the didactical research on these issues. We will exemplify our point and show how epistemology of mathematics and informatics can be helpful to tackle these questions, giving perspectives about the relations between proof and algorithm, the role of language in mathematics and informatics, the thinkings in mathematics and informatics, computer-assisted mathematics, and the new objects and the new fields between mathematics and informatics.
Computers and Programmed Arts in the Sixties in Italy (Elisabetta Mori) Italian Programmed and Cinetic Art has been the focus of a renewed interest in the last years. The work of Bruno Munari, Gruppo T, Gruppo N and others has been recently shown in a series of books and exhibitions. The aim of this research is to explore the nature of relationship between Programmed Art and the diffusion of computers in the Italy of the Sixties, from the history of computer science perspective. Were the electronic computing machines involved in any way to produce the works? What was the notion these artists had of “programming”? What was the market of computers in Italy at that time? It is not just a coincidence that the very first exhibition of "Arte programmata" was presented in the Olivetti showroom in Milan in May 1962, first stage of a tour passing by Venice, Rome and Düsseldorf, ending in the United States. The company entirely funded the exhibition - as a part of its impressive cultural program of supporting art and literature, while its second generation of computers was going to be launched on the market.
Epistemic opacity vis a vis human agents has been presented as an essential, ineliminable characteristic of computer simulation models resulting from the characteristics of the human cognitive agent. This paper argues, on the contrary, that such epistemic opacity as does occur in computer simulations is not a consequence of human limitations but of a failure on the part of model developers to adopt good software engineering practice for managing human error and ensuring the software artefact is maintainable. One consequence of such failures is to create a “technical debt” which manifests itself in the so-called novel confirmation holism confronted by complex systems models. The argument from the supposed essential epistemic opacity of computational science to a non-anthropocentric epistemology runs counter to best practice in software engineering and overlooks empirical results of software engineering science.
Miscomputation in software development: Learning to live with errors (Tomas Petricek) Computer programs do not always work as expected. There is a complex taxonomy of possible miscomputations or errors. Programs may not work as a result of syntax errors, incorrect implementations of algorithms or hardware failures. But what follows after a miscomputation? In this talk, I will discuss different practical strategies that software developers employ when dealing with errors or miscomputations. We look at four different approaches. In the first one, miscomputation must be avoided at all costs, e.g. using formal proofs. In the second, miscomputation is an integral part of the development process, but is avoided in the final software. In the third approach, miscomputation even becomes a normal part of running the software and is mitigated by the runtime. Finally, in the four approach, miscomputation is a part of human computer interaction during the software execution. In other words, the talk follows the history of cohabiting with miscomputation from the early days when programming errors were not acknowledged as a significant issue to the modern development methodologies that are finding new ways of living with ubiquitous errors.
Plankalkül: not just a chess playing program (Carla Petrocelli) Starting in 1939, Konrad Zuse delved deep into the study of formal logic in order to work out his “computation plan”, i.e. a complete notation system for writing a programming language. Although the Plankalkul (Plan and Kalkul) didn’t exercise much impact on account of German post-World War hardships, it displays all the traits currently recognized as standard features of modern programming languages: universal, algorithmic, high-level and perfectly suited to the solution of extremely complex problems. The aim of the present study is to highlight the general purpose and technical specifics of this language, its historical and scientific background, and the philosophical inspiration leading Konrad Zuse to employ Hilbert’s predicate logic in the formalization of the “computation projects” for his machines.
The brain in silicon: history, and skepticism (Alessio Plebe and Giorgio Grasso) The suggestion of getting inspiration from neural architectures in designing microprocessors, the “brain in silicon” idea, has lurked around the twists and turns of the computer history, almost since its beginning. More precisely, three main periods during which this idea turned into practical projects can be identified. The first neural hardware was designed by Marvin Minsky in 1951, building upon the logical interpretation of neural activity by McCulloch and Pitts, and was followed by just few more attempts. After a long period of an almost complete lack of progress, a renewed interest sparked at the end of the 80’s, with several founded projects in Europe, US, and Japan. At the beginning of this century, almost no results of all that effort reached maturity. In the last few ears, a new wave of enthusiasm spread around, with forecast of a revolution in microprocessor design, that closely were to mimic the previous two periods. Despite obvious progress and changes in the technology and knowledge of neural mechanisms, the analysis of those three periods shows a shared the view on the main reason why the brain in silicon should be successful. We argue that this principle is theoretically flawed, and therefore the premises for the success of this approach are weak.
The informal side of computability: Church-Turing thesis, in practice (Luca San Mauro) This talk aims to provide a philosophical analysis of the notion of "proof by Church's Thesis", which is - in a nutshell - the conceptual device that permits to rely on informal methods when working in Computability Theory. Thus it allows, in most cases, to not specify the background model of computation in which a given algorithm - or a construction - is framed. In pursuing such analysis, we carefully reconstruct the development of this notion (from Post to Rogers, to the present days), and we focus on some classical constructions of the field, such as the construction of a simple set. Then, we make use of this focus in order to support the following encompassing claim (which opposes to a somewhat commonly received view): the informal side of Computability, consisting of the large class of methods typically employed in the proofs of the field, is not fully reducible to its formal counterpart. That is to say, informal constructions typically refer to a kind of objects that are: 1) not extensionally fixed; 2) independent from any specific formal mode.l
In the debate on the disciplinary nature of computing, the traditional tools of philosophy of science are often used, like in the case of the experimental method, which is still considered to play a fundamental role in the analysis of the methodological nature of the discipline. Instead of trying to adapt long established concepts to accommodate computing into existing frameworks, we aim for the introduction of new concepts that reflect the peculiar status of computing in between science and engineering. In our endeavor, in order to stretch the traditional notion of experiment and its nature and role in computing, we plan to move along three different but interconnected directions: the notion of directly action-guiding experiment that, in opposition to the traditional notion of controlled experiment, characterizes a significant part of the experimental practice in computing; the debate around engineering ontology and engineering epistemology, and whether adapting existing conceptual frameworks suffice to take into account new practices; technoscience as an engineering way of being in science, where theoretical representation and technical intervention cannot be held apart even in theory.
Hungary's Early Years in the Ryad (Mate Szabo) In 1968, Alexei Kosygin, the Chairman of the Council of Ministers of the Soviet Union initiated a cooperation among the countries of the Council for Mutual Economic Assistance (CMEA) to develop a Unified System of Electronic Computers (ES or Ryad). The Ryad program consisted in an upward-compatible series of computers in order to make up for the CMEA countries' deficits in computing technologies. The goal was achieved by the cloning of IBM's 360 (and later 370) system, different countries being responsible for different members of the series and some of the peripherals. Hungary was responsible for the smallest member of the series, the R10, a computer that did not have a corresponding machine in the IBM 360 series. It was based on the license of the 10010 and later Mitra 15 computers of the French Compagnie International pour l'Informatique (CII) that Hungary bought in 1968. The aim of my talk is to describe Hungary's first couple of years in the project. To achieve this, I briefly describe the state of computer manufacturing in Hungary prior to joining Ryad, give a quick overview of the economic and historical context, and explain the role of the institutions that took part in the project.
The role of computers in Art (Mario Verdicchio) Most of the criticism against the pioneers of Computer Art in the 1960s came in the form of a specialized version of the Lovelace objection: since machines simply follow orders, one cannot expect any creativity from them, hence the works of computer programmers, if they are the result of a creative process, must entirely come from the programmers’ minds; as programmers are not artists, their works are spawned from a process that is not artistic and thus cannot be considered artworks. Today such an objection no longer holds because many artists use computers to create works shown in galleries and museums, that is, it seems like computers have entered the Artworld in full effect. Still, this new practice has not been accompanied by an adequate expansion of Art theory to accommodate it, which leaves the door open to criticism in the Artworld about whether computers are relevant and constitute an actual step forward in the field or are simply a fad. Moreover, the only subfield of Computer Art where computers can be shown to be essential, namely Interactive Art, poses a particularly critical problem regarding what distinguishes its works from videogames.
In search of the roots of formal computation (Jan von Plato) It became clear by the 1930s that steps of formal computation are also steps of formal deduction as defined by recursion equations and other principles of arithmetic. This development began with attempts at giving a foundation to the laws of commutativity and associativity of sum: Followers of Kant's doctrine of the synthetic a priori in arithmetic, a certain Johann Schultz as foremost, missed by a hair's breadth the proper recursive definition of addition that appeared instead first in a book of Hermann Grassmann of 1861. Schultz in his book Anfangsgruende der reinen Mathesis (Basics of pure mathesis, 1790) found it necessary to pose the commutativity and associativity of sum as axioms, as can be seen from the comparison: Schultz 1790: 7+5=7+(4+1)=7+(1+4)=(7+1)+4 Grassmann 1861: 7+5=7+(4+1)=(7+4)+1 Schultz thus missed the inductive proofs of commutativity and associativity, and could not do better than claim that any attempted proof of the commutativity of addition would be circular. mHe gives instead an inductive proof of the commutativity of product in which the right recursion equations appear as "corollaries,'' a reverse of the conceptual order recursive definition -- inductive proof. This order was found by Grassmann and a line can be followed from it to Hankel, Schröder, Dedekind, Peano, and Skolem, the last mentioned marking the birth of recursive arithmetic.
All human activities take place over time, each with its own temporality which itself evolves over time. This paper is part of a historical reflection on the way that temporality is articulated in music and in computer technology, two domains whose temporal logics are initially very different. More specifically, this paper focuses on my specialism: the practices of composers of academic computer-assisted music. It concentrates on the temporal aspects of the act of composition. An initial phase, beginning around 1955, prolonged the reflections on composition of the period 1900-1950. The explosion of calculation speeds in the 1980s allowed “real time” to emerge , followed since the beginning of the 2000s by “live” composition time. This has brought computer time and musical time closer and closer together. The research currently being carried out in France, whether in computer science or in musical research bears witness to convergence and cross-fertilisation between the reflections on time proper to these two fields. The new types of music that are currently emerging subvert the “traditional” composition process. This movement is part of the general crisis affecting the futurist regime of historicity, and illustrates the growing influence of presentism. |