MGS 21 Participants' Talks



Track I (Room M)

The untyped computational lambda-calculus and its intersection type discipline

Riccardo Treglia (Universita degli Studi di Torino)

Abstract: We study a Curry style type assignment system for untyped -calculi with effects, based on Moggi's monadic approach. Moving from the abstract definition of monads, we introduce a version of the call-by-value computational -calculus based on Wadler's variant, without let, and with unit and bind operators.

We define a notion of reduction for the calculus and prove it confluent. In addition, we relate our calculus to the original work by Moggi and other calculi from literature. We then introduce an intersection type system inspired by Barendregt, Coppo and Dezani system for ordinary untyped -calculus, establishing type invariance under conversion. Finally, we introduce a notion of convergence, which is precisely related to reduction, and characterize convergent terms via their types.

Reference: U. de'Liguoro and R. Treglia - The untyped computational lambda-calculus and its intersection type discipline


Partial Type Annotations

Pedro Ângelo ( Universidade do Porto)

Abstract: Type inference algorithms allow the partial annotation of code, where a programmer can type annotate some terms while leaving others to be inferred. This is usually accomplished by having two lambda abstraction rules, one with annotations and the other without, which then also translates into two type inference rules. In this paper we introduce empty type annotations, which allows the collapse of both type system and type inference rules for lambda abstractions into one, and also allows a finer-grained definition of type annotations, where some components of a type can be specified and others can be left to be inferred. We define the type system, the counterpart type inference algorithm and prove the usual correctness and completeness properties.


A mechanical formalization of Hidnley-Damas-Milner type inference (Ongoing work)

Roger Bosman (KU Leuven)

Abstract: ML-style polymorphism, as supported by the Hindley-Damas-Milner (HDM) system, allows for sound, complete, and decidable type inference. HDM is extended with elaboration to support implicit programming features like Haskell's type classes. While HDM itself has been thoroughly studied over the years, both on paper and in mechanized settings, as far as we know, there is no end-to-end mechanization of both HDM type inference and elaboration.

We discuss our ongoing work towards such a mechanization in the Coq proof assistant, which we base on a reformulation of algorithm W that explicitly maintains the scoping of unification variables to facilitate elaboration.


On Structuring Pure Functional Programs Using Monoidal Profunctors

Alexandre Garcia de Oliveira (University of Sao Paulo)

Abstract: Pure functional programming has gained popularity in the computer science community and software industry. As systems became more complex, more techniques to decompose the whole into parts and reason about them are required. Several of these techniques are inspired by category theory. This work introduces one of these techniques, using Haskell, namely monoidal profunctors. The goal of this research is to find whether monoidal profunctors and their free construction can be really useful as applicatives are.


Boxes and Locks

Nachiappan Valliappan (Chalmers University of Technology)

Abstract: Fitch-style modal lambda calculi provide an appealing solution to programming necessitation (or box) modalities by extending typed-lambda calculi with a delimiting context operator (or lock). The addition of locks simplifies the formulation of typing rules for different modalities and enables an elegant semantic foundation for these calculi, but obscures syntactic lemmas involving the context, and makes it difficult to formalise and prove meta-theoretic properties about these calculi. In this talk, I'll present an overview of an intrinsically typed formalisation of the Fitch-style Intuitionistic K (IK) calculus and a normalisation-by-evaluation procedure for it. The goal for this work is to implement executable correct-by-construction normalisation procedures for Fitch-style modal calculi beyond IK and illustrate the use of normalisation for proving domain-specific theorems in application areas such as modal logic and staged computation.


Equating polymorphic programs with the same potential utility

Cristian F. Sottile (Universidad de Buenos Aires)

Abstract: The theory of type isomorphisms relates types according to their potential utility, i.e., the same results can be obtained from them after certain constructor eliminations, and establishes that any program can be transformed into another of an isomorphic type without any loss of information. Polymorphic System I is a System F extension that incorporates isomorphisms by equating isomorphic types and their corresponding programs, allowing the type system to base its restrictions on programs' potential utility instead of syntax, and inducing adaptation of programs to the context they are being used in. This work aims to contribute to the seek for languages that more accurately express logics and computation, for instance by questioning if a function and its curried analogous should be truly different, and to explore the consequences of having type systems with such flexibility, like being able to project functions before applying them.


Track II (Room A)

Linear logic with fixpoints from a denotational semantic point of view

Farzad Jafarrahmani (IRIF, Paris)

Abstract: We provide two denotational semantics of full propositional classical linear logic extended with least and greatest fixpoints of formulae in the relational model (REL) and non-uniform totality spaces (NUTS). Those models distinguish the least and greatest fixed-point operators (mu and nu resp.) in different ways: mu and nu are interpreted in the same way in REL while their interpretations are different in NUTS, though the proofs are interpreted by the same sets in both models. Finally, in the case of infinite proofs, we show that the interpretation of a valid proof is a total element.

This is based on joint work by Thomas Ehrhard and Alexis Saurin.


Paraconsistent probabilistic logics and their proof theory

Daniil Kozhemiachenko (INSA CVL, LIFO Université Orléans)

Abstract: We present paraconsistent bi-lattice extensions of the Łukasiewicz logic and their application to probabilistic reasoning. We consider their semantical properties and their connections to the possible reasoning strategies. We also provide their proof theory in the form of constraint tableaux and discuss the attempt of constructing the display calculus.


From Lindenbaum-Tarski Algebras to Classifying Toposes

Joshua Liam Wrigley (University of Insubria)

Abstract: In this presentation we motivate the study of classifying toposes as the first-order analogue to the Lindenbaum-Tarski algebra of a propositional theory. A classifying topos classifies the models of a geometric theory in any cocomplete topos (such as the topos of Sets) in the sense that each model corresponds naturally to a colimit and finite limit preserving functor between the two toposes.

I will focus on the construction of the syntactic site for the classifying topos of a geometric theory, and, more generally, how a Grothendieck topology on any category can be considered as an inference relation on that category, mirroring the relationship between Lindenbaum-Tarski algebras, partially ordered sets and frames. Reference will then be given to how Morita-equivalences of classifying toposes corresponds to (often non-trivial) semantic equivalences of geometric theories.


From computation to a reconstruction of (linear) logic

Boris Eng (LIPN Université Sorbonne Paris Nord, France)

Abstract: The study of logic usually begins with a presupposed definition of logic such as a notion of "rules of reasoning". As an alternative foundation for logic, Girard's Transcendental Syntax project sketches rough ideas suggesting a reconstruction of linear logic from computation, considered neutral and objective. One goal of my thesis is to give a technical meaning to this project. In this short presentation, I briefly describe how the computational behaviour of a model of computation I call "Stellar Resolution" can speak about the proofs of linear logic and the idea of logical correctness.


Some axioms of Set Theory through the Löwenheim-Skolem Theorem

José Javier González López (University of Salamanca)

Title: Some axioms of Set Theory through the Löwenheim-Skolem Theorem Abstract: The different versions of the Löwenheim-Skolem Theorem are deeply related to several axioms of Set Theory: precisely some of those which are independent of ZF like the Axiom of Choice, the Axiom of Dependent Choices,...We are going to see how the Generalized Löwenheim-Skolem Theorems are equivalent to the Axiom of Choice, and how some versions of the original theorem are necessary conditions of the Axiom of Dependent Choices. These connections with these axioms delimit so rigidly how the Set Theory of the domains of all structures of second-order and first-order logic are.


Subunits and local states

Nuiok Dicaire (University of Edinburgh)

Abstract: A brief overview of how the categorical notion of subunit of the tensor unit in a monoidal category can be used to provide a notion of localisation of monads.


Track III (Room B)

Measure inference and inductive lemma discovery for program termination

Rodrigo Raya (EPFL)

Abstract: Proving program termination is at the heart of axiomatic verification. The classical methodology to prove it uses a measure or ranking function to embed program data in a well-founded domain. However, termination is an undecidable property and no complete procedure can exist that synthesizes all of these measures. In this talk, I will show that several program analysis derived from the dependency pair framework can be leveraged to produce measure functions. I will also show that, in the higher-order case, these measures no longer suffice and I will give alternative analysis which can be seen as a process of inductive lemma discovery. Based on these observations, we developed a standalone system inferring measures and inductive lemmas. The output of the system can be easily checked by an external automatic theorem prover.

This is joint work with Prof. Viktor Kunčak and Dr. Nicolas Voirol.


Guarded Kleene Algebra with Tests: Coequations, Coinduction, and Completeness

Todd Schmid (University College London)

Abstract: Guarded Kleene Algebra with Tests (GKAT) is the fragment of Dexter Kozen's Kleene Algebra with Tests consisting of propositional WHILE programs. GKAT was first described in 2008, but its formal introduction had to wait until POPL 2019. In the POPL paper from 2020, a sound and complete axiomatization of language equivalence was given, and an algorithm running in nearly-linear time for deciding program equivalence was given. In our paper, we show that the omission of a single axiom from the POPL paper results in an axiomatization which is sound and complete with respect to behavioral equivalence, and characterize the behaviors exhibited by GKAT programs along the way. The characterization allows us to settle an open problem. Finally, we show how our completeness theorem can be used to obtain the completeness theorem from 2020.

This is joint work with Tobias Kappé, Dexter Kozen and Alexandra Silva.


A Neural Compiler

Joey Velez-Ginorio (MIT, USA)

Abstract: Programming languages are our best descriptors of computation. They help us interpret, control, and construct machine behavior. Insofar as the brain computes, its best descriptor will also be a programming language. One which helps us interpret, control, and construct neural behavior. My goal is to find and build that language. Currently, I see promise in strange and interesting programming language semantics. For example, there exist categorical semantics of lambda calculi to vector spaces. We can systematically derive compilers from these semantics. And these compilers yield programming languages whose programs compile to and from neural networks. This work is ongoing, and I detail the general approach and strategy in this talk.


Normative Reasoning in Isabelle/HOL

Ali Farjami (University of Luxembourg)

Abstract: Deontic logic is expressive enough for representing ethical and legal reasoning tasks. Legal and ethical reasoning have special requirements due to their normative and conflict resolution roles. We have the family of traditional deontic logics, which includes standard deontic logic (SDL), a modal logic of type KD, and dyadic deontic logic (DDL). On the other hand, we have so-called norm-based deontic logics. The deontic operators are evaluated not with reference to a set of possible worlds but with reference to a set of norms. A particular framework that falls within this category is called input/output (I/O) logic. We have shown the faithfulness of some logical embeddings from both families in higher-order logic based on the shallow semantical embedding approach. We encoded the logical translations in Isabelle/HOL, which turns this system into a proof assistant for deontic logic reasoning. The experiments with this environment provide evidence that these logical implementations fruitfully enables interactive and automated reasoning at the meta-level and the object-level, which developed to LogiKEy framework. In this talk, I will present ongoing work on the study of these embeddings and their applications for legal and ethical computerized applications.


The interplay of data and channel imperfections in verifying termination of declarative distributed systems

Francesco Di Cosmo (Free University of Bozen-Bolzano)

Abstract: Motivated by the area of Declarative Networking and Business Process Analysis, Declarative distributed systems (DDSs) are a model of data-aware distributed computing grounded in logic programming. In fact, the behavior of DDS nodes is specified by means of D2C, a Datalog-like logic programming language, to handle various data-sources (input, state, and messages), modeled as relational databases or database snippets. We aim at a rigorous and comprehensive study of DDS verification, focusing on the interplay of data and network features, e.g., the channel imperfections. While in all generality DDS model-checking is undecidable, decidability applies on DDSs enforcing bounded data-sources and channel sizes: during each computation step, the source-database (the channel) mentions at most b constants (contains at most b messages), where b is a fixed integer, while an unbounded amount of data can still flow through the system along runs. Nevertheless, we argue that by tweaking the message expressive power it is possible to retain decidability even dropping channel-boundedness. Specifically, we are on our way to prove that sometimes termination of bounded DDSs with messages ranging over a propositional signature amounts to reachability of communicating finite state machines (CFSMs), which is decidable as soon as the channels are not perfect (lossy or disordered). We also plan to extend those results to the case of DDSs with messages over a unary signature and CFSMs with infinite data, whose decidability requires disordered channels. Finally, we conjecture that DDSs with messages over a binary signature are powerful enough to suffer of undecidability even over lossy and disordered channels.


Asynchronous games on Petri nets

Federica Adobbati (University of Milano-Bicocca)

Abstract: The interaction of autonomous agents on a concurrent system can be modelled through a game, in which each agent is a player, and need to pursue a specific goal on the system. We model the system as a Petri net, and we investigate when an agent has a winning strategy, i.e. when the agent is able to guarantee a desired behaviour on the system, no matters how the other players behave. We consider the case in which the goal of the user is specified through a temporal logic, and we discuss two different approaches for finding whether the agent has a winning strategy, focussing on the problems arising when we deal with concurrent systems.