ER01: Randomized Algorithms (7-11 December)

Dates : 7-11 December

Teachers: Joel Ouaknine, Ben Worrell et Stefan Kiefel (Oxford).
Local contact: Pascal Koiran

Title: Probabilistic Techniques and Models in Computer Science

The schedule:
Monday: 9:30 – 11:30 am and 1:30 – 3:30 pm
Tuesday: 9 – 11:30 am, 1:30-3:30 pm and 4-6 pm.
Wednesday: 9-11:30 am and 1:30-3:30 pm.
Thursday: 9 am – 12:30 pm
Friday: 9 am – 12:30 pm and afternoon exam: 2 pm – 4 pm

Synopsis (more information here):

— Decision Problems

* Space-bounded interactive protocols

* Reachability and threshold problems for Markov chains

* Connections with number theory

— Stochastic Processes

* Markov-chain Monte Carlo techniques, Coupling

* Martingales, Optional Stopping Theorem, Azuma’s inequality and
applications, Lyapunov functions

* Equivalence of Markov chains, Markov decision processes

* Distance between Markov chains

* Analysis of infinite-state Markov chains

— Data Structures and Algorithms

* Luby’s algorithm

* Count-min filters

* Random rounding, packet routing

— Learning Theory

* Rademacher complexity, VC dimension

* Johnson-Lindenstraus Lemma

ER02: Data Mining : Statistical Modeling and Learning from Data (11-15 January)

Dates: 11-15 January 2016

Teachers: Ciro Cattuto, Laetitia Gauvin et André Panisson (ISI Torino)

Local contact: Márton Karsai (marton.karsai@ens-lyon.fr)

Venue: ENS Lyon, site Monod, Amphi B (entrance from the 4th floor)

Time: 9:30 – 16:45

External participants who has no access to the building should contact Marton Karsai (marton.karsai@ens-lyon.fr) in advance.

The main page of the course can be found here.

The course aims to provide basic skills for analysis and statistical modeling of data, with special attention to machine learning both supervised and unsupervised. An important objective of the course is the operational knowledge of the techniques and algorithms treated, and for this aim the lectures will focus on both theoretical and practical aspects of machine learning, and for the practical part it is required to have a good knowledge of programming, preferentially in Python language. The expected outcomes include (1) understanding the theoretical foundations of machine learning and (2) ability to use some Python libraries for machine learning in the context of simple applications.

Topics will include:

– The major paradigms of learning from data, the learning problem, the feasibility of learning
– The architecture of machine learning algorithms: model structure, scoring, and model selection ­ The theory of generalization, model complexity, the approximation­generalization tradeoff, bias and variance, the learning curve
– Score functions and optimization techniques. Gradient descent and stochastic gradient descent.
– Validation and Cross­Validation: validation set, leave­one­out cross validation, K­fold cross­validation
– Linear Models: linear classification, linear regression, ordinary least squares, logistic regression, non­linear transformations
– Non­linear models for classification: support vector machines, tree models, nearest­neighbor methods, Naive Bayes
– Overfitting and Regularization: model complexity and overfitting, commonly used regularizers, Lasso.
– Unsupervised learning: cluster analysis, the K­means algorithm, hierarchical clustering
– Feature selection and dimensionality reduction: Singular Value Decomposition, Matrix Factorisation
– Information retrieval, text representation and classification, term weighting

Overview of the theoretical aspects of machine learning will be followed by the application of algorithms in real problems such as: image classification, text mining, spam detection… The exercises will be implemented with the help of an interactive Python environment, with the use of standard tools for data analysis and visualization, such as the Scientific Python stack, Scikit­Learn, Pandas and NLTK.

Evaluation: personal projects with oral presentation

ER01: Data Structures for Big Data

Dates: January 12-16, 2015.

Teachers: Michael Bender (Stony Brook University), Martín Farach-Colton (Rutgers and Tokutek), Samuel McCauley (Stony Brook University)

Description of the school

Course Hours

Monday-Thursday:

  • morning lecture: 9:30am-11:30am

  • afternoon lecture: 1:30pm-3:30pm.

  • recitation/homework practice: 4pm-5pm.

Friday:

  • morning lecture: 9:30am-11:30am

  • afternoon exam:  2pm-5pm

Local contact :  Frédéric Vivien.

Registration

Registration is free, but the number of participants is limited. Registration includes neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before … (to be decided later) by clicking on this link, filling the form and sending the e-mail message. You will receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the form below, then send it by e-mail to nicole.meftah@ens-lyon.fr with the subject line “Registration form — research school 1”


First Name:
Last Name:
Institution:
Position (MSc student, PhD student, researcher, etc.):
E-mail address: 

wishes to attend the research school “ER01: Algorithmic Game Theory”, taking place at ENS Lyon, from Jan. 12 to Jan. 16, 2015.

ER02: Algorithms and Heuristics for Large-scale Data Sets

Dates: January 19-23, 2015.

Dedicated webpage

Teachers: Nicolas Schabanel (Université Paris-Diderot), Alain Barrat et Bruno Gonçalves (Université Aix-Marseille)

Local contact : Marton Karsai

Registration

Registration is free, but the number of participants is limited. Registration includes neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before … (to be decided later) by clicking on this link, filling the form and sending the e-mail message. You will receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the form below, then send it by e-mail to nicole.meftah@ens-lyon.fr with the subject line “Registration form — research school 2”


First Name:
Last Name:
Institution:
Position (MSc student, PhD student, researcher, etc.):
E-mail address: 

wishes to attend the research school “ER02: “, taking place at ENS Lyon, from Jan. 19 to Jan. 23, 2015.

ER03: Static Analysis and Compilation

Dates: January 26-30, 2015.

Dedicated webpage

Teacher: Fernando Magno Pereira (Univ Mineas Gerais, Brazil)

Local contact :  Laure Gonnord

Registration

Registration is free, but the number of participants is limited. Registration includes neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before … (to be decided later) by clicking on this link, filling the form and sending the e-mail message. You will receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the form below, then send it by e-mail to nicole.meftah@ens-lyon.fr with the subject line “Registration form — research school 3”


First Name:
Last Name:
Institution:
Position (MSc student, PhD student, researcher, etc.):
E-mail address: 

wishes to attend the research school “ER03: Static Analysis and Compilation”, taking place at ENS Lyon, from Jan. 26 to Jan. 30, 2015.

ER01: Algorithmic Game Theory

Dates: December 9-13, 2013.

School outline and timetable .

Local contact :Natacha Portier.

Registration

Registration is free, but the number of participants is limited. Registration includes neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before November 29, 2013 by clicking on this link, filling the form and sending the e-mail message. You will receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the form below, then send it by e-mail to nicole.meftah@ens-lyon.fr with the subject line “Registration form — research school 1”


First Name:
Last Name:
Institution:
Position (MSc student, PhD student, researcher, etc.):
E-mail address:

wishes to attend the research school “Algorithmic Game Theory”, taking place at ENS Lyon, from Dec. 9 to Dec. 13, 2013.

ER02: Synchronous Approaches for Embedded Systems

Dates: January 13-17, 2014.

Web site.

Local contact : Laure Gonnord.

Registration

Registration is free, but the number of participants is limited. Registration includes neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before January 3, 2014 by clicking on this link, filling the form and sending the e-mail message. You will receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the form below, then send it by e-mail to nicole.meftah@ens-lyon.fr with the subject line “Registration form — research school 2”


First Name:
Last Name:
Institution:
Position (MSc student, PhD student, researcher, etc.):
E-mail address:

wishes to attend the research school “Synchronous Approaches for Embedded Systems”, taking place at ENS Lyon, from Jan. 13 to Jan. 17, 2014.

ER03: Logic of Dynamical Systems

Speakers: André Platzer and Sarah Loos.

Dates: January 20-24, 2014.

Web page.

Local contact: Filippo Bonchi

Registration

Registration is free, but the number of participants is limited. Registration includes neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before January 10, 2014 by clicking on this link, filling the form and sending the e-mail message. You will receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the form below, then send it by e-mail to nicole.meftah@ens-lyon.fr with the subject line “Registration form — research school 3”


First Name:
Last Name:
Institution:
Position (MSc student, PhD student, researcher, etc.):
E-mail address:

wishes to attend the research school “Logic for Dynamical Systems”, taking place at ENS Lyon, from Jan. 20 to Jan. 24, 2014.

ER03: Constraint satisfaction problems

January 21-25, 2013, ENS Lyon.

Speakers: Manuel Bodirsky, Michael Pinsker.

A Constraint Satisfaction Problem (CSP) is a computational problem where the task is to decide for a given finite set of variables and constraints whether there is an assignment to the variables that satisfies all constraints. Many computational problems in various areas of theoretical computer science can be modeled in this way.

If the variables take their values in some finite domain, there is a rich universal-algebraic theory that describes those constraint languages whose CSP is NP-hard, and those whose CSP is in P. In this course, we present a powerful generalization of this approach to CSPs where the underlying domain is infinite.
After introducing the necessary background in model theory, the course will also cover recent applications of structural Ramsey theory that have led to complexity classification results for large classes of CSPs.

Outline of the course

Monday: Examples of CSPs from Artificial Intelligence. Primitive positive formulas and NP-hardness proofs. Some efficient algorithms. The Feder-Vardi conjecture.

Tuesday: The universal-algebraic approach. The Inv-Pol Galois correspondence. The core of a structure. The tractability conjecture. Cyclic operations. Exercices.

Wednesday: CSPs on infinite domains. Fraïssé limits, the Ryll-Nardzewski theorem. The Inv-Pol correspondence on infinite domains. The topology of simple convergence.  Henson digraphs and other examples.

Thirsday: Introduction to Ramsey theory. Canonical functions. The classification of Graph-SAT problems.

Friday: Open problems, proposals of research projects. Exam and discussion.

Local contact: Pascal Koiran

Registration

Registration is free, but there is a limit on the number of participants and it does include neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before Monday, December 3, by clicking on this link and filling in and sending the e-mail form. You shall receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the following form and send it by e-mail to research.school.1@gmail.com, with subject “Registration form — research school 3”


First Name:
Last Name:
Institution:
Position (MSc student, PhD student, researcher, etc.):
E-mail address:

wishes to attend the research school ‘Constraint satisfaction problems’, taking place at ENS Lyon, from Jan. 21 to Jan. 25.

ER02: Semantics and tools for low-level concurrent programming

January 14-18, 2013, ENS Lyon.

Speakers

Francesco Zappa Nardelli (INRIA), Mark Batty (University of Cambridge), Alastair Donaldson(Imperial College London), and Martin Vechev (ETH Zurich)

Slides

Available from here

Abstract

Optimisations performed by the hardware or by high-level language compilers can reorder memory accesses in various subtle ways. These reorderings can sometimes be observed by concurrent programs, exacerbating the usual problems of concurrent programming. The situation is even worse when we abandon traditional shared memory concurrency in favor of graphics processing units (GPUs) or exotic hardware. In these lectures we will cover recent progress in understanding and specifying the behaviour of some major processor architectures and the impact of optimisations performed by high-level programming language compilers, stablishing a solid basis for shared memory programming. We will then explore techniques which can help programmers develop reliable concurrent software by automatically discovering the necessary synchronization. We will also discuss techniques for verification of multicore software, including GPU kernels.

Schedule:

  • Monday:
    • 10h30-12h30: Welcome + introduction
    • 14h: Batty: x86TSO
    • 15h30-16h30: TD: x86TSO
  • Tuesday
    • 9h-10h30: Batty + Zappa: ARM & Power
    • 11h-12h30: Zappa : Compiler Optimisations + DRF
    • 14h-15h: TD: ARM & Power
    • 15h30-16h30: TD: Compiler Optimisations + DRF
  • Wednesday
    • 9h-10h30: Batty : C11
    • 11h-12h30: Vechev (1)
    • 14h-15h: TD : C11
    • 15h30-16h30: TD : Vechev
  • Thursday
    • 9h-10h30: Vechev (2)
    • 11h-12h30: Alastair (1)
  • Friday
    • 9h-10h30: Alastair (2)
    • 11h-12h30: TD: Alastair
    • 14h-15h30: Mini-exam

For the practical sessions

Please bring your laptop with you. In case you do not have one, please get in contact with Filippo Bonchi (ens-lyon.fr).

Contact: Filippo Bonchi.

Registration

Registration is free, but there is a limit on the number of participants and it does include neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before Monday, January 8, by clicking on this link and filling in and sending the e-mail form. You shall receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the following form and send it by e-mail to research.school.1@gmail.com, with subject “Registration form — research school 2”

First Name:

Last Name:

Institution:

Position (MSc student, PhD student, researcher, etc.):

E-mail address:

wishes to attend the research school ‘Semantics and tools for low-level concurrent programming’, taking place at ENS Lyon, from Jan. 14 to Jan. 18.

ER01: Fault tolerance for high performance computing

George Bosilca et Thomas Hérault,

University of Tennessee Knoxville

December 10-14, 2012, ENS Lyon.

During this course, we will discuss fault tolerance techniques for high performance computing systems.

Motivation.

In June 2008, the LANL’s Road Runner computing system was the first to  cross the hallmark of one Petaflop based on the Linpack benchmark (reaching $1.026 \cdot 10^{15}$ double precision floating point operation per second). Today, only 4 years later, the fastest computing system, the Sequoia BlueGene/Q at LLNL, sustains 16.324 Petaflops. Continuing on the same path, it is expected that as early as 2022, computing systems will conquer the 1 Exaflop milestone ($10^18$ flops). After the multicore revolution of 2000, such performance became achievable only by multiplying the number of computing components (cores) in the parallel machine. When the 2008 Road Runner machine had 122,400 cores, distributed over 20,000 heterogeneous nodes, today’s Sequoia systems exhibits no less than 1,572,864 cores, distributed over 100,000 nodes. The drawback of such exponential increase in the number of components is a disruptive decrease in the reliability of the entire

system. Indeed, the Mean Time To Interruption, defined as the average duration for which all the components of the platform behave as specified, is inversely proportional to the number of components in this platform. When in 2009, the MTTI of the largest supercomputers was estimated close to 4 days, it sharply decrease below a single day (19h04 min) in 2011, and projected to continue its descent in the following years (less than 4h by 2015, less than 2h by 2018). While technical breakthrough will hopefully allow the MTTI to remain in a tolerable area, long running applications need an execution environment more flexible, an environment allowing them to survive for longer period of time either algorithmically or by taking advantage of specialized capabilities of the programming model. Unfortunately, current programming paradigms do not account for such a hardware instability; the failure of a single component (be it memory or computing unit, hard

drive or network card) has a drastic and lasting effect on the computation. Even without accounting for the implicit energy costs, in front of the evolving expectations for the next generation computing platforms MTTI, such approaches cannot reasonably lead to sustainable software solutions.

Overview.

This course will highlight the existing solutions, and present parallel algorithms designed to provide correct solutions despite the presence of failures. We will show how the programming environment can be modified to increase the system reliability and sustainability. The course will start with an introduction to High Performance Programming, its challenges and typical algorithms.

Then we present theoretical as well as practical studies of the many fault-tolerant approaches that have been proposed over the years.

Prerequisite.

The course assumes a general knowledge of algorithms and operating system, as well as a basic understanding of machine architecture and network systems.

Update: bring your laptop! Some exercises on machine will be proposed along the course. Attendants are expected to bring a laptop with a possibility to establish a connection to a distant machine (typically, ssh): please write to the local contact (Yves Robert) if this is not possible for you — we should be able to lend a few laptops to participants.

Local contact: Yves Robert.

Registration

Registration is free, but there is a limit on the number of participants and it does include neither housing nor meals (though for lunch the attendees will be granted access to the student cafeteria). Registration should be made before Monday, December 3, by clicking on this link and filling in and sending the e-mail form. You shall receive a confirmation as soon as possible.

Alternatively, you can copy/paste and fill the following form and send it by e-mail to research.school.1@gmail.com, with subject “Registration form — research school 1”

 

First Name:

Last Name:

Institution:

Position (MSc student, PhD student, researcher, etc.):

E-mail address:

wishes to attend the research school ‘Fault tolerance for high performance computing’, taking place at ENS Lyon, from Dec. 10 to Dec. 14

ER02 : Compressive Sensing

Lecturer : Justin Romberg (Georgia Institute of Technology, Atlanta)

Outline of the school :
“The dogma of signal processing maintains that a signal must be sampled at a rate at least twice its highest frequency in order to be represented without error. However, in practice, we often compress the data soon after sensing, trading off signal representation complexity (bits) for some error (consider JPEG image compression in digital cameras, for example). Clearly, this is wasteful of valuable sensing resources. Over the past few years, a new theory of “compressive sensing” has begun to emerge, in which the signal is sampled (and simultaneously compressed) at a greatly reduced rate.
Compressive sensing is also referred to in the literature by the terms: compressed sensing, compressive sampling, and sketching/heavy-hitters.” [by courtesy of: http://dsp.rice.edu/cs]

  • Basis decompositions and frames (3-4 hours)
    •  fundamentals of basis and frame decompositions
    •  the discrete cosine transform and applications to image/video compression
    •    the lapped orthogonal transform
    •     wavelets
    •     thresholding for noise reduction
    • Sparsest decomposition from a dictionary (3-4 hours)
      •    omp and basis pursuit for selecting atoms
      •     uncertainty principles and sparsest decomposition
      •     the “spikes+sines” dictionary
      •     general unions of orthobases
  • Introduction to compressive sampling and applications (2 hours)
  • Recovering sparse vectors from linear measurements (6 hours/ 1 day)
    •    review of classical least-squares theory: the svd, pseudo-inverse, stability analysis, regularization
    •    sparse recovery conditions: l1 duality, sparsest decomposition revisited (with random support)
    •     the restricted isometry property and sparse recovery : l1 for perfect recovery from noise-free measurements,
      l1 stability, l2 stability
  • Random matrices are restricted isometries (2 hours)
  • Optimization (6 hours / 1 day)
    • conjugate gradients
    •   log-barrier methods
    •       first-order l1 solvers
    •   greedy algorithms and iterative thresholding
    •  recursive least-squares
    •    the Kalman filter
    •     dynamic l1 updating
  • Low-rank recovery (2 hours)

Dates : 9-13/01/2012

Correspondant local : Paulo Goncalves