M2 2015-2016

Year 2015-2016

The goal of this Master program is to provide a wide choice of high quality courses in computer science ranging from the most theoretical aspects to applications. It is open to students having achieved a 4th year degree in Science (Master 1) who wish to deepen their understanding of Computer Science. The typical year starts with courses during the fall term, followed by several (24h, week long) winter schools, and achieved by a 20 weeks research internship. Courses and materials are provided in English to encourage foreign students to join our program. Academic tutoring is provided to every student for finding internships, choosing courses, and providing guidance all along the year.

  • List of courses (for full description follow the link CRxx):

CR01 Advanced Cryptographic PrimitivesDamien Stehlé and Benoit Libert.

CR02 Resilient and Energy-Aware Scheduling AlgorithmsAnne Benoit.

CR03 Network Algorithms for Molecular BiologyMarie-France Sagot.

CR04 Quantum Information and ComputationPascal DegiovanniOmar Fawzi and Natacha Portier.

CR05 Tilings: between Dynamical Systems and ComputabilityNathalie Aubrun and
Mathieu Sablik.

CR06 Algorithmic Number TheoryGuillaume Hanrot.

CR07 Computer Science and PrivacyBenoit Libert and Frédéric Prost.

CR08 Arithmetic Circuit ComplexityPascal Koiran and Natacha Portier.

CR09 Distributed Computing: Models and ChallengesEddy CaronGilles FedakChristian Perez and Laurent Lefevre.

CR10 Program Analysis, Safety Program VerificationLaure Gonnord and David Monniaux.

CR11 Rule-based Modeling of Biochemical SystemsRuss Harmer.

CR12 Coinductive Methods in Computer Science, Filippo Bonchi, Daniel Hirschkoff and Damien Pous.

CR13 Implicit Computational ComplexityPatrick BaillotOlivier Laurent.

CR14 Finite Automata in Number TheoryBoris Adamczewski.

CR15 Complex NetworksChristophe Crespelle and Marton Karsai.

CR16 Signal Processing and NetworksPierre BorgnatJean-Christophe Pesquet and Nelly Pustelnik.

CR17 Probabilistic Methods, with Applications to GraphsLouis Esperet and Stéphan Thomassé.

CR18 Advanced Compilers: Loop Transformations and High-Level Synthesis, Tomofumi Yuki and Christophe Alias.

CR19 Fundamental Algorithms in Real Algebraic GeometryMohab Safey El Din and Jean-Charles Faugère.

  1. Winter schools: ici
  • Pre-course meeting: A (mandatory) pre-course meeting is planned on September 11 at 9am, Amphi B. The general organisation of the year and a description of the courses will be provided.
  • Schedule: Courses start September 14. Autumn holidays are October 26-30, Winter holidays are December 21-31. Exams will be held on January 4-8 2016. The schedule will be released soon.
  • Validation: To obtain their degree, CS Master students must complete 60 credits including the internship (30 credits), three winter schools (2 credits each) and four courses (4 credits each) in the above list. A typical choice is 6 courses, 3 schools and the internship; the extra courses can be chosen either in the CS courses or in the other departments. To meet the quality requirements of our program, the course choices must be approved by the academic tutor and the head of the Master 2 program. Administrative registration is mandatory.
  • Complex System program.

    The “Complex Networks” M2 master program organised by IXXI/ENS Lyon provides an innovative training for students who are interested in the interdisciplinary research of complex networks and modeling of complex systems. This program maintains a balance between different disciplines by combining courses from the fields of biology, computer science, mathematics, physics and sociology. Students from all of these disciplines are welcome to participate; the student will follow both the ”Complex Networks” program and part of their own M2: Computer Science, Physics, Bioscience or Mathematics. Follow this link for further details.

ER01: Randomized Algorithms (7-11 December)

Dates : 7-11 December

Teachers: Joel Ouaknine, Ben Worrell et Stefan Kiefel (Oxford).
Local contact: Pascal Koiran

Title: Probabilistic Techniques and Models in Computer Science

The schedule:
Monday: 9:30 – 11:30 am and 1:30 – 3:30 pm
Tuesday: 9 – 11:30 am, 1:30-3:30 pm and 4-6 pm.
Wednesday: 9-11:30 am and 1:30-3:30 pm.
Thursday: 9 am – 12:30 pm
Friday: 9 am – 12:30 pm and afternoon exam: 2 pm – 4 pm

Synopsis (more information here):

— Decision Problems

* Space-bounded interactive protocols

* Reachability and threshold problems for Markov chains

* Connections with number theory

— Stochastic Processes

* Markov-chain Monte Carlo techniques, Coupling

* Martingales, Optional Stopping Theorem, Azuma’s inequality and
applications, Lyapunov functions

* Equivalence of Markov chains, Markov decision processes

* Distance between Markov chains

* Analysis of infinite-state Markov chains

— Data Structures and Algorithms

* Luby’s algorithm

* Count-min filters

* Random rounding, packet routing

— Learning Theory

* Rademacher complexity, VC dimension

* Johnson-Lindenstraus Lemma

ER02: Data Mining : Statistical Modeling and Learning from Data (11-15 January)

Dates: 11-15 January 2016

Teachers: Ciro Cattuto, Laetitia Gauvin et André Panisson (ISI Torino)

Local contact: Márton Karsai (marton.karsai@ens-lyon.fr)

Venue: ENS Lyon, site Monod, Amphi B (entrance from the 4th floor)

Time: 9:30 – 16:45

External participants who has no access to the building should contact Marton Karsai (marton.karsai@ens-lyon.fr) in advance.

The main page of the course can be found here.

The course aims to provide basic skills for analysis and statistical modeling of data, with special attention to machine learning both supervised and unsupervised. An important objective of the course is the operational knowledge of the techniques and algorithms treated, and for this aim the lectures will focus on both theoretical and practical aspects of machine learning, and for the practical part it is required to have a good knowledge of programming, preferentially in Python language. The expected outcomes include (1) understanding the theoretical foundations of machine learning and (2) ability to use some Python libraries for machine learning in the context of simple applications.

Topics will include:

– The major paradigms of learning from data, the learning problem, the feasibility of learning
– The architecture of machine learning algorithms: model structure, scoring, and model selection ­ The theory of generalization, model complexity, the approximation­generalization tradeoff, bias and variance, the learning curve
– Score functions and optimization techniques. Gradient descent and stochastic gradient descent.
– Validation and Cross­Validation: validation set, leave­one­out cross validation, K­fold cross­validation
– Linear Models: linear classification, linear regression, ordinary least squares, logistic regression, non­linear transformations
– Non­linear models for classification: support vector machines, tree models, nearest­neighbor methods, Naive Bayes
– Overfitting and Regularization: model complexity and overfitting, commonly used regularizers, Lasso.
– Unsupervised learning: cluster analysis, the K­means algorithm, hierarchical clustering
– Feature selection and dimensionality reduction: Singular Value Decomposition, Matrix Factorisation
– Information retrieval, text representation and classification, term weighting

Overview of the theoretical aspects of machine learning will be followed by the application of algorithms in real problems such as: image classification, text mining, spam detection… The exercises will be implemented with the help of an interactive Python environment, with the use of standard tools for data analysis and visualization, such as the Scientific Python stack, Scikit­Learn, Pandas and NLTK.

Evaluation: personal projects with oral presentation