Directory of Experts
Back to search results

Research project title

Methods and tools for the development of explainable embedded artificial neural networks for autonomous transportation

Education level

Master or doctorate


Director: Jean Pierre David

Co-director(s): prof. Jérome Le Ny, prof. Mounir Boukadoum, prof. Pierre Langlois, Ph.D. Freddy Lecue

End of display

December 31, 2020

Areas of expertise

Digital signal processing

Electronic circuits and devices

Adaptive, learning and evolutionary systems

Artificial intelligence

Learning and inference theories

Computer vision

Robotic control and automation

VLSI systems

Image and video processing

Distributed and parallel processing

Unit(s) and department(s)

Department of Electrical Engineering


We are looking for students at the graduate level (Ph.D., Master) and undergraduate level (4 months internship).

Please send your CV to prof. Jean Pierre David (

Detailed description

The project has three tracks:

T1: Rail scenes segmentation from hybrid data sources (camera, radar, LIDAR): The objective consists in finding the best Machine Learning (ML) approach to enable the autonomous control of a train. It includes the identification of the necessary features to extract from the data and propose ML algorithms to minimize the error rate and the global risk of an incident. This objective does not address the SWAP constraints (see T2) nor explainability and certification (T3) but it is tightly coupled to them since the concepts elaborated in T2 and T3 will be applied in T1 in the context of rail scenes. Using the partner’s tools, the researchers will work on a better understanding of the learning process in general, especially in train scenes. This way, we will meet both partners’ scientific objectives: understanding the behavior of ML algorithms in order to explain the and certify the decisions, especially in rails scenes and customize the ML algorithms through the visualization and the understanding of the training process. Ph.D.1, MSc1a (first 2 years) and MSc1b (last 2 years) will be supervised by Prof. Le NY and co-supervised by Prof. Boukadoum and Dr. Lecue.

T2: Binary features extraction and Binary Neural Networks: The objective consists in implementing Artificial Neural Networks (ANNs) as low power embedded Binary Neural Networks (BNNs) exploiting binary features themselves computed from low power embedded hardware. In this track, BNNs must be understood as any binary functions/circuits that support the handling of binary features, typically multi-level combinatorial functions, decision trees, Look-Up Tables and more conventional BNNs. This objective includes the optimization of BNNs through modularization, incremental building and pruning. It is not necessarily bound to rail scenes (T1) but, when possible, such data will be used among others to validate the proposed tools and methodology. This objective is closely linked to explainability and certification (T3) since Boolean algebra can be used to explain the premises of a partial/global decision and to prove certain Boolean properties binding input binary features and partial/global decisions. The researchers will focus on the hardware part of the project. Partner is interested in having AI embedded in the sensor itself (camera, LIDAR and radar), which requires low power implementations. The challenge consists in designing, training and implementing binary networks for low power implementation (typically a processor embedded in an FPGA). To do so, the students will also need to extract relevant binary features from the sensors at low power. This work is based on previous work done in Prof. David’s laboratory and partner is interested in adapting their tools to the design and training of the BNN developed by Prof. David. Ph.D.2, MSc2a (first 2 years) and MSc2b (last 2 years) will be supervised by Prof. David, and co-supervised by Prof. Langlois and Dr. Lecue.

T3: Explainability and certification of ANN through modularity and surrogate models: The objective consists in proposing ways to explain the decision of an ANN, to report and explain the level of confidence in such decision and to prove certain properties bindings the input features to the decisions, with a given level of confidence, and being able to explain such level of confidence. This objective includes the modification of ML models in order to do so, especially the modularization of a large ANN in order to extract local properties and local proofs that can be combined together to explain and prove global decisions. This objective is not limited to rail scenes since nearly all applications involving ANNs benefit from explaining the results of what is presently a black box. However, when possible, rail scenes will be used in order to contribute as much as possible to the global research objective. The researchers will study the use of a fuzzy decision tree as surrogate model due its proximity to human reasoning and the potential it offers to insert intuitive field knowledge into the chain of reasoning. Fuzzy decision trees are immune from the threshold and excluded middle problems associated with regular decision trees, and one such model will be used to help explain the operation of a convolutional neural network (CNN), currently the most accurate image classifier tool in the machine learning toolbox. Partner is interested in the opportunity to help explain/certify its systems and partner is interested in offering such features in their toolbox. The system will be validated on image data, after registration and segmentation to provide the input features. Then, a CNN will be trained to provide input-output pairs for creating the fuzzy decision tree, from which the set of explanatory rules will be extracted. Ph.D.3, MSc3a (first 2 years) and MSc3b (last 2 years) will be supervised by Prof. Boukadoum, and co-supervised by Prof. David, and Dr. Lecue.

Financing possibility

Master students can be funded up to 25000$/year

Ph.D. students can be funded up to 30000$/year

Jean Pierre David

Jean Pierre David

Associate Professor

Main profile