Sujets PFE NYUAD

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

PFE Projects

Compilers for Security: Automatic Code Optimization for Fully


Homomorphic Encryption Hardware

This project is at the intersection of compilers and security. It is about building a compiler for the
domain FHE (Fully Homomorphic Encryption) compiler. Fully Homomorphic Encryption (FHE)
is the holy grail of privacy-preserving computation as it enables meaningful operations to be
performed directly on encrypted data. Nevertheless, it is very high-performance overhead
hinders its wide adoption in the industry. Much has been done to accelerate FHE primitives,
whether with better algorithms or hardware accelerators. However, little has been done to
optimize FHE applications from a higher level (i.e. compiler). Existing solutions do not scale for
large applications and are a far cry from what has to be done. FHE applications must be
manually optimized by experts, a time-consuming and error-prone task. In this project, we will
focus on developing automatic code optimization techniques designed for the area of FHE. These
techniques will be implemented in our FHE open-source compiler (CHIHAB). The final goal is
to help FHE programmers write faster privacy-preserving applications in less time.

Building a Tiny Language Model that Understands Code

This project is about developing a Tiny Language Model that understands code. Understanding
code here is defined by the ability to perform a set of challenging tasks such as predicting the
expected output of the code. The goal is to use this model to study the properties of large
language models such as GPT3. Since studying the properties of such large language models is
expensive (training GPT3 costs hundreds of millions of dollars), we will study the properties of
these tiny language models on a tiny programming language instead. We will build a tiny GPT3
model that understands code. While pursuing this goal, we will identify the necessary ingredients
that will allow large language models to understand code (and languages in general). We will use
the tiny langauge model as a proxy to study the properties of large language models and better
understand them. Results from this project will have a high impact on the field of language
models.

Building a Compiler for Quantum Computing


This topic is in collaboration with Ohio State University (USA). The field of quantum computing is
an important emerging field where Quantum computers use the laws of quantum mechanics to
solve problems too complex for classical computers. In this project, we will build a compiler for
quantum computing hardware and use advanced compiler methods to enable efficient code
generation for this hardware.
Building a Deep Learning Model to Predict the Best Code Optimizations in
Tiramisu

This project is about building a deep learning model to predict the best code optimizations that a
compiler should apply to a given code. The goal is to enable automatic code optimization in the
Tiramisu compiler. Our goal is to replace human intelligence in optimizing code and enable
Tiramisu to automatically optimize code and generate the fastest possible code. The goal in this
project is to build a model that takes code as input and predicts directly what are the code
optimizations that should be applied on that code.

Design Bio-inspired Neural Networks

Current neural networks have shown success in many tasks ranging from image processing to
natural language processing. Current neural networks have a serious issue though. They are
computationally expensive. For example, training a state-of-the-art neural network requires a
large cluster of TPUs (Google’s Tensor Processing Units). Scaling to larger neural networks is
becoming very challenging. In order to make progress in the area of deep learning, there is a high
need for efficient neural networks. Bio-inspired neural networks are one possible solution. The
goal of this PFE topic is to explore this solution. Bio-inspired neural networks are inspired from
our best current understanding of how biological neural networks work. In particular, recent
studies [2] have shown how the fruit fly brain might use simple mechanisms to perform the same
tasks that a neural network performs (extract features and map them to behavior). We will build
upon these studies and explore building neural networks with similar principles and that have a
high accuracy.

Using Deep Reinforcement Learning for Automatic Code Optimization in the


Tiramisu Compiler

This project is about using deep learning and deep reinforcement learning to enable automatic
code optimization in the Tiramisu compiler. We will use a reinforcement learning method similar
to the one developed by Google’s AlphaGo Zero.

Using Deep Reinforcement Learning for Automatic Code Optimization in the


MLIR Compiler

This project is about using deep learning and deep reinforcement learning to enable automatic
code optimization in the MLIR compiler. MLIR is an industrial compiler developed by Google
and used to compile Tensorflow code. This project is part of a collaboration with Mathworks, a
US-based company that develops Matlab. As a part of our collaboration, we aim to enable
automatic code optimization for the Matlab compiler. MLIR will be used as a backend for
Matlab and MLIR code will be optimized using use our RL-based automatic code optimization
method. Success in this project will have a high impact on millions of daily users of Matlab
across the world.
Benefits
- The selected students will work on cutting-edge projects with a very high practical
impact.
- Each selected student will receive a salary.
- The students will participate in writing papers and submit them to the best conferences in
the field (NIPS, PLDI, ISCA, …).

References
[1] http://tiramisu-compiler.org/

[2] Can a Fruit Fly Learn Word Embeddings?


Yuchen Liang, Chaitanya Ryali, Benjamin Hoover, Leopold Grinberg, Saket
Navlakha, Mohammed J Zaki, Dmitry Krotov

Contact
● Dr. Riyadh Baghdadi (Assistant Professor at NYUAD, Research Affiliate at MIT):
[email protected]

You might also like