abstract = "This thesis proposes three main contributions to
advance the state-of-the-art of AutoML approaches. They
are divided into two research directions: optimization
(first contribution) and meta-learning (second and
third contributions). The first contribution is a
hybrid optimization algorithm, dubbed Mosaic,
leveraging Monte-Carlo Tree Search and Bayesian
Optimization to address the selection of algorithms and
the tuning of hyper-parameters, respectively. The
empirical assessment of the proposed approach shows its
merits compared to Auto-sklearn and TPOT AutoML systems
on OpenML 100. The second contribution introduces a
novel neural network architecture, termed Dida, to
learn a good representation of datasets (i.e.,
metafeatures) from scratch while enforcing invariances
w.r.t features and rows permutations. Two
proof-of-concept tasks (patch classification and
performance prediction tasks) are considered. The
proposed approach yields superior empirical performance
compared to Dataset2Vec and DSS on both tasks. The
third contribution addresses the limitation of Dida on
handling standard dataset benchmarks. The proposed
approach, called Metabu, relies on hand-crafted
meta-features. The novelty of Metabu is two-fold: i)
defining an oracle topology of datasets based on
top-performing hyper-parameters; ii) leveraging Optimal
Transport approach to align a mapping of the
handcrafted meta-features with the oracle topology. The
empirical results suggest that Metabu metafeature
outperforms the baseline hand-crafted meta-features on
three different tasks (assessing meta-features based
topology, recommending hyper-parameters w.r.t topology,
and warmstarting optimization algorithms).",