The human brain encodes stimuli from the environment
into representations that form a sensory perception of the
world. Despite recent advances in understanding visual
and auditory perception, olfaction remains an
under-explored topic in the machine learning community
due to the lack of large-scale datasets annotated with
labels related to human olfactory perception. In this work,
we ask the question of whether transformer models
trained on chemical structures encode representations that
are aligned with human olfactory perception, i.e.,
\emph{can transformers smell like humans}? We
demonstrate, by means of three analyses, that
representations encoded from transformers pre-trained on
general chemical structures are highly aligned with
human olfactory perception. We use 5 different datasets
and 3 different types of perceptual representations to
show that the representations encoded by transformer
models are able to predict 1) labels associated with
odorants provided by experts; 2) ratings provided by
human participants with respect to pre-defined
descriptors; 3) similarity ratings between odorants
provided by human participants.