Finding new materials with specific properties is a key challenge in materials science. Recent improvements in generative artificial intelligence (AI), including Wasserstein Autoencoders (WAE) and Transformers, can help speed up this process.
Here, Wasserstein Autoencoders create realistic and varied material options by understanding complex connections between material properties. This helps us use inverse design, which means we can create new materials based on desired properties. On the other hand, Transformers are good at predicting how structure relates to properties based on large material datasets. Combining these models allows us to generate new material ideas, check their properties, and improve our searches.
Training generative models work best on Graphics Processing Units (GPUs) because they are faster and more efficient than traditional CPUs. This speed is important for managing complex calculations and large datasets for training these models. Therefore, we would like to use the hardware available on Alvis.