RA-CM3 is a retrieval-augmented multimodal model that can generate both text and images. RA-CM3 achieves improved text and image generation quality while reducing the training cost and model size.
DRAGON is a new foundation model pre-trained jointly from text and knowledge graphs. It helps knowledge- and reasoning-intensive applications such as question answering.
LinkBERT is a new language model pretrained to capture document link knowledge such as hyperlinks of the web. It helps knowledge-intensive applications such as question answering.
How can we use machine learning to fix source code errors (e.g. in C, Python) for us? We introduce Break-It-Fix-It, a new unsupervised method to train code repair models.
We study how to use machine learning to repair programs from error messages, and introduce a promising approach that leverages program-feedback graphs and self-supervised learning.