SPLASH 2021
Sun 17 - Fri 22 October 2021 Chicago, Illinois, United States
Fri 22 Oct 2021 11:05 - 11:20 at Zurich D - Synthesis of models, tools and programs Chair(s): Jonathan Aldrich
Fri 22 Oct 2021 19:05 - 19:20 at Zurich D - Synthesis of models, tools and programs -- mirror Chair(s): Alex Potanin

The ability to learn programs from few examples is a powerful technology with disruptive applications in many domains, as it allows users to automate repetitive tasks in an intuitive way. Existing frameworks on inductive synthesis only perform syntactic manipulations, where they rely on the syntactic structure of the given examples and not their meaning. Any semantic manipulations, such as transforming dates, have to be manually encoded by the designer of the inductive programming framework. Recent advances in large language models have shown these models to be very adept at performing semantic transformations of its input by simply providing a few examples of the task at hand. When it comes to syntactic transformations, however, these models are limited in their expressive power. In this paper, we propose a novel framework for integrating inductive synthesis with few-shot learning language models to combine the strength of these two popular technologies. In particular, the inductive synthesis is tasked with breaking down the problem in smaller subproblems, among which those that cannot be solved syntactically are passed to the language model. We formalize three semantic operators that can be integrated with inductive synthesizers. To minimize invoking expensive semantic operators during learning, we introduce a novel deferred query execution algorithm that considers the operators to be oracles during learning. We evaluate our approach in the domain of string transformations: the combination methodology can automate tasks that cannot be handled using either technologies by themselves. Finally, we demonstrate the generality of our approach via a case study in the domain of string profiling.

Fri 22 Oct

Displayed time zone: Central Time (US & Canada) change

10:50 - 12:10
Synthesis of models, tools and programsOOPSLA at Zurich D +8h
Chair(s): Jonathan Aldrich Carnegie Mellon University
10:50
15m
Talk
Rewrite Rule Inference Using Equality SaturationDistinguished PaperVirtual
OOPSLA
Chandrakana Nandi Certora, inc., Max Willsey University of Washington, Amy Zhu University of Washington, Yisu Remy Wang University of Washington, Brett Saiki University of Washington, Adam Anderson University of Washington, Adriana Schulz University of Washington, Dan Grossman University of Washington, Zachary Tatlock University of Washington
DOI
11:05
15m
Talk
Semantic Programming by Example with Pre-trained ModelsVirtual
OOPSLA
Gust Verbruggen KU Leuven, Vu Le Microsoft, Sumit Gulwani Microsoft
DOI
11:20
15m
Talk
One Down, 699 to Go: or, Synthesising Compositional DesugaringsVirtual
OOPSLA
Sándor Bartha University of Edinburgh, James Cheney University of Edinburgh; Alan Turing Institute, Vaishak Belle University of Edinburgh; Alan Turing Institute
DOI
11:35
15m
Talk
Multi-modal Program Inference: A Marriage of Pre-trained Language Models and Component-Based SynthesisIn-Person
OOPSLA
Kia Rahmani Purdue University, Mohammad Raza Microsoft, Sumit Gulwani Microsoft, Vu Le Microsoft, Daniel Morris Microsoft, Arjun Radhakrishna Microsoft, Gustavo Soares Microsoft, Ashish Tiwari Microsoft
DOI Pre-print
11:50
20m
Live Q&A
Discussion, Questions and Answers
OOPSLA

18:50 - 20:10
Synthesis of models, tools and programs -- mirrorOOPSLA at Zurich D
Chair(s): Alex Potanin Victoria University of Wellington
18:50
15m
Talk
Rewrite Rule Inference Using Equality SaturationDistinguished PaperVirtual
OOPSLA
Chandrakana Nandi Certora, inc., Max Willsey University of Washington, Amy Zhu University of Washington, Yisu Remy Wang University of Washington, Brett Saiki University of Washington, Adam Anderson University of Washington, Adriana Schulz University of Washington, Dan Grossman University of Washington, Zachary Tatlock University of Washington
DOI
19:05
15m
Talk
Semantic Programming by Example with Pre-trained ModelsVirtual
OOPSLA
Gust Verbruggen KU Leuven, Vu Le Microsoft, Sumit Gulwani Microsoft
DOI
19:20
15m
Talk
One Down, 699 to Go: or, Synthesising Compositional DesugaringsVirtual
OOPSLA
Sándor Bartha University of Edinburgh, James Cheney University of Edinburgh; Alan Turing Institute, Vaishak Belle University of Edinburgh; Alan Turing Institute
DOI
19:35
15m
Talk
Multi-modal Program Inference: A Marriage of Pre-trained Language Models and Component-Based SynthesisIn-Person
OOPSLA
Kia Rahmani Purdue University, Mohammad Raza Microsoft, Sumit Gulwani Microsoft, Vu Le Microsoft, Daniel Morris Microsoft, Arjun Radhakrishna Microsoft, Gustavo Soares Microsoft, Ashish Tiwari Microsoft
DOI Pre-print
19:50
20m
Live Q&A
Discussion, Questions and Answers
OOPSLA