SPLASH 2021
Sun 17 - Fri 22 October 2021 Chicago, Illinois, United States
Sun 17 Oct 2021 11:10 - 11:30 at Zurich G - BCNC Session 2 Chair(s): Ahmed ElBatanony, Giancarlo Succi

Coding assistance with deep learning is an emerging concern that has recently attracted much attention in the software development community. To integrate coding assistance with deep learning compactly, we focus on neural machine translation (NMT), which allows users to translate natural language descriptions into expressions in a programming language such as Python. A rising problem here is the limited availability of parallel corpora, which is essential to train better NMT models.

To overcome the problem, we propose a transcompiler-based back-translation, a data augmentation method that generates parallel corpora from numerous source code repositories. In this paper, we present our initial experimental results by comparing several NMT models that are built upon the existing corpora and our corpora. The resulting BLEU indicates that our proposed model is accurate enough to allow coding assistance in the future.

Sun 17 Oct

Displayed time zone: Central Time (US & Canada) change

10:50 - 12:10
BCNC Session 2BCNC at Zurich G
Chair(s): Ahmed ElBatanony Innopolis University, Giancarlo Succi Innopolis University
10:50
20m
Full-paper
The Pareto Distribution of Software Features and No-Code
BCNC
Ahmed ElBatanony Innopolis University, Giancarlo Succi Innopolis University
Link to publication DOI
11:10
20m
Talk
Is Neural Machine Translation Approach Accurate Enough for Coding Assistance?
BCNC
Yuka Akinobu Japan Women's University, Momoka Obara Japan Women's University, Teruno Kajiura Japan Women's University, Shiho Takano Japan Women's University, Miyu Tamura Japan Women's University, Mayu Tomioka Japan Women's University, Kimio Kuramitsu Japan Women's University
DOI
11:30
20m
Full-paper
Towards the No-Code Era: A Vision and Plan for the Future of Software Development
BCNC
Ahmed ElBatanony Innopolis University, Giancarlo Succi Innopolis University
Link to publication DOI