Fix Bugs with Transformer through a Neural-Symbolic Edit Grammar

We introduce NSEdit (neural-symbolic edit), a novel Transformer-based coderepair method. Given only the source code that contains bugs, NSEdit predictsan editing sequence that can fix the bugs. The edit grammar is formulated as aregular language, and the Transformer uses it as a neural-symbolic scriptinginterface to generate editing programs. We modify the Transformer and add apointer network to select the edit locations. An ensemble of rerankers aretrained to re-rank the editing sequences generated by beam search. We fine-tunethe rerankers on the validation set to reduce over-fitting. NSEdit is evaluatedon various code repair datasets and achieved a new state-of-the-art accuracy($24.04\%$) on the Tufano small dataset of the CodeXGLUE benchmark. NSEditperforms robustly when programs vary from packages to packages and when buggyprograms are concrete. We conduct detailed analysis on our methods anddemonstrate the effectiveness of each component.