The transfer learning concept was combined with the sequence-to-sequence (seq2seq) or Transformermodel for prediction and verification.
2
The results demonstrated that the accuracy of the retrosynthetic analysis by the seq2seq and Transformermodels after pretraining was significantly improved.
3
The seq2seq and Transformermodels, both of which are based on an encoder-decoder architecture, were originally constructed for language translation missions.