We are using cookies This website uses cookies in order to offer you the most relevant information. By browsing this website, you accept these cookies.
The transfer learning concept was combined with the sequence-to-sequence (seq2seq) or Transformermodel for prediction and verification.
2
The results demonstrated that the accuracy of the retrosynthetic analysis by the seq2seq and Transformermodels after pretraining was significantly improved.
3
The seq2seq and Transformermodels, both of which are based on an encoder-decoder architecture, were originally constructed for language translation missions.