HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Text Based De Novo Molecule Generation
Text Based De Novo Molecule Generation On
Text Based De Novo Molecule Generation On
평가 지표
BLEU
Exact Match
Frechet ChemNet Distance (FCD)
Levenshtein
MACCS FTS
Morgan FTS
Parameter Count
RDK FTS
Text2Mol
Validity
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
BLEU
Exact Match
Frechet ChemNet Distance (FCD)
Levenshtein
MACCS FTS
Morgan FTS
Parameter Count
RDK FTS
Text2Mol
Validity
Paper Title
Repository
MolT5-Large
85.4
30.2
1.20
16.07
83.4
68.4
770000000
74.6
55.4
90.5
Translation between Molecules and Natural Language
GIT-Mol-caption
75.6
5.1
-
26.315
73.8
51.9
-
58.2
-
92.8
GIT-Mol: A Multi-modal Large Language Model for Molecular Science with Graph, Image, and Text
Text+Chem T5 base
75
21.2
0.061
27.39
87.4
69.7
220000000
76.7
-
79.2
Unifying Molecular and Textual Representations via Multi-task Language Modelling
MolReGPT (GPT-4-0413)
85.7
28.0
0.41
17.14
90.3
73.9
None
80.5
59.3
89.9
Empowering Molecule Discovery for Molecule-Caption Translation with Large Language Models: A ChatGPT Perspective
-
BioT5+
87.2
52.2
0.353
12.776
90.7
77.9
252000000
83.5
57.9
100
BioT5+: Towards Generalized Biological Understanding with IUPAC Integration and Multi-task Tuning
LDMol
92.6
53.3
0.20
6.750
97.3
93.1
-
95.0
-
94.1
LDMol: Text-to-Molecule Diffusion Model with Structurally Informative Latent Space
MolT5-small
75.5
7.9
2.49
25.988
70.3
51.7
60000000
56.8
48.2
72.1
Translation between Molecules and Natural Language
MolReFlect
90.3
51.0
-
11.84
92.9
81.3
-
86.0
-
97.7
MolReFlect: Towards Fine-grained In-Context Alignment between Molecules and Texts
-
BioT5
86.7
41.3
.43
15.097
88.6
73.4
252000000
80.1
57.6
100
BioT5: Enriching Cross-modal Integration in Biology with Chemical Knowledge and Natural Language Associations
Text+Chem T5 small
73.9
15.7
0.066
28.54
85.9
66
60000000
73.6
-
77.6
Unifying Molecular and Textual Representations via Multi-task Language Modelling
MolFM-Small
80.3
16.9
-
20.868
83.4
72.1
13620000
66.2
57.3
85.9
MolFM: A Multimodal Molecular Foundation Model
MolReGPT (GPT-3.5-turbo)
79.0
13.9
0.57
24.91
84.7
62.4
-
70.8
57.1
88.7
Empowering Molecule Discovery for Molecule-Caption Translation with Large Language Models: A ChatGPT Perspective
-
Text+Chem T5-augm small
81.5
19.1
0.06
21.78
86.4
67.2
60000000
74.4
-
95.1
Unifying Molecular and Textual Representations via Multi-task Language Modelling
TGM-DLM
82.6
24.2
0.77
17.003
85.4
68.8
180000000
73.9
58.1
87.1
Text-Guided Molecule Generation with Diffusion Language Model
MolXPT
-
21.5
0.45
-
85.9
66.7
350000000
75.7
57.8
98.3
MolXPT: Wrapping Molecules with Text for Generative Pre-training
-
MolT5-Large-HV
81.0
31.4
0.44
16.758
87.2
72.2
770000000
78.6
59.0
99.6
Translation between Molecules and Natural Language
MolT5-base
76.9
8.1
2.18
24.458
72.1
52.9
220000000
58.8
49.6
77.2
Translation between Molecules and Natural Language
MolFM-Base
82.2
21.0
-
19.445
85.4
75.8
296200000
69.7
58.3
89.2
MolFM: A Multimodal Molecular Foundation Model
Text+Chem T5-augm base
85.3
32.2
.05
16.87
90.1
75.7
220000000
81.6
-
94.3
Unifying Molecular and Textual Representations via Multi-task Language Modelling
TGM-DLM w/o corr
82.8
24.2
0.89
16.897
87.4
72.2
180000000
77.1
58.9
78.9
Text-Guided Molecule Generation with Diffusion Language Model
0 of 20 row(s) selected.
Previous
Next