HyperAI超神経

Machine Translation

自然言語処理(Natural Language Processing、NLP)は、人工知能の一分野で、コンピュータが人間の言葉を理解し、解釈し、生成することに焦点を当てています。その目的は、人間と機械の間のコミュニケーションギャップを埋め、情報のやり取りの効率と質を向上させることです。NLPの応用価値は広範で、スマートカスタマーサービス、感情分析、機械翻訳、要約作成などがあり、これらは社会の情報化や企業の智能化を大きく推進しています。

20NEWS
tensorflow/tensor2tensor
ACCURAT balanced test corpus for under resourced languages Russian-Estonian
Multilingual Transformer
ACCURAT balanced test corpus for under resourced languages Estonian-Russian
ACES
HWTSC-Teacher-Sim
Alexa Point of View
T5
Arba Sicula
Larger
Business Scene Dialogue EN-JA
Business Scene Dialogue JA-EN
Transformer-base
FLoRes-200
GenTranslate-7B
flores95-devtest eng-X
SeamlessM4T Large
flores95-devtest X-eng
FRMT (Chinese - Mainland)
FRMT (Chinese - Taiwan)
FRMT (Portuguese - Brazil)
FRMT (Portuguese - Portugal)
Itihasa
IWSLT 2017
GPT-4o (HPT)
IWSLT2014 English-German
IWSLT2014 German-English
PiNMT
IWSLT2015 Chinese-English
BP-Transformer
IWSLT2015 English-German
PS-KD
IWSLT2015 English-Vietnamese
EnViT5 + MTet
IWSLT2015 German-English
PS-KD
IWSLT2015 Thai-English
Seq-KD + Seq-Inter + Word-KD
IWSLT2015 Vietnamese-English
IWSLT2017 Arabic-English
IWSLT2017 English-Arabic
Transformer base + BPE-Dropout
IWSLT2017 English-French
Transformer base + BPE-Dropout
IWSLT2017 French-English
NLLB-200
IWSLT2017 German-English
Adaptively Sparse Transformer (alpha-entmax)
Multi Lingual Bug Reports
ChatGPT
slone/myv_ru_2022 myv-ru
slone/mbart-large-51-myv-mul-v1
slone/myv_ru_2022 ru-myv
Tatoeba (EL-to-EN)
Tatoeba (EN-to-EL)
PENELOPIE Transformers-based NMT (EN2EL)
V_A (trained on T_H)
M_C
V_B (trained on T_H)
V_C (trained on T_H)
WMT 2017 English-Chinese
DynamicConv
WMT 2017 English-Latvian
WMT 2017 Latvian-English
WMT 2018 English-Estonian
Multi-pass backtranslated adapted transformer
WMT 2018 English-Finnish
Transformer trained on highly filtered data
WMT 2018 Estonian-English
Multi-pass backtranslated adapted transformer
WMT 2018 Finnish-English
WMT 2022 Chinese-English
Vega-MT
WMT 2022 Czech-English
WMT 2022 English-Chinese
WMT 2022 English-Czech
WMT 2022 English-German
WMT 2022 English-Japanese
WMT 2022 English-Russian
Vega-MT
WMT 2022 German-English
WMT 2022 Japanese-English
WMT 2022 Russian-English
WMT2014 English-Czech
Evolved Transformer Big
WMT2014 English-French
Transformer+BT (ADMIN init)
WMT2014 English-German
Transformer Cycle (Rev)
WMT2014 French-English
WMT2014 German-English
Bi-SimCut
WMT2015 English-German
ByteNet
WMT2015 English-Russian
C2-50k Segmentation
WMT2016 Czech-English
Attentional encoder-decoder + BPE
WMT2016 English-Czech
WMT2016 English-French
DeLighT
WMT2016 English-German
WMT2016 English-Romanian
DeLighT
WMT2016 English-Russian
Attentional encoder-decoder + BPE
WMT2016 Finnish-English
WMT2016 German-English
WMT2016 Romanian-English
fast-noisy-channel-modeling
WMT2016 Russian-English
WMT2017 Chinese-English
StrokeNet
WMT2017 English-Finnish
OmniNetP
WMT2017 English-French
OmniNetP
WMT2017 English-German
OmniNetP
WMT2017 Finnish-English
WMT2017 Russian-English
OmniNetP
WMT2017 Turkish-English
WMT2019 English-German
Facebook FAIR (ensemble)
WMT2019 English-Japanese
WMT2019 Finnish-English
WMT2019 German-English