什么?" 萬能的 "Transformer即將被踢下神壇...

谷歌研究科學(xué)家 David Ha:Transformer 是新的 LSTM。

論文:Taming Transformers for High-Resolution Image Synthesis
鏈接:https://arxiv.org/pdf/2012.09841v1.pdf
論文:TransTrack: Multiple-Object Tracking with Transformer
鏈接:https://arxiv.org/pdf/2012.15460v1.pdf
論文:Compound Word Transformer: Learning to Compose Full-Song Music over Dynamic Directed Hypergraphs
鏈接:https://arxiv.org/pdf/2101.02402v1.pdf
論文:Dance Revolution: Long-Term Dance Generation with Music via Curriculum Learning
鏈接:https://arxiv.org/pdf/2006.06119v5.pdf
論文:Self-Attention Based Context-Aware 3D Object Detection
鏈接:https://arxiv.org/pdf/2101.02672v1.pdf
論文:PCT: Point Cloud Transformer
鏈接:https://arxiv.org/pdf/2012.09688v1.pdf
論文:Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting
鏈接:https://arxiv.org/pdf/1912.09363v3.pdf
論文:VinVL: Making Visual Representations Matter in Vision-Language Models
鏈接:https://arxiv.org/pdf/2101.00529v1.pdf
論文:End-to-end Lane Shape Prediction with Transformers
鏈接:https://arxiv.org/pdf/2011.04233v2.pdf
論文:Deformable DETR: Deformable Transformers for End-to-End Object Detection
鏈接:https://arxiv.org/pdf/2010.04159v2.pdf



本文部分素材來源于網(wǎng)絡(luò),如有侵權(quán),聯(lián)系刪除。

實(shí)“鼠”不易?的2020年過去了,在大年三十兒晚上,真誠地祝大家新年快樂!
新的一年,七月在線將努力輸出更多精品干貨!最后,再次祝大家新年快樂,牛氣沖天,牛牛?!?/span>
特此,為大家準(zhǔn)備了年度福利——特訓(xùn)好課 0.01元?秒殺
