Machine Learning Driven Channel Thickness Optimization in Dual-Layer Oxide Thin-Film Transistors for Advanced Electrical Performance

基于机器学习的双层氧化物薄膜晶体管沟道厚度优化,以实现更优异的电性能

阅读:1

Abstract

Machine learning (ML) provides temporal advantage and performance improvement in practical electronic device design by adaptive learning. Herein, Bayesian optimization (BO) is successfully applied to the design of optimal dual-layer oxide semiconductor thin film transistors (OS TFTs). This approach effectively manages the complex correlation and interdependency between two oxide semiconductor layers, resulting in the efficient design of experiment (DoE) and reducing the trial-and-error. Considering field effect mobility (𝜇) and threshold voltage (V(th) ) simultaneously, the dual-layer structure designed by the BO model allows to produce OS TFTs with remarkable electrical performance while significantly saving an amount of experimental trial (only 15 data sets are required). The optimized dual-layer OS TFTs achieve the enhanced field effect mobility of 36.1 cm(2)  V(-1)  s(-1) and show good stability under bias stress with negligible difference in its threshold voltage compared to conventional IGZO TFTs. Moreover, the BO algorithm is successfully customized to the individual preferences by applying the weight factors assigned to both field effect mobility (𝜇) and threshold voltage (V(th) ).

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。