HyperAIHyperAI

Command Palette

Search for a command to run...

TUNet: A Block-online Bandwidth Extension Model based on Transformers and Self-supervised Pretraining

Viet-Anh Nguyen Anh H. T. Nguyen Andy W. H. Khong

Abstract

We introduce a block-online variant of the temporal feature-wise linearmodulation (TFiLM) model to achieve bandwidth extension. The proposedarchitecture simplifies the UNet backbone of the TFiLM to reduce inference timeand employs an efficient transformer at the bottleneck to alleviate performancedegradation. We also utilize self-supervised pretraining and data augmentationto enhance the quality of bandwidth extended signals and reduce the sensitivitywith respect to downsampling methods. Experiment results on the VCTK datasetshow that the proposed method outperforms several recent baselines in bothintrusive and non-intrusive metrics. Pretraining and filter augmentation alsohelp stabilize and enhance the overall performance.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp