HyperAI

Online Tutorial: The World's First MoE Video Generation Model! Alibaba Wan2.2 Open Source, Enabling Cinematic AI Videos on Consumer Graphics Cards

特色图像

Recently,Alibaba's Tongyi Wanxiang Lab has open-sourced its advanced AI video generation model Wan2.2.It includes three models: text-generated video, image-generated video, and unified video generation, with a total of 27B parameters.As the world's first MoE-based video model, Wan2.2 not only significantly improves the quality and computational efficiency of video generation, but also enables efficient operation on consumer-grade graphics cards such as NVIDIA RTX 4090.It significantly reduces the hardware threshold for high-quality video generation.At the same time, it pioneered a movie-level aesthetic control system.It can precisely control aesthetic effects such as lighting, color, and composition, generating cinematic quality, enriching and perfecting the artistic expression of AI-generated video. In multiple internal and external benchmarks, Wan2.2 outperforms existing open-source and closed-source video generation models, demonstrating a clear and significant performance advantage.

This open source initiative is not only a move to make technology accessible to all, but also continues to inject vitality into the open ecosystem of AI applications.In the short term, the open source of Wan2.2 will attract developers to jointly build tool chains and accelerate industry innovation; in the long term, combined with the growth of global AI computing power demand and the digitalization trend of the film and television industry, it will effectively promote industrial transformation from tool empowerment to ecological co-construction.

Wan2.2: An Open Advanced Large-Scale Video Generation Model is now available on the HyperAI Hyperneuron website (hyper.ai) in the Tutorials section.Come experience being the cameraman for cinematic quality videos!

Tutorial Link:

https://go.hyper.ai/JMQSN

Demo Run

1. On the hyper.ai homepage, select the Tutorials page, choose Wan2.2: An Open Advanced Large-Scale Video Generation Model, and click Run this tutorial online.

2. After the page jumps, click "Clone" in the upper right corner to clone the tutorial into your own container.

3. Select the NVIDIA RTX A6000 48GB and PyTorch images, choose Pay-As-You-Go or Daily/Weekly/Monthly plan based on your needs, and click Continue. New users can register using the invitation link below to receive 4 hours of free RTX 4090 and 5 hours of free CPU time!

HyperAI exclusive invitation link (copy and open in browser):

https://openbayes.com/console/signup?r=Ada0322_NR0n

4. Wait for resources to be allocated. The first cloning process will take approximately 3 minutes. When the status changes to "Running," click the arrow next to "API Address" to jump to the Demo page. Please note that users must complete real-name authentication before using the API address.

Effect Demonstration

Text-to-Video Generation

After entering the Demo page, enter the prompt and click "Generate Video".

Prompt: A cinematic shot of a boat sailing on a calm sea at sunset.

Image-to-Video Generation

After entering the Demo page, upload the image and enter the prompt, then click "Generate Video".

Prompt: Take off the cat's glasses with both hands.

The above is the tutorial recommended by HyperAI this time. Everyone is welcome to come and experience it!

Tutorial Link:

https://go.hyper.ai/JMQSN