The Future Brought by Nvidia's New AI Chip: The Day "Rubin CPX" Redefines Video Generation and "Long-Context AI" - Swallowing Videos and Code Whole

The Future Brought by Nvidia's New AI Chip: The Day "Rubin CPX" Redefines Video Generation and "Long-Context AI" - Swallowing Videos and Code Whole

NVIDIA has announced a new GPU, the "Rubin CPX," based on the next-generation architecture "Rubin." The goal is to process "million-token level" long-context inference, such as video generation and understanding large codebases, with speed and efficiency. The Rubin CPX integrates video decoding/encoding and long-context inference (inference) into a single system, aiming to eliminate bottlenecks for developers and creators. The release is expected by the end of 2026. The NVL144 CPX platform boasts AI performance of 8 exaFLOPS, 100TB of memory, and a bandwidth of 1.7PB/s, while also introducing a new revenue model of "$100M investment for $5B in 'token revenue'." On social media, there are both welcoming voices saying "the landscape of video generation will change" and cautious opinions suggesting that "a 50x ROI is exaggerated."