Alibaba Cloud has revealed a modular datacenter architecture it claims will help it to satisfy demand for AI infrastructure by improving performance and build times for new facilities.
Announced at its annual Apsara conference yesterday, the "CUBE DC 5.0" architecture was described as using "prefabricated modular designs" plus "advanced and proprietary technologies such as wind-liquid hybrid cooling system, all-direct current power distribution architecture and smart management system."
Alibaba Cloud hasn't explained those techs in depth, but claims the modular approach reduces deployment times by up to 50 percent compared to traditional datacenter building techniques.
The Register has asked Alibaba to explain the workings of a "wind-liquid hybrid cooling system." For what it's worth, machine translation of the term into Mandarin, then fed into search engines, produced results describing cold plate cooling – a technique that sees thin reservoirs of cooled liquids placed on hardware, with cooling achieved by circulating liquid and/or blowing air across the plates.
Whatever the term describes, CEO of Alibaba Cloud Intelligence Eddie Wu told the conference his company "is investing heavily in building an AI infrastructure for the future."
"These enhancements are not just about keeping up with AI demands but about setting a global standard for efficiency and sustainability."
Other steps towards that goal include a scheduler said to better manage hardware
resources so that they achieve up to 90 percent utilization rates.
Alibaba Cloud's IaaS offering, the Enterprise Elastic Compute Service (ECS), has reached its ninth generation. Conference attendees were told it is better equipped for AI applications as it has improved recommendation engine speeds by 30 percent and database read/write queries per second by 17 percent.
Also at the conference, Alibaba Cloud announced an "Open Lake data utility" that integrates multiple big data engines so they can be used by generative AI applications. Another new offering, "DMS: OneMeta+OneOps," apparently combines and manages metadata from 40 different data sources.
It's 2024, so Alibaba Cloud also announced some AI news: the release of its Qwen 2.5 multimodal models, available in sizes from 0.5 to 72 billion parameters, supporting 29 languages and tuned for the needs of sectors including automotive and gaming. The new models are said to have "enhanced knowledge [and] stronger capabilities in math and coding."
A text-to-video AI model that works with both Chinese and English prompts, Tongyi Wanxiang, was also released.
"The new model is capable of generating high-quality videos in a wide variety of visual styles from realistic scenes to 3D animation," boasted Alibaba Cloud execs.