This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Read our privacy policy
Traditional data center architecture that integrates storage and computing faces many challenges. Learn how storage-compute decoupling offers an effective solution.
By Shu Jiwu, Fellow, Director of Information Storage Technology Committee, China Computer Federation; Professor, Tsinghua University; Dean, School of Information, Xiamen University
The continuous advancement of digitalization is crucial to progress in IT infrastructure, including computing and storage. The cloud and connectivity industries have built the largest IT infrastructure platform in China, storing and processing the majority of data across all industries. It is estimated that by 2025, China will have 300 EFLOPS of computing power, while the country's data volume will reach 48.6 ZB[1]. The continuous advancement of China's Eastern Data and Western Computing project constantly sets higher requirements for data centers to be green, intensive, and independent.
Traditional big data storage solutions that integrate storage and computing are represented by server-based, hyper-converged systems that centrally manage server resources. However, a lack of alignment between storage and computing requirements cause problems like inflexible scaling and low utilization. Storage-compute decoupling means storage and compute resources are divided into independent modules, which has significant advantages for the efficient sharing of storage resources. This solution has been applied in numerous scenarios, strengthening storage systems in terms of data sharing and flexible scaling.