What’s New: Intel today announced Baidu is architecting the in-memory database of its Feed Stream services to harness the high-capacity and high-performance capabilities of Intel® Optane™ DC persistent memory. Paired with 2nd Gen Intel® Xeon® Scalable processors, building a new memory platform based on Intel Optane DC persistent memory allows Baidu to lower its total cost of ownership (TCO) while delivering more personalized search results to users. Intel and Baidu disclosed details of this deployment and other joint collaborations on Thursday at the 2019 Baidu ABC Summit in Beijing.
“For over 10 years, Intel and Baidu have worked closely together to accelerate Baidu’s core businesses, from search to AI to autonomous driving to cloud services. Our deep collaboration enables us to rapidly deploy the latest Intel technologies and improve the way customers experience Baidu’s services.”
–Jason Grebe, Intel corporate vice president and general manager of the Cloud Platforms and Technology Group
Why It’s Important: As companies like Baidu manage the explosive growth of data, the need to quickly and efficiently access and store data is imperative. With today’s news, Baidu is advancing its Feed Stream services to deliver more personalized content to its customers.
Why It’s Different: Baidu uses an advanced in-memory database called Feed-Cube to support data storage and information retrieval in its cloud-based Feed Stream services. Deploying Intel Optane DC persistent memory and 2nd Gen Intel Xeon Scalable processors enable Baidu to ensure high concurrency, large capacity and high performance for Feed-Cube, while reducing TCO1.
Through close collaboration, Intel and Baidu architected a hybrid memory configuration that includes both Intel Optane DC persistent memory and DRAM within the Baidu Feed Stream services. With this approach, Feed-Cube saw a boost in search result response times under the pressure of large concurrent access1. At the same time, single-server DRAM usage dropped by more than half, which reduces costs in terms of the petabyte-level storage capacity of Feed-Cube1. Intel and Baidu have published a detailed case study of this work, including examples of other applications using Intel Optane DC persistent memory technology, such as Redis, Spark and function-as-a-services.
“Using Intel Optane DC persistent memory within the Feed-Cube database enables Baidu to cost-effectively scale memory capacity to stay on top of the continuously expanding demands placed on our Feed Stream services,” said Tao Wang, chief architect, Recommendation Technology Architecture at Baidu.
What’s Next: Today’s news comes on the heels of Intel and Baidu recently signing a new memorandum of understanding (MoU) aimed at increasing the collaboration between the two companies in Baidu’s core business areas. Baidu and Intel will continue to work together to enable new products and technologies that play an increasingly important role in growing core Internet business scenarios as well as critical applications and services. The deeper collaboration between Baidu and Intel will help Baidu provide a more diverse and engaging user experience to its customers.
What Else Was Disclosed at the 2019 Baidu ABC Summit:
- New HPC Solution for Baidu ABC Storage: Intel and Baidu unveiled a new storage solution to accelerate machine learning performance in high-performance computing workloads. The new HPC solution is offered in Baidu’s cloud environment and provides users with end-to-end HPC support covering a range of capabilities, from data pre-processing, model training and evaluation to inferencing and result publishing. The Baidu Cloud ABC Storage service uses Intel® Optane™ DC SSDs and Intel® QLC 3D NAND SSDs.
- Confidential Computing Consortium: Intel and Baidu recently joined the Confidential Computing Consortium under the Linux Foundation. As part of the consortium, Intel and Baidu will work with industry partners to deploy and save private trusted computing services in the Baidu cloud based on Intel® Software Guard Extension (Intel® SGX®) technology.