AI energy consumption will double from 460 terawatt-hours (TWh) in 2022 to 1,000 TWh in 2026<\/a>, comparable to Japan’s total electricity consumption.<\/p>\nCRAM-based machine learning inference accelerators could achieve energy improvements of up to 1,000 times, with some applications seeing energy savings of 2,500 and 1,700 times compared to traditional methods.<\/p>\n
“Our initial concept to use memory cells directly for computing 20 years ago was considered crazy,” said Jian-Ping Wang, senior author of the paper and a Distinguished McKnight Professor at the University of Minnesota.<\/p>\n
The interdisciplinary team, comprising experts from physics, materials science, computer science, and engineering, has been developing this technology since 2003.<\/p>\n
The research builds on patented work into Magnetic Tunnel Junctions (MTJs), nanostructured devices used in hard drives, sensors, and other microelectronics systems, including Magnetic Random Access Memory (MRAM).<\/p>\n
CRAM leverages these advancements to perform computations directly within memory cells, eliminating slow and energy-intensive data transfers typical of traditional architectures.<\/p>\n
Breaking the von Neumann bottleneck<\/h3>\n CRAM architecture overcomes the bottleneck of the traditional von Neumann architecture, where computation and memory are separate entities.<\/p>\n
“CRAM is very flexible; computation can be performed in any location in the memory array,” said Ulya Karpuzcu, an Associate Professor and expert on computing architecture.<\/p>\n
This flexibility allows CRAM to match the performance needs of various AI algorithms more efficiently than traditional systems.<\/p>\n
CRAM uses significantly less energy than current random access memory (RAM) devices, which rely on multiple transistors to store data.<\/p>\n
By employing MTJs\u2014a type of spintronic device that uses electron spin instead of electrical charge\u2014CRAM provides a more efficient alternative to traditional transistor-based chips.<\/p>\n
The University of Minnesota team is now collaborating with semiconductor industry leaders to scale up demonstrations and produce the hardware necessary to reduce AI energy consumption on a larger scale.<\/p>\n
The development of CRAM technology represents a monumental step towards sustainable AI computing.<\/p>\n
By dramatically reducing AI energy consumption while maintaining high performance, this innovation promises to meet the growing demands of AI applications and pave the way for a more efficient and environmentally friendly future.<\/p>\n","protected":false},"excerpt":{"rendered":"
University of Minnesota has developed a cutting-edge hardware device that could dramatically reduce AI energy consumption.<\/p>\n","protected":false},"author":15,"featured_media":49640,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[830],"tags":[570,651],"acf":[],"yoast_head":"\n
University of Minnesota device slashes AI energy consumption<\/title>\n \n \n \n \n \n \n \n \n \n \n \n \n\t \n\t \n\t \n \n \n \n \n \n\t \n\t \n\t \n