SK Hynix DRAM for AI applications HBM3E can process more than 230 Full-HD movies of 5GB-size each in a second!

SK hynix announced today that it successfully developed HBM3E, the next-generation of the highest-specification DRAM for AI applications currently available. Customer’s evaluation of samples is underway. A high-value, high-performance memory that vertically interconnects multiple DRAM chips, enabling a dramatic increase in data processing speed in comparison to earlier DRAM products. HBM3E is the extended version of the HBM3 and the 5th generation of its kind, succeeding the previous generations HBM, HBM2, HBM2E and HBM3. In terms of speed, the HBM3E can process data up to 1.15 terabytes(TB) a second, which is equivalent to processing more than 230 Full-HD movies of 5GB-size each in a second.

READ  BenQ Joybee GP3 Mini Projector -Designed to enhance the experience of dynamic wireless projection

HBM3E DRAM not only meets the industry’s highest standards of speed, the key specification for AI memory products, but all categories including capacity, heat dissipation and user-friendliness. HBM3E DRAM comes with a 10% improvement in heat dissipation by adopting the cutting-edge technology of the Advanced Mass Reflow Molded Underfill, or MR-MUF2, onto the latest product. It also provides backward compatibility that enables the adoption of the latest product even onto the systems that have been prepared for the HBM3 without a design or structure modification.

READ  LG to showcase the vibrant neo-pop artwork of Romero Britto on its stunning lineup of Ultra HD TVs at CES 2013

The successful development of HBM3E, the extended version of HBM3 which delivers the world’s best specifications, comes on top of its experience as the industry’s sole mass provider of HBM3. With its experience as the supplier of the industry’s largest volume of HBM products and the mass-production readiness level, SK hynix plans to mass produce HBM3E from the first half of next year and solidify its unrivaled leadership in AI memory market.