💥 Gate Square Event: #PostToWinFLK 💥
Post original content on Gate Square related to FLK, the HODLer Airdrop, or Launchpool, and get a chance to share 200 FLK rewards!
📅 Event Period: Oct 15, 2025, 10:00 – Oct 24, 2025, 16:00 UTC
📌 Related Campaigns:
HODLer Airdrop 👉 https://www.gate.com/announcements/article/47573
Launchpool 👉 https://www.gate.com/announcements/article/47592
FLK Campaign Collection 👉 https://www.gate.com/announcements/article/47586
📌 How to Participate:
1️⃣ Post original content related to FLK or one of the above campaigns (HODLer Airdrop / Launchpool).
2️⃣ Content mu
SK Hynix achieves a critical milestone in next-generation HBM4 chips
On Friday, SK Hynix completed the development of HBM4, its next-generation memory product for ultra-high-performance AI. The South Korean company also established a mass production system for these high-bandwidth memory chips.
According to the company, this semiconductor chip vertically interconnects multiple DRAM chips and increases data processing speed compared to conventional DRAM products. SK Hynix is confident that the mass production of these HBM4 chips will lead the artificial intelligence industry.
SK Hynix prepares for mass production of HBM4
The company developed this product based on the recent dramatic increase in the demand for AI and data processing, which requires high bandwidth memory for greater system speed. They also point out that ensuring the energy efficiency of memory has become a key requirement for customers, as energy consumption for the operation of data centers has increased significantly.
The semiconductor provider expects that the higher bandwidth and energy efficiency of HBM4 will be the optimal solution to meet customer needs. The production of this new generation involves stacking chips vertically to save space and reduce power consumption, which helps process large volumes of data generated by complex AI applications.
“Completing the development of HBM4 will be a new milestone for the industry. By delivering a product that meets customer needs in performance, energy efficiency, and reliability in a timely manner, the company will meet time to market and maintain a competitive position,” said Joohwan Cho, head of HBM development at SK Hynix.
The South Korean company revealed that its new product has the best data processing speed and energy efficiency in the industry. According to the report, the chip's bandwidth has doubled compared to the previous generation by adopting 2,048 I/O terminals, and its energy efficiency has increased by more than 40%.
SK Hynix also stated that HBM4 will enhance AI service performance by up to 69% when the product is applied. The initiative aims to address data bottlenecks and reduce energy costs for data centers.
The firm revealed that HBM4 exceeds the standard operational speed (8Gbps) set by the Joint Electronic Devices Engineering Council (JEDEC) by incorporating more than 10Gbps into the product. JEDEC is the global standardization body that develops open standards and publications for the microelectronics industry.
SK Hynix also incorporated the Advanced MR-MUF process in HBM4, which allows stacking chips and injecting liquid protective materials between them to protect the circuit and harden them.
The company stated that the process has proven to be more reliable and efficient in the market for heat dissipation compared to the method of placing film-type materials for each chip stack.
SK Hynix shares soar
Following the launch of HBM4, SK Hynix's stock price reached an all-time high on Friday, rising by as much as 6.60% to 327,500 KRW. The company's stock price has also increased by approximately 17.5% in the last five days and nearly 22% in the past month.
Kim Sunwoo, senior analyst at Meritz Securities, predicted that the company's market share in HBM will remain in the low 60% range in 2026, supported by the early supply of HBM4 to key customers and the resulting advantage of being pioneers.
SK Hynix supplies the largest amount of HBM semiconductor chips to Nvidia, followed by Samsung Electronics and Micron, which supply smaller volumes. The head of HBM business planning at the company, Choi Joon-yong, projected that the AI memory chip market will grow by 30% annually until 2030.
Joon-yong stated that the end-user demand for AI is very strong and also sees that the billions of dollars in AI capital expenditures by cloud computing companies will be revised upwards in the future.