قالب وردپرس درنا توس
Home / Technology / Why cryptography was not the only culprit for wild GPU prizes

Why cryptography was not the only culprit for wild GPU prizes



Video card prices returned to normal last month, and the end of GPU shortage is finally in sight. But why did we have any shortages, and are we really off the hook ̵

1; and what will happen to the next generation of graphics cards appearing in the August / September timeframe? I talked to several industry insiders to get a better idea of ​​what happened and what we can expect in the rest of 2018, maybe even further.

The simple scapegoat for the bottlenecks are crypto currency customers, and we were as guilty as any place that showed the finger of contempt in this way. It is true that miners did not help the situation, but are they the only cause of the problem? No of course not.

Another equally easy target is the GPU makers Nvidia and AMD. "If they only produced more chips, we would not have to fight for our right to play games!" That's the refrain, but from what I've collected, it's not fair to blame them. Sure, AMD and Nvidia could have ordered more wafers months before the bottlenecks, with their perfect ability to predict the future, but that would not have fully met demand or kept prices low.

Similarly, we could try and blame the AIB (Add-In Board) partners, Asus, EVGA, Gigabyte, MSI, etc. Prices went up in their stores, but all cards were still sold out. It was also noted that, looking back on the online market in January / February, most of the cards that were sold were only third-party. Some resellers just bought as many GPUs as they could, tagged them and made quick profits. And then somebody would buy these tickets at inflated prices and make the cost even higher!

There is no way to say that SDRAM helps to run our modern cities.

One of the main culprits for all this is actually something else, especially the DRAM makers. But it is not fair to blame them. In 2015 and 2016, DRAM prices were at their lowest levels, and investing in additional foundries to provide even more DRAM for the needs of an already saturated market was obviously not a particularly good idea. For example, Samsung has completed only half of one of its new DRAM foundries, although it is now working to complete the installation. It takes a lot of time – you do not have to flip a switch to turn on a multi-billion dollar investment, so this is a future goal.

The important reason is that DRAM prices have dropped dramatically. Meanwhile, demand for NAND increased. NAND is not a DRAM, but it is often made in the same place. Switching takes time and can cost a lot of money, meaning months or even years are planned in advance. And these plans should make more NAND and less DRAM (and move to 3D NAND, but that's another topic). All parts were in place for DRAM supply to fall in late 2016 and 2017, and then went south.

Smartphones began to use more DRAM (and NAND), and the update cycle for smart phones is shorter. AMD started Ryzen and the 2017 CPU fights spurred more PC upgrades than in recent years. Cars have become a growing market for DRAM ICs – most modern cars have probably distributed 4GB to 8GB of DRAM on their boards, and models with advanced features such as lane departure warning and self-driving technology are more than quadrupling the amount of DRAM! There has also been a massive increase in supercomputing investments that require a lot of DRAM (for both the system and add-in boards), and on the other hand, millions of tiny IoT devices are being made, each with a small portion from DRAM

All together, and our beloved graphics cards also need a lot of DRAM. With more demand than supply, prices had to rise. Even if nothing else happened, video card prices would have increased in late 2017 and early 2018, but demand for crypto currency customers has increased and you have the makings of a perfect storm.

Hurricanes Crypto, DRAM and GPU are moving in. Image from Wikipedia

You do not have to look far to see how DRAM prices have exploded in the past 18-24 months. I remember buying 16GB DDR4-2400 and DDR4-2666 memory chips for only $ 50- $ 60 in mid-2016. The same kits are sold today for $ 170 or more! DDR4 may not be the same as GDDR5, GDDR5X or HBM2, but it comes from the same facilities and has to fight for time on the production line.

What does it do for graphics cards? The contract price for GDDR5 was 8 GB at 40 to 50 US dollars, when Nvidia's GeForce 1080 and 1070 along with AMD's Radeon RX 480/470 came on the market. Today, contract prices for the same 8 GB GDDR5 seem to be closer to $ 100 (give or take, depending on volume). Spot prices have risen from around $ 60 to $ 120 or more. And every level of the supply chain wants a bit of action, so if the base cost goes up $ 50 on a graphics card, that usually brings in more than $ 100 at retail price.

If GDDR5 is in a hard bond, it is even worse for HBM2. I've heard suggestions that 8GB of HBM2 can go for $ 175 (give or take), and it's already more expensive to use because of the need for a silicone interposer. Based on this information, I think it's a safe bet that Vega 56 and Vega 64 do not always return to their original targets of $ 399 and $ 499 for MSRP. That's not good, because while the performance charts can keep up with the GTX 1070/1070 Ti / 1080, they can not do that while costing 25 to 50 percent more.

Nvidia and AMD have not officially (means to the public) increased prices on their pre-made graphics cards, but with Founders Edition models mostly in stock again (outside the 1080 Ti), it's worth noting that the FE cards are in the Usually $ 50 more than the baseline RRP cost. And Founders Edition models help eliminate at least one layer in the supply chain. Unofficially, it sounds like the contract prices for the graphics card manufacturers have increased – just to take account of the higher costs for DRAM.

A look into the future of graphics cards, image via PublicDomainPictures

DRAM makers (Samsung, SK-Hynix, Micron, etc.) are driving the production of DRAMs and building new equipment. Why should not they, considering the prices are twice or more what they once were? But even with increased production, it may take some time for DRAM prices to approach near 2016 levels. Whether or not DRAM companies have price collusion and agreements is another factor, but do not expect that the results of these investigations will soon lead to lower prices. (At least SSD prices have declined due to investment in additional NAND capacity.)

What does that mean for new graphics cards in 2018? AMD is basically in radio silence, and except for a 7nm shrinkage of Vega for machine learning applications (where it can be sold at prices that make the use of 8GB HBM2 a non-factor), I do not expect a larger Team Redesign graphics card launch this year.

Nvidia is another story, with the widely rumored GTX 2080/2070 or GTX 1180/1170 planned for launch in August or September. (It is said that Nvidia distributes contradictory naming materials so that no one really knows exactly what the final name is.) And whether it's called the Turing or Ampere architecture (or even Volta) looks like the new GPUs Similar to the Volta GV100, minus the Tensor Cores and FP64 support. I've seen speculation that the prices will be higher than the current models, so $ 499 for the 2070 and $ 699 for the 2080, but what I've heard from industry insiders is that we're likely to be in the range of $ 799 to $ 2080 999 will see. Whether it is fast enough to justify such a price is a guess (probably not, at least not initially).

However, the reason for the price increase over the 10 pieces is the same: higher DRAM costs. Nvidia has to make money with the parts, and if manufacturing and parts cost $ 50 to $ 100 more than the previous generation, the price should be $ 100 to $ 200 higher. In the meantime, if you have $ 16 billion available, investing in a new DRAM foundry sounds like a great plan.


Source link