DV Hardware - bringing you the hottest news about processors, graphics cards, Intel, AMD, NVIDIA, hardware and technology!

   Home | News Archives | Reviews | Articles | Howto's
 
DarkVision Hardware - Daily tech news
September 26, 2017 
Main Menu
Home
Info
News archives
Articles
Howto
Reviews
 

Who's Online
There are currently 81 people online.

 

Latest Reviews
Ewin Racing Champion gaming chair
Zowie P-TF Rough mousepad
Zowie FK mouse
BitFenix Ronin case
Ozone Rage ST headset
Lamptron FC-10 SE fan controller
ZOWIE G-TF Rough mousepad
ROCCAT Isku FX gaming keyboard
 

Follow us
RSS
 

Welcome to DV Hardware
Got news? : news@dvhardware.net

New Reviews:
Ewin Racing Champion Chair
Ewin Racing Champion
Zowie FK
Zowie FK
BitFenix Ronin
BitFenix Ronin

Latest news on DV Hardware - Older stories
Intel teases the Loihi neuromorphic AI test chip September 25, 2017 - 20:20
AMD Matisse and Picasso CPUs to arrive in 2019, Vega 20 in Q3 2018? September 25, 2017 - 18:36
Gigabyte showcases its Z370 AORUS motherboards September 25, 2017 - 18:15
Early Core i7-8700K benchmarks show 42% performance boost in multi-threaded software September 25, 2017 - 15:34
Biostar offers Plug-and-Mine solution for new cryptocurrency miners September 25, 2017 - 13:55
Kurzweil: Technology is not going to steal your job September 25, 2017 - 13:25
Intel Project Alloy headset did not meet performance targets September 25, 2017 - 11:37
Imagination Technologies gets bought out by Chinese investor for $745 million September 25, 2017 - 11:26
Palmer Luckey has a new virtual reality startup September 25, 2017 - 11:11
Mining demand for NVIDIA GPUs still very strong September 25, 2017 - 10:44
PNY no longer giving warranty on GeForce cards? September 25, 2017 - 10:25
Intel 8th Gen Core ups the core count, ships on October 5 September 25, 2017 - 09:57
AMD dumps its CrossFire brand September 22, 2017 - 22:55
Intel Project Alloy mixed reality headset gets cancelled due to low interest September 22, 2017 - 22:29
Gigabyte and MSI have little or no plans for custom-design Radeon RX Vega cards September 22, 2017 - 21:25
G.Skill adds Trident Z RGB kits with AMD Ryzen support September 22, 2017 - 14:58
First sighting of ASUS ROG STRIX Radeon RX Vega 64 September 22, 2017 - 14:51
Analysts weigh in on Globalfoundries tech day September 22, 2017 - 12:26
AMD Vega 11 GPUs could launch in time for holiday season September 22, 2017 - 12:05
MSI teases triple-fan GeForce GTX 1080 Ti Gaming X TRIO September 22, 2017 - 11:54

The Mailbox - reviews and news from other tech sites
Koolance VID-NX1080 GPU Water Block September 25, 2017 - 22:19
Intel Core i9-7980XE 18-Core Processor September 25, 2017 - 15:40
Intel Core i9-7980XE 18-Core Processor September 25, 2017 - 14:00
In Win 101 Computer Case September 25, 2017 - 12:52
Intel Core i9 7960X Linux Benchmarks September 25, 2017 - 11:24
Intel Core i9 7980XE & Core i9 7960X September 25, 2017 - 09:31
Intel Core i9-7980XE Extreme Edition – 18 cores of overclocked CPU madness September 25, 2017 - 09:24
Intel Core i9-7980XE and i9-7960X CPU September 25, 2017 - 09:24
Intel i9-7980XE and i9-7960X - Core-X Again September 25, 2017 - 09:24
Intel Core i9-7980XE And Core i9-7960X : Intel Attacks AMD Threadripper September 25, 2017 - 09:24
Reality Distortion Field: 10 Things Apple Won't Directly Say But We'll Infer Abo September 25, 2017 - 09:24
Guardians of the Galaxy Vol. 2 4K Blu-ray September 24, 2017 - 18:50
Scythe Mugen 5 SCMG-5000 CPU Cooler September 24, 2017 - 18:01
Jabra Elite Sport True Wireless Earbuds September 24, 2017 - 18:00
ASUS Tinker Board Is An Interesting ARM SBC For About $60 USD September 24, 2017 - 18:00
Seasonic FOCUS Plus 850 Gold 850W Power Supply September 23, 2017 - 09:13
Vertagear SL5000 Gaming Chair September 22, 2017 - 23:02
Areca ARC-8050T3 12-Bay Thunderbolt 3 RAID DAS September 22, 2017 - 20:13
Project Build: Carmine - Part 3 – Custom all the things September 22, 2017 - 19:41
Zotac GTX 1080 Ti Mini – the worlds smallest! September 22, 2017 - 16:24

Posted on Monday, September 25 2017 @ 20:20:42 CEST by Thomas De Maesschalck
Intel is banging the artificial intelligence drums again as the company teases Loihi, a new neuromorphic AI chip that mimics how the brain functions by learning to operate based on various modes of feedback from the environment. The chip giant says this 14nm chip is up to 1,000 times more energy efficient than general purpose computing hardware required for typical neural network training systems. Not sure whether they're comparing with CPUs or GPUs here. Intel says it will share the Loihi test chip with universities and researchers in the first half of 2018.

Intel Loihi

Here's some more info straight from Intel:
As part of an effort within Intel Labs, Intel has developed a first-of-its-kind self-learning neuromorphic chip – the Loihi test chip – that mimics how the brain functions by learning to operate based on various modes of feedback from the environment. This extremely energy-efficient chip, which uses the data to learn and make inferences, gets smarter over time and does not need to be trained in the traditional way. It takes a novel approach to computing via asynchronous spiking.

We believe AI is in its infancy and more architectures and methods – like Loihi – will continue emerging that raise the bar for AI. Neuromorphic computing draws inspiration from our current understanding of the brain's architecture and its associated computations. The brain's neural networks relay information with pulses or spikes, modulate the synaptic strengths or weight of the interconnections based on timing of these spikes, and store these changes locally at the interconnections. Intelligent behaviors emerge from the cooperative and competitive interactions between multiple regions within the brain's neural networks and its environment.

Machine learning models such as deep learning have made tremendous recent advancements by using extensive training datasets to recognize objects and events. However, unless their training sets have specifically accounted for a particular element, situation or circumstance, these machine learning systems do not generalize well.

The potential benefits from self-learning chips are limitless. One example provides a person's heartbeat reading under various conditions – after jogging, following a meal or before going to bed – to a neuromorphic-based system that parses the data to determine a "normal" heartbeat. The system can then continuously monitor incoming heart data in order to flag patterns that do not match the "normal" pattern. The system could be personalized for any user.

This type of logic could also be applied to other use cases, like cybersecurity where an abnormality or difference in data streams could identify a breach or a hack since the system has learned the "normal" under various contexts.

Introducing the Intel Loihi test chip
The Intel Loihi research test chip includes digital circuits that mimic the brain's basic mechanics, making machine learning faster and more efficient while requiring lower compute power. Neuromorphic chip models draw inspiration from how neurons communicate and learn, using spikes and plastic synapses that can be modulated based on timing. This could help computers self-organize and make decisions based on patterns and associations.

The Intel Loihi test chip offers highly flexible on-chip learning and combines training and inference on a single chip. This allows machines to be autonomous and to adapt in real time instead of waiting for the next update from the cloud. Researchers have demonstrated learning at a rate that is a 1 million times improvement compared with other typical spiking neural nets as measured by total operations to achieve a given accuracy when solving MNIST digit recognition problems. Compared to technologies such as convolutional neural networks and deep learning neural networks, the Intel Loihi test chip uses many fewer resources on the same task.

The self-learning capabilities prototyped by this test chip have enormous potential to improve automotive and industrial applications as well as personal robotics – any application that would benefit from autonomous operation and continuous learning in an unstructured environment. For example, recognizing the movement of a car or bike.

Further, it is up to 1,000 times more energy-efficient than general purpose computing required for typical training systems.

In the first half of 2018, the Intel Loihi test chip will be shared with leading university and research institutions with a focus on advancing AI.

The Loihi test chip's features include:
  • Fully asynchronous neuromorphic many core mesh that supports a wide range of sparse, hierarchical and recurrent neural network topologies with each neuron capable of communicating with thousands of other neurons.
  • Each neuromorphic core includes a learning engine that can be programmed to adapt network parameters during operation, supporting supervised, unsupervised, reinforcement and other learning paradigms.
  • Fabrication on Intel's 14 nm process technology.
  • A total of 130,000 neurons and 130 million synapses.
  • Development and testing of several algorithms with high algorithmic efficiency for problems including path planning, constraint satisfaction, sparse coding, dictionary learning, and dynamic pattern learning and adaptation.

    What's next?
    Spurred by advances in computing and algorithmic innovation, the transformative power of AI is expected to impact society on a spectacular scale. Today, we at Intel are applying our strength in driving Moore's Law and manufacturing leadership to bring to market a broad range of products — Intel® Xeon® processors, Intel® Nervana™ technology, Intel Movidius™ technology and Intel FPGAs — that address the unique requirements of AI workloads from the edge to the data center and cloud.

    Both general purpose compute and custom hardware and software come into play at all scales. The Intel® Xeon Phi™ processor, widely used in scientific computing, has generated some of the world's biggest models to interpret large-scale scientific problems, and the Movidius Neural Compute Stick is an example of a 1-watt deployment of previously trained models.

    As AI workloads grow more diverse and complex, they will test the limits of today's dominant compute architectures and precipitate new disruptive approaches. Looking to the future, Intel believes that neuromorphic computing offers a way to provide exascale performance in a construct inspired by how the brain works.

    I hope you will follow the exciting milestones coming from Intel Labs in the next few months as we bring concepts like neuromorphic computing to the mainstream in order to support the world's economy for the next 50 years. In a future with neuromorphic computing, all of what you can imagine – and more – moves from possibility to reality, as the flow of intelligence and decision-making becomes more fluid and accelerated.

    Intel's vision for developing innovative compute architectures remains steadfast, and we know what the future of compute looks like because we are building it today.

  • (comments?)

    Posted on Monday, September 25 2017 @ 18:36:48 CEST by Thomas De Maesschalck
    AMD logo
    Some new slide leaked that reportedly show AMD's CPU and GPU roadmap for 2018 and 2019, these were posted by Informatica Cero. So this is the sort of news you better take with a grain of salt, on one hand I'm not sure who would come up with fake codenames like "Matisse" and "Picasso", but on the other hand there do seem to be some errors in the slides (like Polaris-based integrated graphics on Bristol Ridge). So proceed with caution.

    On the CPU front the slide reveals there will be no major changes this year. Next year we can expect Pinnacle Ridge, it looks like AMD is adopting a tick-tock like model here. Pinnacle Ridge will be based on the current Summit Ridge architecture but will presumably be made on the new 12LP processor from GlobalFoundries.

    One surprise is that Raven Ridge seems to be delayed to 2018, these APUs will combine up to 8 Zen thread with up to 11 Vega-based CUs. Raven Ridge will use Socket AM4 on the desktop and Socket FP5 on the laptop platform.

    The 2019 column reveals AMD will switch to painter-based names. The desktop version of the Zen 2 architecture is codenamed Matisse and the slide reveals this chip will stick with Socket AM4. In the same year, we can also expect the Picasso APU. Picasso will be based on Raven Ridge but will offer higher performance/Watt.

    AMD roadmap CPU leak

    Moving on to the GPU roadmap we can see that AMD is listing Vega 20 for a launch in Q3 2018. This slide is geared towards compute and datacenter application so I'm not sure if this is a good guide for the desktop version. Vega 20 will presumable be a 12nm LP version of Vega 10. The slide also reveals that the Rome-based EPYC processor platform will support PCI Express 4.0.

    AMD roadmap GPU leak

    VideoCardz also reposted a third slide from Informatica Cero that shows some benchmarks from AMD that compare the Ryzen 5 PRO mobile APU with an Intel Core i5 Kaby Lake processor. AMD's figures claim Ryzen 5 PRO Mobile will offer much higher multi-threaded CPU performance and integrated graphics performance than the Kaby Lake chip, while running at a comparable idle power level.

    AMD roadmap Ryzen 5 PRo mobile performance leak
    (comments?)

    Posted on Monday, September 25 2017 @ 18:15:06 CEST by Thomas De Maesschalck
    Now that Intel's cat is out of the bag, motherboard makers are free to show off their upcoming Z370 motherboards. Gigabyte is one of the first to announce a model, they send over details about the Z370 AORUS GAMING 7. The Z370 motherboards and Intel's Coffee Lake-S processors should be available on October 5.
  • Supports 8th Gen Intel® Core™ Processors
  • Dual Channel Non-ECC Unbuffered DDR4, 4 DIMMs
  • Intel® Optane™ Memory Ready
  • ASMedia 3142 USB 3.1 Gen 2 with USB Type-C™ and Type-A
  • Front USB 3.1 Gen 2 Type-C™ Header
  • Multi-Way Graphics Support with Dual Armor and Ultra Durable™ Design
  • 121dB SNR AMP-UP Audio with ALC1220 & High-End ESS SABRE 9018 DAC with WIMA audio capacitors
  • Sound BlasterX 720°, the top-of-the-line audio engine solution for 4K gaming and entertainment
  • Killer™ E2500 Gaming Network + Intel® Gigabit LAN
  • RGB FUSION with Multi-Zone Digital LED Light Show design, support digital LED & RGB LED strips
  • Swappable Overlay for Accent LED
  • Smart Fan 5 features Multiple Temperature Sensors and Hybrid Fan Headers with FAN STOP
  • Triple Ultra-Fast M.2 with PCIe Gen3 x4 interface and Thermal Guard
  • USB DAC-UP 2 with Adjustable Voltage
  • Precise Digital USB Fuse Design for Stronger Protection
  • Anti-Sulfur Resistors Design
  • Ultra Durable™ 25KV ESD and 15KV Surge LAN Protection
  • Lightning-Fast Intel® Thunderbolt™ 3 AIC Support
  • APP Center Including EasyTune™ and Cloud Station™ Utilities
  • Z370 AORUS
    (comments?)

    Posted on Monday, September 25 2017 @ 15:34:55 CEST by Thomas De Maesschalck
    Intel logo
    Asian tech site Expreview did it again and managed to get its hands on a sample of the Intel Core i7-8700K processor to publish an early review well before the October 5 NDA expiry. The site compared the 8700K with the 7700K and also did some tests with both processors clocked at 4.5GHz to measure if there's any IPC gain.

    Single-threaded performance is somewhat of a mixed bag, its faster in some test cases but in other scenarios the difference is minimal or even negative. With both chips clocked at the same frequency, there do seem to be some IPC gains as Coffee Lake delivers more per-clock performance than Kaby Lake. But there may still be some bugs as the tests were performed on an engineering sample rather than the final retail product.

    In multi-threaded benchmarks the 8700K really shines, this new model has two more cores than its predecessor and this helps a lot to achieve a 42 performance boost in multi-threaded apps. No temperature or power draw measurements yet, so still too early to draw firm conclusions.

    Multithreaded performance of Core i7 8700K

    H/T: VideoCardz
    (comments?)

    Posted on Monday, September 25 2017 @ 13:55:47 CEST by Thomas De Maesschalck
    The overall computer motherboard market is doing quite poor so it's easy to understand why Biostar is banging the cryptocurrency drums so hard. The company claims its motherboards are found in one out of every five mining systems around the world and just released a new "Plug-and-Mine" solution to make Ethereum mining easier than ever. The company partnered with ethOS to create a USB flash drive that comes preoladed with ethOS, when you plug this into one of the Biostar motherboards you're ready to mine some cryptocoins.

    BIOSTAR continues to lead the way in Crypto Mining with an estimated installation of 1 in every 5 mining systems around the world; thanks to its comprehensive line-up of Intel and AMD dedicated mining motherboards, price-performance positioning and ease-of-use. BIOSTAR now makes it even easier for miners by partnering with ethOS to create a total-solution with its Crypto Mining series motherboards and the ethOS USB flash drive preloaded with ethOS system. With the Plug-and-Mine solution, BIOSTAR brings miners an easy-to-follow, simple to set-up package, welcoming more people to join the crypto mining community.

    BIOSTAR’s Price-Performance Mining Setup with ethOS
    The ethOS Mining System is one of the most straight-forward operating systems for mining ZCash, Monero, Ethereum and other GPU-minable coins. The setup is simple Plug-and-Mine, without needing a SSD. When paired with BIOSTAR mining boards, building a heavy yet stable mining rig of up to 12 GPUs can be done with ease. These have been tried and tested by both BIOSTAR and ethOS. Watch the video installation guide for easy setup: https://www.youtube.com/watch?v=LWHgeMbsuvc

    BIOSTAR Crypto Mining Motherboards
    BIOSTAR offers a wide array of crypto mining-oriented motherboards powered by the Intel B250, B85, H81, H110, as well as AMD B350 and A320 chipsets offering great stability, ease of use and high return on investment. These motherboards all support heavy mining setups using 6 to 12 GPUs. BIOSTAR’s current crypto mining line-up includes Intel based BIOSTAR TB250-BTC PRO, BIOSTAR TB250-BTC+, BIOSTAR TB250-BTC, BIOSTAR TB85, BIOSTAR H81A and BIOSTAR H110M-BTC and AMD based TB350-BTC and TA320-BTC.

    BIOSTAR Blockchain Technology Lab
    BIOSTAR is an expert in crypto mining and runs its own ‘Blockchain Technology Lab’ to ensure the best stability and performance from its crypto-mining series motherboards. All BIOSTAR crypto mining motherboards have been tested and verified in this lab, along with the ethOS plug-and-mine solution. BIOSTAR’s Blockchain Technology Lab has the ultimate crypto mining setup with its un-manned operating configuration (UOC), excellent cooling design, and automated reboot system.

    BIOSTAR continues to make mining easier with the cooperation with ethOS by providing a USB flash drive with pre-loaded OS making it Plug-and-Mine. BIOSTAR also proven its motherboards and systems are well tested for superb stability with the Blockchain Technology Lab.

    BIOSTAR’s Recommended Mining Solution:
  • ethOS USB (or BIOSTAR S120 SSD)
  • BIOSTAR TB250-BTC PRO
  • BIOSTAR RX 470 Graphics Cards (Expandable up to 12 Graphics Cards)
  • Dual 800W Power Supply
  • 16GB DDR4 DRAM

    How to install ethOS
    With the ethOS USB, miners can set up in just a few simple steps:
  • Step 1: Set up your crypto mining rig before installation *No need to set CSM Support enabled for NVIDIA graphics cards *Enable CSM Support for AMD graphics cards
  • Step 2: Enter the ethOS mining system automatically
  • Step 3: Input the specific command to turn off the remote config
  • Step 4: Set up your wallet & mining pools
  • Step 5: Reboot the system to make sure all the settings completed

    With the ethos USB flash drive, users can be the first to experience the latest ethOS 1.2.5 updates, which has been thoroughly tested to ensure compatibility, stability and graphics card overclocking support when using with BIOSTAR’s crypto mining product lines. This version also includes BIOSTAR themes for all BIOSTAR motherboards. http://ethosdistro.com/changelog/

  • (comments?)

    Posted on Monday, September 25 2017 @ 13:25:28 CEST by Thomas De Maesschalck
    These days a lot of people are concerned that the ongoing trends in the field of automation and artificial intelligence will make a lot of current jobs redundant. Humanity has been through this process in the past, before the industrial revolution farmers accounted for about 90 of the labor force but these days farmers make up a small single-digit figure. Machines have made a lot of farmers redundant and nowadays a lot of people are sitting in offices all day long doing God knows what.

    The big worry this time is that there may be no new jobs to replace the ones that are eliminated by new technology. For example, we can look at self-driving car technology and predict that sometime in our lifetime, this technology will put millions of truck drivers, cab drivers and delivery men out of work. Famous futurist Ray Kurzweil, who joined Google in 2012, offers some more nuance and believes AI will do far more good than harm.

    Kurzweil argues that for every job that gets eliminated, more jobs will be created at the top of the skill ladder. But he admits the situation creates a difficult political issue because as of right now nobody knows what those new jobs will be, because they don't exist yet:
    How will artificial intelligence and other technologies impact jobs?
    We have already eliminated all jobs several times in human history. How many jobs circa 1900 exist today? If I were a prescient futurist in 1900, I would say, “Okay, 38% of you work on farms; 25% of you work in factories. That’s two-thirds of the population. I predict that by the year 2015, that will be 2% on farms and 9% in factories.” And everybody would go, “Oh, my God, we’re going to be out of work.” I would say, “Well, don’t worry, for every job we eliminate, we’re going to create more jobs at the top of the skill ladder.” And people would say, “What new jobs?” And I’d say, “Well, I don’t know. We haven’t invented them yet.”

    That continues to be the case, and it creates a difficult political issue because you can look at people driving cars and trucks, and you can be pretty confident those jobs will go away. And you can’t describe the new jobs, because they’re in industries and concepts that don’t exist yet.
    You can read more about his views at Fortune.

    Cyborg logo
    (comments?)

    Posted on Monday, September 25 2017 @ 11:37:24 CEST by Thomas De Maesschalck
    A couple of days ago we reported about the cancellation of Project Alloy, Intel's standalone mixed reality headset. Now we have a bit more information about why the project got shelved as Intel admitted to PC World that processing power was one of the major hurdles.

    The chip giant wanted to put a low-power Atom processor in Project Alloy but eventually realized this did not result in an optimal mix of price and performance. Basically, the performance of Project Alloy was far underwhelming versus traditional tethered headsets.

    Project Alloy failed to gain traction with Intel's partners and now the company is focusing its VR efforts on WiGig. This wireless technology promises to make it possible to link a high-end desktop computer to an untethered VR headset:
    Instead, “we realized that this isn’t necessarily the optimum form factor,” Pallister said. In examining Alloy, Pallister said, the company didn’t believe it had the right mix of price and performance. In the meantime, separate projects to build a wireless WiGig link into a VR headset—a partnership struck with HTC — felt like they would prove to be more successful.

    “We said that the best way to deliver a high-performance PC experience is to wirelessly talk to a high-performance PC plugged into a wall outlet and do it that way,” Pallister said.
    Project Alloy
    (comments?)

    Posted on Monday, September 25 2017 @ 11:26:59 CEST by Thomas De Maesschalck
    Imagination logo
    Imagination Technologies has had a rocky year as the company's confession that Apple would dump its PowerVR intellectual property resulted in a huge hit to its future business prospects. Apple was Imagination's biggest customer so the loss of this contract was devastating.

    This summer the company said it was exploring a sale and now news rolls in that Imagination Technologies has been sold to a Chinese-backed investment firm. The bid represents a 41.6 percent premium versus Imagination's share price on Friday.

    Canyon Bridge is taking over most of Imagination Technologies' assets for 550 million pounds, about $744 million. The investment group is also looking to buy US chipmaker Lattice Semiconductor for $1.3 billion but that buyout got shot down by the Trump administration as the US government has national security concerns.

    One part of Imagination that will not be sold to Canyon Bridge is the MIPS division. It's not known why this unit is sold separately but Imagination said Tallwood Venture Capital offered $65 million for MIPS.

    At the moment, it's not clear whether Tallwood simply offered a better price or whether the deal was explicitly structured this way to ease regulatory approval. Either way, it's no secret that China wants to get its hands on more MIPS assets, so a lot of questions will be raised about this.
    (comments?)

    Posted on Monday, September 25 2017 @ 11:11:14 CEST by Thomas De Maesschalck
    Oculus founder Palmer Luckey got forced out of his company about half a year ago but now the 25-year old entrepreneur is back in the spotlight as he has a new virtual reality startup.

    Luckey appeared on the HTC Vive stage during the Tokyo Game Show 2017 conference and hinted he's working on some very exciting things. He refused to talk about specifics but told the audience to think of him as a VR person, and not just an as Oculus person.

    Road to VR speculates Palmer may still be experimenting with the idea of VR having significant consequences:
    Back in a May interview Luckey offer a hint into what his next project might entail; he spoke of his interest in the anime Sword Art Online, in which characters become trapped in a VR MMORPG—if they die in the game, they die in real life. Luckey said that he liked the idea of VR having significant consequences.

    “This concept of [significant consequences] is part of one of the projects I am working on. But I won’t talk about any details,” he said, and added that it was still “very early” for the project.
    Obviously, he's not trying to make a game where the outcome for losing would be death. But he does want something that hits a "sweet spot" between current games and Sword Art Online.
    (comments?)

    Posted on Monday, September 25 2017 @ 10:44:50 CEST by Thomas De Maesschalck
    NVIDIA logo
    Mizuho analyst Vijay Rakesh talked to a bunch of suppliers in Asia-Pacific and got the sense that demand for NVIDIA's GPUs is still a lot higher than expected. Sales this quarter are higher than analyst estimates as cryptocurrency demand is driving huge demand for NVIDIA's GeForce GTX 1060 and GTX 1070 video cards.

    The high demand pushed up the pricing of NVIDIA's GPUs but Rakesh warns demand may cool next quarter. There are two problems on the horizon, first up the bans on crypto may lower demand from Ethereum miners and next the analyst expects the shortage of DRAM will also affect GPU shipments:
    Our checks with the leading GPU and motherboard OEMs indicate SepQ GPU card trends are very strong, with card shipments coming in ~30-50% ahead of flat q/q expectations on strength from cryptocurrency mining. Cryptocurrency demand is driving strength in NVDA's GTX 1060/1070 cards. The GPU/motherboard OEMs also noted GPU pricing was up ~25% in the last six months. As we have noted prior (Link - Pricing), the strong pricing and unit trends point to strong OctQ upside to NVDA's muted cryptocurrency expectations. The OEMs also noted zero inventory of GPUs in the channel and constrained short DRAM supply and pricing also affecting GPU shipments. Coming off a very strong SepQ, there are also expectations in the supply chain that DecQ GPU sales could be muted in pricing and demand on recent cryptocurrency bans and DRAM shortages.

    (comments?)

    Poll
    Which one will you buy?

    AMD Radeon RX Vega
    NVIDIA GeForce GTX 1080



    Results
    Polls

    Votes 62
     

    Binary Option Robot @ 7binaryoptions.com
    Website traffic reviews
    Trading Signals
    Earn more savings by using this Aliexpress coupon!
    EwinRacing.com - get 15% off with code DVH at checkout
    Search

     

     

    DV Hardware - Privacy statement
    All logos and trademarks are property of their respective owner.
    The comments are property of their posters, all the rest © 2002-2017 DM Media Group bvba