Posted on Tuesday, July 17 2018 @ 22:27:14 CEST by Thomas De Maesschalck
There's a new specification that will enable future virtual reality headsets to connect with PCs and other devices using a single connector. Called VirtualLink, this new spec is supported by NVIDIA, Oculus, Valve, AMD, and Microsoft. It uses a single USB Type-C connector, instead of the current mess that involves multiple connectors.
VirtualLink uses an Alternate Mode of USB Type-C, it connects VR headsets using four high-speed HBR3 DisplayPort lanes, a USB 3.1 data channel, and can deliver up to 27W of power. It's a very welcome spec, that not only makes VR headsets more convenient but also significantly increases the bandwidth (to 32.4Gbps).
A new industry consortium led by NVIDIA, Oculus, Valve, AMD, and Microsoft today introduced the VirtualLink™ specification -- an open industry standard that enables next-generation VR headsets to connect with PCs and other devices using a single, high-bandwidth USB Type-C™ connector, instead of a range of cords and connectors.
This new connection, an Alternate Mode of USB-C, simplifies and speeds up VR setup time, avoiding key obstacles to VR adoption. It also brings immersive VR experiences to smaller devices with fewer ports, such as thin and light notebooks.
To fulfill the promise of next-generation VR, headsets will need to deliver increased display resolution and incorporate high-bandwidth cameras for tracking and augmented reality. VirtualLink connects with VR headsets to simultaneously deliver four high-speed HBR3 DisplayPort lanes, which are scalable for future needs; a USB3.1 data channel for supporting high-resolution cameras and sensors; and up to 27 watts of power.
Unlike other connectivity alternatives, VirtualLink is purpose-built for VR. It optimizes for the latency and bandwidth demands that will enable headset and PC makers to deliver the next generation of VR experiences.
The consortium also announced the publication of an advance overview of the VirtualLink specification, available to companies that wish to receive details ahead of the upcoming VirtualLink 1.0 specification. >/blockquotE>
Posted on Tuesday, July 17 2018 @ 17:45:02 CEST by Thomas De Maesschalck
NZXT rolls out its new E Series Digital power supplies. The firm offers 500W, 650W, and 850W models, all with 80Plus Gold certification. These units are based on the SeaSonic Focus+ Gold platform and offer real-time power monitoring and control. The E Series offers active monitoring for the 12V, 5V, and 3.3V rails and has a fully modular design. They're available immediately in the US for $124.99 (500W), $139.99 (650W), and $149.99 (850W). NZXT offers a 10-year warranty.
NZXT today announces a new line of digital ATX PSUs designed to provide PC builders with real-time power monitoring and control. The E Series lineup includes 500W, 650W, and 850W models, all 80 PLUS Gold certified. For these PSUs, NZXT partnered with Seasonic, one of the most highly-regarded PSU manufacturers, and worked with them to enhance their Focus+ Gold platform and PMBus architecture, adding a powerful Texas Instruments DSP and USB connectivity.
“The PSU is a critical part of your PC. Our goal for creating a smarter PC capable of automatically responding to the demands of the application--whether you are browsing the web, drafting email, or playing your favorite game--includes adding control and monitoring to devices that are integral to the system,” said Johnny Hou, founder and CEO of NZXT. “Our E Series PSUs are another step in completing this vision. You can track the performance of the PSU in real time, monitor temperature and total power-on hours, and even enable multi-rail OCP for additional protection of key components using CAM--our innovative software solution at the heart of our smart PC.”
NZXT E Series PSUs Key Features:
Active monitoring for the 12V, 5V, and 3.3V rails
Total power-on time tracking enabled using the embedded real-time clock (RTC)
Digital multi-rail over-current protection (OCP) for all three 12V outputs, with adjustable thresholds for the 12V CPU (4+4 pin) and GPU (6+2 pin PCIe) connections
Japanese capacitors rated at 105°C
80 PLUS Gold certification
Fully modular design means you use only the cables you need
10-year Warranty and NZXT Service and Support
MONITOR YOUR PERFORMANCE
The E Series uses a powerful digital signal processor (DSP) providing real-time wattage indicators for the three 12V rails, along with total uptime and internal temperature. Compare CPU and GPU power draw to their rated TDP and track historical data using CAM.
In addition to providing high-precision monitoring, the onboard DSP lets you enable independent over-current protection (OCP) for the 12V output to the motherboard, CPU, and GPU, providing even better protection for your expensive components.
RELIABLE AND EFFICIENT
All 80 PLUS Gold certified E Series PSUs are built using Japanese capacitors, rated to 105° C, providing long-term durability and reliability, and are backed by a 10-year limited warranty.
When powering loads under 100W total, the 0dB Mode provides fully silent operation. Using CAM, you can optimize your own fan curve to suit your system or choose between the Silent, Performance, or Fixed preset modes.
ALL THE ESSENTIALS
E Series PSUs support single- or multi-GPU builds and the fully modular design means you use just the connections you need for your build, reducing cable clutter and simplifying system building and expansion.
Posted on Tuesday, July 17 2018 @ 16:55:04 CEST by Thomas De Maesschalck
Freshly leaked Intel roadmaps reveal the chip giant's Z370 chipset is facing early retirement. Sometime this quarter, the Z370 will get replaced by the Z390. Overall, there's not much new, the biggest new features are USB 3.1 Gen 2 support and optional Wireless-AC support.
VideoCardz writes some motherboard makers say it's basically a renamed Z370 as there's hardly any difference.
Posted on Tuesday, July 17 2018 @ 16:35:44 CEST by Thomas De Maesschalck
Corsair reveals the Carbide Series SPEC-06 RGB, a new mid-tower case with a tempered glass side panel and a front with some special RGB LED effects. The case is sold via Corsair's webshop for 84.90EUR ($89.99), you can get it in black and white editions. Besides the RGB version, Corsair also offers single-color LED versions (the SPEC-06).
- CORSAIR®, a world leader in PC gaming peripherals and enthusiast components, today unveiled the latest addition to its Carbide Series of enthusiast PC cases, the CORSAIR Carbide Series SPEC-06 RGB. A tempered glass side panel and RGB-lit front panel showcase and highlight your build, while Direct Airflow Path™ design delivers cool air to the systems hottest components and keeps your PC ahead of the curve.
With a beautiful tempered glass side panel, the SPEC-06 RGB allows you to show off your system in style, highlighted by a stunning RGB-lit curve recessed into the distinctive front panel. The integrated RGB front lighting is easily controlled by a built-in three-switch controller that adds dramatic illumination to your system, with seven different colors, two lighting speeds, and five color shifting modes including color wave, heartbeat, rainbow wave and more. Let your PC shine bright with vibrant patterns or stick to solid colors to match your setup.
Beneath its bold design, the SPEC-06 RGB can accommodate a wealth of high-end PC hardware and cooling options. Two included 120mm cooling fans and CORSAIR Direct Airflow Path™ design ensure cool air is directed to the hottest components, with room for up to six 120mm fans and to install even the largest 360mm CORSAIR Hydro Series™ H150i PRO RGB Liquid CPU Cooler. Expansive storage space allows for up to four 2.5in SSDs and two 3.5in HDDs, giving you the storage capacity you need.
The SPEC-06 RGB’s builder-friendly internal layout enables quick and easy installation with intuitively placed rubber gromets and cut-outs that allow even first-time builders to create a great looking system. A plethora of cable tie-downs and a full-length PSU cover facilitate clean cable management, while removable dust filters in the roof, floor and front keep your system clean long after the build is finished.
Posted on Tuesday, July 17 2018 @ 15:09:29 CEST by Thomas De Maesschalck
Jim Keller is one of the more recognizable names in the processor industry. He played a big role in the development of the AMD K8 and Zen designs, as well as Apple's A4 and A5 SoCs. A couple of months ago, the tech community was shocked to learn that Keller took up a position at Intel. He's now six weeks into the jobs and granted AnandTech an interview, which you can read over here. Here's a little snippet about how he got hired:
Ian Cutress: For a number of my generation, it was a shock to hear that the famous Jim Keller was set and poised to join Intel. Is it like a Formula 1 driver wanting to drive for Ferrari - did you expect that Intel was going to be on the cards at some point in your career? Why join Intel?
Jim Keller: To be honest, I didn't really ever think about it much. In my career, you know, some people have asked me how I planned it, and the honest truth is I've really just kind of worked on the next interesting thing. And so, I've had a bunch, at least to me, of almost random jobs and interesting experiences and they've all been interesting. When this came up (at Intel), Murthy, Brian, Raja kind of pinged me after Raja joined. I talked to them at some length before I joined Tesla, well to BK [Brian Krzanich] a couple of times at least, and then they pinged me again later. They offered me an opportunity that seemed really interesting so, I took it.
Posted on Tuesday, July 17 2018 @ 11:46:09 CEST by Thomas De Maesschalck
There's a belief among a large fraction of gamers that AMD's graphics drivers are (or were) not as stable as they should be. In effort to combat this perception, AMD hired QA Consultants to compare the stability of its graphics card drivers with those from NVIDIA.
Under the test, three consumer and three professional cards from AMD, as well as six comparable models from NVIDIA, were tested for 12 days of 24-hour stress testing using CRASH from the Microsoft Windows Hardware Lab Kit (HLK). No real-world games or professional rendering software were tested, so the results should be taken with a grain of salt.
Under the test conditions, QA Consultants found that AMD's video cards were more stable than those from NVIDIA. Surprisingly, consumer cards scored better than workstation parts. AMD GPUs passed 93 percent of tests, whereas NVIDIA products passed 82 percent.
You can check the full whitepaper over here (PDF).
Posted on Tuesday, July 17 2018 @ 11:13:50 CEST by Thomas De Maesschalck
NVIDIA will get rid of the separation between the PC/Mac and Shield versions of its GeForce NOW version. Starting this month, a unified version will be offered. The main implication here is that owners of the Android-based Shield console will now get access to the full library of 225 gaming titles. At the moment, GeForce NOW is still in beta phase and is free-to-use. Presumably, NVIDIA will be rolling out a paid version in the future.
Being aimed at computers with keyboards and mice, this version of the cloud-based game streaming service supported a considerably larger library of PC games. To date, the service on PCs/Macs has gained support for 225 titles, significantly higher than the number of titles available for the first-generation service for SHIELD devices.
Starting this month, however, the differentiation between services is coming to an end. NVIDIA is essentially discontinuing the first-generation GeForce NOW service, and in the process is moving SHIELD TV devices to the second-generation service. This will serve to unify GeForce NOW across all platforms, as now PC, Mac, and SHIELD TV will all access the same service.
Posted on Tuesday, July 17 2018 @ 11:03:08 CEST by Thomas De Maesschalck
Looks like you may not need a Z390 motherboard for Intel's eight-core chips after all. We wrote about a MSI BIOS update a couple of days ago and now more beta BIOS releases are popping up. Various Z370 motherboards from ASUS, ASRock, and MSI now have beta BIOS versions with the Intel 06EC microcode revision.
Intel's changelog reveals the 06EC microcode update includes mitigation against newer Spectre variants, and an older revision of the document also mentioned support for Intel Core 9000 series processors. These new processors are expected later this summer.
Posted on Tuesday, July 17 2018 @ 10:53:49 CEST by Thomas De Maesschalck
Samsung announced it successfully made world's first 10nm-class 8Gb LPDDR5 memory chips. These new DRAM chips run at 6.4Gbps and have a 30 percent lower power consumption than the previous generation. Interestingly, Samsung dropped this announcement even though JEDEC hasn't finalized the LPDDR5 (or even DDR5) specification yet. Initial applications will include the mobile and AI markets.
Samsung Electronics, the world leader in advanced memory technology, today announced that it has successfully developed the industry’s first 10-nanometer (nm) class* 8-gigabit (Gb) LPDDR5 DRAM. Since bringing the first 8Gb LPDDR4 to mass production in 2014, Samsung has been setting the stage to transition to the LPDDR5 standard for use in upcoming 5G and Artificial Intelligence (AI)-powered mobile applications.
The newly-developed 8Gb LPDDR5 is the latest addition to Samsung’s premium DRAM lineup, which includes 10nm-class 16Gb GDDR6 DRAM (in volume production since December 2017) and 16Gb DDR5 DRAM (developed in February).
“This development of 8Gb LPDDR5 represents a major step forward for low-power mobile memory solutions,” said Jinman Han, senior vice president of Memory Product Planning & Application Engineering at Samsung Electronics. “We will continue to expand our next-generation 10nm-class DRAM lineup as we accelerate the move toward greater use of premium memory across the global landscape.”
The 8Gb LPDDR5 boasts a data rate of up to 6,400 megabits per second (Mb/s), which is 1.5 times as fast as the mobile DRAM chips used in current flagship mobile devices (LPDDR4X, 4266Mb/s). With the increased transfer rate, the new LPDDR5 can send 51.2 gigabytes (GB) of data, or approximately 14 full-HD video files (3.7GB each), in a second.
The 10nm-class LPDDR5 DRAM will be available in two bandwidths – 6,400Mb/s at a 1.1 operating voltage (V) and 5,500Mb/s at 1.05V – making it the most versatile mobile memory solution for next-generation smartphones and automotive systems. This performance advancement has been made possible through several architectural enhancements. By doubling the number of memory “banks” – subdivisions within a DRAM cell – from eight to 16, the new memory can attain a much higher speed while reducing power consumption. The 8Gb LPDDR5 also makes use of a highly advanced, speed-optimized circuit architecture that verifies and ensures the chip’s ultra-high-speed performance.
To maximize power savings, the 10nm-class LPDDR5 has been engineered to lower its voltage in accordance with the operating speed of the corresponding application processor, when in active mode. It also has been configured to avoid overwriting cells with ‘0’ values. In addition, the new LPDDR5 chip will offer a ‘deep sleep mode’, which cuts the power usage to approximately half the ‘idle mode’ of the current LPDDR4X DRAM. Thanks to these low-power features, the 8Gb LPDDR5 DRAM will deliver power consumption reductions of up to 30 percent, maximizing mobile device performance and extending the battery life of smartphones.
Based on its industry-leading bandwidth and power efficiency, the LPDDR5 will be able to power AI and machine learning applications, and will be UHD-compatible for mobile devices worldwide.
Samsung, together with leading global chip vendors, has completed functional testing and validation of a prototype 8GB LPDDR5 DRAM package, which is comprised of eight 8Gb LPDDR5 chips. Leveraging the cutting-edge manufacturing infrastructure at its latest line in Pyeongtaek, Korea, Samsung plans to begin mass production of its next-generation DRAM lineups (LPDDR5, DDR5 and GDDR6) in line with the demands of global customers.
Posted on Tuesday, July 17 2018 @ 10:04:02 CEST by Thomas De Maesschalck
Intel and Micron terminated their NAND flash memory joint-venture in January and now the firms announce they'll do the same thing with their 3D XPoint development partnership. The two chip makers will continue to jointly develop second-generation 3D XPoint, which is expected to be ready in the first half of 2019, but further development will be performed independently. The official line is that Intel and Micron want to optimize the technology for their respective product and business needs:
Micron and Intel today announced an update to their 3D XPoint™ joint development partnership, which has resulted in the development of an entirely new class of non-volatile memory with dramatically lower latency and exponentially greater endurance than NAND memory.
The companies have agreed to complete joint development for the second generation of 3D XPoint technology, which is expected to occur in the first half of 2019. Technology development beyond the second generation of 3D XPoint technology will be pursued independently by the two companies in order to optimize the technology for their respective product and business needs.
The two companies will continue to manufacture memory based on 3D XPoint technology at the Intel-Micron Flash Technologies (IMFT) facility in Lehi, Utah.
"Micron has a strong track record of innovation with 40 years of world-leading expertise in memory technology development, and we will continue driving the next generations of 3D XPoint technology," said Scott DeBoer, executive vice president of Technology Development at Micron. "We are excited about the products that we are developing based on this advanced technology, which will allow our customers to take advantage of unique memory and storage capabilities. By developing 3D XPoint technology independently, Micron can better optimize the technology for our product roadmap while maximizing the benefits for our customers and shareholders."
"Intel has developed a leadership position delivering a broad portfolio of Optane products across client and data center markets with strong support from our customers," said Rob Crooke, senior vice president and general manager of Non-Volatile Memory Solutions Group at Intel Corporation. "Intel Optane's direct connection to the world's most advanced computing platforms is achieving breakthrough results in IT and consumer applications. We intend to build on this momentum and extend our leadership with Optane, which combined with our high-density 3D NAND technology offers the best solutions for today's computing and storage needs."