Archive for the ‘NVIDIA’ Category

D3D raytracing no longer exclusive to 2080, as Nvidia brings it to GeForce 10, 16

March 19th, 2019
A screenshot of <em>Metro Exodus</em> with raytracing enabled.

Enlarge / A screenshot of Metro Exodus with raytracing enabled. (credit: Nvidia)

Microsoft announced DirectX raytracing a year ago, promising to bring hardware-accelerated raytraced graphics to PC gaming. In August, Nvidia announced its RTX 2080 and 2080Ti, a pair of new video cards with the company's new Turing RTX processors. In addition to the regular graphics-processing hardware, these new chips included two extra sets of additional cores, one set designed for running machine-learning algorithms and the other for computing raytraced graphics. These cards were the first, and currently only, cards to support DirectX Raytracing (DXR).

That's going to change in April, as Nvidia has announced that 10-series and 16-series cards will be getting some amount of raytracing support with next month's driver update. Specifically, we're talking about 10-series cards built with Pascal chips (that's the 1060 6GB or higher), Titan-branded cards with Pascal or Volta chips (the Titan X, XP, and V), and 16-series cards with Turing chips (Turing, in contrast to the Turing RTX, lacks the extra cores for raytracing and machine learning).

Unsurprisingly, the performance of these cards will not match that of the RTX chips. RTX chips use both their raytracing cores and their machine-learning cores for DXR graphics. To achieve a suitable level of performance, the raytracing simulates relatively few light rays and uses machine-learning-based antialiasing to flesh out the raytraced images. Absent the dedicated hardware, DXR on the GTX chips will use 32-bit integer operations on the CUDA cores already used for computation and shader workloads.

Read 4 remaining paragraphs | Comments

Posted in 2080, direct3d, DirectX, dxr, Gaming & Culture, NVIDIA, raytracing, rtx, rtx 2080, rtx2080 ti, Tech, Windows | Comments (0)

Nvidia CEO warns of “extraordinary, unusually turbulent, disappointing” Q4

January 28th, 2019
Nvidia's RTX 2080Ti launched in September 2018 at an MSRP of $1,199. Nvidia's latest financial estimates indicate that this thing didn't exactly fly off of store shelves.

Enlarge / Nvidia's RTX 2080Ti launched in September 2018 at an MSRP of $1,199. Nvidia's latest financial estimates indicate that this thing didn't exactly fly off of store shelves. (credit: Nvidia)

On Monday, Nvidia took the unusual step of offering a revised Q4 2019 financial estimate ahead of its scheduled disclosure on February 14. The reason: Nvidia had already predicted low revenue numbers, and the hardware producer is already confident that its low estimate was still too high.

The original quarterly revenue estimate of $2.7 billion has since dropped to $2.2 billion, a change of roughly 19 percent. A few new data points factor into that revision. The biggest consumer-facing issue, according to Nvidia, is "lower than expected" sales of its RTX line of new graphics cards. This series, full of proprietary technologies like a dedicated raytracing processor, kicked off in September 2018 with the $1,199 RTX 2080 Ti and the $799 RTX 2080.

"These products deliver a revolutionary leap in performance and innovation with real-time raytracing and AI, but some customers may have delayed their purchase while waiting for lower price points and further demonstrations of RTX technology in actual games," Nvidia said in a statement. As of press time, only one retail game, Battlefield V, has tapped into the RTX-only raytracing system.

Read 4 remaining paragraphs | Comments

Posted in Gaming & Culture, geforce rtx, NVIDIA | Comments (0)

Alienware’s new Area-51m brings a new design, RTX GPUs, and 9th-gen Intel chips

January 8th, 2019

Valentina Palladino

CES 2019 is officially underway, which means the time has come for every gaming PC maker under the sun to introduce new hardware. Dell and its Alienware subsidiary are no exception, and on Tuesday, the latter announced a new flagship gaming laptop called the Area-51m.

The 17-inch notebook will be available on January 29 starting at $2,549. That's expensive, but the Area-51m appears to pack the kind of high-end power you'd expect for that price. Alienware says it will include Intel's 9th-generation desktop CPUs and what the company says are the "full-fat" versions of Nvidia's GeForce RTX GPUs. The entry-level model comes with a six-core Core i7-8700 chip and RTX 2060 with 6GB GDDR6 video memory, but those can be upgraded up to an eight-core i9-9900K and RTX 2080 with 8GB GDDR6 if desired.

Read 15 remaining paragraphs | Comments

Posted in Alienware, Area-51m, CES, dell, gaming laptops, Laptops, NVIDIA, Tech | Comments (0)

These 12 FreeSync monitors will soon be compatible with Nvidia’s G-Sync

January 7th, 2019
An Nvidia graphic that depicts what G-Sync technology is like, I guess...

Enlarge / An Nvidia graphic that depicts what G-Sync technology is like, I guess...

In addition to the entirely expected news about the upcoming RTX 2060, Nvidia CES presentation this weekend included a surprise about its G-Sync display standard. That screen-tear and input-lag-smoothing technology will soon work with select monitors designed for VESA's competing DisplayPort Adaptive-Sync protocol (which is used in AMD's FreeSync monitors).

In announcing the move, Nvidia says the gaming experience on these variable refresh rate (VRR) monitors "can vary widely." So Nvidia says it has gone to the trouble of testing 400 different Adaptive-Sync monitors to see which ones are worthy of being certified as "G-Sync Compatible."

Of those 400 tests, Nvidia says only 12 monitors have met its standards so far. They are:

Read 5 remaining paragraphs | Comments

Posted in AMD, freesync, Gaming & Culture, gsync, NVIDIA | Comments (0)

Our favorite (and least favorite) tech of 2018

December 26th, 2018
The Apple Watch Series 4 on a wooden table.

Enlarge / The Apple Watch series 4. (credit: Valentina Palladino)

Farewell, 2018. You brought us Facebook scandal after Facebook scandal, vastly more devices with Alexa and Google Assistant than anyone needs, a nosedive for net neutrality, endless political and regulatory challenges for Chinese smartphone makers, and oh so many notch-equipped smartphones.

We're ready for about two months of accidentally writing "2018" every time we're supposed to write 2019 in our first drafts—the adjustment always takes a while. And since our minds aren't quite out of 2018 yet, let's take this opportunity to look back on the year—specifically, our favorite and least-favorite products from the year.

Every member of the Ars Technica reviews team—Ron Amadeo, Peter Bright, Jeff Dunn, Valentina Palladino, and Samuel Axon—chimed in with personal picks and a little bit of explanation for why we picked what we did.

Read 44 remaining paragraphs | Comments

Posted in A12X, apple, Facebook, Facebook Portal, Features, google, Nokia, NVIDIA, Steam, Tech, wear OS | Comments (0)

Final Fantasy 15 on PC: Has Square Enix lost its way, or do graphics really matter?

August 25th, 2017

Enlarge

In a tech demo, which debuted at Nvidia’s GPU Technology Conference in May, famed Japanese developer Square Enix recreated a cinema-quality, computer-generated character inside of a video game. Nyx Ulric, voiced by Aaron Paul in the CGI film Kingsglaive: Final Fantasy XV, had been previously been confined to the silver screen, where the complexity of producing of detailed computer graphics is offloaded to vast farms of computers one frame at a time (each taking hours to render), before 24 of them are pieced together to create a single second of film.

With top-of-line PC hardware from Nvidia (the server-grade Tesla V100, no less), Square Enix pulled character models and textures from the film, and displayed them in real-time using Luminous Studio Pro, the same engine that powers Final Fantasy XV on the Xbox One, PlayStation 4, and—with the upcoming release of Final Fantasy XV: Windows Edition in 2018—PC. Like any good tech demo, Kingsglaive is as impressive as it is impractical, featuring authentic modelling of hair, skin, leather, fur, and lighting that no PC or console on the market today can display (at least in 4K).

The Xbox One X, Microsoft’s “most powerful console in the world,” sports around six teraflops of processing power (FP32, for those technically inclined) to push graphics at 4K resolution—that’s four times the number of pixels as a typical HD television. The Kingsglaive tech demo requires over 12 teraflops of processing power, more than is found in Nvidia’s $1000/£1000 Titan Xp graphics card.

Read 18 remaining paragraphs | Comments

Posted in AMD, Final Fantasy VII Remake, final fantasy XV, Gaming & Culture, NVIDIA, PC gaming, Square Enix | Comments (0)

Final Fantasy 15: Windows Edition coming to PC in 2018

August 21st, 2017

Final Fantasy XV is coming to PC in the form of Final Fantasy XV: Windows Edition in “early 2018,” Square Enix announced today.

To make up for the delay following the release of the PlayStation 4 and Xbox One versions of the game in November of 2016, Final Fantasy XV: Windows Edition includes all the DLC and updates previously released on console, as well as some PC-exclusive graphical enhancements.

Read 4 remaining paragraphs | Comments

Posted in gamescom 2017, gameworks, Gaming & Culture, NVIDIA, Square Enix | Comments (0)

What kind of gaming rig can run at 16K resolution?

August 3rd, 2017

The consumer gaming world might be in a tizzy about 4K consoles and displays of late, but that resolution standard wasn’t nearly enough for one team of PC tinkerers. The folks over at Linus Tech Tips have posted a very entertaining video showing off a desktop PC build capable of running (some) games at an astounding 16K resolution. That’s a 15260×8640, for those counting the over 132 million pixels being pushed every frame—64 times the raw pixel count of a standard 1080p display and 16 times that of a 4K display.

The key to the build is four Quadro P5000 video cards provided by Nvidia. While each card performs similarly to a consumer-level GTX1080 (8.9 teraflops, 2560 parallel cores), these are pro-tier cards designed for animators and other high-end graphic work, often used for massive jumbotrons and other multi-display or multi-projector installations.

The primary difference between Quadro and consumer cards is that these come with 16GB of video RAM. Unfortunately, the multi-display Mosaic technology syncing the images together means that mirrored memory doesn’t stack, leading to the rig’s most significant bottleneck. All told, the graphics cards alone would cost over $10,000, including a “quadrosync” card that ties them all together to run a single image across 16 displays.

Read 5 remaining paragraphs | Comments

Posted in 16K, gaming, Gaming & Culture, NVIDIA, Resolution | Comments (0)

Nvidia and Remedy use neural networks for eerily good facial animation

August 1st, 2017

Enlarge

Remedy, the developer behind the likes of Alan Wake and Quantum Break, has teamed up with GPU-maker Nvidia to streamline one of the more costly parts of modern games development: motion capture and animation. As showcased at Siggraph, by using a deep learning neural network—run on Nvidia’s costly eight-GPU DGX-1 server, naturally—Remedy was able to feed in videos of actors performing lines, from which the network generated surprisingly sophisticated 3D facial animation. This, according Remedy and Nvidia, removes the hours of “labour-intensive data conversion and touch-ups” that are typically associated with traditional motion capture animation.

Aside from cost, facial animation, even when motion captured, rarely reaches the same level of fidelity as other animation. That odd, lifeless look seen in even the biggest of blockbuster games is often down to the limits of facial animation. Nvidia and Remedy believe its neural network solution is capable of producing results as good, if not better than that produced by traditional techniques. It’s even possible to skip the video altogether and feed the neural network a mere audio clip, from which it’s able to produce an animation based on prior results.

The neural network is first fed a “high-end production facial capture pipeline based on multi-view stereo tracking and artist-enhanced animations,” which essentially means feeding it information on prior animations Remedy has created. The network is said to require only five to 10 minutes of footage before it’s able to produce animations based on simple monocular video capture of actors. Compared to results from state-of-the-art monocular and real-time facial capture techniques, the fully automated neural network produces eerily good results, with far less input required from animators.

Read 7 remaining paragraphs | Comments

Posted in AI, AMD, deep learning, game development, Gaming & Culture, neural networks, NVIDIA, Tech | Comments (0)

RX Vega 64 and RX Vega 56: AMD will “trade blows” with GTX 1080 for $499

July 31st, 2017

Enlarge

RX Vega—AMD’s long awaited follow up to the two-year-old Fury and Fury X high-performance graphics cards—launches on August 14 in two core versions: the $499 Radeon RX Vega 64, and the $399 Radeon RX Vega 56 (UK prices TBC).

A limited edition version of RX Vega 64, which features a slick aluminium shroud, costs $599 as part of a bundle that includes discounts on a Freesync monitor, X370 motherboard, and free games. A watercooled version of RX Vega 64, dubbed Radeon RX Vega 64 Liquid Cooled Edition, also comes in a similar bundle pack priced at $699.

According to those in attendance at Siggraph, where AMD made its RX Vega announcements, much of the focus was on the value proposition of RX Vega bundles and features like Freesync, rather than all out performance. Anandtech has been told Vega 64 will “trade blows” with Nvidia’s GeForce GTX 1080, which launched way back in May of 2016. The launch of Vega Frontier Edition (a production-focused graphics card) in June hinted at such levels of performance—RX Vega 64 and RX Vega 56 are based on the same Vega 10 GPU and architecture.

Read 10 remaining paragraphs | Comments

Posted in AMD, Gaming & Culture, GPUs, graphics cards, NVIDIA, PC gaming, RX Vega, Tech, vega | Comments (0)