Archive for the ‘NVIDIA’ Category

Alienware’s new Area-51m brings a new design, RTX GPUs, and 9th-gen Intel chips

January 8th, 2019

Valentina Palladino

CES 2019 is officially underway, which means the time has come for every gaming PC maker under the sun to introduce new hardware. Dell and its Alienware subsidiary are no exception, and on Tuesday, the latter announced a new flagship gaming laptop called the Area-51m.

The 17-inch notebook will be available on January 29 starting at $2,549. That's expensive, but the Area-51m appears to pack the kind of high-end power you'd expect for that price. Alienware says it will include Intel's 9th-generation desktop CPUs and what the company says are the "full-fat" versions of Nvidia's GeForce RTX GPUs. The entry-level model comes with a six-core Core i7-8700 chip and RTX 2060 with 6GB GDDR6 video memory, but those can be upgraded up to an eight-core i9-9900K and RTX 2080 with 8GB GDDR6 if desired.

Read 15 remaining paragraphs | Comments

Posted in Alienware, Area-51m, CES, dell, gaming laptops, Laptops, NVIDIA, Tech | Comments (0)

These 12 FreeSync monitors will soon be compatible with Nvidia’s G-Sync

January 7th, 2019
An Nvidia graphic that depicts what G-Sync technology is like, I guess...

Enlarge / An Nvidia graphic that depicts what G-Sync technology is like, I guess...

In addition to the entirely expected news about the upcoming RTX 2060, Nvidia CES presentation this weekend included a surprise about its G-Sync display standard. That screen-tear and input-lag-smoothing technology will soon work with select monitors designed for VESA's competing DisplayPort Adaptive-Sync protocol (which is used in AMD's FreeSync monitors).

In announcing the move, Nvidia says the gaming experience on these variable refresh rate (VRR) monitors "can vary widely." So Nvidia says it has gone to the trouble of testing 400 different Adaptive-Sync monitors to see which ones are worthy of being certified as "G-Sync Compatible."

Of those 400 tests, Nvidia says only 12 monitors have met its standards so far. They are:

Read 5 remaining paragraphs | Comments

Posted in AMD, freesync, Gaming & Culture, gsync, NVIDIA | Comments (0)

Our favorite (and least favorite) tech of 2018

December 26th, 2018
The Apple Watch Series 4 on a wooden table.

Enlarge / The Apple Watch series 4. (credit: Valentina Palladino)

Farewell, 2018. You brought us Facebook scandal after Facebook scandal, vastly more devices with Alexa and Google Assistant than anyone needs, a nosedive for net neutrality, endless political and regulatory challenges for Chinese smartphone makers, and oh so many notch-equipped smartphones.

We're ready for about two months of accidentally writing "2018" every time we're supposed to write 2019 in our first drafts—the adjustment always takes a while. And since our minds aren't quite out of 2018 yet, let's take this opportunity to look back on the year—specifically, our favorite and least-favorite products from the year.

Every member of the Ars Technica reviews team—Ron Amadeo, Peter Bright, Jeff Dunn, Valentina Palladino, and Samuel Axon—chimed in with personal picks and a little bit of explanation for why we picked what we did.

Read 44 remaining paragraphs | Comments

Posted in A12X, apple, Facebook, Facebook Portal, Features, google, Nokia, NVIDIA, Steam, Tech, wear OS | Comments (0)

Final Fantasy 15 on PC: Has Square Enix lost its way, or do graphics really matter?

August 25th, 2017

Enlarge

In a tech demo, which debuted at Nvidia’s GPU Technology Conference in May, famed Japanese developer Square Enix recreated a cinema-quality, computer-generated character inside of a video game. Nyx Ulric, voiced by Aaron Paul in the CGI film Kingsglaive: Final Fantasy XV, had been previously been confined to the silver screen, where the complexity of producing of detailed computer graphics is offloaded to vast farms of computers one frame at a time (each taking hours to render), before 24 of them are pieced together to create a single second of film.

With top-of-line PC hardware from Nvidia (the server-grade Tesla V100, no less), Square Enix pulled character models and textures from the film, and displayed them in real-time using Luminous Studio Pro, the same engine that powers Final Fantasy XV on the Xbox One, PlayStation 4, and—with the upcoming release of Final Fantasy XV: Windows Edition in 2018—PC. Like any good tech demo, Kingsglaive is as impressive as it is impractical, featuring authentic modelling of hair, skin, leather, fur, and lighting that no PC or console on the market today can display (at least in 4K).

The Xbox One X, Microsoft’s “most powerful console in the world,” sports around six teraflops of processing power (FP32, for those technically inclined) to push graphics at 4K resolution—that’s four times the number of pixels as a typical HD television. The Kingsglaive tech demo requires over 12 teraflops of processing power, more than is found in Nvidia’s $1000/£1000 Titan Xp graphics card.

Read 18 remaining paragraphs | Comments

Posted in AMD, Final Fantasy VII Remake, final fantasy XV, Gaming & Culture, NVIDIA, PC gaming, Square Enix | Comments (0)

Final Fantasy 15: Windows Edition coming to PC in 2018

August 21st, 2017

Final Fantasy XV is coming to PC in the form of Final Fantasy XV: Windows Edition in “early 2018,” Square Enix announced today.

To make up for the delay following the release of the PlayStation 4 and Xbox One versions of the game in November of 2016, Final Fantasy XV: Windows Edition includes all the DLC and updates previously released on console, as well as some PC-exclusive graphical enhancements.

Read 4 remaining paragraphs | Comments

Posted in gamescom 2017, gameworks, Gaming & Culture, NVIDIA, Square Enix | Comments (0)

What kind of gaming rig can run at 16K resolution?

August 3rd, 2017

The consumer gaming world might be in a tizzy about 4K consoles and displays of late, but that resolution standard wasn’t nearly enough for one team of PC tinkerers. The folks over at Linus Tech Tips have posted a very entertaining video showing off a desktop PC build capable of running (some) games at an astounding 16K resolution. That’s a 15260×8640, for those counting the over 132 million pixels being pushed every frame—64 times the raw pixel count of a standard 1080p display and 16 times that of a 4K display.

The key to the build is four Quadro P5000 video cards provided by Nvidia. While each card performs similarly to a consumer-level GTX1080 (8.9 teraflops, 2560 parallel cores), these are pro-tier cards designed for animators and other high-end graphic work, often used for massive jumbotrons and other multi-display or multi-projector installations.

The primary difference between Quadro and consumer cards is that these come with 16GB of video RAM. Unfortunately, the multi-display Mosaic technology syncing the images together means that mirrored memory doesn’t stack, leading to the rig’s most significant bottleneck. All told, the graphics cards alone would cost over $10,000, including a “quadrosync” card that ties them all together to run a single image across 16 displays.

Read 5 remaining paragraphs | Comments

Posted in 16K, gaming, Gaming & Culture, NVIDIA, Resolution | Comments (0)

Nvidia and Remedy use neural networks for eerily good facial animation

August 1st, 2017

Enlarge

Remedy, the developer behind the likes of Alan Wake and Quantum Break, has teamed up with GPU-maker Nvidia to streamline one of the more costly parts of modern games development: motion capture and animation. As showcased at Siggraph, by using a deep learning neural network—run on Nvidia’s costly eight-GPU DGX-1 server, naturally—Remedy was able to feed in videos of actors performing lines, from which the network generated surprisingly sophisticated 3D facial animation. This, according Remedy and Nvidia, removes the hours of “labour-intensive data conversion and touch-ups” that are typically associated with traditional motion capture animation.

Aside from cost, facial animation, even when motion captured, rarely reaches the same level of fidelity as other animation. That odd, lifeless look seen in even the biggest of blockbuster games is often down to the limits of facial animation. Nvidia and Remedy believe its neural network solution is capable of producing results as good, if not better than that produced by traditional techniques. It’s even possible to skip the video altogether and feed the neural network a mere audio clip, from which it’s able to produce an animation based on prior results.

The neural network is first fed a “high-end production facial capture pipeline based on multi-view stereo tracking and artist-enhanced animations,” which essentially means feeding it information on prior animations Remedy has created. The network is said to require only five to 10 minutes of footage before it’s able to produce animations based on simple monocular video capture of actors. Compared to results from state-of-the-art monocular and real-time facial capture techniques, the fully automated neural network produces eerily good results, with far less input required from animators.

Read 7 remaining paragraphs | Comments

Posted in AI, AMD, deep learning, game development, Gaming & Culture, neural networks, NVIDIA, Tech | Comments (0)

RX Vega 64 and RX Vega 56: AMD will “trade blows” with GTX 1080 for $499

July 31st, 2017

Enlarge

RX Vega—AMD’s long awaited follow up to the two-year-old Fury and Fury X high-performance graphics cards—launches on August 14 in two core versions: the $499 Radeon RX Vega 64, and the $399 Radeon RX Vega 56 (UK prices TBC).

A limited edition version of RX Vega 64, which features a slick aluminium shroud, costs $599 as part of a bundle that includes discounts on a Freesync monitor, X370 motherboard, and free games. A watercooled version of RX Vega 64, dubbed Radeon RX Vega 64 Liquid Cooled Edition, also comes in a similar bundle pack priced at $699.

According to those in attendance at Siggraph, where AMD made its RX Vega announcements, much of the focus was on the value proposition of RX Vega bundles and features like Freesync, rather than all out performance. Anandtech has been told Vega 64 will “trade blows” with Nvidia’s GeForce GTX 1080, which launched way back in May of 2016. The launch of Vega Frontier Edition (a production-focused graphics card) in June hinted at such levels of performance—RX Vega 64 and RX Vega 56 are based on the same Vega 10 GPU and architecture.

Read 10 remaining paragraphs | Comments

Posted in AMD, Gaming & Culture, GPUs, graphics cards, NVIDIA, PC gaming, RX Vega, Tech, vega | Comments (0)

Nvidia and Bosch team up to build an AI supercomputer for your self-driving car

March 15th, 2017

Enlarge / A cutaway image of the Bosch/Nvidia car supercomputer. (credit: Nvidia)

It seems that barely a day goes by without news of a tech company teaming up with the auto industry to advance the art of self-driving vehicles. On Tuesday, it was Nvidia and Bosch’s turn. In an announcement at Bosch Connected World in Berlin, Germany, the two companies revealed that they are collaborating on an onboard computer capable of running the AI necessary for self-driving.

Based on Nvidia’s Drive PX technology—which also powers semi-autonomous Teslas—the Bosch will also use Nvidia’s forthcoming “Xavier” AI system-on-chip. Nvidia says that Xavier is capable of 20 trillion operations per second while drawing just 20 watts of power, meaning the Bosch car computer should be smaller and cheaper than Nvidia’s current Drive PX 2 unit.

“We want automated driving to be possible in every situation. As early as the next decade, driverless cars will also be a part of everyday life. Bosch is advancing automated driving on all technological fronts. We aim to assume a leading role in the field of artificial intelligence, too,” Bosch CEO Dr. Volkmar Denner said in a statement.

Read 2 remaining paragraphs | Comments

Posted in Bosch, Cars Technica, NVIDIA, self-driving | Comments (0)

The most detailed maps of the world will be for cars, not humans

March 11th, 2017

Here

The weight of the automotive and tech industries is fully behind the move toward self-driving cars. Cars with “limited autonomy”—i.e., the ability to drive themselves under certain conditions (level 3) or within certain geofenced locations (level 4)—should be on our roads within the next five years.

But a completely autonomous vehicle—capable of driving anywhere, any time, with human input limited to telling it just a destination—remains a more distant goal. To make that happen, cars are going to need to know exactly where they are in the world with far greater precision than currently possible with technology like GPS. And that means new maps that are far more accurate than anything you could buy at the next gas station—not that a human would be able to read them anyway.

Read 13 remaining paragraphs | Comments

Posted in autonomous driving, Cars Technica, Civil Maps, HD maps, here, NVIDIA | Comments (0)