Archive for the ‘NVIDIA’ Category

Final Fantasy 15 on PC: Has Square Enix lost its way, or do graphics really matter?

August 25th, 2017

Enlarge

In a tech demo, which debuted at Nvidia’s GPU Technology Conference in May, famed Japanese developer Square Enix recreated a cinema-quality, computer-generated character inside of a video game. Nyx Ulric, voiced by Aaron Paul in the CGI film Kingsglaive: Final Fantasy XV, had been previously been confined to the silver screen, where the complexity of producing of detailed computer graphics is offloaded to vast farms of computers one frame at a time (each taking hours to render), before 24 of them are pieced together to create a single second of film.

With top-of-line PC hardware from Nvidia (the server-grade Tesla V100, no less), Square Enix pulled character models and textures from the film, and displayed them in real-time using Luminous Studio Pro, the same engine that powers Final Fantasy XV on the Xbox One, PlayStation 4, and—with the upcoming release of Final Fantasy XV: Windows Edition in 2018—PC. Like any good tech demo, Kingsglaive is as impressive as it is impractical, featuring authentic modelling of hair, skin, leather, fur, and lighting that no PC or console on the market today can display (at least in 4K).

The Xbox One X, Microsoft’s “most powerful console in the world,” sports around six teraflops of processing power (FP32, for those technically inclined) to push graphics at 4K resolution—that’s four times the number of pixels as a typical HD television. The Kingsglaive tech demo requires over 12 teraflops of processing power, more than is found in Nvidia’s $1000/£1000 Titan Xp graphics card.

Read 18 remaining paragraphs | Comments

Posted in AMD, Final Fantasy VII Remake, final fantasy XV, Gaming & Culture, NVIDIA, PC gaming, Square Enix | Comments (0)

Final Fantasy 15: Windows Edition coming to PC in 2018

August 21st, 2017

Final Fantasy XV is coming to PC in the form of Final Fantasy XV: Windows Edition in “early 2018,” Square Enix announced today.

To make up for the delay following the release of the PlayStation 4 and Xbox One versions of the game in November of 2016, Final Fantasy XV: Windows Edition includes all the DLC and updates previously released on console, as well as some PC-exclusive graphical enhancements.

Read 4 remaining paragraphs | Comments

Posted in gamescom 2017, gameworks, Gaming & Culture, NVIDIA, Square Enix | Comments (0)

What kind of gaming rig can run at 16K resolution?

August 3rd, 2017

The consumer gaming world might be in a tizzy about 4K consoles and displays of late, but that resolution standard wasn’t nearly enough for one team of PC tinkerers. The folks over at Linus Tech Tips have posted a very entertaining video showing off a desktop PC build capable of running (some) games at an astounding 16K resolution. That’s a 15260×8640, for those counting the over 132 million pixels being pushed every frame—64 times the raw pixel count of a standard 1080p display and 16 times that of a 4K display.

The key to the build is four Quadro P5000 video cards provided by Nvidia. While each card performs similarly to a consumer-level GTX1080 (8.9 teraflops, 2560 parallel cores), these are pro-tier cards designed for animators and other high-end graphic work, often used for massive jumbotrons and other multi-display or multi-projector installations.

The primary difference between Quadro and consumer cards is that these come with 16GB of video RAM. Unfortunately, the multi-display Mosaic technology syncing the images together means that mirrored memory doesn’t stack, leading to the rig’s most significant bottleneck. All told, the graphics cards alone would cost over $10,000, including a “quadrosync” card that ties them all together to run a single image across 16 displays.

Read 5 remaining paragraphs | Comments

Posted in 16K, gaming, Gaming & Culture, NVIDIA, Resolution | Comments (0)

Nvidia and Remedy use neural networks for eerily good facial animation

August 1st, 2017

Enlarge

Remedy, the developer behind the likes of Alan Wake and Quantum Break, has teamed up with GPU-maker Nvidia to streamline one of the more costly parts of modern games development: motion capture and animation. As showcased at Siggraph, by using a deep learning neural network—run on Nvidia’s costly eight-GPU DGX-1 server, naturally—Remedy was able to feed in videos of actors performing lines, from which the network generated surprisingly sophisticated 3D facial animation. This, according Remedy and Nvidia, removes the hours of “labour-intensive data conversion and touch-ups” that are typically associated with traditional motion capture animation.

Aside from cost, facial animation, even when motion captured, rarely reaches the same level of fidelity as other animation. That odd, lifeless look seen in even the biggest of blockbuster games is often down to the limits of facial animation. Nvidia and Remedy believe its neural network solution is capable of producing results as good, if not better than that produced by traditional techniques. It’s even possible to skip the video altogether and feed the neural network a mere audio clip, from which it’s able to produce an animation based on prior results.

The neural network is first fed a “high-end production facial capture pipeline based on multi-view stereo tracking and artist-enhanced animations,” which essentially means feeding it information on prior animations Remedy has created. The network is said to require only five to 10 minutes of footage before it’s able to produce animations based on simple monocular video capture of actors. Compared to results from state-of-the-art monocular and real-time facial capture techniques, the fully automated neural network produces eerily good results, with far less input required from animators.

Read 7 remaining paragraphs | Comments

Posted in AI, AMD, deep learning, game development, Gaming & Culture, neural networks, NVIDIA, Tech | Comments (0)

RX Vega 64 and RX Vega 56: AMD will “trade blows” with GTX 1080 for $499

July 31st, 2017

Enlarge

RX Vega—AMD’s long awaited follow up to the two-year-old Fury and Fury X high-performance graphics cards—launches on August 14 in two core versions: the $499 Radeon RX Vega 64, and the $399 Radeon RX Vega 56 (UK prices TBC).

A limited edition version of RX Vega 64, which features a slick aluminium shroud, costs $599 as part of a bundle that includes discounts on a Freesync monitor, X370 motherboard, and free games. A watercooled version of RX Vega 64, dubbed Radeon RX Vega 64 Liquid Cooled Edition, also comes in a similar bundle pack priced at $699.

According to those in attendance at Siggraph, where AMD made its RX Vega announcements, much of the focus was on the value proposition of RX Vega bundles and features like Freesync, rather than all out performance. Anandtech has been told Vega 64 will “trade blows” with Nvidia’s GeForce GTX 1080, which launched way back in May of 2016. The launch of Vega Frontier Edition (a production-focused graphics card) in June hinted at such levels of performance—RX Vega 64 and RX Vega 56 are based on the same Vega 10 GPU and architecture.

Read 10 remaining paragraphs | Comments

Posted in AMD, Gaming & Culture, GPUs, graphics cards, NVIDIA, PC gaming, RX Vega, Tech, vega | Comments (0)

Nvidia and Bosch team up to build an AI supercomputer for your self-driving car

March 15th, 2017

Enlarge / A cutaway image of the Bosch/Nvidia car supercomputer. (credit: Nvidia)

It seems that barely a day goes by without news of a tech company teaming up with the auto industry to advance the art of self-driving vehicles. On Tuesday, it was Nvidia and Bosch’s turn. In an announcement at Bosch Connected World in Berlin, Germany, the two companies revealed that they are collaborating on an onboard computer capable of running the AI necessary for self-driving.

Based on Nvidia’s Drive PX technology—which also powers semi-autonomous Teslas—the Bosch will also use Nvidia’s forthcoming “Xavier” AI system-on-chip. Nvidia says that Xavier is capable of 20 trillion operations per second while drawing just 20 watts of power, meaning the Bosch car computer should be smaller and cheaper than Nvidia’s current Drive PX 2 unit.

“We want automated driving to be possible in every situation. As early as the next decade, driverless cars will also be a part of everyday life. Bosch is advancing automated driving on all technological fronts. We aim to assume a leading role in the field of artificial intelligence, too,” Bosch CEO Dr. Volkmar Denner said in a statement.

Read 2 remaining paragraphs | Comments

Posted in Bosch, Cars Technica, NVIDIA, self-driving | Comments (0)

The most detailed maps of the world will be for cars, not humans

March 11th, 2017

Here

The weight of the automotive and tech industries is fully behind the move toward self-driving cars. Cars with “limited autonomy”—i.e., the ability to drive themselves under certain conditions (level 3) or within certain geofenced locations (level 4)—should be on our roads within the next five years.

But a completely autonomous vehicle—capable of driving anywhere, any time, with human input limited to telling it just a destination—remains a more distant goal. To make that happen, cars are going to need to know exactly where they are in the world with far greater precision than currently possible with technology like GPS. And that means new maps that are far more accurate than anything you could buy at the next gas station—not that a human would be able to read them anyway.

Read 13 remaining paragraphs | Comments

Posted in autonomous driving, Cars Technica, Civil Maps, HD maps, here, NVIDIA | Comments (0)

Taking a ride in Nvidia’s self-driving car

January 7th, 2017

Enlarge

Sitting in the passenger seat of a car affectionately known at Nvidia as “BB8” is an oddly terrifying experience. Between me and the driver’s seat is a centre panel covered in touchscreens detailing readings from the numerous cameras and sensors placed around the car, and a large red button helpfully labelled “stop.”

As BB8 pulls away to take me on a short ride around a dedicated test track on the north side of the Las Vegas convention centre—with no-one in the driver’s seat—it’s hard to resist keeping a hand hovering over that big red button. After all, it’s not every day that you consciously put your life in the hands of a computer.

The steering wheel jerks and turns as BB8 sweeps around a corner at a cool 10 miles per hour, neatly avoiding a set of traffic cones while remaining within the freshly painted white lines of the makeshift circuit. After three smooth laps, two Nvidia employees wheel out an obstacle—a large orange panel—into the middle of track, which BB8 deftly avoids.

Read 16 remaining paragraphs | Comments

Posted in AI, Cars Technica, CES, CES 2017, NVIDIA, self driving cars | Comments (0)

From Paintbox to PC: How London became the home of Hollywood VFX

May 1st, 2016

In a darkened room on the backstreets of London’s red light district, Mike McGee stared at a screen. Surrounded by a thick wall of cigarette smoke and impatient chain-smoking clients, he swiped a pen across the table, his movement replicated with surprising accuracy as a pixel-perfect line on the screen above. The clients—TV producers from the BBC—were impressed. In just a few short minutes, McGee had transformed a single frame of video into the beginnings of a title sequence. In a world where labour-intensive optical effects and manual rotoscoping were the norm, this was a revelation.

For 12 years, McGee worked out of this room, painting onto the screen, his eyes left bloodshot and burning from the smoke as runners dashed in and out to empty overflowing ashtrays. It was a painstaking process; the Quantel Paintbox and its pressure-sensitive stylus were groundbreaking pieces of technology when they were released in 1981, but they had their limitations. The huge 14-inch platter hard drive could store 160MB of data, enough for just over six seconds of video at 25 FPS. Longer pieces required playing out each frame to tape before wiping the hard drive, a risky process that resulted in McGee and his staff working eight-hour shifts around the clock to minimise cockups.

The Paintbox and its multi-frame follow up Harry—which could store up to 30 seconds of footage and manipulate multiple frames of animation at once—would come to dominate the TV industry throughout the 1980s and early ’90s, defining the decade’s iconic visual style (see Dire Straits’ “Money For Nothing” video). It would even star in its own TV show on the BBC, Painting with Light, alongside artists like David Hockney. And for McGee, a graduate of the famous Central Saint Martins College of Art and Design, the Paintbox would spark a career in the visual effects industry spanning nearly three decades.

Read 55 remaining paragraphs | Comments

Posted in 3d animation, Autodesk, Features, Framestore, MPC, NVIDIA, Pixar, Quantel, The Multiverse, vfx | Comments (0)