Earlier last week, SIGGRAPH 2022 kicked off and to everyone’s surprise, Intel actually made an appearance at the annual conference with their top-tier A770 graphics card in tow. Usually, the event is a chance for brands in the industry to showcase their latest software and, in most cases hardware. In Intel’s case, this included a demonstration of its card using the Blender software.
Specifically, Intel was showing off the A770 and its capabilities via Blender Cycles, a render engine with a physical-based path tracer that was introduced to the world more than a decade ago. But as old as the program is, its parent company had, over the years, transitioned its support from CPU rendering to GPU rendering, with the latter now able to support NVIDIA’s CUDA or OptiX. And OpenCL acceleration for AMD’s own GPUs.
Example of what I showed at #SIGGRAPH2022. Dense scene with indirect lighting in #B3D, rendering Cycles raytracing in realtime viewport with live denoising, rendered on #IntelArc A770 GPU in a NUC 11 Extreme PC pic.twitter.com/db89fSuGgP
— visualbob 🖱️🎨 (@bobduffy) August 13, 2022
That having been said, Blender will be releasing a new self-named 3.3 LTS update that will also add support for Intel’s oneAPI into it, and thus allowing the chipmaker’s ARC GPUs to show off their prowess as well.
Moving along, The A770 was reportedly demoed using an older Tiger Lake system and more specifically, a NUC 11 Extreme system. The choice of pairing is a little odd, given how vociferous Intel has been about pairing its ARC GPUs with its 12th generation Alder Lake and later. On another note, it also demonstrated the GPU running Blender Cycles with ray-tracing and live denoising.
As its flagship GPU, Intel’s ARC A770 will feature a full-fat ACM-G10 GPU, have 32 Xe-Cores, a 256-bit memory bus, and 16GB of GDDR6 memory. The chipmaker has already released snippets of the card’s performance, with the most recent being just last month, when it gave popular TechTuber Linus Sebastian a chance to conduct some limited performance checks and runs on the GPU.