Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Do you have experience with SYCL?

No, I bought a Nvidia card and just use CUDA.

> OpenCL had just a weird dance to perform to get a kernel running...

Yeah but that entire list, if you step back and think big picture, probably isn't the problem. Programmers have a predictable response to that sort of silliness. Build a library over it & abstract it away. The sheer number of frameworks out there is awe-inspiring.

I gave up on OpenCL on AMD cards. It wasn't the long complex process that got me, it was the unavoidable crashes along the way. I suspect that is a more significant issue than I realised at the time (when I assumed it was just me) because it goes a long way to explain AMD's pariah-like status in the machine learning world. The situation is more one-sided than can be explained by just a well-optimised library. I've personally seen more success implementing machine learning frameworks on AMD CPUs than on AMD's GPUs, and that is a remarkable thing. Although I assume in 2024 the state of the game has changed a lot from when I was investigating the situation actively.

I don't think CUDA is the problem here, math libraries are commodity software that give a relatively marginal edge. The lack of CUDA is probably a symptom of deeper hardware problems once people stray off an explicitly graphical workflow. If the hardware worked to spec I expect someone would just build a non-optimised CUDA clone and we'd all move on. But AMD did build a CUDA clone and it didn't work for me at least - and the buzz suggests something is still going wrong for AMD's GPGPU efforts.



> Programmers have a predictable response to that sort of silliness. Build a library over it & abstract it away

Impossible. GPGPU runtimes are too close to hardware, and the hardware is proprietary with many trade secrets. You need support from GPU vendors.

BTW, if you want reliable cross-vendor GPU, just use Direct3D 11 compute shaders. Modern videogames use a lot of compute, to the point that UE5 even renders triangle meshes with compute shaders. AMD hardware is totally fine, it’s the software ecosystem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: