Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Add support for AMD GPUs via ROCArrays #938

Closed
wants to merge 4 commits into from

Conversation

jpsamaroo
Copy link
Contributor

@jpsamaroo jpsamaroo commented Nov 19, 2019

Adds support for AMD GPUs via ROCArrays.jl

At the moment, ROCArrays doesn't have the equivalent of CUDNN/CURNN wired up (MIOpen is the ROCm equivalent), so rocm/rocm.jl is empty. I also blindly copied anything else referencing CuArrays/CUDA and made a ROCm equivalent, which could be the totally wrong thing to do.

I'm just posting this here so I don't forget about it, but this has not been tested yet. I expect to do a lot more work on this PR and ROCArrays before this PR is ready for review. That said, feel free to try it out and let me know if it does what you would expect 😄

Addresses #173 partially and #310 fully

TODO:

  • Get Flux to load correctly
  • Wire up ROCm GitLab CI
  • Add ROCArrays tests
  • Make said tests pass
  • Update documentation

@jpsamaroo jpsamaroo mentioned this pull request Nov 19, 2019
5 tasks
@MikeInnes
Copy link
Member

Nice! Very much looking forward to seeing this develop.

@jpsamaroo jpsamaroo changed the title [WIP] Add support for ROCArrays [WIP] Add support for AMD GPUs via ROCArrays Nov 21, 2019
@bjarthur
Copy link

bjarthur commented Oct 3, 2020

is AMD support still planned at some point? i'd like to train models on mac os x. would be a great advertising point for flux if possible, as i don't believe any tensorflow or pytorch officially support it.

@jpsamaroo
Copy link
Contributor Author

Yes, I'm just working on integrating more of the GPUArrays interface into AMDGPU.jl, which will be required for good Flux support. I'll pick this PR back up once AMDGPU.jl is ready.

Also, macOS support will not come via this PR. Apple has divested themselves of support for non-Metal APIs, so AMDGPU.jl will not work on Macs.

@jpsamaroo
Copy link
Contributor Author

Closing until AMDGPU.jl is ready for integrating into Flux

@jpsamaroo jpsamaroo closed this Oct 3, 2020
@bjarthur
Copy link

bjarthur commented Oct 3, 2020

so for flux to take advantage of the GPUs on macs, we'd need a MetalArrays.jl? and that would work for the various cards/chips apple uses: AMD, radeon, and intel iris? is anyone working on this?

@jpsamaroo
Copy link
Contributor Author

jpsamaroo commented Oct 3, 2020

Yes that's about right, you'd need to get MetalCore to full functionality, and get an array implementation put together. It would theoretically work for all GPUs that Metal supports.

EDIT: MetalCore is not actively maintained.

@0x0f0f0f
Copy link

@jpsamaroo I managed to get AMD's nasty stuff up and running in a chroot, any way I could with help implementing the GPUArrays interface in AMDGPU.jl?

@jpsamaroo
Copy link
Contributor Author

Sure, we need a working GPUArrays.mapreducedim! implementation, which together with a broadcast impl. borrowed from CUDA.jl, should get us most of the functionality we need to support Flux.

@Abogical
Copy link

Abogical commented Feb 27, 2021

@jpsamaroo

Sure, we need a working GPUArrays.mapreducedim! implementation, which together with a broadcast impl. borrowed from CUDA.jl, should get us most of the functionality we need to support Flux.

I was just looking at this to see if I can help and it seems you just solved it: JuliaGPU/AMDGPU.jl#104
I guess this PR is ready to go? Is there anything missing?

@jpsamaroo
Copy link
Contributor Author

That is correct @Abogical. Although, I would recommend starting a new PR, since Flux has seen many, many commits since I last touched this PR.

@jpsamaroo
Copy link
Contributor Author

I'll be creating a FluxAMDGPU.jl glue package in the near future to implement this support

@doriantsolak
Copy link

doriantsolak commented May 25, 2021

Thanks very much for your work on this. Is there a rough time frame for when AMD GPU support may become available? (I would also like to contribute, but I consider myself too inexperienced in Julia and package development in general. However if you think there is anything that could be done even for someone like me, let me know.)

Edit: My bad commenting here. I have found: #1566 which answers my question pretty much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants