I am C++ programmer that develop image and video algorithims, should i learn Nvidia CUDA? or it is one of these technlogies that will disappear?
I am C++ programmer that develop image and video algorithims, should i learn Nvidia
Share
Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.
Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.
Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Please briefly explain why you feel this question should be reported.
Please briefly explain why you feel this answer should be reported.
Please briefly explain why you feel this user should be reported.
CUDA is currently a single vendor technology from NVIDIA and therefore doesn’t have the multi vendor support that OpenCL does.
However, it’s more mature than OpenCL, has great documentation and the skills learnt using it will be easily transferred to other parrallel data processing toolkit.
As an example of this, read the Data Parallel Algorithms by Steele and Hillis and then look at the Nvidia tutorials – theres a clear link between the two yet the Steele/Hillis paper was written over 20 years before CUDA was introduced.
Finally, the FCUDA Projects is working to allow CUDA projects to target non nvidia hardware (FPGAs).