I’m about to start a mapreduce project which will run on AWS and I am presented with a choice, to either use Java or C++.
I understand that writing the project in Java would make more functionality available to me, however C++ could pull it off too, through Hadoop Streaming.
Mind you, I have little background in either language. A similar project has been done in C++ and the code is available to me.
So my question: is this extra functionality available through AWS or is it only relevant if you have more control over the cloud? Is there anything else I should bear in mind in order to make a decision, like availability of plugins for hadoop that work better with one language or the other?
Thanks in advance
You have a few options for running Hadoop on AWS. The simplest is to run your MapReduce jobs via their Elastic MapReduce service: http://aws.amazon.com/elasticmapreduce. You could also run a Hadoop cluster on EC2, as described at http://archive.cloudera.com/docs/ec2.html.
If you suspect you’ll need to write your own input/output formats, partitioners, and combiners, I’d recommend using Java with the latter system. If your job is relatively simple and you don’t plan to use your Hadoop cluster for any other purpose, I’d recommend choosing the language with which you are most comfortable and using EMR.
Either way, good luck!
Disclosure: I am a founder of Cloudera.
Regards,
Jeff