I am writing a C++ application and would like to request several data files through a HTTP GET request simultaneously, where should I look to get started (needs to be cross-platform).
- Run Application
- Create a list of URLs { “http://host/file1.txt“, “http://host/file2.txt“, “http://host/file3.txt“}
- Request all the URLs simultaneously and load the contents to variables (don’t want disk writes). Each file has about 10kB of data.
What libraries would you recommend I use? libcurl? curlpp? boost asio? would I need to roll-my-own multi threading to request all the files simultaneously? is there a easier way?
Edit: I will need to request about 1000 GET requests simultaneously. Most likely I will do this in batches (100 at a time, and creating more connections as made one are completed).
I would recommend libcurl. I’m not super-familiar with it, but it does have a multi-interface for performing multiple simultaneous HTTP operations.
Depending on what solution you go with, it’s possible to do asynchronous I/O without using multithreading. The key is to use the
select(2)system call.select()takes a set of file descriptors and tells you if any of them have data available. If they do, you can then proceed to useread(2)orrecv(2)on them without worrying about blocking.