Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • SEARCH
  • Home
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 476235
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 13, 20262026-05-13T00:28:09+00:00 2026-05-13T00:28:09+00:00

Greetings All! I am having some troubles on how to execute thousands upon thousands

  • 0

Greetings All!

I am having some troubles on how to execute thousands upon thousands of requests to a web service (eBay), I have a limit of 5 million calls per day, so there are no problems on that end.

However, I’m trying to figure out how to process 1,000 – 10,000 requests every minute to every 5 minutes.

Basically the flow is:
1) Get list of items from database (1,000 to 10,000 items)
2) Make a API POST request for each item
3) Accept return data, process data, update database

Obviously a single PHP instance running this in a loop would be impossible.

I am aware that PHP is not a multithreaded language.

I tried the CURL solution, basically:
1) Get list of items from database
2) Initialize multi curl session
3) For each item add a curl session for the request
4) execute the multi curl session

So you can imagine 1,000-10,000 GET requests occurring…

This was ok, around 100-200 requests where occurring in about a minute or two, however, only 100-200 of the 1,000 items actually processed, I am thinking that i’m hitting some sort of Apache or MySQL limit?

But this does add latency, its almost like performing a DoS attack on myself.

I’m wondering how you would handle this problem? What if you had to make 10,000 web service requests and 10,000 MySQL updates from the return data from the web service… And this needs to be done in at least 5 minutes.

I am using PHP and MySQL with the Zend Framework.

Thanks!

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-13T00:28:09+00:00Added an answer on May 13, 2026 at 12:28 am

    I’ve had to do something similar, but with Facebook, updating 300,000+ profiles every hour. As suggested by grossvogel, you need to use many processes to speed things up because the script is spending most of it’s time waiting for a response.
    You can do this with forking, if your PHP install has support for forking, or you can just execute another PHP script via the command line.

    exec('nohup /path/to/script.php >> /tmp/logfile 2>&1 & echo $!'), $processId);
    

    You can pass parameters (getopt) to the php script on the command line to tell it which “batch” to process. You can have the master script do a sleep/check cycle to see if the scripts are still running by checking for the process id’s. I’ve tested up to 100 scripts running at once in this manner, at which point the CPU load can get quite high.

    Combine multiple processes with multi-curl, and you should easily be able to do what you need.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 243k
  • Answers 243k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer A and B are disjoint subtypes in your model. This… May 13, 2026 at 7:55 am
  • Editorial Team
    Editorial Team added an answer Mono's RichTextBox is written in C# and is open source:… May 13, 2026 at 7:55 am
  • Editorial Team
    Editorial Team added an answer C++ shows this error when you omit the identifier type.… May 13, 2026 at 7:55 am

Related Questions

Greetings, I want to write a script that handles simple http requests from Google
Greetings all, I am taking a Structure and Application of Microcomputers course this semester
Greetings all, I have posted this on the MSDN managed news groups as well
Greetings all! Looking for some help with MVC in a PHP context. Currently I

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.