I have a script when ran consumes 17mb of memory (logged using memory_get_peak_usage()).
Script is ran 1 million times per day.
Total daily memory consumption: 17 million mb
86400 seconds in a day.
17000000 / 86400 = 196.76
Assumption: Running this script 1 million times each day will require at least 196.76 dedicated memory.
Is my assumption correct?
If script is runs 1000000 copies at the same time, then you would get 17 million MB, but as it releases memory after to completes, you don’t add usage to total sum.
You need to know how much copies run at same time and multiply that number of 17 MB. That would be max memory usage.