I’m trying to create an XML sitemap using CakePHP, from a table which has more than 50,000 records at the moment, each record equivalent to a URI in the sitemap. Now the problem I’m facing is CakePHP is running me out of memory while generating it, for two reasons:
- A
find('all')is building a huge associative array of the entire set of 50,000 URIs. - Since I don’t want to output HTML from the controller itself, I’m transferring the associative array containing URI, priority, change frequency etc., to the view with a
$this->set()call — which again is huge, containing 50,000 indices.
Is it possible at all, to do this while following MVC and CakePHP guidelines?
Are you sure you have to run out of memory on 50.000 records? Even if a row is 1K in size (pretty huge), you would have to deal with ~ 50 MB of data? My P1 had enough RAM to handle that. Set memory_limit in php.ini higher than the default. (Consider also tweaking max_execution_time.)
On the other hand, if you consider the data set as too huge and processing it as too resource intensive, you should not serve that page dynamically, it is the perfect DDoS bait. (At least I would cache it heavily.)
You could schedule a cron job to re-generate the page every X hours by a server side script free from the MVC penalty of serving all data at once to the view, it could work on the rows sequentially.