For a single web request, We fetch more than 1000 rows of data from an external system and we need to re-format the data into much better presentation format before we display it to the end user on the web page.
When we loop through the data in ASPX pages, it is creating a hard loop (or tight loop?) which is consuming more than 95% of the CPU.
How do we avoid CPU spikes and also have the process completed as fast as possible? TIA
Are you duplicating any effort here? Is it the same 1000 rows on subsequent requests for the same data? Then maybe you should cache the results? Are some of the calculations the same, over and over? Then maybe you should store the results of the calculations along with the raw data.
Bottom line: don’t do a tight loop in the presentation layer. It doesn’t belong there.
It’s also hurting your scalability, BTW. You’re tying up worker threads that could be handling other requests, perhaps other requests not sitting in tight CPU loops.