Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 249565
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 11, 20262026-05-11T21:26:17+00:00 2026-05-11T21:26:17+00:00

I am working on an API to query a database server (Oracle in my

  • 0

I am working on an API to query a database server (Oracle in my case) to retrieve massive amount of data. (This is actually a layer on top of JDBC.)

The API I created tries to limit as much as possible the loading of every queried information into memory. I mean that I prefer to iterate over the result set and process the returned row one by one instead of loading every rows in memory and process them later.

But I am wondering if this is the best practice since it has some issues:

  • The result set is kept during the whole processing, if the processing is as long as retrieving the data, it means that my result set will be open twice as long
  • Doing another query inside my processing loop means opening another result set while I am already using one, it may not be a good idea to start opening too much result sets simultaneously.

On the other side, it has some advantages:

  • I never have more than one row of data in memory for a result set, since my queries tend to return around 100k rows, it may be worth it.
  • Since my framework is heavily based on functionnal programming concepts, I never rely on multiple rows being in memory at the same time.
  • Starting the processing on the first rows returned while the database engine is still returning other rows is a great performance boost.

In response to Gandalf, I add some more information:

  • I will always have to process the entire result set
  • I am not doing any aggregation of rows

I am integrating with a master data management application and retrieving data in order to either validate them or export them using many different formats (to the ERP, to the web platform, etc.)

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-11T21:26:17+00:00Added an answer on May 11, 2026 at 9:26 pm

    There is no universal answer. I personally implemented both solutions dozens of times.

    This depends of what matters more for you: memory or network traffic.

    If you have a fast network connection (LAN) and a poor client machine, then fetch data row by row from the server.

    If you work over the Internet, then batch fetching will help you.

    You can set prefetch count or your database layer properties and find a golden mean.

    Rule of thumb is: fetch everything that you can keep without noticing it

    if you need more detailed analysis, there are six factors involved:

    • Row generation responce time / rate(how soon Oracle generates first row / last row)
    • Row delivery response time / rate (how soon can you get first row / last row)
    • Row processing response time / rate (how soon can you show first row / last row)

    One of them will be the bottleneck.

    As a rule, rate and responce time are antagonists.

    With prefetching, you can control the row delivery response time and row delivery rate: higher prefetch count will increase rate but decrease response time, lower prefetch count will do the opposite.

    Choose which one is more important to you.

    You can also do the following: create separate threads for fetching and processing.

    Select just ehough rows to keep user amused in low prefetch mode (with high response time), then switch into high prefetch mode.

    It will fetch the rows in the background and you can process them in the background too, while the user browses over the first rows.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 145k
  • Answers 145k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer See my response on a similar topic. Validating form using… May 12, 2026 at 8:50 am
  • Editorial Team
    Editorial Team added an answer I think you may be misunderstanding what a Grails template… May 12, 2026 at 8:50 am
  • Editorial Team
    Editorial Team added an answer In general, read my article about parameter passing. The basic… May 12, 2026 at 8:50 am

Related Questions

I am working on a database access layer project and decided to use Linq
I am working on a Django application which allows a user to upload files.
Am working on web based Job search application using Lucene.User on my site can
I'm working on a flash web application (Actionscript 2.0) for my honours project but

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.