Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 505833
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 13, 20262026-05-13T06:38:21+00:00 2026-05-13T06:38:21+00:00

I have an algorithm which currently allocates a very large array of doubles, which

  • 0

I have an algorithm which currently allocates a very large array of doubles, which it updates and searches frequently. The size of the array is N^2/2, where N is the number of rows on which the algorithm is operating. I also have to keep a copy of the entire thing for purposes associated with the application surrounding the algorithm.

Of course this imposes a limit on the number of rows that my algorithm can handle as I have the heap limitation to contend with. Up to this point I have got away with asking the people using the algorithm to update the -Xmx setting to allocate more space, and that has worked fine. However, I now have a genuine problem where I need this array to be larger than I can fit into memory.

I already have plans to change my algorithm to mitigate the necessity of this large array and have some promising results in that domain. However it is a fundamental alteration to the process and will require a lot more work before it gets to the highly polished condition of my current code which is operating in production very successfully and has been for several years.

So, while I am perfecting my new algorithm I wanted to extend the life of the existing one and that means tackling the heap limitation associated with allocating my huge array of doubles.

My question is what is the best way of dealing with it? Should I use an nio FileChannel and a MappedByteBuffer, or is there a better approach. If I do use the nio approach, what sort of performance hit should I expect to take compared to an in-memory array of the same size?

Thanks

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-13T06:38:21+00:00Added an answer on May 13, 2026 at 6:38 am

    If you’re running on PCs, page sizes for mapped files are likely to be 4 kilobytes.

    So the question really starts from if I start swapping the data out to disk, “how random is my random access to the RAM-that-is-now-a-file”?

    And (…can I and if so…) how can I order the doubles to maximise cases where doubles within a 4K page are accessed together rather than a few at a time in each page before the next 4K disk fetch?

    If you use standard IO, you probably still want to read and write in chunks but ther chunks could be smaller. Sectors will be at least 512 bytes, disk clusters bigger, but what size of read is best given that there is a kernel round trip overhead for each IO?

    I’m sorry but I’m afraid your best next steps depend to a great extent on the algorithm and the data you are using.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 287k
  • Answers 287k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer i'm not a sysadmin, but depending on your mailserver i… May 13, 2026 at 5:05 pm
  • Editorial Team
    Editorial Team added an answer Try opening the style sheet itself (by entering its address… May 13, 2026 at 5:05 pm
  • Editorial Team
    Editorial Team added an answer MySQL v5.0.2 or higher supports triggers, which are pieces of… May 13, 2026 at 5:05 pm

Related Questions

I'm in search of an algorithm, which can handle the problem described below. I
I have a two part question Best-Practice I have an algorithm that performs some
I am to develop an application on windows. I have never done that before
I have a Polygon (a Hex for a board game) in Silverlight, something like;
I have a recursive algorithm which steps through a string, character by character, and

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.