Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • SEARCH
  • Home
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 654297
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 13, 20262026-05-13T22:28:45+00:00 2026-05-13T22:28:45+00:00

I’m woking on Java with a large (millions) hashmap that is actually built with

  • 0

I’m woking on Java with a large (millions) hashmap that is actually built with a capacity of 10.000.000 and a load factor of .75 and it’s used to cache some values

since cached values become useless with time (not accessed anymore) but I can’t remove useless ones while on the way I would like to entirely empty the cache when its performance starts to degrade. How can I decide when it’s good to do it?

For example, with 10 millions capacity and .75 should I empty it when it reaches 7.5 millions of elements? Because I tried various threshold values but I would like to have an analytic one.

I’ve already tested the fact that emping it when it’s quite full is a boost for perfomance (first 2-3 algorithm iterations after the wipe just fill it back, then it starts running faster than before the wipe)

EDIT: ADDITIONAL INFO

The hashmap has long as keys and float as values. It contains cached correlation of contents, since it’s a dot product of tag vectors I wanted to cache them (to increase performance).

So basically what I do is to compute a long key using the hashcodes of the 2 contents:

static private long computeKey(Object o1, Object o2)
{
    int h1 = o1.hashCode();
    int h2 = o2.hashCode();

    if (h1 < h2)
    {
        int swap = h1;
        h1 = h2;
        h2 = swap;
    }

    return ((long)h1) << 32 | h2;
}

and use it to retrieve stored values. What happens is that since it’s a hierarchical clustering contents are merged and their correlation values with other contents are not needed any more.. that’s why I want to wipe the hashmap from time to time, to avoid degradation due to useless values inside it.

Using a WeakHashMap will unpredictably wipe out data also when they are still needed.. I’ve no control over it.

Thanks

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-13T22:28:45+00:00Added an answer on May 13, 2026 at 10:28 pm

    Why not use an LRU Cache?
    From Java’s LinkedHashMap documentation:

    A special constructor is provided to
    create a linked hash map whose order
    of iteration is the order in which its
    entries were last accessed, from
    least-recently accessed to
    most-recently (access-order). This
    kind of map is well-suited to building
    LRU caches. Invoking the put or get
    method results in an access to the
    corresponding entry (assuming it
    exists after the invocation
    completes). The putAll method
    generates one entry access for each
    mapping in the specified map, in the
    order that key-value mappings are
    provided by the specified map’s entry
    set iterator. No other methods
    generate entry accesses. In
    particular, operations on
    collection-views do not affect the
    order of iteration of the backing map.

    So basically, every once in a while as your map gets too big, just delete the first x values that the iterator gives you.

    See documentation for removeEldestEntry to have this done for you automatically.

    Here is code that demonstrates:

     public static void main(String[] args) {
        class CacheMap extends LinkedHashMap{
          private int maxCapacity;
          public CacheMap(int initialCapacity, int maxCapacity) {
            super(initialCapacity, 0.75f, true);
            this.maxCapacity = maxCapacity;
          }
    
          @Override
          protected boolean removeEldestEntry(Map.Entry eldest) {
            return size()>maxCapacity;
          }
        }
    
        int[] popular = {1,2,3,4,5};
        CacheMap myCache = new CacheMap(5, 10);
        for (int i=0; i<100; i++){
          myCache.put(i,i);
          for (int p : popular) {
            myCache.get(p);
          }
        }
    
        System.out.println(myCache.toString()); 
        //{95=95, 96=96, 97=97, 98=98, 99=99, 1=1, 2=2, 3=3, 4=4, 5=5}
      }
    
    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 385k
  • Answers 385k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer Adding an autorelease there removes the memory issue but then… May 14, 2026 at 11:20 pm
  • Editorial Team
    Editorial Team added an answer OK - at last I seem to have done it.… May 14, 2026 at 11:20 pm
  • Editorial Team
    Editorial Team added an answer Whitespace stretches are textNode instances Annotated example: <div id="full_size_photo">[node 1… May 14, 2026 at 11:20 pm

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.