Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 823179
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 15, 20262026-05-15T02:54:42+00:00 2026-05-15T02:54:42+00:00

I am doing some optimizations on an MPEG decoder. To ensure my optimizations aren’t

  • 0

I am doing some optimizations on an MPEG decoder. To ensure my optimizations aren’t breaking anything I have a test suite that benchmarks the entire codebase (both optimized and original) as well as verifying that they both produce identical results (basically just feeding a couple of different streams through the decoder and crc32 the outputs).

When using the “-server” option with the Sun 1.6.0_18, the test suite runs about 12% slower on the optimized version after warmup (in comparison to the default “-client” setting), while the original codebase gains a good boost running about twice as fast as in client mode.

While at first this seemed to be simply a warmup issue to me, I added a loop to repeat the entire test suite multiple times. Then execution times become constant for each pass starting at the 3rd iteration of the test, still the optimized version stays 12% slower than in the client mode.

I am also pretty sure its not a garbage collection issue, since the code involves absolutely no object allocations after startup. The code consists mainly of some bit manipulation operations (stream decoding) and lots of basic floating math (generating PCM audio). The only JDK classes involved are ByteArrayInputStream (feeds the stream to the test and excluding disk IO from the tests) and CRC32 (to verify the result). I also observed the same behaviour with Sun JDK 1.7.0_b98 (only that ist 15% instead of 12% there).
Oh, and the tests were all done on the same machine (single core) with no other applications running (WinXP). While there is some inevitable variation on the measured execution times (using System.nanoTime btw), the variation between different test runs with the same settings never exceeded 2%, usually less than 1% (after warmup), so I conclude the effect is real and not purely induced by the measuring mechanism/machine.

Are there any known coding patterns that perform worse on the server JIT? Failing that, what options are available to “peek” under the hood and observe what the JIT is doing there?

  • Maybe I misworded my “warmup” description. There is no explicit warmup code. The entire test suite (consisting of 12 different mpeg streams, containing ~180K audio frames total) is executed 10 times, and I regard the first 3 runs as “warmup”. One test round takes approximately 40 seconds of 100% cpu on my machine.

  • I played with the JVM options as suggested and using “-Xms512m -Xmx512m -Xss128k -server -XX:CompileThreshold=1 -XX:+PrintCompilation -XX:+AggressiveOpts -XX:+PrintGC” I could verify that all compilation takes place in the first 3 rounds. Garbage collection kicks in every 3-4 rounds and took 40ms at most (512m is extremely oversized, since the tests can be run with 16m just fine). From this I conclude that garbage collection has no impact here. Still, comparing client to server (other options unaltered) the 12/15% difference remains.

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-15T02:54:43+00:00Added an answer on May 15, 2026 at 2:54 am

    As you’ve seen, the JIT can skew test results, since it runs in a background thread, stealing cpu cycles from your main thread running the test.

    As well as stealing cycles, it’s also asynchornos, so you cannot be sure it has finished it’s work when you complete warmup and start your test for real. To force synchronous JIT compilation, You can use the -XBatch nonstandard option to force JIT compilation to the foreground thread, so you can be sure the JIT has finished when your warmup completes.

    HotSpot doesn’t compile methods right away, but waits until a method has been executed a certain number of times. On the page for the -XX options, it states that the default for -server is 10000 times, while for -client it is 1500 times. This could be a cause of
    the slowdown, particularly if your warmup ends up invoking many critical methods between 1500 and 10000 times: with the -client option they will be JITed during the warmup phase, but running with -server, compilation may be delayed execution of your profiled code.

    You can change the number of method invocations needed before HotSpot compiles a method by setting -XX:CompileThreshold. I choose twenty so that even vaguely hot-spots (luke-warm spots?) are converted during the warmup even when the test is run just a few times. This has worked for me in the past, but YMMV and different values may give you better results.

    You might also check the HotSpot VM Options page to find the other options that differ between -client and -server options, particularly the garbage collector options, as these differ considerably.

    See

    • Java HotSpot VM Options
    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 400k
  • Answers 400k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer I suggest DateJS even if it's not a jQuery plugin. May 15, 2026 at 4:19 am
  • Editorial Team
    Editorial Team added an answer According to this link, I would say that it does.… May 15, 2026 at 4:19 am
  • Editorial Team
    Editorial Team added an answer In python 2.6, import itertools def intersperse(x, numzeroes): for indices… May 15, 2026 at 4:19 am

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.