Don't worry so much about not understanding that headline. We don't blame you. The Dalvik Virtual Machine is a behind-the-scenes tool that most of you never see, let alone need to worry about. Same goes for the Just-in-Time compiler -- aka the JIT. But those two things are among the main reasons Android 2.2 -- Froyo -- is leaps and bounds faster than than its predecessors. (Check out our own benchmarking tests if you need proof.)

Google's Dan Bornstein recently took to the Android Developers Blog to explain more about Dalvik and Jit. And he turns it into plain English far better than I could:

We added a Just In Time (JIT) compiler to the Dalvik VM. The JIT is a software component which takes application code, analyzes it, and actively translates it into a form that runs faster, doing so while the application continues to run. If you want to learn more about the design of the Dalvik JIT, please watch the excellent talk from Google I/O 2010 given by my colleagues Bill Buzbee and Ben Cheng, which should be posted to YouTube very soon.

To be clear, the differences aren’t always dramatic, nor do they apply uniformly to all applications. Code that is written to run the CPU all-out can now do more in the same amount of time (running faster), and code that is written to be rate-limited can get its work done using less time and less of the CPU (using less battery). On the performance front in particular, we have seen realistic improvements of 2x to 5x for CPU-bound code, compared to the previous version of the Dalvik VM. This is equivalent to about 4x to 10x faster than a more traditional interpreter implementation.

OK, I take it back. I understood parts of that. But I especially understood the part where Dan explain that things run "4x to 10x faster." There's no interpretation needed there. And on top of the speed increases, the JIT is light on RAM, too. It's a win-win. Check out Dan's entire post for the whole nitty-gritty on the JIT and why you'll love it. [Android Developers Blog]