Rico Mariani's Performance Tidbits

Implying no warranties and conferring no rights: "AS IS" since 1988

Visual Studio: Why is there no 64 bit version? (yet)

Visual Studio: Why is there no 64 bit version? (yet)

Rate This
  • Comments 78

Disclaimer: This is yet another of my trademarked "approximately correct" discussions

 From time to time customers or partners ask me about our plans to create a 64 bit version of Visual Studio. When is it coming? Why aren’t we making it a priority? Haven’t we noticed that 64 bit PC’s are very popular? Things like that. We just had an internal discussion about “the 64 bit issue” and so I thought I would elaborate a bit on that discussion for the blog-o-sphere.

So why not 64 bit right away?

Well, there are several concerns with such an endeavor.

First, from a performance perspective the pointers get larger, so data structures get larger, and the processor cache stays the same size. That basically results in a raw speed hit (your mileage may vary).  So you start in a hole and you have to dig yourself out of that hole by using the extra memory above 4G to your advantage.  In Visual Studio this can happen in some large solutions but I think a preferable thing to do is to just use less memory in the first place.  Many of VS’s algorithms are amenable to this.  Here’s an old article that discusses the performance issues at some length: http://blogs.msdn.com/joshwil/archive/2006/07/18/670090.aspx

Secondly, from a cost perspective, probably the shortest path to porting Visual Studio to 64 bit is to port most of it to managed code incrementally and then port the rest.  The cost of a full port of that much native code is going to be quite high and of course all known extensions would break and we’d basically have to create a 64 bit ecosystem pretty much like you do for drivers.  Ouch.

[Clarification 6/11/09: The issue is this:  If all you wanted to do was move the code to 64 bit then yes the shortest path is to do a direct port.  But that’s never the case.  In practice porting has an opportunity cost, it competes with other desires.  So what happens is more like this:  you get teams that have C++ code written for 32 bits and they say “I want to write feature X, if I port to managed I can do feature X plus other things more easily, that seems like a good investment” so they go to managed code for other reasons.  But now they also have a path to 64 bit.  What’s happening in practice is that more and more of the Visual Studio is becoming managed for reasons unrelated to bitness. Hence a sort of net-hybrid porting strategy over time.]

So, all things considered, my feeling is that the best place to run VS for this generation is in the 32 bit emulation mode of a 64 bit operating system; this doubles your available address space without taking the data-space hit and it gives you extra benefits associated with that 64 bit OS.  More on those benefits later.

Having said that, I know there are customers that would benefit from a 64 bit version but I actually think that amount of effort would be better spent in reducing the memory footprint of the IDE’s existing structures rather than doing a port.  There are many tradeoffs here and  the opportunity cost of the port is high.

Is it expensive because the code is old and of poor quality?

It’s not really about the quality of the code – a lot of it is only a few releases old – as it is the amount of code involved.  Visual Studio is huge and most of its packages wouldn’t benefit from 64 bit addressing but nearly all of it would benefit from using more lazy algorithms – the tendency to load too much about the current solution is a general problem which results in slowness even when there is enough memory to do the necessary work.  Adding more memory to facilitate doing even more work that we shouldn’t be doing in the first place tends to incent the wrong behavior.  I want to load less, not more.

Doesn’t being a 64 bit application save you all kinds of page faults and so forth?

A 64 bit address space for the process isn’t going to help you with page faults except in maybe indirect ways, and it will definitely hurt you in direct ways because your data is bigger.  In contrast a 64 bit operating system could help you a lot!  If you’re running as a 32 bit app on a 64 bit OS then you get all of the 4G address space and all of that could be backed by physical memory (if you have the RAM) even without you using 64 bit pointers yourself.   You’ll see potentially huge improvements related to the size of the disk cache (not in your address space) and the fact that your working set won’t need to be eroded in favor of other processes as much.  Transient components and data (like C++ compilers and their big .pch files) stay cached  in physical memory, but not in your address space.  32 bit processes accrue all these benefits just as surely as 64 bit ones.

In fact, the only direct benefit you get from having more address space for your process is that you can allocate more total memory, but if we’re talking about scenarios that already fit in 4G then making the pointers bigger could cause them to not fit and certainly will make them take more memory, never less.  If you don’t have abundant memory that growth might make you page, and even if you do have the memory it will certainly make you miss the cache more often.  Remember the cache size does not grow in 64 bit mode but your data structures do.  Where you might get savings is if the bigger address space allowed you to have less fragmentation and more sharing.  But Vista+ auto-relocates images efficiently anyway for other reasons so this is less of a win.  You might also get benefits if the 64 bit instruction set is especially good for your application (e.g. if you do a ton of 64 bit math)

So, the only way you’re going to see serious benefits is if you have scenarios that simply will not fit into 4G at all.  But, in Visual Studio anyway, when we don’t fit into 4G of memory I have never once found myself thinking “wow, System X needs more address space” I always think “wow, System X needs to go on a diet.”

Your mileage may vary and you can of course imagine certain VS packages (such as a hypothetical data analytics debugging system) that might require staggering amounts of memory but those should be handled as special cases. And it is possible for us to do a hybrid plan with including some 64 bit slave processes. 

I do think we might seem less cool because we’re 32 bit only but I think the right way to fight that battle is with good information, and a great product.

Then why did Office make the decision to go 64 bit?

This section is entirely recreational speculation because I didn’t ask them (though frankly I should). But I think I can guess why. Maybe a kind reader can tell me how wrong I am :)

First, some of the hardest porting issues aren’t about getting the code to run properly but are about making sure that the file formats the new code generates remain compatible with previous (and future) versions of those formats. Remember, the ported code now thinks it has 64 bit offsets in some data structures.  That compatibility could be expensive to achieve because these things find their way into subtle places – potentially any binary file format could have pointer-size issues. However, Office already did a pass on all its file formats to standardize them on compressed XML, so they cannot possibly have embedded pointers anymore. That’s a nice cost saver on the road to 64 bit products.

Secondly, on the benefit side, there are customers out there that would love to load enormous datasets into Excel or Access and process them interactively. Now in Visual Studio I can look you in the face and say “even if your solution has more than 4G of files I shouldn’t have to load it all for you to build and refactor it” but that’s a much harder argument to make for say Excel.

In Visual Studio if you needed to do a new feature like debugging of a giant analytics system that used a lot of memory I would say “make that analytics debugging package 64 bit, the rest can stay the way they are” but porting say half of Excel to 64 bits isn’t exactly practical.

So the Office folks have different motivations and costs and therefore came to different conclusions -- the above are just my personal uninformed guesses as to why that might be the case.

One thing is for sure though: I definitely think that the benefits of the 64 bit operating system are huge for everyone. Even if it was nothing more than using all that extra memory as a giant disk cache, just that can be fabulous, and you get a lot more than that!

  • You may have noticed that there is not a native 64-bit version of Visual Studio 2010. With all of the

  • Thanks a lot for this great explanation Rico... this was a common question I think and your explanation is clear.

    However, I think that many of us are waiting a 64bit version of VS in the future...

  • You could do with making some of the existing slave processes 64bit. fsi.exe is one example. You can use corflags to patch it, just ignore the warning about the stripping of the strong name since it still works despite this. (though if you strip the strong name by hand then it *does* break it)

  • The pointer size increase is a bane.  However, the 64-bit ISA's advantages (larger and more registers) can cause a significant performance boost.  An obvious compomise would be to use just 4gb of address space in a 64-bit process and simply make due with 32-bit pointers.  Although code-size would be slightly larger due to instruction size issues, this looks like an almost pure win, otherwise (it'd be nice to have this option for other programs too).

    With fancier tricks, some people seem to think that it's possible to use 32-bit pointers even in 64-bit java programs that use more that 4GB of memory:

    http://docs.thinkfree.com/docs/view.php?dsn=855108

  • Back when I used to do mixed model tricks with 16 bit pointers addressing 64k segments we had interesting compiler magic that would let you address more than one pointer sort of.  Far and near pointers based pointers.  You could image that world all over again with 64 bit -- 32 bit 'near' pointers in a 4G segment.  I guess it could work but boy it was no fun then.  Maybe some special cases or something.

    More registers can make a difference but as I was saying the other day, the regular 32 bit instruction set actually has a lot more registers than you might think.  With L1 being what it is  [EBP+4] [EBP+8] [EBP+12] are all nearly as good as registers.

    Interestingly, when we changed our code-gen patterns in VS2008 so that EBP was no longer used as a general purpose register, thereby reducing the actual number of registers by 1 (which is a big deal if you consider that this reduces you from 7 to 6 depending on how you count it, that's a significant percentage.  But actually there was no difference.

    Sometimes there is, but really with out of order instructions, register renaming, and a good L1 cache it isn't nearly as bad as it used to be.

  • "That basically results in a raw speed hit (your mileage may vary)"

    Have you measured the performance hit? What was it?

  • Makes me long for the days when 640KB was the memory limit.  Or maybe we should go back to punched cards.  On second thought, we'd just end up debating the merits of 90-column round-hole cards vs. 80-column rectangle-hole cards. :-)

  • Everyone is saying "Go 64bit!". I have listened to them and now I have driver problems, compatibility problems etc. If you have 64bit OS then you have to have 64bit (native) tools, native drivers etc. Plain and simple as that.

  • Interesting Finds: June 11, 2009

  • Maybe Visual Studio doesn't need to be 64-bit -- but please give us an environment that brings parity for 64-bit and 32-bit development!

    So for example bring the improvements made to the 32-bit-JIT-compiler in 3.5 SP1 to 64-bit. And bring Historical Debugging to 64-bit...

  • Apart from memory there is another reason to want a 64 bits VS IDE: it's getting WOW out of the way in VS extensions.

    One disadvantage from using 32 bits Visual Studio on a 64 bits OS is that VS extensions also have to run on the 32 bits WOW layer.

    WOW gets in the way when you want to automate development tasks agains 64 bits server products that run on your development box - e.g. SharePoint - from within Visual Studio.

    E.g. when you want to access the registry you get the 'wrong' one (32 bits). There are also differences in program files and GAC.

    We encounter this problem in our Factory Guide VS extension for our "Macaw Solutions Factory".

    So the lack of a 64 bits VS IDE is hindering integration when developing for 64 bits MS server products.

  • Our Visual Studio Chief Architect, Rico Mariani, wrote up a great blog about why we haven't moved to

  • I see two issues with this argument:

    1. Things don't work the same in WOW. For example, much noise was made about Edit and Continue for C# (and already for VB) in recent versions. It doesn't work in WOW.

    2. Lots of other apps have been built to run in the VS shell under the VSIP program. A good example is the SQL toolset. Can you really see SQL Server Management Studio staying as a 32 bit tool for many years to come?

  • Porting the VS shell to 64 bit would be a much more interesting project.  Many people could use it (like SSMS) and it's much less code.

    Could be a great incremental step.

    Some folks pointed out that *runtime* support for 64 bit isn't as good as it could/should be (e.g. full edit and continue support is desired).  I totally agree, but of course that isn't because the IDE itself isn't 64 bit.

  • @vincent:

    For VS extensions that really need to be 64 bit because of (e.g.) 64 bit services they use we have been recommending that the extension be split into a 64 bit part that does the heavy lifting and a 32 bit part that hooks into Visual Studio.  This is the strategy for integrating with 64 bit Office for us.  This actually has many architectural benefits as well but it is more work.  Sorry :(

Page 1 of 6 (78 items) 12345»