Dimitri // Mar 13, 2008 at 10:39 am
Windows has been 64-bit for over two years now, and for almost that long it has been darned near impossible to buy an office computer that does not use a 64-bit multicore processor. You can buy RAM at about $100 for 4GB. If you run 32-bit GIS products in 32-bit Windows you are throwing away years of technical advances to emulate 1990’s machines that run single core, 32-bit processors limited to 1 or 2 GB of RAM.
That makes no more sense than buying an expensive 8 cylinder car automobile to get hot performance and then disconnecting the ignition to all but four of the cylinders because your mechanic finds it too intellectually challenging to work on eight cylinder cars. Get a new mechanic!
In fact, I’ll bet that just about everyone posting on this blog who has acquired a computer in the last year or so is actually writing on a computer that contains a 64-bit processor, probably at least a dual core processor to boot. If you are running a 64-bit processor you should also be running a 64-bit operating system and 64-bit applications. In the mainstream, this is totally routine.
64-bit operation has huge benefits:
64-bit Windows is much more reliable than 32-bit Windows. Unless you enjoy cracking out your “what to do when it crashes” cheatsheet, you will go 64-bits just for greater reliability.
64-bit applications likewise tend to be much more reliable than 32-bit applications. The main reason is that the application is not confined to the very small 1GB/2GB process spaces of 32-bit Windows, so that application errors which cause crashes in limited 32-bit spaces get the benefit of enough slack not to crash in 64-bit applications.
64-bit GIS applications tend to run much faster, often a factor of four or more even for routine desktop work, because GIS data sets in modern times routinely are big enough to require paging to disk to deal with the small memory spaces allowed by 32-bit Windows. Whenever you page out to disk you trade microsecond RAM response for millisecond disk response, a thousand times slower. Using 64-bit GIS applications will often avoid that paging for pheonomenally faster response. I have yet to meet a GIS user who complains about snappier response.
This effect is especially important if you are doing analytics, because many modern GIS algorithms are recursive, so the ability to actually utilize lots of RAM translates into dramatically faster operation.
Modern RAM is so cheap that it is dumb, dumb, dumb not to pop 4GB or 8GB of RAM into your machine. Keep in mind that Windows itself is getting bigger and bigger, and the amount of crud loaded into your computer gets bigger and bigger (ever look at your processes tab recently?) so it is often the case that the limited memory space you have available within 32-bit Windows is not fully available for your GIS process. Pop 4 GB or 8 GB of memory into a 32-bit Windows system and you are wasting it, since Windows won’t be able to use more than 1GB or 2GB of it effectively. Pop 4 GB or 8 GB into a 64-bit Windows system and it can all be seamlessly used. To take advantage of lots of cheap RAM, you *must* run 64-bit Windows.
That memory effect is especially profound in IMS servers, where the least expensive way of dramatically increasing the number of visitors a web server can host is to simply go 64-bit Windows, 64-bit IMS and toss tons of RAM into a multicore server. Servers with two sockets are now routine and cheap: install two quad-core processors and you get eight cores with lots of RAM that will eat a 32-bit ArcGIS Server installation alive at a tenth of the price. Multiprocessor access to spatial DBMS is especially essential to performance, so not having that also kills ESRI performance on top of not being able to use 64-bit memory spaces.
No doubt the phenomenal disparity between 32-bit glacial response and 32-bit unreliability compared to 64-bit, multicore speed and total reliability is one reason ESRI seeks to go 64-bits with their IMS / ArcGIS Server products first.
And yes, they are getting their heads kicked in by vendors who offer 64-bit, multicore IMS / server products - many users who wouldn’t even think of looking at someone other than ESRI have made the decision to try an alternative because they feel forced into it by the lack of 64-bit and multi-core support in the ESRI product line.
Those customer losses are especially painful to ESRI because it is exactly those customers who need 64-bit performance who are normally the most willing to pay excessive ESRI prices and who are normally the most insulated from poaching by ESRI competitors. They are the last people ESRI wants to get a taste of life with modern price/performance, because once they see for themselves the power and reliability of 64-bit operation in IMS or DBMS apps, they realize that they are probably being failed by ESRI in other GIS product areas as well. It has been often the case that a single customer transition to non-ESRI, 64-bit IMS has ended up costing ESRI millions in lost GIS business overall as the rest of that customer’s GIS usage also transitions. It therefore has not escaped ESRI’s attention that for their GIS competitors, selling 64-bit product against ESRI is like shooting fish in a bucket.
ESRI is caught between a rock and a hard place, because even once they finally introduce some 64-bit server products, all they will succeed in doing is emphasizing how obsolete the rest of their product line is. No sensible person will be happy with 32-bit ArcInfo once they start getting a taste of 64-bit life with, say, a quasi-64-bit ArcGIS Server offering. What’s ESRI going to say… Go 64-bits with ArcGIS Server because it is essential but stick to being a 32-bit retard with ArcInfo?
If you don’t use 64-bit GIS products in 64-bit Windows you are making a decision to give up the benefits of modern hardware, modern price/performance and modern reliability. That is especially wasteful given that your hardware is almost certainly 64-bit already. 64-bit hardware and 64-bit Windows have been mainstream for two years now. Don’t buy any excuses from a vendor who is so incompetent at software development that they are years behind what even your kids now think is old hat.
If your vendor is so senile they cannot keep up with mainstream progress, go find one that can, and don’t wait until 2009 or 2010! (…. 2010? … they have got to be kidding…)