There was a time when 64K was considered a lot of memory. Then too, 1 million dollars was a lot of money at one time (as cheerfully depicted when Dr. Evil was thawed out in the first ‘Austin Powers’ movie demanding one million dollars – quickly to be corrected as one billion dollars by his cohorts in crime ). It’s funny how things change through the years. Today, in computing terms, 64k (and now even 4 gigabytes) is not that much memory. 4 gigs is the addressable limit of how much memory a 32-bit processor can use. And an iPhone I can hold in my hand has at least 8 gig of memory. Wow!!!
To take full advantage of 64-bit processors, applications need to be compiled using 64-bit code and run on a 64-bit operating system. Fortunately for both developers and end users, this technology has been around for a while. In the Windows marketplace, the MSVC compiler has been able to compile 64-bit code since 2005 (and maybe even a bit before). Plus, if a lot of low level pointer manipulation isn’t required, many programs can just be recompiled.
For many server applications, it makes sense to compile and run as 64-bit code for performance advantages, both due to the larger addressable memory, and the processor internally using 64 bit data. How is this better? For a colorful illustration, imagine two large dinosaurs standing next to each other enjoying a nice meal. Let’s assume they both have the same size teeth, but one has a much larger jaw and 64 teeth, while the other has only 32. Which one would devour a larger volume of his meal in the same time period?
If there is one thing I have learned in all my years working in software and hardware engineering, it’s that rapid advances in technology have become the norm. Vinyl records and cassettes (not to mention 8-track tapes – remember those?) were replaced by CD’s then music went all digital, accessible by online download. Flat screen monitors and TV’s replaced CRT’s. And does anyone remember using hard-copy encyclopedias for reports or research? (Not me of course).
I generally don’t make public predictions about where technology is headed, but I do believe that 64-bit applications will replace 32-bit at a rapid pace, since the process of porting 32-bit applications is not very difficult – it’s just takes a little time.
Someday, maybe I will be telling my grandchildren, “I remember when 4 gigs was a lot of memory” and, “A billion dollars used to be lot of money.”
What do you think the next advance will be? How long will it take for 64 Bit to be the standard?