PDA

View Full Version : 64 bit cpus



fergie
03-02-2003, 05:47 PM
Hi ya,
Just reading an article about when AMD is releasing its new CPU's.
anyway, wot are 64bit cpus?
wot are current ones - 32bit?
Wot does this mean?
Are they goign to be the next best thing - and all current cpus overnight will become ancinet or something/
cheers

Gorela
03-02-2003, 06:25 PM
Basically it's talking about the CPU ( Central Processing Unit ) of the computer. As you said the current computers run 32 bit CPU's. Now from memory there is 7 data bits and one parity bit and this is a byte, etc.

So it is basically a measure of how many data channels a CPU has and therefore how many bits it can move per cycle. Hence the Mhz rating for CPU's ie P4 3000Mhz equals 3000 million cycles per second times the number of data channels equals the number of bits per second.

The main problem is that you need to use an operating system that can effectively use this type of CPU. Supposedly Microsoft don't have anything, but quite luckily a number of Linux suppliers (SuSE for one) are working towards a 64bit operating system.

kiwibeat
03-02-2003, 07:01 PM
the current 32 bit cpu's will be around for a long time yet the benefits of 64 are mainly in massive number crunching , simulations and speadsheets etc

Chilling_Silence
03-02-2003, 11:33 PM
AMD are supposed to be releasing a 64Bit CPU that is backwards-compatible with 32Bit software etc...

DangerousDave
04-02-2003, 08:09 AM
lol when amd releases athlon64... come to think of it, it was supposed to be out 4th quarter last year i think or maybe that was the barton. BTW i reckon the new barton core will kick any P4 ;)

- David

Graham L
04-02-2003, 02:48 PM
Of course, if you have an Alpha workstation (made by Digital ... alas now dissolved into Compaq, and now the HP signs are going up on that building ... :D) you already have a 64 bit processor. It's the word length. It affects the (native) precision of arithmetic. IBM selected 32 bits for the 360 series, which was "not quite enough" precision for scientific floating point work, though enough for financial computations (then -- :D).

One of the Control Data big machines used 60 bits, Burroughs large systems used 48 bits, some of the early DEC ones used 36,

The biggest advantage these days is probably in addressing, as well as for inflation adjusted financial stuff.