kareem_nasser wroteif you have a PC hardware (GPU or CPU) with a Memory Bus Width 64-bit(read&write) and clocked at 1GHz while another is 128-bit(read&write) at 500MHz, By calculation(Theoretically) they are the same but what is the difference?
let me explain the difference between 128bit systems and 64bit and 32bit etc... because computing systems are not only computers .
first we have the computation cycles of the computer processors which are measured in "hertz" , a single hertz is 1 computation cycle per second , or in one second the computer can make one computation cycle . hence 1 GHZ is 1000 mega cycles per second or 1,000,000,000 cycles per second .
now lets talk about the 32bit vs 64bit . 64 bits are 64 zeroes and ones that enter the CPU for computation for example : 1 in decimal = 00000000000000000000000000000001 in 32 bit binary notation . hence the biggest number i can fit in it is 11111111111111111111111111111111 in 32 bit binary notation which is equal to 4294967295 in decimal while the biggest number to fit in 64bit is 1111111111111111111111111111111111111111111111111111111111111111
128 you ask ?
1111111111111111111111111111111111111111111111111111111111111111
1111111111111111111111111111111111111111111111111111111111111111
its just that .
the only advantage of a 64 or 128 bit system over 32 is in the case of huge numbers calculations within the CPU . where the 32 bit processor will divide the number into 2 and work with it , the 64 bit will work with the number as is .