How computer multiply two numbers.?

suppose i have to perform the multupication 100*100 … so how many multipications are actually performed inside the computers to do this multipication. I searched it on stackoverflow , but i didn’t understand their logic…! help !

Hi va1ts7_100,

Computer use Booth’s algorithms for multiplication , this process is slightly different from we
do normal multiplication there is shifting of bits also . There are many videos at Booth’s algorithm
on youtube and it is easily understandable.

Booth’s multiplication algorithm wiki link

i guess from taking an organisation course that its done in a 32bit register or 64bit register - up to ur ps - , that the two numbers first are converted into binary n digits and then summed up to n times

for more detail of this…you can look CO201 Computer architecture and Organization course
this course generally taught in 3rd semester in B.tech(NITs)…

for multiplication there two major algorithms…

*Robertson's Multiplication Algorithm
*Booth's Multiplication Algorithm

these algorithm are really simple just look…you will understand definitely very quick…
it just state forward

here two major operation (here we do multiplication at binary level…like 101*011 )

  • compare

    • if multiplying bit 1 and 1 result :-1
    • if multiplying bit 0 and 1 or 1 and 0 or 0 and 0 result :- 0
  • shift

    • as we use cross mark ‘x’ for shifting in 3’rd grade multiplication method
      here we use bit shifting…

  0101
  X011

  0101
 01010
000000

001111

now you got it how simple it is …just same as we do in decimal …computer do in binary…

go through for more detail…

1). Robertson’s Multiplication Algorithm

2).Booths’ Multiplication Algorithm

*** Happy coding***

thanks @deepakmourya