How does a computer do division? please explain in lahmans terms.
Also does the division take longer the bigger the number gets? would it take only a second for a super computer to divide a number with millions of digits by a small number? please give me an answer other than : your not being specific enouph, or it depends: . thank you