Submit a question to our community and get an answer from real people.

Do you measure Internet speed in megabytes or megabits?

Report as

Either megabits or megabytes can be used. A byte is equal to exactly eight bits. So, the conversion between the two can be a little tricky. One megabit equals one million bits. One megabyte equals one million bytes, which is equal to eight million bits.

Also, to be technically correct, speed of data transmission must be measured in amount of data *per unit of time.* The unit of time is usually seconds. The most common unit of measure for internet speed is Mbps, which stands for megabits per second.

For example, when a 1 Mbps connection is advertised, it usually means that the maximum achievable download bandwidth is 1 megabit/s (million bits per second), which is actually 0.125 MB/s (megabyte per second).

Report as
Report as
Report as

Internet speed can be measured in both megabytes or megabits. Internet networks are however mostly measure in megabits because the number is more accurate and accuracy is important when you are measuring speed in internet.

Report as
Why is megabits more accurate?
Report as
Technically, either megabits per second or megabytes per second should be equally accurate. A byte is exactly eight bits. So, a megabit is one million bits. One million bits is 125,000 bytes, because one million divided by eight is 125,000. A bit and a byte are different units of measurement, but both should be equally accurate.
Report as