Megabit. A unit used to measure digital data. A Megabit is approximately 1 million bits, roughly ⅛ the size of a Megabyte (MB).
Multiples of bits  


 
The megabit is a multiple of the unit
 [[]] bit for digital information. The prefix mega (symbol M) is defined in the International System of Units (SI) as a multiplier of 10^{6} (1 million), and therefore
 1 megabit = 10^{6}bits = 1000000bits = 1000 kilobits.
The megabit has the unit symbol Mbit.
The megabit is closely related to the mebibit, a unit multiple derived from the binary prefix mebi (symbol Mi) of the same order of magnitude, which is equal to 2^{20}bits = 1048576bits, or approximately 5% larger than the megabit. Despite the definitions of these new prefixes for binarybased quantities of storage by international standards organizations, memory semiconductor chips are still marketed using the metric prefix names to designate binary multiples.
Using the common byte size of eight bits and the standardized metric definition of megabit and kilobyte, 1 megabit is equal to 125 kilobytes (kB) or approximately 122 kibibytes (KiB).
The megabit is widely used when referring to data transfer rates of computer networks or telecommunications systems. Network transfer rates and download speeds often use the megabit as the amount transferred per time unit, e.g., a 100 Mbit/s (megabit per second) FastEthernet connection, or a 10 Mbit/s Internet access service, whereas the sizes of data units (files) transferred over these networks are often measured in megabytes. To achieve a transfer rate of one megabyte per second one needs a network connection with a transfer rate of eight megabits per second.