Jump to content

Bit time

From Wikipedia, the free encyclopedia

Bit time is a concept in computer networking.[1] It is defined as the time it takes for one bit to be ejected from a network interface controller (NIC) operating at some predefined standard speed, such as 10 Mbit/s.[2] The time is measured between the time the logical link control sublayer receives the instruction from the operating system until the bit actually leaves the NIC. The bit time has nothing to do with the time it takes for a bit to travel on the network medium but has to do with the internals of the NIC. Typically this refers to the smallest usable reference time, often also referred to as the 'minimum time quantum'.[3][4]

To calculate the bit time at which a NIC ejects bits, use the following:

        bit time = 1 / NIC speed

To calculate the bit time for a 10 Mbit/s NIC, use the formula as follows:

        bit time = 1 / (10 * 10^6)
                 = 10^-7
                 = 100 * 10^-9
                 = 100 nanoseconds

The bit time for a 10 Mbit/s NIC is 100 nanoseconds. That is, a 10 Mbit/s NIC can eject 1 bit every 0.1 microsecond (100 nanoseconds = 0.1 microseconds).

Bit time is distinctively different from slot time, which is the time taken for a pulse to travel through the longest permitted length of network medium.

References

[edit]
  1. ^ "Definition of bit time". PCMAG. Retrieved 2025-06-23.
  2. ^ Dye, Mark; McDonald, Richard; Rufi, Antoon (2007-10-29). Network Fundamentals, CCNA Exploration Companion Guide. Cisco Press. ISBN 978-0-13-287743-5.
  3. ^ Paret, Dominique (2007-06-13). Multiplexed Networks for Embedded Systems. John Wiley & Sons. p. 90. ISBN 978-0-470-51170-1.
  4. ^ Natale, Marco Di; Zeng, Haibo; Giusto, Paolo; Ghosal, Arkadeb (2012-01-19). Understanding and Using the Controller Area Network Communication Protocol. Springer Science & Business Media. p. 5. ISBN 978-1-4614-0314-2.