Convert Microsecond (µs) to Century (century) instantly.
About these units
Microsecond (µs)
A microsecond equals one millionth of a second (10⁻⁶ s) and belongs to the realm of electronics, high-speed computation, radar systems, and signal processing. In digital electronics, microseconds describe the switching times of microcontrollers, communication baud rates, and pulse-width modulation (PWM) frequencies. Flash memory access times, database latency, and embedded systems all use µs resolution. In aviation and radar, microseconds represent the time it takes for radio waves to travel hundreds of meters. In biology, neural synapse firing intervals and muscle micro-movements occur at microsecond timescales. The microsecond is essential for understanding everything from machine communication to the fast nuances of living organisms.
Century (century)
A century equals 100 years and is a major unit of historical, demographic, and civilizational analysis. Historians frequently divide narratives into centuries to highlight long-term transformations—technological revolutions, empire rises and falls, or artistic movements. Sociologists study century-scale changes in population, urbanization, and cultural evolution. Although human lifespans rarely exceed one century, this unit is large enough to encompass sweeping societal changes, making it ideal for macrohistorical studies.