Microseconds To Seconds Conversion

µs

1 µs = 0.000001 s

Want to convert from seconds to microseconds instead?

Disclaimer: We've spent hundreds of hours building and testing our calculators and conversion tools. However, we cannot be held liable for any damages or losses (monetary or otherwise) arising out of or in connection with their use. Full disclaimer.

How to convert microseconds to seconds (µs to s)

The formula for converting microseconds to seconds is: s = µs ÷ 1e+6. To calculate the microsecond value in seconds first substitute the microsecond value into the preceding formula, and then perform the calculation. If we wanted to calculate 1 microsecond in seconds we follow these steps:

s = µs ÷ 1e+6

s = 1 ÷ 1e+6

s = 0.000001

In other words, 1 microsecond is equal to 0.000001 seconds.


Example Conversion

Let's take a look at an example. The step-by-step process to convert 3 microseconds to seconds is:

  1. Understand the conversion formula: s = µs ÷ 1e+6
  2. Substitute the required value. In this case we substitute 3 for µs so the formula becomes: s = 3 ÷ 1e+6
  3. Calculate the result using the provided values. In our example the result is: 3 ÷ 1e+6 = 0.000003 s

In summary, 3 microseconds is equal to 0.000003 seconds.


Converting seconds to microseconds

In order to convert the other way around i.e. seconds to microseconds, you would use the following formula: µs = s × 1000000. To convert seconds to microseconds first substitute the second value into the above formula, and then execute the calculation. If we wanted to calculate 1 second in microseconds we follow these steps:

µs = s × 1000000

µ1 = s × 1000000

µs = 1000000

Or in other words, 1 second is equal to 1000000 microseconds.


Conversion Unit Definitions

What is a Microsecond?

A microsecond (μs) is a unit of time measurement that represents one millionth (1/1,000,000) of a second. It is an extremely small unit of time and is commonly used in various scientific, technological, and computing applications.
To provide an example of a microsecond, let's consider the time it takes for an electrical signal to travel along a wire or through a circuit. Electrical signals typically propagate at speeds close to the speed of light, which is approximately 299,792,458 meters per second (m/s) in a vacuum. By calculating the time it takes for an electrical signal to cover a certain distance, we can determine the duration in microseconds.
For instance, suppose an electrical signal travels along a wire or through a circuit for a distance of 300 meters. Using the formula Time = Distance / Speed, we can calculate:
Time = 300 meters / 299,792,458 meters per second = 1.000001000 microsecond
Therefore, it takes approximately 1 microsecond for an electrical signal to travel a distance of 300 meters.
Microseconds are used in various applications that require precise timing and fast operations. They are commonly encountered in fields such as telecommunications, digital signal processing, computer networking, and high-speed computing. For example, in computer systems, the response times of memory operations and the execution times of certain instructions are often measured in microseconds.
In summary, a microsecond (μs) is a unit of time that represents one millionth of a second. The example of the time it takes for an electrical signal to travel 300 meters demonstrates how microseconds are used to measure extremely short durations, particularly in scientific, technological, and computing contexts.

What is a Second?

A second (s) is the base unit of time measurement in the International System of Units (SI). It is defined as the duration of 9,192,631,770 periods of the radiation corresponding to the transition between two hyperfine levels of the ground state of the cesium-133 atom.
To provide an example of a second, let's consider a simple action like snapping your fingers. The time it takes for the sound of a finger snap to occur is typically on the order of milliseconds, which is a fraction of a second. However, if we zoom in further, a second can be divided into smaller units such as milliseconds, microseconds, and nanoseconds.
For instance, if we take 1 second and divide it into smaller intervals of 1 millisecond each, we would have 1,000 milliseconds in a second. Each millisecond represents a thousandth of a second. This level of precision is often used in fields that require accurate time measurement, such as scientific experiments, computing, and telecommunications.
In everyday life, we use seconds as a fundamental unit of time to measure durations, intervals, and clock time. For example, when you count "1...2...3...," each count represents a second. When you check the time on a clock, it displays the hours, minutes, and seconds elapsed since midnight.
Additionally, seconds are crucial in measuring the speed of events, such as the time it takes for a car to accelerate from 0 to 60 miles per hour or the duration of a short video clip.
In summary, a second (s) is the base unit of time in the SI system. It represents the duration of 9,192,631,770 periods of the radiation corresponding to the transition between two hyperfine levels of the cesium-133 atom. The example of snapping your fingers highlights how seconds are used to measure everyday durations, and they can be further divided into smaller units like milliseconds for more precise time measurement.

Microseconds To Seconds Conversion Table

Below is a lookup table showing common microseconds to seconds conversion values.

Microsecond (µs)Second (s)
1 µs0.000001 s
2 µs0.000002 s
3 µs0.000003 s
4 µs0.000004 s
5 µs0.000005 s
6 µs0.000006 s
7 µs0.000007 s
8 µs0.000008 s
9 µs0.000009 s
10 µs0.00001 s
11 µs0.000011 s
12 µs0.000012 s
13 µs0.000013 s

Microseconds To Seconds Conversion Chart