My question relates to sound travel "thunder from a distant storm" for example. Assuming I understand this correctly...As the temperature rises, molecules bump into each other faster, allowing the energy from the sound wave to travel faster. That being said, on two given days with different tempaeratures, and a storm at the same distance with the same lightning strike - will the sound arrive faster on the warm day? If so, does that skew the "sound count "one thousand one" a little?
Posted May 22, 2007
MIKE MOSS SAYS: Michael, to give a range of typical near surface temperatures that might be experienced with thunderstorms in the vicinity, consider that sound will travel at roughly 755 mph through 50 degree air, 770 mph through 70 degree air, and 785 mph through 90 degree air.
On the other hand, the "rule of thumb" we've all learned is to figure about 1 mile for every 5 seconds as the distance to a lightning strike. If you convert this to miles per hour, you come up with 720 (this equates to the speed of sound for about 2 degrees F, a temperature typical this time of year for about 20,000 feet above the ground), a notable difference from the numbers given above. So why do we use the 5 seconds per mile rule?
First, it is simple, easy to remember and easily calculated. Second, since lightning may occur completely inside a cloud, or may extend from high up the side of a cloud down to the ground in some strikes, there are parts of the lightning channel that are emitting sound waves that will travel through cooler air at higher altitudes along part of their trip to your ear, so that basing the speed simply on the temperature near the surface isn't always appropriate. In fact, for cloud-to-ground strikes more than a few miles away, the sound from the lowest portion of the strike will be refracted upward and may not even be heard at your location. Instead the sound from somewhat higher above the ground is more likely to reach you.
Finally, this is all intended to be a rough estimate anyway, and not a very precise method of locating the exact distance. Consider that the percentage difference between 720 mph and 770 mph is only about a 7% increase. That means if you counted 15 seconds and determined that a strike was about three miles away using the 720 mph-based rule, in reality it might be 2.8 miles away. This difference is not very meaningful in the context of the accuracy to which you can distinguish and time a given strike in most instances, and for purposes of doing a quick estimate in your head, it is much easier to use 5 seconds per mile as opposed to , for example, the 4.7 seconds per mile that would equate to about 770 mph. Likewise, in terms of nice round numbers, using 5 seconds (720 mph) comes much closer to reality than either 4 seconds (900 mph) or 6 seconds (600 mph).