Science of D-Day: Forecasts played big role in Allied invasion
Posted June 5
Updated June 6
You might remember Greg Fishel’s Veterans Day discussion of the role weather forecasting played in the D-Day invasion on June 6, 1944.
Group Captain James Stagg, the British Royal Air Force meteorologist, spotted a break in the weather that the German meteorologists had missed, enabling the landing of 150,000 allied troops along a 50-mile stretch of beach during the limited daytime window.
Another British scientist, Oceanographer Arthur Thomas Doodson, was also instrumental in making D-Day the surprise invasion it was.
The cover of night was needed during the crossing of the English Channel, but Allied Air Forces favored a late rising moonlight to navigate by. The Army favored high tide, reducing the distance of exposed beach soldiers would face under fire.
German Field Marshal Erwin Rommel anticipated the invasion and the cover high tide would provide and ordered the installation of hidden obstacles that would rip out the bottoms of landing craft. This forced the Navy to insist on a landing time closer to low tide when obstacles were not submerged and at first light to enable demolition teams to open up corridors for the following landing craft.
The Navy had learned the importance of accurate tidal forecasts the previous year in the Pacific Theater at the Battle of Tarawa, when landing craft were stranded on a reef by neap tides on Nov. 20, 1943. Neap tides are created by the moon’s position relative to the sun and the reduced gravitational pull of the pair.
Doodson calculated that only June 5, 6 or 7, 1944, had the right combination of the tides and moonlight.
The complex process of predicting tides dates back to Isaac Newton’s Principia, published in 1687. Improvements in equations were made throughout the 1700s until more accurate representations of lunar motion were applied around the turn of the century by Arthur Thomas Doodson.
The equations have remained pretty much the same since, though the calculators have changed through the years.
Purpose-built analog computers were built as early as the 1800s to carry out the complex trigonometry contained in these equations. The machines consisted of a drive wheel and a dozen or more gears and pulleys that could be adjusted with the tidal constituents (parameters such as mean water level, amplitude, speed, etc.)
Early machines accounted for just eight constituents. The machine designed by Doodson and used for D-Day calculations was capable of computing 26 constituents. Today, NOAA bases its calculations on 37 constituents (example: Cape Hatteras Fishing Pier).
These early computers were so important to the war effort that not only were the machines and their data classified (publication of tide prediction tables were banned during WWII), the British kept two machines running seven days a week. The machines were kept separated at the Liverpool Tidal Institute at Bidston Observatory to minimize the chance of losing them both as bombs fell in the area.
Doodson’s calculations of the fast-rising tides for each landing beach produced diagrams describing tidal and illumination conditions for military planners. Only June 5-7 had the right combination of tides, moonlight, and twilight.
NOAA (then the U.S. Coast and Geodetic Survey) continued to use mechanical computers to predict tides until the mid-1960s when mainframe computers running software written in FORTRAN took over the job.
Tony Rice is a volunteer in the NASA/JPL Solar System Ambassador program and software engineer at Cisco Systems. You can follow him on Twitter @rtphokie.