That's a question that comes up from time to time in reference to the fact that we often make mention of near-normal, above-normal or below normal values for temperature, precipitation, and so on, and the fact that during some parts of the year, more than a day or two at a time with "normal" anything is actually something of a rarity, although this can depend a little on how you interpret things.
At its most basic level, the term "normal" in a climatological sense refers approximately to the average value of a quantity over a 30 year time span ending with the most recent "zero" year. The World Meteorological Organization settled on this averaging period as a reasonable compromise between having the normals too sensitive to short-term fluctuations and variability if the period was much shorter, but failing to realistically capture and represent meaningful climate shifts if the period was too much longer. In addition, updating the normals every ten years keeps them reasonably stable for planning and engineering purposes. This 30-year definition is why it's important that we specify the term "normal" rather than using the more generic "average," since that term leaves in question whether it's a six-week, one year, or 50 year average, for example, (among infinite possibilities) that's being discussed. Admittedly, we aren't always as careful in hewing to this distinction on-air as we should be.
So, how close should we expect the conditions on any given day to match the normal for that day? It depends somewhat on the time of year. In wintertime, and to some extent during the later fall and early spring months, the passage of fairly strong frontal systems, low and high pressure areas and associated variations in cloud cover, airmass properties, and precipitation fields means that we can go for days or weeks swinging back and forth between "above normal" and "below normal" temperature and precipitation, without having many days actually fall close to the 30-year average values. In the dog days of summer, on the other hand, we frequently have significant stretches of time with temperatures that fall within just a few degrees of the normal value each day, thanks to slow-changing pressure patterns and persistent warm, humid airmasses over the region.
As a result of those seasonal characteristics, one can gauge the likelihood of falling somewhere close to the normal value by looking at the standard deviation of temperature for a given day, which is a measure of the statistical variability that attaches to some avaraged quantity. As an example, this link shows a table of the daily raw 30-year average high and low temperature and precipitation for the Raleigh-Durham airport, along with the standard deviations of the highs and lows. Note that around late January and early March, the standard deviation for maximum temperature is almost +/- 12 degrees, meaning that you have almost a 70 percent historical chance that the actual high on a given date will fall within 12 degrees of the average value, and likewise there is about a 95% chance that the high will fall within 24 degrees (2 standard deviations) of that average, and so on. Conversely, around mid-July the standard deviation is only a little over +/- 5 degrees, indicating much more year to year consistency in the high temperature at that time of year.
Does all this mean that to get the "normal" temperatures for a particular date, we can simply average the observations from the appropriate 30-year period and get the numbers you see on our Almanac screen online and on air? Not quite, as the National Climatic Data Center (NCDC) applies a couple more steps in order to provide normal values that change smoothly from day to day and season to season in a logical manner. Even with a 30-year dataset, there will be some "noise" in the averaged data that we'd rather not see. For example, in late spring we would expect that temperatures would gradually climb from day to day and week to week, but the "raw" averages may show two days climbing, then a cooler average, then warmer, then a lot warmer, etc, though in most cases the raw numbers do follow the expected trend rather closely. In order to eliminate potential for this effect, NCDC actually averages all the temperatures for a given month over thirty years, then fits a smooth curve (called a "cubic spline") through those twelve values such that the curve reaches its lowest value at the appropriate point in winter and vice versa. Then, the normal values for each day of the year are taken from that curve and rounded to the nearest whole degree, and may vary a bit from the raw averages for each day. To see how this works out, just compare the raw average link from the previous paragraph to this link showing the "normal" daily values for RDU.
Finally, for those of you interested in more details on the procedures used by NCDC, including how precipitation normals are generated, see http://www.ncdc.noaa.gov/oa/climate/normals/usnormalsprods.html#CLIM84.