The U.S. is home to some of the world's top hospitals and doctors. However, Americans are paying more for health care, and not reaping the benefits.