WASHINGTON — Most Americans say kids should be vaccinated to attend school. However, as Florida plans to become the first state to eliminate childhood vaccine mandates, U.S. adults are also less likely to think these immunizations are important than they…

See Full Page