Florida plans to become the first state to eliminate vaccine mandates, a longtime cornerstone of public health policy for keeping schoolchildren and adults safe from infectious diseases.

See Full Page