Florida plans to become the first state to eliminate vaccine mandates, a longtime cornerstone of public health policy for keeping schoolchildren and adults safe from infectious diseases.
Florida Plans To Become First State To Eliminate All Childhood Vaccine Mandates

90