Florida is now working to become the first U.S. state to eliminate vaccine mandates. Earlier today, the State Surgeon General called current requirements in schools and elsewhere “immoral” intrusions on people’s rights. According to the Associated Press, Florida’s move is a departure from research that has shown vaccines to be the most effective way to stop the spread of communicable diseases. LiveNOW’s Austin Westfall is learning about the history of vaccine mandates with Dr. Michael Burgess, a former Congressman (R-TX) and physician.

See Full Page