Biden Officially Announces Vaccine Mandate
We released broke a story that the DOJ had announced they would allow vaccine mandates and on Thursday Biden announced the mandates are on the way!
Biden was asked if he would like to see companies and schools move in the direction of mandating vaccines and Biden responded by saying he had his DOJ approve such a thing.
Meaning your job, child's education, going to the grocery store, going out to eat or a night of fun, driving and every essential part of life could be yanked from you if you aren't willing to get the same vaccine Biden and Harris said they wouldn't take last fall.
The actions of the left time and time again prove this is nothing more than a power grab. If it was really about keeping the American people save from a virus, they wouldn't have spent last fall telling people not to take the vaccine because they wanted to make President Trump look bad.
But as soon as they steal the office now all of a sudden the vaccine is safe and demanding everyone take it. These are the actions are people that care about nothing but enriching themselves at the cost of our country.
What do you think of the coming vaccine mandate? Let us know in the comments below.
Biden on states, private companies, and schools mandating COVID-19 vaccines:— Benny (@bennyjohnson) July 29, 2021
"I would like to see them continue to move in that direction... it's still a question whether the Federal government can mandate the whole country." pic.twitter.com/ZSFd1JCB6s
Help Us Fight CensorshipBig tech is silencing Republican voices but we are here to help keep you informed by hunting down real news and sharing it with you. If you like the content we share please feel free to subscribe to our newsletter.