Why did things have to get this obvious before people realized the truth?
Why did things have to get this obvious before people realized the truth?
It feels like topics I used to only see on r/conspiracy—like Epstein and the deep state—are now all over mainstream subreddits.
The US is doing what it always has done, only now the pretexts are weaker than ever. Did things really have to get this obvious before people finally realized that western governments only care about what's best for the oligarchy?