Skip to main content
Tag

War on Drugs

“The War on Drugs” is an American term usually applied to the United States government’s campaign of prohibition of drugs, military aid, and military intervention, with the stated aim being to reduce the illegal drug trade.

10 Top Drug War Stories of 2013

By Editors' Choice

2013 will go down in history as the beginning of the end of our disastrous war on drugs. Fifty-eight percent of Americans nationally support marijuana legalization. World leaders like former U.N. head Kofi Annan are calling for an end to the drug war.

Read More