Once again.

America didn't have a Civil War. A Civil War is two factions fighting for control of the country.

That didn't happen.

The war was the Federal Government imposing itself on a people who didn't want to be ruled by it any longer.

It's called tyranny.