overlooking a major effect of the Civil War
The end of slavery is the thing everyone remembers, and it's important that it was done, even if it took a war.
The thing everyone seems to overlook is that the Civil War ended the idea of state rights being superior to the Federal. The northern states probably didn't foresee that either, but the Civil War established that federal government is stonger and it's authority supercedes that of the individual states. The trend has continues.
As to there not being that much difference so the country united, the loser states were basically occupied by puppet government for a while and "pacified" back in the union. Don't get me wrong, I'm not advocating "the South shall rise again" or wishing the Confederacy had won. The US probably eventually gained strength from the Civil War, but it wasn't a everyone got over it and came together after surrender and lived happily ever after.
it was just a matter of which had enough money, men, and arms to sway it in their direction
Pragmatic and true, even though in the slavery aspect the moral side won, that statement acknowledges right or wrong doesn't determine winners.