I may not be a history major, but I know more than most. Let's take only the wars that America has been involved in for example.
Does anyone see a trend here?
These wars were between secular governments, and they only negotiated after one side was defeated (there's that "D" word again!). What makes the Left think that there's the slightest chance that negotiations would work when dealing with absolute religious fanatics who haven't been defeated yet?
As usual, they haven't thought about it. It just sounds right, so it has to be right. Right?
Only if you're an emotional child with no ability to actually think.