Do you think Hollywood has an obligation to tell the truth? In other words, should they work harder to make sure the stories they tell are accurate to the history they’re exploring?