I recently re-read Fire Logic, the first volume in the fabulous Elemental Logic series by Laurie J. Marks, and found myself focusing on a key aspect of the legal system. In Shaftal, disputes and criminal activities are resolved by a Truthken, an air witch who can determine whether someone is telling the truth.
Not only can Truthkens discover whether someone is lying, but they can also deal with the subtle problem of someone who is technically telling the truth but is being fundamentally dishonest. Take this passage:
In any case, I would refuse to hear him as Zanja’s accuser, for he only loves the justice that serves his interests, and only sees the Law as a tool to achieve his desires.
How wonderful it would be if we could resolve all disputes and solve all crimes with such insight. However, unlike the people of Shaftal, we do not have elemental magic at our disposal. For us, figuring out the real truth is a difficult and flawed process.
Human beings lie. They dissemble. They leave things out – sometimes intentionally, sometimes unintentionally. And – perhaps most importantly – they color the truth in their own favor, often unconsciously.
Disinterested witnesses to an action are hard to come by and even they may have prejudices that affect the absolute truth of what they’re saying.
It’s really hard to figure any truth, much less the whole truth and nothing but. So, since we humans don’t have truthkens available to evaluate what people say, we turn to technology.
So far technology hasn’t been all that successful.
Take “lie detectors” – the polygraph machine that is supposed to figure out if someone is lying. Results from those tests are almost never admissible in court, because the accuracy is only so-so. I’ve had friends tell me they could pass a polygraph while lying, and while I probably couldn’t, I don’t doubt them.
There are drugs. There are all kinds of psychological theories. These days there are fMRI lie detectors. But so far none of these systems has anything like a perfect scorecard. That’s why they’re not allowed in court.
Science fiction comes up with effective lie detectors on a regular basis, though it seems to me that a lot of those stories are about dystopias. And in a lot of cases, I think the machines are good at finding technical truth, but not the whole truth and nothing but. I’d be hard-pressed to write a science fiction story in which technological lie detection was a good thing.
But assuming brain research allows us to build a perfect lie detector – one that would not only analyze the actual truth of a statement, but whether it was a complete truth – would we really want it?
I can think of circumstances where it would have saved people from untold misery. Recently in Texas, we had several people released from prison after many years because new evidence suggested – or proved – that they were not guilty. One case involved people who ran a day care center and were convicted of doing Satanic sex rituals – one of many fantastic cases that arose after people began to realize that sexual abuse of children was a real problem. Getting at the truth in these cases is difficult. I often wonder if actual cases of molestation – ones that might have had nothing to do with the day care workers – were overlooked in the rush to come up with a more exotic crime.
Another involved several lesbians who were convicted of abusing a young girl, who as an adult now says she was forced to testify against them by a homophobic relative and that nothing happened. Then there was Michael Morton, an innocent man who served 25 years in prison for the murder of his wife. Another man has now been convicted of that crime.
An effective lie detector might have protected all those people. It might also make it possible to lock up some dangerous people who are talented at getting off. Something that could differentiate between whether a shooter in Florida was really standing his ground because he feared for his life or just responding in anger to disrespect might prove valuable. Though even that probably couldn’t tell whether his fear for his life was reasonable.
I’m not sure accurate lie detection would lead to a more perfect world. There’s something to be said for a little fuzziness as to the truth. Especially the whole truth. I certainly don’t want everyone to know what I’m thinking even as I say something mild and noncommittal.
Besides, how can we come up with good stories if everyone knows the truth about everyone else?