Prison, Innovation and Counterintuition

Imagine sitting across from a convicted felon.  He’d violently beaten his girlfriend – breaking her arm, two ribs and causing her to go blind in her right eye.  You’ve seen pictures of the victim.  The person in front of you was a monster.

But that was 17 years ago.  Since then he’s been in prison, where he has an exemplary record of good behavior. He was only 19 when he committed his crime and although he came from an abusive home himself, he had no prior history of violence.  The night of the battery he’d learned his girlfriend was cheating on him and knowingly gave him HIV, which she confessed during a drunken argument.  Now he’s ashamed and remorseful.  6 years ago he founded a group to counsel fellow inmates on anger management.  He’s also earned an online associate’s degree in social work.

As a member of his Parole Board, what are you going to do?  Keep him in jail another 8 years?  Set him free?  What information do you need to decide?  Take a second to think about it.  What information would you need before deciding his fate? 

Given this scenario, a few people think he should never get out.  A few others think he should be released immediately.  Yet most people want to know how remorseful he is and whether he’s “safe” to release back into society.  They want to know if he’s “really” sorry.  Is his remorse genuine?  When he says he’ll never hurt anyone again, they want to look into his eyes and know he’s telling the truth.  This is the very essence of a parole hearing.

In 1928, a sociologist named Ernest Burgess took a different approach.  Instead of doing interviews he looked at the parole outcomes for 3,000 parolees.  He then came up with a list of 21 objective factors (ex. chronological age, number of previous offenses) and simply counted the factors in each case.  He next figured out what scores tended to predict good or bad parole outcomes.

When Ernest’s short checklist was compared with the clinical judgment of prison psychiatrists (using their combined expertise, education and intuitions), his method was a lot more accurate at predicting parole failure.[i]  Since then, statistical and algorithmic methods have continued to be refined and are now used in varying degrees by most parole boards in the US with increasing success.

Now imagine sitting across from your colleague at work.  She’s a Stanford MBA and has worked at your company 20 years.  She’s excelled in her career, is well liked and respected.  She wants your help raising a $20 million budget to start a new corporate group.  They plan to commercialize a new innovative technology that you find exciting.  Finance says the numbers look good, but of course they’re all based on future forecasts (and the future is always uncertain).  There seems to be a need in the market but it’s always hard to know in advance how much (and how fast) customers will buy.

As a member of the Funding Committee, what are you going to do?  Vote to fund the opportunity?  Say no? What information do you need to decide?  Take a second to think about it.  What information would you need before deciding the innovation’s fate?

Given this scenario, a few people always say “no.”  A few others always say “yes.” Yet most people want to know how good the idea really is and if it’s the right team to make it happen.  They want to know if it’s “really” a good innovation. Is the leader credible?  Is the market need and financial forecast genuine? When she says it’s a great opportunity, they want to look into the leader’s eyes and know she’s telling the truth.  This is the very essence of a funding review.

In 2006, some researchers took a different approach.  Instead of relying on traditional due diligence alone they began looking at outcomes for (ultimately) thousands of new businesses.  They came up with a list of objective factors (ex. price, performance) and simply counted the factors in each case.  They next figured out what combinations of factors tended to predict good or bad investment outcomes.

When their algorithms were compared with the investment judgment of senior managers and executives (using their combined expertise, education and intuitions), the statistical methods were a lot more accurate at predicting business survival and failure. Since then, these statistical and algorithmic methods have continued to be refined and are now used by leading businesses and investment firms with increasing success.

In both parole and innovation, empirical rigor and quantitative probabilities have demonstrated consistent, significant, objective improvements in abilities to predict critical outcomes.  In the words of Grove and Meehl, pioneers in the demonstrably higher accuracy field of statistical psychology (vs. intuitive clinical psychology):

“…a practitioner who claims not to need any statistical or experimental studies but relies solely on clinical experience as adequate justification, by that very claim is shown to be a nonscientifically minded person whose professional judgments are not to be trusted.  Further, when large amounts of… money are expended on personnel who employ unvalidated procedures… even a united front presented by the profession involved should be given no weight in the absence of adequate scientific research to show that they can do what they claim to do.

Regardless of whether one views the issue as theoretically interesting, it cannot be dismissed as pragmatically unimportant.  Every single day many thousands of predictions are made by parole boards, deans’ admission committees, psychiatric teams, and juries hearing civil and criminal cases.  Students’ and soldiers’ career aspirations, job applicants’ hopes, freedom of convicted felons or risk to future victims, millions of taxpayer dollars expended by court services, hundreds of millions involved in individual and class action lawsuits… and so forth – these stakes are high indeed.  To use the less efficient of two prediction procedures in dealing with such matters is not only unscientific and irrational, it is unethical.”[ii]

There are many prisons.  Some are literal.  Others come from an inability to accept even the most beneficial forms of change.  However the good news is, improvement happens.  We learn.  We unlock our world’s mysteries one generation after another, one piece of data at a time, even when it comes to prison, innovation and counterintuition.

 

 



[i] Grove & Meehl, Comparative Efficiency of Informal (Subjective, Impressionistic) and Formal (Mechanical, Algorithmic) Prediction Procedures: The Clinical-Statistical Controversy, Psychology, Public Policy, and Law, 2, 293-323 (1996); see also Burgess, Factors determining success or failure on parole, In A. A. Bruce (Ed.), The workings of the interminate sentence law and the parole system in Illinois, (pp. 205 – 249). Springfield, IL: Illinois Committee on Indeterminate-Sentence Law and Parole (1928).

[ii] Grove & Meehl, Comparative Efficiency of Informal (Subjective, Impressionistic) and Formal (Mechanical, Algorithmic) Prediction Procedures: The Clinical-Statistical Controversy, Psychology, Public Policy, and Law, 2, 293-323 (1996).

This Post Has 4 Comments

  1. Alastair

    An interesting article – reinforces the need to have defined key evaluation criteria as part of an innovation process.

  2. Toby

    Thanks Thomas. For a forthcoming piece on early stage investing one of the firms I will be examining are Google Ventures, which publicizes heavily its use of statistical tools to make decisions. It would be great to know which other corporate venturing units people in the group think use the most convincing scientific techniques to invest in small companies.

  3. Kermit

    Coming from a perspective that is probably way out of scope, this contains an excellent nugget of information for anyone trying and proving a case. In my situation, this would probably be one of convincing my customers to change their current infrastructure approach and employ something new. If I can show a case study that’s qualitatively AND quantitatively similar, and that case study reports significant improvements in desired areas, I’ll have a much better chance of selling the customer on the idea. The trick, then, is to find a qualitatively and quantitatively similar case study to draw from. Not so easy in the HPC realm, but maybe not impossible.

  4. Pingback: itemprop="name">URL

Leave a Reply