Keeping our Clients Safe

Reynold Greenlaw
Asimov’s laws - 3 laws of robotics
Asimov’s laws - 3 laws of robotics

At the recent ContrOCC workshop on ad-hoc reporting John Boyle was reminded of a paper he wrote in 1984  suggesting that the ‘on button’ for a simple calculator (one that could perform addition, subtraction, multiplication and division) should be replaced by a ‘test’ button.  On pressing the test button, the screen would display a simple, but appropriate, test: for example, 7 * 8.  If the human entered the correct answer, only then would the calculator switch on and the user be allowed to proceed.  The same thought occurred during the ContrOCC ad-hoc reporting workshop when an expert from Hertfordshire talked about the dangers of users building and believing incorrect budget reports in software applications.  It seemed to John that, before the user should be allowed to run their report, in ContrOCC they should have to answer a suitable question such as “What is the 2010/11 budget for cost centre ‘Older People – Day Care’”.  Only if they get the answer right within say 10%, thereby demonstrating an understanding of their data, should they be allowed to use the report.

This concern, that the system we create should protect us from themselves, was most famously addressed by Asimov in his famous 3 laws of robotics.  By adapting Asimov’s laws, we can surely justify implementing tests in the OCC products

Asimov’s Laws

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

An interesting case of the failure of applications to embed the 3rd law was highlighted by the major USA power cuts in 2003.  The cause of the problem turned out to be engineers over-riding instructions from their power management system.  Instead of isolating an “area of failure” as recommended by their IT system, the engineers thought they could fix the problem before it cascaded forwards. Three times the engineers over-rode the IT before the problem got out of control.  Given that OCC’s National Grid software helps engineers manage their grid maintenance, it may be that these laws should appear in more applications!