Login

Please fill in your details to login.





009: the ethics of automation: who is responsible when robots make mistakes (ks5)

Explore the ethical and legal implications of autonomous systems, focusing on liability, the Trolley Problem, and algorithmic accountability.

The Algorithmic Judge


The Scenario
You are a Lead Ethics Consultant for a major autonomous vehicle manufacturer. A prototype vehicle was involved in a complex accident where it had to choose between two unavoidable collision paths. Legal action is pending. The Board of Directors needs a detailed briefing note explaining the ethical logic governing the car's decision-making system and where the legal liability likely sits based on current frameworks.

image
The Persona: The Responsible Innovator
You explore the intersection of EthicsI have no idea what this means, LawI have no idea what this means, and SocietyI have no idea what this means. You move beyond asking "Can we build this?" to "Should we build this, and who is responsible?". You apply rigorous ethical frameworks to technical problems.

1
Researching the Legal Framework

To advise the board, you must first understand the current classification of automation.
Research the SAE Levels of Driving Automation (Levels 0-5).
Define the specific point at which legal liability shifts from the human driver to the automated system.
Investigate the concept of Algorithmic Accountability and the challenges in proving "negligence" in software development.

2
The Trolley Problem in Engineering

The accident involved a "forced choice" scenario. You need to explain the ethical theory behind the car's programming.
Compare UtilitarianI have no idea what this means ethics (minimising total harm) against DeontologicalI have no idea what this means ethics (following strict rules/duty).
Find a real-world case study or a theoretical framework (such as the German Ethics Commission on Automated and Connected Driving) that proposes rules for these scenarios.

3
Drafting the Liability Briefing

Open a new document titled Liability Briefing Note.

Section 1: Classification. Explain the SAE level of the vehicle (assume Level 4) and what this implies for liability.
Section 2: The Ethical Logic. Describe the decision matrix the car likely used. Was it Utilitarian? Why is this controversial?
Section 3: The Verdict. Based on your research, advise who is likely liable: the Manufacturer (hardware failure), the Developer (logic error/bias), or the Passenger (failure to intervene). Justify your answer with reference to the "Black Box" problem in AI.

Outcome
A professional briefing note using correct legal and technical terminology.
Accurate definition of SAE levels and their impact on liability.
A clear analysis of Utilitarian vs Deontological approaches to the Trolley Problem.
A justified conclusion on liability referencing the complexity of AI decision-making.
Last modified: January 9th, 2026
The Computing Café works best in landscape mode.
Rotate your device.
Dismiss Warning