Apply ethical frameworks — deontology, utilitarianism, virtue ethics, and justice theory — to analyze moral dilemmas and make principled decisions. Use this skill when the user presents a concrete moral dilemma, a decision with ethical implications, or needs a structured multi-framework ethical analysis, even if they say 'is this the right thing to do', 'what are the ethical implications of this decision', or 'evaluate this dilemma through different ethical lenses'.
Ethical analysis applies moral philosophy frameworks to real-world dilemmas. No single framework provides all answers — the value is in examining a situation through multiple ethical lenses and making the tensions explicit.
IRON LAW: Apply Multiple Frameworks, Not Just One
Different ethical frameworks can reach DIFFERENT conclusions for the same
dilemma. Analyzing through only one lens is incomplete. Apply at least
two frameworks and explicitly compare where they agree and disagree.
The disagreement IS the insight.
1. Deontology (Kant) — Duty-based ethics
2. Utilitarianism (Mill, Bentham) — Consequence-based ethics
3. Virtue Ethics (Aristotle) — Character-based ethics
4. Justice Theory (Rawls) — Fairness-based ethics
# Ethical Analysis: {Dilemma}
## Dilemma Statement
- Decision: {what must be decided}
- Options: A) {option} B) {option}
- Stakeholders: {who is affected}
## Framework Analysis
| Framework | Recommendation | Reasoning |
|-----------|---------------|-----------|
| Deontology | A / B | {duty-based reasoning} |
| Utilitarianism | A / B | {consequence calculation} |
| Virtue Ethics | A / B | {character-based reasoning} |
| Justice Theory | A / B | {fairness reasoning} |
## Convergence / Divergence
- Agree on: {where frameworks align}
- Disagree on: {where they diverge and why}
- Core tension: {the fundamental values in conflict}
## Recommendation
{Decision with explicit justification of which values are prioritized and which trade-offs are accepted}
Scenario: Should a company share user data with law enforcement without a warrant to help catch a criminal?
| Framework | Recommendation | Reasoning |
|---|---|---|
| Deontology | No | Users consented to terms that promise privacy. Breaking that promise violates a duty. Universalizing warrant-less sharing would destroy trust in all digital services. |
| Utilitarianism | Maybe | Depends on harm calculus: catching one criminal vs eroding privacy for millions. If the crime is serious enough, total utility might favor sharing. |
| Virtue Ethics | No | An honest, trustworthy company keeps its promises. A courageous company stands up to government pressure. |
| Justice (Rawls) | No | From behind the veil of ignorance, you'd want your data protected — especially if you're in a vulnerable group subject to wrongful surveillance. |
Convergence: 3 of 4 frameworks say no. The tension is between public safety (utilitarian) and individual privacy (deontological, rights-based) ✓
references/trolley-problems.mdreferences/business-ethics-cases.md