Engineering Ethics Tutorial

  • Engineering Ethics Tutorial
  • Engineering Ethics - Home
  • Engineering Ethics - Introduction
  • Engineering Ethics - Moral Issues
  • Moral Dilemmas
  • Moral Autonomy
  • Kohlberg’s Theory
  • Heinz’s Dilemma
  • Engineering Ethics - Gilligan’s Theory
  • Professions and Professionalism
  • Engineering Ethics - Ethical Theories
  • Social Experimentation
  • Balanced Outlook on Law
  • Responsibility for Safety
  • Chernobyl’s Case Study
  • Bhopal’s Gas Tragedy
  • Responsibilities of Engineers
  • Engineering Ethics - Confidentiality
  • Rights of Engineers
  • Engineering Ethics - Global Issues
  • Moral Leadership
  • Engineering Ethics Useful Resources
  • Engineering Ethics - Quick Guide
  • Engineering Ethics - Resources
  • Engineering Ethics - Discussion
  • Selected Reading
  • UPSC IAS Exams Notes
  • Developer's Best Practices
  • Questions and Answers
  • Effective Resume Writing
  • HR Interview Questions
  • Computer Glossary

Engineering Ethics - Chernobyl’s Case Study

The Chernobyl disaster was nuclear accident that occurred at Chernobyl Nuclear Power Plant on April 26, 1986. A nuclear meltdown in one of the reactors caused a fire that sent a plume of radioactive fallout that eventually spread all over Europe.

Chernobyl nuclear reactor plant, built at the banks of Pripyat river of Ukraine , had four reactors, each capable of producing 1,000 MWs of electric power.

On the evening of April 25 th 1986 , a group of engineers, planned an electrical engineering experiment on the Number 4 Reactor. With their little knowledge on Nuclear physics, they thought of experimenting how long turbines would spin and supply power to the main circulating pumps following a loss of main electrical power supply.

Following is an image of the Chernobyl nuclear power plant.

chernobyl case study in professional ethics

What Led to the Disaster?

Let us now see what led to the disaster.

The reactor unit 4 was to be shut down for routine maintenance on 25 April 1986. But, it was decided to take advantage of this shutdown to determine whether, in the event of a loss of station power, the slowing turbine could provide enough electrical power to operate the main core cooling water circulating pumps, until the diesel emergency power supply became operative. The aim of this test was to determine whether cooling of the core could continue in the event of a loss of power .

Due to the misconception that this experiment belongs to the non-nuclear part of the power plant, it was carried out without a proper exchange of information between the testing department and the safety department. Hence the test started with inadequate safety precautions and the operating personnel were not alerted to the nuclear safety implications of the electrical test and its potential danger.

The Experiment

According to the test planned, the Emergency Core Cooling System (ECCS) of the reactor, which provides water for cooling the reactor core, was shut down deliberately.

For the test to be conducted, the reactor has to be stabilized at about 700-1000 MW prior to shut down, but it fell down to 5000 MW due to some operational phenomenon. Later, the operator working in the night shift committed an error, by inserting the reactor control rods so far. This caused the reactor to go into a near-shutdown state, dropping the power output to around 30 MW.

Since this low power was not sufficient to make the test and will make the reactor unstable, it was decided to restore the power by extracting the control rods, which made the power stabilize at 200 MW. This was actually a violation to safety law, due to the positive void co-efficiency of the reactor. Positive void coefficient is the increasing number of reactivity in a reactor that changes into steam. The test was decided to be carried out at this power level.

Actually, the reactors were highly unstable at the low power level, primarily owing to the control rod design and the positive void coefficient factors that accelerated nuclear chain reaction and the power output if the reactors lost cooling water.

The following image shows the reactor 4 where the experiment was conducted. This picture was taken after everything was restored.

Emergency Core Cooling System

At 1:23, on April 26 th 1986, the engineers continued with their experiment and shut down the turbine engine to see if its inertial spinning would power the reactor’s water pumps. In fact, it did not adequately power the water pumps and without the cooling water the power level in the reactor got surged.

The water pumps started pumping water at a slower rate and they together with the entry to the core of slightly warmer feed water, may have caused boiling (void formation) at the bottom of the core. This, along with xenon burn out, might have increased the power level at the core. The power level was then increased to 530 MW and continued to rise. The fuel elements were ruptured and lead to steam generation, which increased the positive void coefficient resulting in high power output.

The high power output alarmed the engineers who tried to insert all the 200 control rods, which is a conventional procedure done in order to control the core temperature. But these rods got blocked half the way, because of their graphite tip design. So, before the control rods with their five-meter absorbent material, could penetrate the core, 200 graphite tips simultaneously entered the core which facilitated the reaction to increase, causing an explosion that blew off the 1,000-ton heavy steel and concrete lid of the reactor, consequently jamming the control rods, which were halfway down the reactor. As the channel pipes begin to rupture, mass steam generation occurred as a result of depressurization of the reactor cooling circuit.

As a result, two explosions were reported. The first one was the initial steam explosion. Eventually, after two to three seconds, a second explosion took place, which could be possibly from the build-up of hydrogen due to zirconium-steam reactions.

All the materials such as Fuel, Moderator and Structural materials were ejected, starting a number of fires and the destroyed core was exposed to the atmosphere. In the explosion and ensuing fire, more than 50 tons of radioactive material were released into the atmosphere, where it was carried by air currents. This was 400 times to the amount of radioactive materials released at the time of Hiroshima bombing.

Fatal Effects of the Disaster

The Chernobyl Nuclear Power Plant disaster in Ukraine, is the only accident in the history of commercial nuclear power to cause fatalities from radiation.

There were many fatal effects due to the radiation released. A few of the effects are listed below −

Two workers had died. One immediately got burnt to ashes after the accident, while the other was declared dead at the hospital within few hours of admission.

28 emergency workers and staff died within 4 months of the accident due to the thermal burns and the radiation effect on their bodies.

This accident created 7,000 cases of thyroid cancer.

Acute radiation syndrome (ARS) was diagnosed in 237 people, who were on-site and involved in cleaning up

The land, air and ground water were all contaminated to a great extent.

The direct and indirect exposure to radiation led to many severe health problems such as Downs Syndrome, Chromosomal Aberrations, Mutations, Leukemia, Thyroid Cancer and Congenital Malfunctions, etc.

A number of plants and animal faced destruction as after-effect.

Advertisement

Advertisement

Using the Chernobyl Incident to Teach Engineering Ethics

  • Original Paper
  • Published: 15 December 2011
  • Volume 19 , pages 625–640, ( 2013 )

Cite this article

chernobyl case study in professional ethics

  • William R. Wilson 1  

3407 Accesses

21 Citations

Explore all metrics

This paper discusses using the Chernobyl Incident as a case study in engineering ethics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster and to defend their decisions in a mock debate. The results of student surveys and the Engineering and Science Issues Test indicate that the approach is very popular with students and has a positive impact on moral reasoning. The approach incorporates technical, communication and teamwork skills and has many of the features suggested by recent literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

chernobyl case study in professional ethics

Overcoming the Challenges of Teaching Engineering Ethics in an International Context: A U.S. Perspective

chernobyl case study in professional ethics

What Do We Teach to Engineering Students: Embedded Ethics, Morality, and Politics

chernobyl case study in professional ethics

The Impact of Ethics Across the Curriculum at Union College, 2006–2017

Explore related subjects.

  • Medical Ethics

Abbreviations

Defining issues test version 2

Engineering and science issues test

Accrediting board for engineering and technology

Reaktor Bolshoy Moshchnosti Kanalniy

National society of professional engineers

ABET. (2010). Criteria for accrediting engineering programs, 2011 – 2012 review cycle . Reference document. Accrediting Board for Engineering and Technology. http://www.abet.org/Linked Documents-UPDATE/Program Docs/abet-eac-criteria-2011-2012.pdf. Accessed July 15, 2011.

Ballantyne, R., & Bain, J. (1995). Enhancing environmental conceptions: An evaluation of cognitive conflict and structured controversy learning units. Studies in Higher Education, 20 (3) . Available from: EBSCOhost. Accessed August 3, 2011.

Bernstein, J. L., & Meizlish, D. S. (2003). Becoming congress: A longitudinal study of the civic engagement implications of a classroom simulation. Simulation & Gaming, 34 (2), 198–219.

Article   Google Scholar  

Bero, B., & Kuhlman, A. (2010). Teaching ethics to engineers: Ethical decision making parallels the engineering design process. Science and Engineering Ethics . June 04 2010 (Online 1st).

Bird, S. R., & Erickson, K. A. (2010). A constructive controversy approach to “case studies”. Teaching Sociology, 38 (2), 119–131.

Borenstein, J., Drake, M. J., Kirkman, R., & Swann, J. L. (2010). The engineering and science issues test (ESIT): A discipline-specific approach to assessing moral judgment. Science and Engineering Ethics, 16 (2), 387–407.

Brown, K. M. (1994). Using role play to integrate ethics into the business curriculum: A financial management example. Journal of Business Ethics, 13 (2), 105–110.

Brummel, B. J., Gunsalus, C. K., Anderson, K. L., & Loui, M. C. (2010). Development of role-play scenarios for teaching responsible conduct of research. Science and Engineering Ethics, 16 (3), 573–589.

Colby, A., & Sullivan, W. M. (2008). Ethics teaching in undergraduate engineering education. Journal of Engineering Education, 97 (3), 327–338.

Crain, W. C. (2011). Kohlberg’s stages of moral development. In Theories of development: Concepts and applications (pp. 157–179). Boston: Prentice Hall.

D’Eon, M., & Proctor, P. (2001). An innovative modification to structured controversy. Innovations in Education and Teaching International, 38 (3), 251–256.

DeNeve, K. M., & Heppner, M. J. (1997). Role play simulations: The assessment of an active learning technique and comparisons with traditional lectures. Innovative Higher Education, 21 (3), 231–246.

Doron, I. (2007). Court of ethics: Teaching ethics and ageing by means of role-playing. Educational Gerontology, 33 (9), 737–758.

Drake, M. J., Griffin, P. M., Kirkman, R., & Swann, J. L. (2005). Engineering ethical curricula: Assessment and comparison of two approaches. Journal of Engineering Education, 95 (2), 223–231.

Harris, C. E., Davis, M., Pritchard, M. S., & Rabins, M. J. (1996). Engineering ethics: What? Why? How? And when? Journal of Engineering Education, 85 (2), 93–96.

Hatfield, D. B., & Zelinski, B. J. (2010). Computational materials engineering: A tool whose time has come . Technology Today 2010, Issue 2. Raytheon web publication. http://www.raytheon.com/technology_today/2010_i2/comp_eng.html . Accessed February 12, 2011.

Haws, D. R. (2001). Ethics instruction in engineering education: A (mini) meta-analysis. Journal of Engineering Ecutation, 90 (2), 223–229.

Google Scholar  

Herkert, J. R. (2002). Continuing and emerging issues in engineering ethics education. The Bridge, 32 (3). National Academy of Engineering web publication. http://www.nae.edu/Publications/Bridge/EngineeringEthics7377/ContinuingandEmergingIssuesinEngineeringEthicsEducation.aspx .

Hertel, J. P., & Millis, B. J. (2002). Using simulations to promote learning in higher education: An introduction . Sterling, VA: Stylus Pub.

Holtzapple, M. T., & Reece, D. (2008). Concepts in engineering (2nd ed.). Dubuque, Iowa: McGraw-Hill.

Jensen, G. M., & Richert, A. E. (2005). Reflection on the teaching of ethics in physical therapist education: Integrating cases, theory, and learning. Journal of Physical Therapy Education, 19 (3), 78–85.

Johnson, B. J., & Corser, R. (1998). Learning ethics the hard way: Facing the ethics committee. Teaching of Psychology, 25 (1), 26–28.

Johnson, D. W., & Johnson, R. T. (1979). Conflict in the classroom: Controversy and learning. Review of Educational Research, 49 (1), 51–70.

Johnson, D. W., & Johnson, R. (1985). Classroom conflict: Controversy versus debate in learning groups. American Educational Research Journal, 22 (2), 237–256.

Johnson, D. W., Johnson, R. T., & Smith, K. A. (2000). Constructive controversy: The educative power of intellectual conflict. Change, 32 (1), 28–38.

Jorenby, M. K. (2007). Comics and war: Transforming perceptions of the other through a constructive learning experience. Journal of Peace Education, 4 (2), 149–162.

Krain, M., & Lantis, J. S. (2006). Building knowledge? Evaluating the effectiveness of the global problems summit simulation. International Studies Perspectives, 7 (4), 395–407.

Kraus, R. (2008). You must participate: Violating research ethical principles through role-play. College Teaching, 56 (3), 131–136.

Loui, M. C. (2009). What can students learn in an extended role-play simulation on technology and society? Bulletin of Science, Technology & Society, 29 (1), 37–47.

Mitchell, J. M., Johnson, D. W., & Johnson, R. T. (2002). Are all types of cooperation equal? Impact of academic controversy versus concurrence-seeking on health education. Social Psychology of Education, 5 (4), 329–344.

Newberry, B. (2004). The dilemma of ethics in engineering education. Science and Engineering Ethics, 10 (2), 343–351.

Oakes, W. C., Leone, L. L., & Gunn, C. J. (2009). Engineering your future: A comprehensive introduction to engineering, 2009–2010 Ed . Chesterfield, MO: Great Lakes Press.

Poling, D. A., & Hupp, J. M. (2009). Active learning through role playing: Virtual babies in a child development course. College Teaching, 57 (4), 221–228.

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93 (3), 223–231.

Raisner, J. A. (1997). Using the “ethical environment” pardigm to teach business ethics: The case of the maquliadors. Journal of Business Ethics, 16 (12/13), 1331–1346.

Rest, J. (1999). Postconventional moral thinking : A Neo - Kohlbergian approach [e-book]. Ipswich, MA: Lawrence Erlbaum Associates, Inc. Available from: EBSCOhost. Accessed July 19, 2011.

Rest, J. R., Narvaez, D., Thoma, S. J., & Bebau, M. J. (1999). DIT2: Devising and testing a revised instrument of moral judgment. Journal of Educational Psychology, 91 (4), 644–659.

Rest, J., Thoma, S. J., Narvaez, D., & Bebau, M. J. (1997). Alchemy and beyond: Indexing the defining issues test. Journal of Educational Psychology, 89 (3), 498–507.

Rosnow, R. L. (1990). Teaching research ethics through role-play and discussion. Teaching of Psychology, 17 (3), 179–181.

Santi, P. M. (2000). Ethics exercises for civil, environmental, and geological engineers. Journal of Engineering Education, 89 (2), 151–159.

Sanyal, R. N. (2000). An experiential approach to teaching ethics in international business. Teaching Business Ethics, 4 (2), 137–149.

Shaw, C. M. (2004). Using role-play scenarios in the IR classroom: An examination of exercises on peacekeeping operations and foreign policy decision making. International Studies Perspectives, 5 (1), 1–22.

Smith, K., Johnson, D. W., & Johnson, R. T. (1981). Can conflict be constructive? Controversy versus concurrence seeking in learning groups. Journal of Educational Psychology, 73 (5), 651–663.

Strohmetz, D. B., & Skleder, A. A. (1992). The use of role-play in teaching research ethics: A validation study. Teaching of Psychology, 19 (2), 106–108.

Tichy, M., Johnson, D. W., Johnson, R. T., & Roseth, C. J. (2010). The impact of controversy on moral development. Journal of Applied Psychology, 40 (4), 765–787.

Vesilind, P. A. (1996). Using academic integrity to teach engineering ethics. Journal of Engineering Education, 85 (1), 41–44.

Wareham, D. G., Elefsiniotis, P. T., & Elms, D. G. (2006). Introducing ethics using structured controversies. European Journal of Engineering Education, 31 (6), 651–660.

Download references

Acknowledgments

The author would like to thank Jason Borenstein and Matt Drake—co-authors of the ESIT—for their assistance in its use.

Author information

Authors and affiliations.

Muskingum University, New Concord, OH, USA

William R. Wilson

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to William R. Wilson .

Rights and permissions

Reprints and permissions

About this article

Wilson, W.R. Using the Chernobyl Incident to Teach Engineering Ethics. Sci Eng Ethics 19 , 625–640 (2013). https://doi.org/10.1007/s11948-011-9337-4

Download citation

Received : 16 July 2011

Accepted : 24 November 2011

Published : 15 December 2011

Issue Date : June 2013

DOI : https://doi.org/10.1007/s11948-011-9337-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Engineering ethics
  • Ethics education
  • Constructive controversy
  • Find a journal
  • Publish with us
  • Track your research

Browse Course Material

Course info.

  • Dr. Taft Broome

Departments

  • Engineering Systems Division

As Taught In

  • Business Ethics
  • Engineering

Learning Resource Types

Engineering ethics, 10: case studies: chernobyl, three mile island (cont.).

  • Download video
  • Download transcript

facebook

You are leaving MIT OpenCourseWare

Lessons of Chernobyl

chernobyl case study in professional ethics

Published June 7, 2019

National Review Online

By Mona Charen

O n the morning of April 28, 1986, an employee of the Forsmark nuclear power plant an hour north of Stockholm was returning from a restroom break when his shoes set off the radiation alarm. Soon klaxons were sounding everywhere. Technicians scoured the plant. No leak. After performing chemical and other analysis, they determined that the radiation wasn’t coming from Forsmark. It wasn’t even coming from Sweden. It was fallout from Chernobyl, 700 miles away.

The excellent HBO series  Chernobyl  offers an overdue glimpse into the leviathan of lies that was the Soviet system. Some have nitpicked that a composite character was created or that the trial scene at the end wasn’t historically accurate, but these exercises in poetic license don’t detract from the overall impact, and are, in any case, openly acknowledged, not concealed.

Lies were the Soviet regime’s native tongue. From the annual “record” grain harvests to the  Katyn Forest  massacre; from the Gulag to  Lysenkoism  to the shootdown of KAL007 to the  Holomodor ; the system was a black hole from which truth could not escape.

When Sweden demanded an explanation of the radiation, the Soviets denied that anything was amiss. Days later, when other European governments were detecting fallout and when U.S. satellites had photographed Chernobyl’s smoldering roof, the official news agency grudgingly acknowledged that an accident had killed “two people” but that “the situation had now been stabilized and [is] under control.” A later (post-Soviet) Ukrainian parliamentary report described such reassurances as “disinformation” of “almost Mephistophelean proportions.” The reactor was open and spewing radioactivity. Scores were already dead, more were dying an agonizing death from acute radiation poisoning, and thousands were inhaling and eating and drinking the radioactive isotopes that would cause miscarriages, stillbirths, and thyroid and other cancers. As  The   Economist summarized, “Chernobyl led to thousands of deaths, including the Soviet Union.”

While Communist Party officials were bundling their families out of Ukraine, the people were kept in the dark. Doses of potassium iodide, if administered within ten days, can protect against thyroid cancer. But there was no program to distribute the medicine. The Ukrainian surgeon general asked that people be warned to at least stay inside, wash their vegetables, avoid drinking milk, and take other precautions, but the Central Committee demanded that May Day parades proceed as usual, along with outdoor weddings and bike races. On May 1 in Kiev, as invisible fallout rained down, children in shirtsleeves marched past reviewing stands that usually held Soviet officials. They stood empty.

The Soviets kept their eye on the ball — deceit. Phone lines were cut to Chernobyl to prevent unauthorized truth from escaping, while the official machinery of propaganda revved up. As Robert McConnell reported  in  National Review , a Soviet television “news” report showed a photo of the damaged reactor and explained, “As you can see for yourself, there is no enormous destruction that some Western agencies are writing about, or no great fires, as there are no thousands of dead . . .” The air and water in Kiev were safe, the Soviet government said. For good measure, the TASS  news agency added that the U.S. had experienced “2,300 nuclear accidents and breakdowns in 1979 alone.”

While the world reeled from reports of a massive nuclear plume dropping poison whichever way the wind blew; while frantic but secret efforts were underway in Soviet Ukraine to stanch the flow; and while 220,000 people had to flee their homes (leaving livestock and pets behind as the HBO series dramatizes in sad detail), the official statements from the Kremlin were as noxious as the site itself. Denouncing a “Poisoned Cloud of Anti-Sovietism,”  Moscow News  pointed the finger at a “premeditated and well-orchestrated campaign” to “cover up criminal acts of militarism by the USA and NATO against peace and security.”

The catastrophe was a direct result of official lies. The Chernobyl reactor, like others of Soviet design, lacked several key safety features — but operators were kept in ignorance.

It is hard to know just how many people were killed outright or suffered later cancers and other pathologies because of the accident. The Soviet state directed that only the most severe cases of radiation sickness be noted in patients’ records. Estimates of deaths vary from several thousand to hundreds of thousands. As tragic as they were, those deaths were a tiny fraction of the millions starved, shot, and worked to death by the Soviet Union.

It was among the most soul-crushing of regimes in human history and lies were at the core of its corruption. The Soviet Union is gone, but deceit lives on. The Chernobyl  series is a timely reminder and a metaphor: Lying, when it becomes a way of life, is radioactive.

© 2019 Creators.com

Mona Charen is a senior fellow at the Ethics and Public Policy Center.

Tim Walz, Education Radical

chernobyl case study in professional ethics

A Note on Walz and ‘Neighborly Socialism’

chernobyl case study in professional ethics

Donald Trump must do better among white voters if he hopes to win in November

chernobyl case study in professional ethics

EPPC BRIEFLY

Sign up to receive EPPC's biweekly e-newsletter of selected publications, news, and events.

Your support impacts the debate on critical issues of public policy.

Related Publications

The summer reading list: a ukrainian primer.

George Weigel

chernobyl case study in professional ethics

Given the rubbish about Ukraine spewed out by Russian propaganda trolls and regurgitated by foolish or ideologically besotted Americans, this…

Syndicated Column / June 8, 2022

Dobbs Hysteria and Russian Disinformation

chernobyl case study in professional ethics

There are striking parallels between the Russian disinformation campaign that continues to foul the global communications space in the third month of the war on Ukraine and the hysterical screeds of pro-abortion American politicians after a draft Supreme Court decision in the  Dobbs  case leaked.

Syndicated Column / May 18, 2022

The Pope and the Patriarch of Moscow

chernobyl case study in professional ethics

A meeting between the current Bishop of Rome and the current Patriarch of Moscow would not have been a meeting of two religious leaders. It would have been a meeting between a religious leader and an instrument of Russian state power.

Syndicated Column / April 27, 2022

Holy Week 2022: A Wartime Meditation

chernobyl case study in professional ethics

A lengthy healing process notwithstanding, Ukraine has experienced more than the Passion this Lent.

Syndicated Column / April 13, 2022

Stay Connected!

Are you enjoying this article? Share with a Friend!

chernobyl case study in professional ethics

Always see the latest from Mona Charen and other EPPC Scholars. Sign up for EPPC Briefly!

  • DOI: 10.1007/s11948-011-9337-4
  • Corpus ID: 39459910

Using the Chernobyl Incident to Teach Engineering Ethics

  • Published in Science and Engineering… 15 December 2011
  • Engineering, Environmental Science

28 Citations

“safety” and “integration”: examining the introduction of disaster into the science curriculum in south korea, a holistic approach to quantum ethics education: the quantum ethics project quantumethicsproject.org, experiencing complex stakeholder dynamics around emerging technologies: a role-play simulation, covid-19 pandemic reveals challenges in engineering ethics education, senior engineering students’ reflection on their learning of ethics and morality: a qualitative investigation of influences and lessons learned, engineering students as co-creators in an ethics of technology course, sustainable leadership and management of complex engineering systems: a team based structured case study approach, moral reasoning and anti-immigrant bias: experimental evidence from university students in germany and the united states, engineering safety and risk management: a structured case study approach to investigating chemical process safety, 50 references, dit2: devising and testing a revised instrument of moral judgment..

  • Highly Influential
  • 10 Excerpts

Alchemy and beyond: Indexing the Defining Issues Test.

Teaching ethics to engineers: ethical decision making parallels the engineering design process, development of role-play scenarios for teaching responsible conduct of research, the engineering and science issues test (esit): a discipline-specific approach to assessing moral judgment, a constructive controversy approach to “case studies”, the impact of constructive controversy on moral development, active learning through role playing: virtual babies in a child development course, kohlberg's stages of moral development, what can students learn in an extended role-play simulation on technology and society, related papers.

Showing 1 through 3 of 0 Related Papers

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Using the Chernobyl incident to teach engineering ethics

Affiliation.

  • 1 Muskingum University, New Concord, OH, USA. [email protected]
  • PMID: 22170503
  • DOI: 10.1007/s11948-011-9337-4

This paper discusses using the Chernobyl Incident as a case study in engineering ethics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster and to defend their decisions in a mock debate. The results of student surveys and the Engineering and Science Issues Test indicate that the approach is very popular with students and has a positive impact on moral reasoning. The approach incorporates technical, communication and teamwork skills and has many of the features suggested by recent literature.

PubMed Disclaimer

Similar articles

  • Comparing First-Year Engineering Student Conceptions of Ethical Decision-Making to Performance on Standardized Assessments of Ethical Reasoning. Cimino RT, Streiner SC, Burkey DD, Young MF, Bassett L, Reed JB. Cimino RT, et al. Sci Eng Ethics. 2024 Jun 4;30(3):23. doi: 10.1007/s11948-024-00488-y. Sci Eng Ethics. 2024. PMID: 38833046 Free PMC article.
  • The Engineering and Science Issues Test (ESIT): a discipline-specific approach to assessing moral judgment. Borenstein J, Drake MJ, Kirkman R, Swann JL. Borenstein J, et al. Sci Eng Ethics. 2010 Jun;16(2):387-407. doi: 10.1007/s11948-009-9148-z. Epub 2009 Jul 14. Sci Eng Ethics. 2010. PMID: 19597969
  • Teaching ethics to engineers: ethical decision making parallels the engineering design process. Bero B, Kuhlman A. Bero B, et al. Sci Eng Ethics. 2011 Sep;17(3):597-605. doi: 10.1007/s11948-010-9213-7. Epub 2010 Jun 4. Sci Eng Ethics. 2011. PMID: 20524076
  • A pathway for moral reasoning in home healthcare. McCormick-Gendzel M, Jurchak M. McCormick-Gendzel M, et al. Home Healthc Nurse. 2006 Nov-Dec;24(10):654-61; quiz 670-1. doi: 10.1097/00004045-200611000-00011. Home Healthc Nurse. 2006. PMID: 17135844 Review.
  • Ethical decision making in intrapartum nursing. Simmonds AH. Simmonds AH. J Perinat Neonatal Nurs. 2012 Oct-Dec;26(4):327-35. doi: 10.1097/JPN.0b013e3182694df2. J Perinat Neonatal Nurs. 2012. PMID: 23111721 Review.
  • Engineering Students as Co-creators in an Ethics of Technology Course. Bombaerts G, Doulougeri K, Tsui S, Laes E, Spahn A, Martin DA. Bombaerts G, et al. Sci Eng Ethics. 2021 Jul 23;27(4):48. doi: 10.1007/s11948-021-00326-5. Sci Eng Ethics. 2021. PMID: 34297187 Free PMC article.
  • Are Ethics Training Programs Improving? A Meta-Analytic Review of Past and Present Ethics Instruction in the Sciences. Watts LL, Medeiros KE, Mulhearn TJ, Steele LM, Connelly S, Mumford MD. Watts LL, et al. Ethics Behav. 2017;27(5):351-384. doi: 10.1080/10508422.2016.1182025. Epub 2016 May 27. Ethics Behav. 2017. PMID: 30740008 Free PMC article.
  • A Systematic Literature Review of US Engineering Ethics Interventions. Hess JL, Fore G. Hess JL, et al. Sci Eng Ethics. 2018 Apr;24(2):551-583. doi: 10.1007/s11948-017-9910-6. Epub 2017 Apr 11. Sci Eng Ethics. 2018. PMID: 28401510
  • A Meta-analytic Comparison of Face-to-Face and Online Delivery in Ethics Instruction: The Case for a Hybrid Approach. Todd EM, Watts LL, Mulhearn TJ, Torrence BS, Turner MR, Connelly S, Mumford MD. Todd EM, et al. Sci Eng Ethics. 2017 Dec;23(6):1719-1754. doi: 10.1007/s11948-017-9869-3. Epub 2017 Feb 1. Sci Eng Ethics. 2017. PMID: 28150177
  • Review of Instructional Approaches in Ethics Education. Mulhearn TJ, Steele LM, Watts LL, Medeiros KE, Mumford MD, Connelly S. Mulhearn TJ, et al. Sci Eng Ethics. 2017 Jun;23(3):883-912. doi: 10.1007/s11948-016-9803-0. Epub 2016 Jul 7. Sci Eng Ethics. 2017. PMID: 27387564 Review.
  • Teach Psychol. 1992 Apr;19(2):106-8 - PubMed
  • Sci Eng Ethics. 2004 Apr;10(2):343-51 - PubMed
  • Teach Psychol. 1990 Oct;17(3):179-81 - PubMed
  • Sci Eng Ethics. 2010 Jun;16(2):387-407 - PubMed
  • Teach Psychol. 1998;25(1):26-8 - PubMed
  • Search in MeSH

LinkOut - more resources

Full text sources.

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Using the Chernobyl incident to teach engineering ethics.

Full text links.

Add to Saved Papers

Get 1-tap access

Related Resources

For the best experience, use the read mobile app.

Mobile app image

Get seemless 1-tap access through your institution/university For the best experience, use the Read mobile app

Toggle icon

Europe PMC requires Javascript to function effectively.

Either your web browser doesn't support Javascript or it is currently turned off. In the latter case, please turn on Javascript support in your web browser and reload this page.

Prindle Institute Logo

  • Ethics in Culture

The Ethics of Scientific Advice: Lessons from “Chernobyl”

chernobyl case study in professional ethics

The recently-released HBO miniseries Chernobyl highlights several important moral issues that are worth discussing. For example, what should we think about nuclear power in the age of climate change? What can disasters tell us about government accountability and the dangers of keeping unwelcome news from the public? This article will focus on the ethical issues concerning scientists potential to influence government policy. How should scientists advise governments, and who holds them accountable for their advice? 

In the second episode, the Soviet Union begins dumping thousands of tons of sand and boron onto the burning nuclear plant at the suggestion of physicist Valery Legasov. After consulting fellow scientist Ulana Khomyuk (a fictional character who represents the many other scientists involved), Legasov tells Soviet-leader Gorbachev that in order to prevent a potential disaster, drainage pools will need to be emptied from within the plant in an almost certain suicide mission. “We’re asking for your permission to kill three men,” Legasov reports to the Soviet government. It’s hard to imagine a more direct example of a scientist advising a decision with moral implications. 

Policy makers often lack the expertise to make informed decisions, and this provides an opportunity for scientists to influence policy. But should scientists consider ethical or policy considerations when offering advice? 

On one side of this debate are those who argue that scientists primary responsibility is to ensure the integrity of science. This means that scientists should maintain objectivity and should not allow their personal moral or religious convictions to influence their conclusions. It also means that the public should see science as an objective and non-political affair. In essence, science must be value-free.

This value-free side of the debate is reflected in the mini-series’ first episode. It ends with physicist Legasov getting a phone call from Soviet minister Boris Shcherbina telling him that he will be on the commission investigating the accident. When Legasov begins to suggest an evacuation, Shcherbina tells him, “You’re on this committee to answer direct questions about the function of an RBMK reactor…nothing else. Certainly not policy.”

Those who argue for value-free science often argue that scientists have no business trying to influence policy. In democratic nations this is seen as particularly important since policy makers are accountable to voters while scientists are not. If scientists are using ethical judgments to suggest courses of action, then what mechanism will ensure that those value judgments reflect the public’s values?

In order to maintain the value-free status of science, philosophers such as Ronald N. Geire argue that there is an important distinction between judging the truth of scientific hypotheses and judging the practical uses of science. A scientist can evaluate the evidence for a theory or hypotheses, but they shouldn’t evaluate whether one should rely on that theory or hypothesis to make a policy decision. For example, a scientist might tell the government how much radiation is being released and how far it will spread, but they should not advise something like an evacuation. Once the government is informed of relevant details, the decision of how to respond should be left entirely to elected officials. 

Opponents of this view, however, argue that scientists do have a moral responsibility when offering advice to policy makers and believe that scientists shouldering this responsibility is desirable. Philosopher Heather Douglas argues that given that scientists can be wrong, and given that acting on incorrect information can lead to morally important consequences, scientists do have a moral duty concerning the advice they offer to policy makers. Scientists are the only ones who can fully appreciate the potential implications of their work. 

In the mini-series we see several examples where only the scientists fully appreciate the risks and dangers from radiation, and are the strongest advocates of evacuation. In reality, Legasov and a number of other scientists offered advice on how to proceed with cleaning up the disaster. According to Adam Higginbotham’s Midnight in Chernobyl: The Untold Story of the World’s Greatest Nuclear Disaster , the politicians were ignorant of nuclear physics, and the scientists and technicians were too paralyzed by indecision to commit to a solution.

In the real-life disaster, the scientists involved were frequently unsure about what was actually happening. They had to estimate how fast various parts of the core might burn and whether different radioactive elements would be released into the air. Reactor specialist Konstantin Fedulenko was worried that the boron drops were having limited effect and that each drop was hurling radioactive particles into the atmosphere. Legasov disagreed and told him that it was too late to change course. Fedulenko believed it was best to let the graphite fire burn itself out, but Legasov retorted, “People won’t understand if we do nothing…We have to be seen to be doing something.” This suggests that the scientists were not simply offering technical advice but were making judgments based on additional value and policy considerations. 

Again, according to Douglas, given the possibility for error and the potential moral consequences at play, scientists should consider these consequences to determine how much evidence is enough to say that a hypothesis is true or to advise a particular course of action. 

In the mini-series, the government relies on monitors showing a low level of radiation to initially conclude that the situation is not bad enough to warrant an evacuation. However, it is pointed out the radiation monitors being used likely only had a limited maximum range, and so the radiation could be much higher than the monitor would tell them. Given that they may be wrong about the actual amount of radiation and the threat to public health, a morally-responsible scientist might conclude that evacuation be suggested to policy makers. 

While some claim that scientists shouldn’t include these considerations, others argue that they should. Certainly, the issue isn’t limited to nuclear disasters either. Cases ranging from climate change to food safety, chemical and drug trials, economic policies, and even the development of weapons, all present a wide array of potential moral consequences that might be considered when offering scientific advice. 

It’s difficult to say a scientist shouldn’t make morally relevant consequences plain to policy makers. It often appears beneficial, and it sometimes seems unavoidable. But this liberty requires scientists to practice judgment in determining what a morally relevant consequence is and is not. Further, if scientists rely on value judgments when advising government policy, how are scientists to be held accountable by the public? Given these benefits and concerns, whether we want scientists to make such judgments and to what extent their advice should reflect those judgments presents an important ethical dilemma for the public at large. Resolving this dilemma will at least require that we be more aware of how experts provide policy advice.

chernobyl case study in professional ethics

Should Sports Broadcasts Promote Gambling?

image of Eurovision logo

Eurovision, Israel, and the Responsibility for War

image of camera lens and ring light in studio

AI in Documentary Filmmaking: Blurring Reality in ‘What Jennifer Did’

chernobyl case study in professional ethics

On Ye and Edgelords

Email Icon

Receive a weekly digest of our best content!

  • Nuclear Disasters: Fukushima and Chernobyl Words: 2406
  • Fukushima and Chernobyl’ Nuclear Disasters Comparison Words: 2406
  • Researching: Ethics and Engineering Words: 1833
  • Prevention of Nuclear Disasters Words: 555
  • The Space Shuttle Challenger Engineering Ethics Words: 1218
  • Ethical and Legal Issues During Catastrophes or Disasters Words: 2208
  • Environmental Studies: The Chernobyl Disaster Words: 2294
  • Engineering Professionalism and Ethics Words: 630
  • Disasters’ Benefits to People Who Experience Them Words: 1346
  • Ethics of the Flixborough Chemical Plant Disaster Words: 349
  • Ethics: What Is It, Why Do We Study It, Specific Codes of Ethics Words: 1161
  • International Studies: Global Disasters Words: 964
  • Chernobyl Disaster’s Socio-Economic and Environmental Impact Words: 3809
  • Ethics in the Modern Society Words: 549
  • Engineering Ethics and Professionalism Words: 1300
  • Chernobyl and Fukushima Disasters: Their Impact on the Ecology Words: 837
  • Genetic Engineering Biomedical Ethics Perspectives Words: 1590

Chernobyl Disaster and Engineering Ethics

This paper examines the Chernobyl disaster ethical issues. Specifically, it analyses the engineering ethics, what went wrong, and how to avoid similar disasters in the future.

Chernobyl Disaster

Chernobyl ethical issues analysis, annotated bibliography.

This Chernobyl ethics case study reflects events that occurred on the Chernobyl Nuclear Power Plant. The disaster of the Chernobyl Nuclear Power Plant was the largest nuclear power accident in the history of the world. Discarded from the destroyed reactor to the atmosphere, nuclear fuel fission products were separated by air over considerable areas determining their contamination within the borders of Ukraine, Russia, and Belarus. The effects of contamination affected people, flora, and fauna, causing plenty of incurable diseases and even deaths.

Thousands of people were relocated over the exclusion zone. Engineers, who made a mistake in the design of the nuclear reactor, were recognized guilty in the accident. In this regard, the paper examines engineering ethical issues in Chernobyl disaster, resulting in the necessity of increasing engineers’ awareness of their essential mission, in particular, the improvement of humans’ lives.

April 26, 1986, the fourth unit of the Chernobyl nuclear power plant located on the territory of Ukraine was destroyed. It was the demolition of the explosive nature when the reactor was completely destroyed, releasing enormous quantities of radioactive substances in the environment. The accident is regarded as the largest of its kind in the history of nuclear energy, both in the estimated number of dead and injured people and economic damage. Chernobyl was the most powerful NPP in the USSR.

There were various explanations of the Chernobyl accident. However, only two of them stand out as the most scientific and reasonable. The essence of the first states that the staff of the fourth block of Chernobyl flagrantly violated regulations of the process of preparing and carrying out the electrical test. In other words, rules for the safe operation of the reactor were violated. The second note negligence in the management of the reactor plant and lack of understanding of personnel features of technological processes in a nuclear reactor.

The second state commission consisting mainly of the working staff, gave another explanation of reasons caused the Chernobyl accident. It determined that the fourth reactor had some structural deficiencies that led to the reactor explosion. Therefore, scientists and engineers who have created and designed the reactor graphite and propellants are to blame for the accident.

As a matter of fact, the consequences and ramifications of the accident were terrifying. The Chernobyl incident resulted in approximately 30 kilometers of the exclusion zone, where hundreds of small settlements were destroyed and buried by heavy machinery (Yablokov, Nesterenko, & Nesterenko, 2010). Some territories of Belarus, Russia, and Ukraine had undergone the pollution. Affecting the human organism, caused radiation contamination leading to different diseases, including cancer, cataracts, cardiovascular diseases, and even death (Cardis & Hatch, 2011).

It is also should be noted that the UN report on Chernobyl disaster consequences states that the effect of radiation on human health was less than expected, and the relocation of residents from the 30 kilometers zone has brought more harm than good. It destroyed the local communities, families leading to unemployment and depression. Severe stress, the effects of which cannot be accurately assessed, influenced all the relocated people. Before the accident, the city’s population comprised 43 thousand people. Currently, the city is home only to the staff of institutions and enterprises of the exclusion zone operating on a rotational basis.

In this regard, it becomes obvious that the professional ethics of engineering and other technical professions should involve the pursuit of high quality of work. Creating objects of material culture, the engineer cannot work without the spiritual culture. The development of modern techno sphere makes engineering ethics very important and in demand. The case with Chernobyl makes it clear that the engineer should possess ethics to avoid any similar situations in the perspective.

Nowadays one might observe the ethnic pluralism that traced a steady increase of the attention to the professionalism and competence problems of the modern society. Ethical dominants followed by specialists are directly dependent on the risks faced by humanity. Hence, the relevance of discussing engineering ethics in the modern world has never been greater.

The disaster of the Chernobyl Nuclear Power Plant was the largest nuclear power accident in the history of the world. Discarded from the destroyed reactor to the atmosphere, nuclear fuel fission products were separated by air over considerable areas determining their contamination within the borders of Ukraine, Russia, and Belarus. Thousands of people were relocated over the exclusion zone. Engineers, who made a mistake in the design of the nuclear reactor, were recognized guilty in the accident as they violated the code of ethics, particularly, the morality the engineering profession resulting in harm to society.

The Chernobyl disaster changed the scientific view of the world. Consequently, problems of technological risks were discussed not only by scientists but also by the general public. The very way of discussion of safety problems concerning nuclear reactors is changed.

The responsibility of politicians, engineers, designers, and operators as well as scientists became to be taken into account. It was stated that none of the social, economic, technical, and scientific interests can justify the harm to humans and the environment (Harris, Pritchard, Rabins, James, & Englehardt, 2013). The growth of technical and technological capacities of humanity creates the entirely new ethical situation that requires not only evaluation of the exploitation of techniques but also the possibility to anticipate disasters, prevent, or, at least, minimize it.

As shown by the accident at the Chernobyl, nuclear power technology development does not make people’s lives safer. The accident occurred because of a number of violations of rules of reactor facilities operation. On the fourth power unit, during its output for scheduled maintenance at night, several experiments were conducted involving the study of modes of turbo-generators work. Workers did not provide the appropriate oversight and did not take all the security measures. All in all, it was engineer’s fault, who made a mistake in the design of the reactor.

It goes without saying that such consequences of nuclear energy as the Chernobyl disaster cannot be predicted. Nevertheless, engineers should try to do it in relation to new projects conducting relevant researches and listening to the views of the opposition (Wilson, 2013).

It is expected that the technical work would always contain a necessary component of the assessment of the technology, and not everything that is technically feasible should be necessarily be created. In the framework of utilitarianism ethical theory, the principal task of the engineer is not only to measure the space but also create the safe environment based on ethical and social responsibility. People are more significant than technology. Therefore, the latter should satisfy requirements of a variety of values, namely, criteria of economy, improvement of living standards, safety, health, environmental quality, and social environment.

Cardis, E., & Hatch, M. (2011). The Chernobyl Accident – An Epidemiological Perspective. Clinical Oncology, 23 (4), 251-260.

“The Chernobyl Accident – An Epidemiological Perspective” by Cardis and Hatch investigates the affect of the catastrophe on the human organism. Authors stated that there is the obvious connection of that affect and risk of thyroid cancer. The ionizing radiation affects people every day by means of plenty of natural sources including cosmic rays and natural radioactive materials occurring in food, drink, and air. It is considered natural radiation. However, due to the high level of radiation, there is the evidence of diseases. The book is useful for the research as it emphasizes the effect of radiation level of Chernobyl incident.

Harris, C. E., Pritchard, M. S., Rabins, M. J., James, R., & Englehardt, E. (2013). Engineering ethics: Concepts and cases (5th ed.). Belmont, CA: Wadsworth.

The book “Engineering ethics: Concepts and cases” by Harris, Pritchard, Rabins, James, and Englehardt focuses on a system of moral principles that are applied in the practice of engineering. Authors of the book are considered scholars. Therefore, the credibility of the source cannot be argued. Combining theory and practice, the book demonstrates plenty of cases as well as offers a code of ethics and its application in practice. In addition, the book clearly explains the impact of engineering solutions on health, safety, and welfare of society. It was very significant for the research as promoted the comprehension of engineer’s morality.

Wilson, W. R. (2013). Using the Chernobyl Incident to Teach Engineering. Science and Engineering Ethics, 19 (2), 625-640.

The article “Using the Chernobyl Incident to Teach Engineering” by Wilson reveals the approach to teaching engineers ethics. It demonstrates the situation when students combined in groups and asked to act and speak like those workers of the Chernobyl Nuclear Power Plant, comprehend the issue better and suggest concise and argumentative solutions. The author of the research is a professor of Muskingum University, New Concord, OH, USA. The intended audience of the article is engineering students. This credible source was rather helpful for the provided analysis as it contributed to the creating of possible decisions concerning ethical issues in engineering.

Yablokov, A. V., Nesterenko, V. B., & Nesterenko, A. V. (2010). Chernobyl: Consequences of the Catastrophe for People and the Environment . Boston, MA: New York Academy of Sciences.

“Chernobyl: Consequences of the catastrophe for people and the environment” by Yablokov, Nesterenko, and Nesterenko reflects various consequences of the disaster. Authors clearly and successively point out all the ramifications including general morbidity, disability, non-malignant diseases, and other illnesses occurred as a result of the accidence. What is more, the book describes catastrophe consequences for the environment. In particular, it comprises radioactive effect on atmospheric, soil, and water. The book contributed to the understanding the degree of the disaster. It would be interesting and helpful as for students as well as for the average reader.

Cite this paper

  • Chicago (N-B)
  • Chicago (A-D)

StudyCorgi. (2020, October 21). Chernobyl Disaster and Engineering Ethics. https://studycorgi.com/chernobyl-disaster-and-engineering-ethics/

"Chernobyl Disaster and Engineering Ethics." StudyCorgi , 21 Oct. 2020, studycorgi.com/chernobyl-disaster-and-engineering-ethics/.

StudyCorgi . (2020) 'Chernobyl Disaster and Engineering Ethics'. 21 October.

1. StudyCorgi . "Chernobyl Disaster and Engineering Ethics." October 21, 2020. https://studycorgi.com/chernobyl-disaster-and-engineering-ethics/.

Bibliography

StudyCorgi . "Chernobyl Disaster and Engineering Ethics." October 21, 2020. https://studycorgi.com/chernobyl-disaster-and-engineering-ethics/.

StudyCorgi . 2020. "Chernobyl Disaster and Engineering Ethics." October 21, 2020. https://studycorgi.com/chernobyl-disaster-and-engineering-ethics/.

This paper, “Chernobyl Disaster and Engineering Ethics”, was written and voluntary submitted to our free essay database by a straight-A student. Please ensure you properly reference the paper if you're using it to write your assignment.

Before publication, the StudyCorgi editorial team proofread and checked the paper to make sure it meets the highest standards in terms of grammar, punctuation, style, fact accuracy, copyright issues, and inclusive language. Last updated: June 8, 2022 .

If you are the author of this paper and no longer wish to have it published on StudyCorgi, request the removal . Please use the “ Donate your paper ” form to submit an essay.

Engineering Ethics of Chernobyl and the Three Mile Island Research Paper

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

Introduction

The background of the two nuclear accidents, the causes of disasters, the principles of engineering ethics, breach of ethics in chernobyl and three mile island, the comparison of ethical implications in the two cases.

Ethical obligations are intrinsic to any profession; nonetheless, people often make little account of them, giving priority to the efficiency and accomplishments of their work. In occupational spheres where moral qualities are not considered important, one’s mistake usually has minor, short-term, and reversible consequences. However, in some cases, a breach of ethics can lead to significant detrimental outcomes.

Negligence or an irresponsible attitude to work poses a threat to people’s safety, as it happened on the Three Mile Island, and results in long-term catastrophic repercussions, as illustrated by the Chernobyl tragedy. The two accidents serve as valuable lessons and warnings, showing the essential role of the human factor in the nuclear power industry.

The well-functioning mechanisms and automated processes cannot guarantee stability and safety because an individual who makes major decisions is still prone to error. To review the mentioned events in the context of engineering ethics, it is necessary to study their history, causes, the underlying code of conduct, and professionals’ relative adherence to it.

The Chernobyl disaster is known for its drastic impact not only on the Ukrainian population and environment but also on the whole world. As Plokhy (2019) explains, the engineers working on the nuclear power plant were given the task to improve the automatic shutdown mechanisms and, consequently, establish a new emergency safety system.

The ministry prompted them to run a corresponding test that would imitate the conditions of power failure (Plokhy, 2019). None of the workers had expected that the procedure would disrupt protective appliances. However, on April 6, 1986, an abrupt power emission during the reactor system test had caused an intense explosion (United States Nuclear Regulatory Commission, 2018). The resulting fire contributed to the destruction of Unit 4 and the extensive spread of radiation (United States Nuclear Regulatory Commission, 2018). Thus, the intention to enhance the work of the plant in case of emergency has turned into the worst nuclear catastrophe globally.

Despite the significant efforts of the responders to the Chernobyl accident, its consequences included the creation of the exclusion zone, people’s deaths, and worsening of the population’s health. The remaining reactors were eventually stopped, and the area within 30 kilometers of the plant was closed (United States Nuclear Regulatory Commission, 2018). Shortly after the disaster, 28 workers died, many others suffered from the radiation, and millions of people were exposed to the adverse impact in the contaminated areas (United States Nuclear Regulatory Commission, 2018).

The drastic event has also led to the change in the mental health of the affected individuals. They are prone to depression, addictive behaviors, and anxiety; some people experience unrecognized physical symptoms, overestimate their conditions, and make negative predictions regarding their life span (United States Nuclear Regulatory Commission, 2018). The disaster’s widely spread outcomes should motivate society to pay more attention to safety within the industry.

The Three Mile Island occurrence is considered the most serious nuclear accident which happened in the United States. The personnel working on the plant on March 28, 1979, were unaware of the emergency feedwater valves closure after a maintenance procedure, which had an ultimate influence on their further actions (Filburn & Bullard, 2016).

The absence of the main feedwater had stopped the turbine and caused a chain of complicated events and alarms which misled the operators into making wrong decisions (Filburn & Bullard, 2016). They had also missed the fact of inadequate core cooling and the failure of Power Operated Relief Valves, which resulted in radiation releases and gathering of hydrogen in the reactor’s vessel (Filburn & Bullard, 2016). The accident did not bring dangerous consequences because the large explosion was avoided. Nonetheless, being on the verge of the catastrophe makes that day on Three Mile Island admonitory.

The implications of the nuclear plant accident in the United States were realized later. According to Filburn and Bullard (2016), although people’s exposure to the radiation was insignificant and they did not suffer from long-term health issues, the event entailed economic losses. The rectification of the consequences demanded time and money – ten years and approximately one billion dollars respectively (Filburn & Bullard, 2016).

Removing the fuel remains and the subsequent careful examination of the ill-conditioned components helped investigate the matter, and the proper recycling of radioactive waste ensured safety on the plant’s territory (Filburn & Bullard, 2016). Moreover, Three Mile Island aims at full decontamination of Unit 2 by 2034 (Filburn & Bullard, 2016). The accident did not impact the environmental situation in the region and did not worsen the residents’ health, but it caused financial troubles for the involved parties.

The Chernobyl catastrophe is viewed as unprecedented because of its unique circumstances and contributing factors. First of all, the utilized RBMK reactor included a mixture of a graphite moderator and water coolant and was considered unacceptable outside the Soviet Union because of its instability (The Nuclear Energy Institute, 2019). It means that the respective authorities ignored the risks associated with the item’s characteristics. The reactor’s conditions were difficult to predict at low power due to its peculiarities: the absence of cooling water increased the speed of the nuclear chain reaction and power output (The Nuclear Energy Institute, 2019).

Secondly, the plant was less protected in comparison with similar buildings in the world. It was deprived of the reinforced containment structure, which allowed radioactive materials to enter the environment (The Nuclear Energy Institute, 2019). Consequently, the Chernobyl power station was not prepared for emergency situations, which made it much more dangerous than other nuclear plants.

There is no doubt that the causes of the Chernobyl tragedy are also related to the human factor. As stated in “Low Safety Culture” (2019), at that time, economic and sociopolitical aspects in the atomic energy sphere were not legally regulated. There was no person fully responsible for the safety of nuclear power utilization (“Low Safety Culture,” 2019).

The plant’s Unit 4 was subject to prearranged maintenance; and operators had to perform the procedures determining whether the equipment could maintain enough power for the cooling system during the transition phase (Plokhy, 2019). However, the workers did not take proper precautions when performing the system testing because they were not aware of the existing risks. Most importantly, they were under the pressure of the deputy chief engineer, focused on the task accomplishment, and could not prioritize safety (“Low Safety Culture,” 2019). Thus, the system of seniority prevailed overprotective measures, leading to the disaster.

Regarding Three Mile Island, the reasons for the accident included minor failures of equipment and inadequate control instrumentation. The chain of events began when the malfunction of the secondary cooling circuit increased the temperature, the subsequent step of the reactor occurred, and the relief valve could not close (World Nuclear Association, 2020). However, the control room instrumentation did not detect the problem.

The unnoticed stuck of the pilot-operated relief valve (PORV) has led to a series of misconceptions, which prevented the staff from the timely and effective actions and shifted their focus in the wrong direction. They underestimated the significance of the PORV and the block valve because the manufacturer was unaware of its safety function during accident loads (Rosztoczy, 2019). The flaws in the design of the plant systems had led to confusion during their handling.

The Three Mile Island accident was not deprived of human errors as well. Because of the deficient instrumentation with indicators and the lack of training in a similar situation, the operators could not decide on the appropriate course of action. Rosztoczy (2019) explains that unable to find the problem, they were improvising and counted on the pressurizer water level data. The employees made several mistakes, such as shutting the emergency core cooling system, opening the letdown line, missing overheating of the core, and injecting radioactive water into the auxiliary building (Rosztoczy, 2019).

These errors were fixed late as the partial meltdown had already occurred. Besides, the operators’ training constituted their preparation for the worst scenario. Their treatment of the situation as a minor did not allow seeing it as potentially dangerous and prevented the workers from reacting to it more seriously (Filburn & Bullard, 2016). Therefore, the incorrect approach to the staff coaching subjected the power station to a disturbing experience.

Those individuals who are involved in the nuclear industry should recognize the basic moral values connected with radiological protection. The first one involves beneficence, which is the facilitation of good, and nonmaleficence – the avoidance of doing harm. The principles are realized in the protection of society from the adverse influence of radiation and minimizing the likelihood of threats (Cho et al., 2018). The second – prudence – means one’s competence to make informed and thought-out choices depending on what they can do in the given circumstances (Cho et al., 2018).

The virtue is related to the obligation to monitor radiological conditions and make sound decisions even when facing uncertainties (Cho et al., 2018). The third value is dignity; it implies that every individual deserves unconditional respect and has the right to take or reject the risk (Cho et al., 2018). The final principle – justice – constitutes the equal distribution of benefits and losses, which means that people’s exposure to radiation should be limited, but not at the expense of others (Cho et al., 2018). The underlying ethical values should become a norm for the workers of nuclear power plants.

The practical application of moral principles can be reflected in such aspects as accountability, transparency, and inclusiveness. The first concept obliges a person or an organization to take responsibility for their actions and related consequences, as well as the provided advice, given orders, and developed requirements (Cho et al., 2018). The second one means the establishment of explicit procedures and demands, ensuring unimpeded access to the information which impacts society and the environment (Cho et al., 2018).

The last aspect reflects the freedom of stakeholder participation, allowing for the possibility to involve other people besides specialists in the radiological protection because it serves their interests (Cho et al., 2018). The collaboration between the experienced professionals and stakeholders helps them approach an issue in a comprehensive manner and contributes to mutual understanding. The adherence to the accountability, transparency, and inclusiveness elements is the key to upholding safety within the nuclear industry.

Engineers are respectable professionals; given the nature of their work, they are expected to follow the most demanding guidelines of ethical conduct and conform to the corresponding code. The National Society of Professional Engineers offers a comprehensive, detailed, and well-organized set of requirements for the individuals choosing this occupation. It outlines the rules of practice, dividing them into five categories (National Society of Professional Engineers, 2019).

According to the code, engineers should prioritize the public’s safety, health, and welfare, work only in the spheres of their competence, and deliver unbiased information to the population (National Society of Professional Engineers, 2019). Furthermore, the representatives of this profession should be faithful to their employers and clients and refrain from fraudulent acts (National Society of Professional Engineers, 2019). The engineering practice means complying with the highest ethical standards.

People working in the nuclear industry are prepared to undertake a number of obligations connected to their job. For example, engineers make honesty and integrity their moral priority, serve the public interest, avoid potentially misleading actions, and preserve the confidentiality of the clients or employees (National Society of Professional Engineers, 2019). They do not allow conflicting interests to hinder their judgment and do not purposefully worsen the reputation of colleagues to obtain a promotion (National Society of Professional Engineers, 2019).

On the contrary, the professionals recognize the proprietary rights of others and accept personal responsibility for their work (National Society of Professional Engineers, 2019). The National Society of Professional Engineers recognizes that their activities directly influence people’s quality of life, setting enhanced demands regarding one’s virtues. Such an attitude is commendable and promising in terms of the safety of nuclear power stations.

The tragedy of Chernobyl and its causes can be explained by the absence of certain ethical standards, beginning from the government officials and ending with ordinary performers. The desire to keep pace with other countries has prompted the Soviet Union led to building own nuclear power plant. Nikita Khrushchev ignored the engineers’ warnings that it would be dangerous to use uranium-graphite channel-type reactors to produce electricity (“Low Safety Culture,” 2019). The alternative options were inapproachable because of the technological complexity of the reactor vessel production (“Low Safety Culture,” 2019).

Thus, one can perceive the leader’s decision as contradicting the principles of nonmaleficence, accountability, and inclusiveness: he did not prevent harm, possess a sense of responsibility, and consider the public interests. Moreover, Khrushchev was acting beyond his area of competence, refusing to accept the advice of specialists in the field. Such breach of ethics has served as a prerequisite for future disaster, given the flawed design of the reactor.

A further breach of ethics can be seen right before the accident on the Chernobyl station and even afterward. As evident from the background description, the plant operators were forced to perform a risky procedure, prioritizing the deputy chief engineer’s orders over safety (“Low Safety Culture,” 2019). The head manager did not adhere to the principles of prudence and dignity, threatening everyone’s well-being and showing no respect for his employees. He also did not demonstrate the willingness to ensure public welfare. When the catastrophe occurred, despite the reactor’s flawed design, the officials could have limited radioactive exposure (The Nuclear Energy Institute, 2019).

Nonetheless, the plant operators hid the news from authorities and the affected population, which led to the late evacuation alert and people’s consumption of contaminated food (The Nuclear Energy Institute, 2019). Such actions signify the breach of nonmaleficence, justice, transparency, as well as providing false information, ignoring public interests, and avoiding accountability. The continued disregard of ethics has resulted in long-term and extensive outcomes.

The Three Mile Island accident presents slightly different aspects, given that its consequences were manageable and reversible. Firstly, the safety function of PORV was disrupted because such a requirement was not indicated in its purchase order (Rosztoczy, 2019). The supplier did not comprehend the client’s demand, which illustrates the violation of beneficence, prudence, and inclusiveness due to poor communication and the reluctance to reach mutual understanding.

Secondly, the design of the control room was not well thought out because the operators found it difficult to notice and interpret the indicators (World Nuclear Association, 2020). It means that the individuals responsible for this work did not organize the procedures explicitly and did not strive to contribute to the employees’ welfare and favorable job conditions. These various shortcomings had affected the involved personnel, causing their quite rambling actions and making their diagnostic process increasingly complicated.

The plant’s staff had played the greatest role in the Three Mile Island disaster because of their non-adherence to the expected ethical conduct. Some of the operators were not experienced enough to handle emergency cases, and the previous training did not seem to prepare them for such situations. It included the mitigation of the presupposed accidents, but very small of them, such as PORV failure, were not analyzed by the designer and consequently were excluded from the program (Rosztoczy, 2019). Without considering the station’s reaction to such a breakdown, instructors had taught the operators to rely on the pressurizer water level indication for water level measurements

in the reactor coolant system (Rosztoczy, 2019). The individuals who had developed the training of operators demonstrated the breach of accountability and beneficence because they did not offer a comprehensive preparation that would prove their professionalism and become useful to the students. In turn, the operators did not act within their area of competence as their knowledge was insufficient during the accident. The company’s staff should have been educated on the engineering code of conduct to comply with the necessary requirements.

The difference in the consequences of the two discussed disasters is significant because of the variations in the ethical culture of the United States and the Soviet Union. While the related issues were regulated in the U.S. during the corresponding accident, the USSR did not have clearly developed rules of professional conduct in the nuclear industry (The Nuclear Energy Institute, 2019). In addition, the economic progress of the foreign countries has placed the latter state in an unfavorable position.

To withstand the competition, the Soviet Union had to make fast decisions and focus on efficiency and production volume, ignoring ethical obligations, including safety (“Low Safety Culture,” 2019). The authoritarian regimen disapproved initiative and demanded upholding the system of seniority. Conversely, the U.S. continued to promote democratic values and recognized the importance of collaboration. Therefore, the existence of ethical standards and better technical conditions did not result in the tragedy in Three Mile Island, but placing competition and authority before safety caused the Chernobyl tragedy.

The dissimilar outcomes of the accidents can be commented on based on the different attitudes to the precautions, which can be partially attributed to the people’s mentality. The Chernobyl disaster was provoked because of the breach of transparency and accountability. The workers performed a dangerous test without being aware of the potential consequences as the leaders ignored or hid these details (The Nuclear Energy Institute, 2019). The procedure was not accompanied by the proper safety procedures because nobody was responsible solely for this aspect (“Low Safety Culture,” 2019).

On the Three Mile Island plant, the employees did not conduct questionable procedures under pressure and were just subject to an unexpected situation. They took corrective actions as soon as possible, and the serious influence of the accident was avoided (Rosztoczy, 2019). Thus, taking safety measures means following ethical standards and preventing detrimental events.

Nevertheless, both disasters involved a breach of engineering ethics and needed improvement in this realm. The accidents have revealed gaps in the leaders’ and subordinates’ accountability, prudence, and transparency, which formed the prerequisites for the occurred issues (“Low Safety Culture,” 2019). Some individuals acted beyond their competency area and provided misleading information, for instance, the chief engineer on the Chernobyl nuclear station and the managers organizing the training of employees on Three Mile Island.

Furthermore, both accidents have illustrated the incidences of violating the principles of beneficence, nonmaleficence, and inclusiveness, revealing the professionals’ inability to sympathize and communicate in an efficient manner. The underlying standards of honesty and integrity were not followed as well. One can notice that every breach of ethical obligations can hinder the safety of the enterprise, which is why it is crucial to uphold the existing codes of conduct within the nuclear industry.

The analysis of engineering ethics related to the disasters in Chernobyl and on Three Mile Island requires examining their backgrounds and factors, viewing the accepted principles and standards, and assessing compliance of the involved parties. The former tragedy has happened after the system test resulted in the massive explosion, people’s deaths, diseases, and considerable harm to the local environment. The latter accident involved partial melting of the reactor’s core due to the unnoticed problem and incorrect series of actions; it entailed mostly economic losses.

The causes of the Chernobyl event included poor design of the reactor and the human factor, while the Three Mile Island incident occurred due to minor equipment deficiencies and the insufficient preparation of the operators. In this light, the obligatory aspects of industry ethics encompass nonmaleficence, prudence, justice, dignity, accountability, transparency, and inclusivity. In the Chernobyl case, mostly all involved individuals violated at least one ethical principle, while in the other event the breach was limited to the designer’s fault and inadequate training of the employees. The comparison of the disasters allows concluding that improving the ethical culture will help enhance nuclear power safety.

Cho, K. W. et al. (2018). ICRP Publication 138: Ethical foundations of the system of radiological protection. Annals of the ICRP, 47 (1), 1-65.

Filburn, T., & Bullard, S. (2016). Three Mile Island, Chernobyl and Fukushima: Curse of the nuclear genie. Cham, Switzerland: Springer.

Low safety culture of the entire system – The cause of the Chernobyl accident. (2019).

National Society of Professional Engineers. (2019). Code of ethics for engineers [PDF document].

Plokhy, S. (2019). Chernobyl: History of a tragedy. London, UK: Penguin Books.

Rosztoczy, Z. R. (2019). Root causes of the Three Mile Island accident . Nuclear News.

The Nuclear Energy Institute. (2019). Chernobyl accident and its consequences .

United States Nuclear Regulatory Commission. (2018). Backgrounder on Chernobyl nuclear power plant accident.

World Nuclear Association. (2020). Three Mile Island accident .

  • Laser Drilling: Extracting Oil and Gas
  • The Tractor Hydraulic System: Components
  • The Chernobyl Disaster: Influence on Human Health
  • The Chernobyl Disaster: Time, Distance and Shielding
  • How Chernobyl Affected Animals?
  • Engineering Hardware Identification
  • Nikola Tesla's Inventions and Achievements
  • Very-Low-Frequency Testing of Cables
  • Characterisation of Large Disturbance Rotor Angle and Voltage Stability in Interconnected Power Networks with Distributed Wind Generation
  • Water Treatment System for Saline Bores in Cape York
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2022, February 26). Engineering Ethics of Chernobyl and the Three Mile Island. https://ivypanda.com/essays/engineering-ethics-of-chernobyl-and-the-three-mile-island/

"Engineering Ethics of Chernobyl and the Three Mile Island." IvyPanda , 26 Feb. 2022, ivypanda.com/essays/engineering-ethics-of-chernobyl-and-the-three-mile-island/.

IvyPanda . (2022) 'Engineering Ethics of Chernobyl and the Three Mile Island'. 26 February.

IvyPanda . 2022. "Engineering Ethics of Chernobyl and the Three Mile Island." February 26, 2022. https://ivypanda.com/essays/engineering-ethics-of-chernobyl-and-the-three-mile-island/.

1. IvyPanda . "Engineering Ethics of Chernobyl and the Three Mile Island." February 26, 2022. https://ivypanda.com/essays/engineering-ethics-of-chernobyl-and-the-three-mile-island/.

Bibliography

IvyPanda . "Engineering Ethics of Chernobyl and the Three Mile Island." February 26, 2022. https://ivypanda.com/essays/engineering-ethics-of-chernobyl-and-the-three-mile-island/.

IMAGES

  1. Chernobyl Disaster

    chernobyl case study in professional ethics

  2. PE CASE Study

    chernobyl case study in professional ethics

  3. chernobyl case study in engineering ethics

    chernobyl case study in professional ethics

  4. Ethics of Chernobyl by Daniel ryan on Prezi

    chernobyl case study in professional ethics

  5. PPT

    chernobyl case study in professional ethics

  6. (Download PDF) PROFESSIONAL ETHICS, VALUES AND HUMAN

    chernobyl case study in professional ethics

VIDEO

  1. The Chernobyl Disaster

  2. What if Chernobyl Didn't Explode?

  3. Chernobyl disaster case study ..#worst manmade disaster

  4. Case study 2- Chernobyl disaster

  5. Case Study About The Chernobyl disaster in hindi

  6. Chernobyl Power Plant Case Study

COMMENTS

  1. Engineering Ethics

    Engineering Ethics - Chernobyl's Case Study. The Chernobyl disaster was nuclear accident that occurred at Chernobyl Nuclear Power Plant on April 26, 1986. A nuclear meltdown in one of the reactors caused a fire that sent a plume of radioactive fallout that eventually spread all over Europe. Chernobyl nuclear reactor plant, built at the banks ...

  2. 8: Case Studies: Chernobyl, Three Mile Island

    MIT OpenCourseWare is a web based publication of virtually all MIT course content. OCW is open and available to the world and is a permanent MIT activity

  3. PDF Using the Chernobyl Incident to Teach Engineering Ethics

    William R. Wilson. s Media B.V. 2011Abstract This paper discusses using the Chernobyl Incident as a case study in engineering e. hics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster and to defend their decisions. in a mock debate. The results of student surveys and the Engineering ...

  4. Chernobyl Disaster

    This video is on Case Study: Chernobyl Disaster - GE8076 | Professional Ethics in Engineering. (If you want full slides, please contact me at [email protected]...

  5. Using the Chernobyl Incident to Teach Engineering Ethics

    This paper discusses using the Chernobyl Incident as a case study in engineering ethics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster and to defend their decisions in a mock debate. The results of student surveys and the Engineering and Science Issues Test indicate that the approach is very popular with students and has a positive ...

  6. 10: Case Studies: Chernobyl, Three Mile Island (cont.)

    MIT OpenCourseWare is a web based publication of virtually all MIT course content. OCW is open and available to the world and is a permanent MIT activity

  7. Lessons of Chernobyl

    Lessons of Chernobyl. Published June 7, 2019. National Review Online. By Mona Charen. O n the morning of April 28, 1986, an employee of the Forsmark nuclear power plant an hour north of Stockholm was returning from a restroom break when his shoes set off the radiation alarm. Soon klaxons were sounding everywhere.

  8. 8: Case Studies: Chernobyl, Three Mile Island

    MIT ESD.932 Engineering Ethics, Spring 2006Instructor: Dr. Taft BroomeView the complete course: https://ocw.mit.edu/courses/esd-932-engineering-ethics-spring...

  9. Using the Chernobyl Incident to Teach Engineering Ethics

    The case study approach can thus be ineffective for training students to understand professional ethics because it turns the focus on technical mistakes, such as a flawed reactor design (Herkert ...

  10. PE CASE Study

    Professional Ethics - Chernobyl's Case Study. S. No Content Page No. 1 Background 2- 2 What led to the Disaster? 4 3 The Experiment 5- 4 Fatal Effect of the Disaster 8- 5 US Reactors and NRC's response 10- 6 Discussion 12- 7 Sarcophagus 14- 8 Ethical and Legal Impact 16 9 Information Resources 17

  11. Using the Chernobyl Incident to Teach Engineering Ethics

    Using the Chernobyl Incident as a case study in engineering ethics instruction incorporates technical, communication and teamwork skills and has many of the features suggested by recent literature. This paper discusses using the Chernobyl Incident as a case study in engineering ethics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster ...

  12. Three Mile Island (TMI) Accident

    This video is on Case Study: Three Mile Island Accident | Case Study - GE8076 Professional Ethics in Engineering. (If you want full slides, please contact me...

  13. UnitIII Professional Ethics

    UnitIII professional ethics - Free download as PDF File (.pdf), Text File (.txt) or read online for free. This document discusses engineers' responsibility for safety. It covers assessing and analyzing risk, comparing risks and benefits, and case studies of major industrial accidents at Three Mile Island and Chernobyl nuclear power plants. Risk analysis involves identifying hazards, evaluating ...

  14. Using the Chernobyl incident to teach engineering ethics

    Abstract. This paper discusses using the Chernobyl Incident as a case study in engineering ethics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster and to defend their decisions in a mock debate. The results of student surveys and the Engineering and Science Issues Test indicate that ...

  15. Using the Chernobyl incident to teach engineering ethics

    This paper discusses using the Chernobyl Incident as a case study in engineering ethics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster and to defend their decisions in a mock debate. The results of student surveys and the Engineering and Science Issues Test indicate that the approach is very popular with students and has a positive ...

  16. Using the Chernobyl incident to teach engineering ethics

    Abstract. This paper discusses using the Chernobyl Incident as a case study in engineering ethics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster and to defend their decisions in a mock debate. The results of student surveys and the Engineering and Science Issues Test indicate that ...

  17. Chernobyl Power Plant Case Study

    About Press Copyright Contact us Creators Advertise Developers Terms Privacy Policy & Safety How YouTube works Test new features NFL Sunday Ticket Press Copyright ...

  18. The Ethics of Scientific Advice: Lessons from "Chernobyl"

    In reality, Legasov and a number of other scientists offered advice on how to proceed with cleaning up the disaster. According to Adam Higginbotham's Midnight in Chernobyl: The Untold Story of the World's Greatest Nuclear Disaster, the politicians were ignorant of nuclear physics, and the scientists and technicians were too paralyzed by ...

  19. Using the Chernobyl Incident to Teach Engineering Ethics

    This paper discusses using the Chernobyl Incident as a case study in engineering ethics instruction. Groups of students are asked to take on the role of a faction involved in the Chernobyl disaster and to defend their decisions in a mock debate. The results of student surveys and the Engineering and Science Issues Test indicate that the approach is very popular with students and has a positive ...

  20. Using The Chernobyl Incident To Teach Engineering Ethics

    Scribd is the world's largest social reading and publishing site.

  21. Chernobyl Disaster and Engineering Ethics

    This Chernobyl ethics case study reflects events that occurred on the Chernobyl Nuclear Power Plant. The disaster of the Chernobyl Nuclear Power Plant was the largest nuclear power accident in the history of the world. Discarded from the destroyed reactor to the atmosphere, nuclear fuel fission products were separated by air over considerable ...

  22. Ecological Disaster Area: the Chernobyl Case Study

    ECOLOGICAL DISASTER AREA: THE CHERNOBYL CASE STUDY Oleg S. Kolbasov* The accident at the Chernobyl atomic power station on April 26, 1986, was dangerous and vast, with long-term adverse consequences. It now is viewed as one of the national disasters of the century. The most significant damage resulting from the accident was the radioac­

  23. Engineering Ethics of Chernobyl and the Three Mile Island

    The causes of the Chernobyl event included poor design of the reactor and the human factor, while the Three Mile Island incident occurred due to minor equipment deficiencies and the insufficient preparation of the operators. In this light, the obligatory aspects of industry ethics encompass nonmaleficence, prudence, justice, dignity ...