Ethics of Chemical Weapons Research:
|
boiling point (°C) | LD50, skin (mg/kg) | LD50, subcutaneous (mg/kg) | LC50, inhalation (ppm) | LCLo, inhalation (ppm) | |
Chloroacetone | 119 | 141b | 262 (1 h) | ||
Chlorine | -34.0 | 1373 (1 h) | 800d (30 min) | ||
Arsenic trichloride | 130 | 80 | 200e (20 min) | ||
Phosgene | 8.2 | 190e (15 min) | |||
Hydrocyanic acid | 26 | 3.0c | 160 (30 min) | ||
Chloropicrin |
112 |
111a,b (20 min) | |||
Phenyldibromoarsine | 265 | 15 | |||
Mustard agent | 216 | 5 | 1.5 | 40.0a,b (10 min) | |
Lewisite | 197 | 15 | 1.0 | 6.0f (30 min) | |
Sarin | 147 | 2.5 | 0.103 | 0.81a,c (30 min) | |
Soman | 198 | 7.8 | 0.071 | 0.13a,c (30 min) | |
VX | 300 | 0.25b | 0.012 | ||
LD50 and LC50 are the lethal dose per kg body weight and the lethal air concentration, respectively, which kill 50% of a sample of test animals, differentiated by the species of the animal, the uptake route (skin absorption, subcutaneous, oral, inhalation, etc.), and exposure time in case of LC50; LCLo is the lowest concentration reported to have caused the death of certain animals after a certain exposure time. a converted from mg/m3; b rabbit; c mouse; d dog; e cat; f human. |
In August 1919, shortly after the formal peace Treaty of Versailles was signed, Fritz Haber escaped to neutral Switzerland, fearing the persecution by the Allies for war crimes. Two months later he received a message from Sweden that he would be awarded the Nobel Prize for chemistry, retrospectively for 1918, for his achievements in ammonia synthesis. However, by then Habers invention had not much been used for the manufacture of fertilizers, which only became feasible and cheap by massive catalytic improvements that earned the former CEO of BASF, Carl Bosch, the Chemistry Nobel Prize as late as 1931. Instead, by 1918, ammonia by the Haber process was mostly used (via nitrogen dioxide and nitric acid to react with various aromatics) for the large scale production of high explosives for shells, such as trinitrotoluene (TNT), trinitroglycerin (TNG), and nitrocellulose. Moreover, in order to obtain hydrogen for ammonia synthesis (by electrolysis of aqueous NaCl solutions, the chloralkali process) equal amounts of chlorine were produced to be used as poison gas. Thus, immediately after the war, the Nobel committee honored the crucial chemical reaction that enabled the mass production of both high explosives and poison gas a cynical prize for chemical warfare as contemporary critics called it.
By many other Nobel Prizes, which were already then considered the highest international awards in science, the Swedish Academy honored major figures in German chemical weapons research and development (Van Der Kloot 2004). Richard Willstätter, head of the national gas mask research unit, received the Nobel Prize for chemistry already in 1915. Walther Nernst, who like Haber escaped after the war out of fear first to Sweden and then to Switzerland after he had sold his estate in Germany, was awarded the same prize in 1920. Habers most talented recruitments of his poison gas team, Gustav Hertz and James Franck, were the physics Nobel Laureates of 1925. Otto Wieland, the German co-father of mustard agent and Adamsite, won the chemistry prize in 1927. Otto Hahn did so only in 1947 after his co-discovery of nuclear fission had been developed into the atomic bomb.
Despite notable exceptions, starting with Nobel Laureate Hermann Staudinger during and after WWI, the scientific community has never seriously questioned the reputation of these scientists because of their engagement in chemical weapons research. Instead, their names have been upheld as models for future generations. For instance, the KW institute that Haber once turned into the biggest weapons research unit worldwide, is now named the Fritz Haber Institute of the Max Planck Society. The German Physical Chemical Society (Bunsen-Gesellschaft) calls its highest award the Walther Nernst Medallion and its young scholars award the Nernst-Haber-Bodenstein Prize. The top award of the German Chemical Society for organic chemistry is named after Emil Fischer.
The honoring of former heroes of chemical weapons research is not confined to Germany (Freemantle 2014, pp. 44 ff., 219 ff.). For instance, the medical chemist Fritz Pregl, a leading figure in the Austrian chemical warfare project, received the chemistry Nobel Prize in 1923. When the International Union of Pure and Applied Chemistry (IUPAC) was founded in 1919, they elected as their first President Charles Moureu who had been head of the French offensive chemical warfare department during the war. Both Moreau and Nobel Laureate Victor Grignard, who was the major scientific innovator of French chemical warfare, are still honored by numerous monuments in France. In 1922 the British chemist Sir William Jackson Pope, who was knighted for his chemical warfare achievements, followed Moureu as President of IUPAC. Such as Pope had made for himself a name in the synthesis of mustard agent in Britain, so had James B. Conant in the US as a young scholar, whom historians of science mainly know from his mentoring of Thomas S. Kuhn. After the war, Conant rapidly advanced from chemistry professor to Harvard University President to one of the most influential science policy advisor in the US during and after WWII, particularly on nuclear weapons research and deployment. Before the rise of nuclear weapons physics, chemical weapons research appears to have been one of the most promising fields to make a career in science and science administration.
Ethics or moral philosophy, one of the oldest philosophical disciplines, serves various purposes. One is to justify or criticize new and existing national and international laws on the basis of accepted ethical principles. For instance, before a new law comes into force in democratic societies, ethical deliberations and debates are usually conducted to ensure that it is in accordance with the prevailing ethical standards. Equally important is the role of ethics in providing moral guidance in areas not covered by law. Because the law cannot and for various reasons should better not control all human behavior, this leaves ample room for ethics. For instance, chemical weapons research is not forbidden by any law. If you research a potential substance for chemical warfare, you could always argue that you are just studying the compound to understand its interesting chemical structure, or to find a new agent for pest control. No judge would be able to read your mind, although your colleagues might guess what you are after.
Ethical theories aim to make impartial judgments about what is morally right or wrong, regardless of any personal, corporate, or national interest, i.e. their first principle is impartiality. Like any theory, they do so by providing general principles and methods to derive judgments for particular cases. All ethical theories for the moral assessment of human actions fall into two main groups, utilitarianism and deontology, which both provide respectable moral positions. They differ to some extent in their moral judgments, but not, as we will see, about weapons research.
Utilitarian theories are based on a single principle, a normative rule for actions: act so that the consequences of your actions maximize the benefit of all people. Theories greatly differ in what they understand by benefit, how to calculate and balance it with harmful consequences, how to distribute it best, and to what extent all people include future generations and nonhuman living beings. In the present context, these differences are unimportant. What matters is that utilitarian (and more generally consequentialist) theories judge actions in retrospect only according to their actually beneficial and harmful consequences, including the unintended adverse consequences, i.e. the naive good will that brings about harm is a major cause of moral failure.
Deontological theories (from Greek deon: duty, obligation) are based on two or more principles that are all general normative rules or duties. These duties are frequently organized by values (commandments of what one should strive for) and evils (prohibitions of what should be avoided). They all incorporate the utilitarian norm in the form of the commandment of benevolence and the prohibition of doing harm, which is typically of higher rank such that the prospective benefit rarely justifies doing harm. Unlike in utilitarianism, however, benefit and harm cannot simply balance each other out. On the one hand, there are additional absolute prohibitions, e.g. of harming human integrity or dignity. On the other, benefit and harm should each be fairly distributed according to values of justice. For specific contexts, such as biomedical ethics, core lists of further principles have been developed, albeit with vague priority rankings, which is one of the weaknesses of deontology.
The most famous deontological approach, by Immanuel Kant, provides a meta-rule for deriving normative rules in each kind of context, which is a sophisticated version of the Golden Rule: chose only those general rules of which you can reasonably want that it becomes a universal rule applicable to anyone. In deontology, actions are not judged on the basis of the actual consequences but according to whether one acts out of ethically justified duties or not, which of course includes the duties to foresee, based on the available knowledge, any possible harm and to avoid it. Here again, as in utilitarianism, naivety is a significant moral failure, not an excuse.
Compared to ordinary life activities, science and engineering are special in that they potentially create entirely new entities that did not exist before, say a new chemical substance or a new weapon. Moreover they discover, and usually make public, the ways in which these new entities can be made. Research, development, and publication are the actions for which scientists and engineers are to be hold responsible and which are to be ethically assessed.
Let us begin with utilitarianism and ask what the likely consequences of successful weapons research are. They are of two kinds. First, those who get access to your knowledge will try to build the weapon and use it to threaten other people, some of whom will deploy it in order to kill or harm other people, if only to illustrate their power. That has been true throughout history, and includes the hydrogen bomb that a growing number of countries have rebuilt. In general, despite all efforts to classify it as secret, scientific and engineering knowledge about powerful weapons quickly leaks, by espionage or the analysis of weapon tests, to the rest of the world, including your enemies, rogue states, and terrorists. It is difficult to find any exception to that historical law.
Therefore, according to utilitarianism every deployment of your weapon by anyone in the future belongs to the consequences of your research, according to which your action is morally judged. You might only have wanted your weapon used only as a means of deterrence, or for a certain one-time deployment by good guys in a special situation, but that naivety is no excuse in ethics. Even if there are such special situations in which the possession or use of a weapon by one party has beneficial consequences, the overall consequences in the future, which includes any deployment by any party, are by all reasonable foresight harmful and outweigh any possible positive consequence. Thus from an utilitarian point of view weapons research is clearly morally wrong.
Moreover, as scientific research strictly builds on itself, which is an almost unique feature of science among all cultural activities, so does weapons research. The second type of consequences includes further research by others who modify and improve your weapon, making it more effective. If that is done by your enemy, it becomes a step in an arms race that develops ever more devastating weapons, of which poison gas research during WWI is a particularly instructive example because it quickly escalated once the international ban had been broken. In this case your original research does not literally cause the follow-up research, but it enables it, such that subsequent step(s) in an arms race are the consequences of your research activity for which you are co-responsible. You might desire to reach an immediate advantage for the good, but you actually contribute to an ever worsening development of weaponry. It is difficult to imagine another research situation where the utilitarian verdict, no matter what specific theory, is so clear as in weapons research.
Much of what has been discussed above also applies to deontological ethics because the prohibition of doing or causing predictable harm is a major duty in all systems. Because all weapons research causes easily predictable harm in the future, it is forbidden.
And yet, the appeal to duty has frequently been abused for justifying weapons research and other crimes. For instance, Haber in the interwar period argued that he had performed his projects out of duty to his home country. Even Heinrich Himmler, leader of the Nazi SS, claimed in his notorious Posen speech (1944) that the extermination of Jews would be a "moral right", a "duty to our people". However, the alleged duties to ones nation, corporation, or gang are not moral duties but only self-imposed rules by a group. They all violate the principle of impartiality that defines the scope of moral rules. In contrast a moral duty is a duty to anybody regardless of membership of a group, or to humanity as a whole. Hence, patriotism is not to be confounded with morality.
Kants meta-rule, which implements the principle of impartiality, is a useful test instrument for moral rules: chose only those general rules of which you can reasonably want that they become universal rules applicable to anyone. The rule to be examined is thus not My chemical weapons research is permitted, but instead Chemical weapons research is generally permitted. Can you reasonably want that anyone else follows this, now and forever? If you think that there are irresponsible people who should not be allowed to do that, then you consider the rule morally wrong. Moreover, if you think that such unrestricted weapons research leads to an arms race to make ever more poisonous substances that threaten the existence of humanity and all living beings, then you would even more strongly oppose the rule. Only a suicide candidate might want that, but that does not count as a reasonable volition.
In sum, chemical weapons research and development is morally wrong according to all major ethical theories, all of which were well-known during WWI. This does not only include the synthesis of new poisons, but also the research and development of effective deployment methods in the form of actual weapons. All chemists who contributed to that during WWI and thereafter morally failed.
Note that you do not need to be a pacifist to accept the conclusion as some have argued (Kovac 2013). Even if you are willing to support the use of weapons under certain circumstances, you can strictly disagree with weapons research in general for ethical reasons.
After WWI chemical weapons researchers have expressed various excuses that, strangely enough, became popular vindications for moral wrong-doing in science. Because their ethical refutations are less known, it is worthwhile to point out the underlying misconceptions of the thirteen most common excuses.
Surely most weapons researchers felt some obligation during their work, including patriotism and commitment to their research unit, and thus considered it only right to fulfill their duties. However, as has been shown above, moral duty to my nation (or to any other group) is a contradiction in terms because morality implies impartiality, whereas patriotism includes a nationalistic bias. This position thus confounds patriotism with ethics. You cannot take holidays from ethics, not even in wartime.
That most common misunderstanding in science takes weapons research to be ethically neutral, only the deployment is to be blamed. However, this assumption is wrong by any ethical theory. Everybody is co-responsible for the consequences of ones action. Hence, also the creators of new weapons are co-responsible for any future deployments of their creations because those are the consequences of their action. There is no ethical theory that would allow for an exemption or excuse.
Even though there might be situations where the deployment of a weapon prevents greater harm than it causes, the arguments confuses research with deployment. It focuses on a specific deployment as the consequence of ones research, but neglects all future uses and misuses of the weapon that are to be considered in an ethical assessments too. This is the standard form of moral naivety that neglects the unintended but easy to foresee consequences.
This sophisticated form of the previous excuse refers to Just War theories, according to which a particular war can be morally justified under certain conditions, which includes several elaborations of the Hague Conventions and the prohibition of weapons of mass destruction (i.e. nuclear, biological and chemical weapons), such that the argument is pointless. In general, Just War theories are irrelevant for the moral assessment of weapons research, unless all possible hostile parties, including terrorists, will always comply to those rules, which is more than unlikely. Just War theories can only be applied to very particular war situations, whereas weapons researchers increase the arsenal of weaponry for any party in any future war.
After WWI many chemists, including Haber, argued that chemical weapons are more humane because of their lower death toll compared to other weapons.[9] On the one hand, it is impossible to calculate and compare the degree of humanity of different weapons. For instance, is killing slowly over years more humane than killing fast? Moreover, during WWI and thereafter chemical weapons employed ever more toxic substances that were dropped or sprayed from airplanes to kill anyone living beneath. This indiscriminate or uncontrolled effect made them weapons of mass destruction, like biological and nuclear weapons. On the other hand, the argument again confuses weapons research with deployment. Research and development of any new weapons adds another tool for killing and harming people and is morally wrong as such, regardless of what other weapons already exist.
If chemists produce chemical weapons that existed before, they do no original research and development, but only production work. Then the arguments above do not apply. However as creators they are co-responsible for the use of the stockpile they produce, even if they have no control over the particular deployment. And they help their country to commit war crimes for which they are co-responsible. If, on the other hand, they develop new chemical weapons for retaliation, then they engage in morally wrong research and, even worse, contribute to an arms race that develops ever more devastating weapons.
In times of war the enemy is usually made guilty by downplaying ones own activity as purely defensive and exaggerating that of the enemy as aggressive. This creates the dangerous constellation of an arms race in which each step further is justified as an allegedly defensive or responsive measure. Systemic forces seem to take over the responsibility of the individuals. However, systemic forces cannot assume ethical responsibility. Only individuals can be held ethically responsible, either alone or as members of a group who share the responsibility, such as a weapons research team or all chemists involved on either side. Pointing to systemic forces is therefore no moral excuse, but a way to shift responsibility to an abstract entity. Moreover, using this argument to justify ad hoc research in an arms race once again confuses the different responsibilities of weapons research and deployment.
A frequent excuse tries to downplay ones own role by arguing that one was a replaceable actor: my refusal would have made no difference, others would have worked in my position such that the consequences were unavoidable. What at first glance looks like a moral argument is actually not. Imagine a man is lying unconscious on the street with money in his hand. You steal his money thinking, If I dont do it, somebody else will do it. If you are tried for your crime, your excuse would make no impression because it is morally irrelevant for the judgment of your culpability in committing a robbery. Pointing to other possible criminals does not diminish ones responsibility, nor does it excuse wrong-doing, be it robbery or weapons research.
Many weapons researchers have tried to diminish their responsibility in retrospect, arguing that they had no other choice, were forced or ordered to do so. However, there is no single reported modern case of forced research; it is even questionable if creative research is possible under force. The social pressure on weapons researchers is usually not different from that of any other employee. Leaving a weapons program might bring some disadvantage, for example, in ones personal research career or earnings. But that does not count as a moral excuse, such as the need of money does not excuse a thief. The question is: why does somebody get involved in such a research program in the first place?
The Chemical Weapons Convention permits the small-scale production of highly toxic substances for medical, pharmaceutical and protective research (see above). However, that permission can easily be abused. First, as we have seen above, protective devices such as special gas mask filters are part of offensive equipment. They allow offenders to use poisons while being themselves protected. Second, any new highly toxic substance that might be researched for some pharmaceutical effect is at the same time, by its toxicity, a new potential chemical weapon. Thus, if you first synthesized it, you are co-responsible for its possible military or terrorist abuse by anyone in the future.
As a weapons researcher you are supposed to know at least what anybody else knows, that knowledge about powerful weapons readily leaks out to the rest of the world, including terrorists. You might not have intended terrorists to use your weapon, but that is an unintended consequence of your research that is easy to foresee and for which you are co-responsible.
According to Cold War standard rhetoric, a war between two enemies becomes unlikely if both are equally equipped with weapons of mass destruction that are ready to be deployed as retaliation for any possible first strike by the other. Apart from the general deficits of the argument, it can hardly be applied to weapons research. The argument presupposes a weapons balance. However, research on either side to create more sophisticated or disastrous weapons is an attempt to destroy exactly that balance, which triggers an arms race rather than enabling stable conditions for peace.
The development of complex weapons systems requires a division of labor. Various individuals or research groups each work on a small element of the entire system. If the project is secretly coordinated, it might be possible that some researchers, particular young scientists, are not aware of the overall goal of their individual work. However, such conditions hardly apply to chemical weapons research aimed at poisons or explosives, where military purposes are likely intended. If, nonetheless, senior researchers trick young scientists into weapon research projects without their knowledge and consent, they commit a major ethical offence.
As we have seen in Section 2.5, many leading figures of chemical warfare research in WWI had excellent careers afterwards. Moreover, they received numerous Nobel Prizes and have been honored by the scientific community in the names of educational buildings, scientific institutes, and awards till the present day. No doubt they all made other important contributions to science that are worth commemorating. However, they also morally failed according to all major ethical theories. And many were honored, not despite, but because of their warfare engagement.
Using someones name for a scientific institution or an award honors the persons integrity as a whole rather than a particular achievement. For instance, Germanys biggest research institute for physical chemistry is called Fritz Haber Institute, not Nitrogen Fixation Institute, because it honors and commemorates Haber entire lifetime work beyond his contribution to nitrogen fixation. It singles out the person as an outstanding role model for a younger generation, to be admired and copied. How can this still be justified in todays world?
One could argue that Habers scientific achievements and his unquestionable personal engagement for his employees outweigh his moral failure. But how does one balance these factors? Does not such a compromise deliver a dangerous message: your scientific achievements can outweigh your moral failure? In the same vein one could honor the Nazi physiologists who, by brutal or lethal experiments on concentration camps prisoners, produced valuable physiological knowledge.
It seems more likely that many chemists take moral failures to be marginal and ignore them. For instance, biographies of WWI warfare chemists, except for Haber, written by fellow chemists typically omit their war engagement or mention it only in passing, as if nobody should not know about that. They thereby miss the chance of engaging young chemists in historical and ethical issues of their discipline and the chance to draw valuable lessons. Moreover, they further isolate chemistry from a civil society that learn about such topics from public media. Keeping moral failure as an open secret leaves the impression as if the chemical community has not come to terms with ethics since WWI.
The story of poison gas in WWI is an instructive example of the academic-industrial-military-governmental complex. Actors from different fields collaborated in a network with the aim of committing a war crime. Such complexes invite confusion about who is responsible for what. Thus the first task of an ethical analysis is to disentangle the network and define the primary responsibilities according to the different kinds of actions and decisions that occurred: The scientists and engineers research and develop the new weapon; industry produces it; the government has the ultimate decision on its deployment; and the military decides when and how it will exactly be deployed. In the case of Haber, parsing the responsibilities is particularly difficult because he had positions in virtually all fields of the academic-industrial-military-governmental complex. The next step consists in eliminating all non-ethical duties and commitments, such as by patriotism, public pressure, business contracts, and local law. Based on an ethical theory you can then perform an ethical assessment of the individual contributions and, on a more advanced level, of the interactions of the actors.
In the case of weapons research conceptual confusion abounds, particularly between ethics and patriotism and between research and deployment. Based on patriotism weapons researchers have constructed a pseudo-moral legitimation for their work. And moral debates on weapons research have either made researchers responsible only for certain deployments or rejected any responsibility for deployment. However, ethically the creators of a new weapon are co-responsible for all subsequent uses and misuses of their creation, which they enable and of which they are supposed to know the overall harmful consequences. There is no excuse of not-knowing or not-intending.
Since WWI governments have employed or contracted scientists on a large scale for researching new weapons that would soon spread worldwide. While politicians might feel responsible only for their own use of these weapons, they tempted scientists into becoming ethically responsible for all possible uses and misuses of their creations in the future. By upholding conceptual confusions about responsibilities, or by having a blind spot towards weapons research, scientific societies have never adequately responded to that large-scale abuse of science (e.g. by condemning it in their codes of conduct).[10] This has made science, and chemistry in particular, suspicious to the public, and rightly so according all major ethical theories. Obviously there are still important lessons to learn from WWI.
I would like to thank Jeffrey Kovac, Tom Børsen, Tami Spector, and two anonymous referees for their useful comments.
There are numerous books on poison gas in WWI. Freemantle 2014 provides a broad and up-to-date view; Friedrich et al. 2017 is the latest anthology. Still worth reading is the classic Haber 1986, written by the son of Fritz Haber. For a comprehensive history of chemical warfare, see Tucker 2006. The best researched Haber biography, with numerous valuable insights, is Szöllösi-Janze 1998, for a very short English essay, see Szöllösi-Janze 2017. Papers on ethics of chemical weapons research typically confuse research with deployment. For an introduction see Kovacs 2016 and Schummer 2001. Some of the standard excuses are dealt with in Ryberg 2003.
[1] For data, see the OECD database (http://stats.oecd.org).
[2] See http://avalon.law.yale.edu/19th_century/dec99-02.asp.
[3] See http://avalon.law.yale.edu/20th_century/hague04.asp#art23.
[4] Freemantle 2014, p. 197, on the industrial production see Johnson 2017.
[5] This section mainly draws on Szöllösi-Janze 1998 to which the following page references refer if not otherwise indicated.
[6] For the text and the dates of signatures and ratifications, see http://disarmament.un.org/treaties/t/1925.
[7] See https://www.opcw.org/chemical-weapons-convention/.
[8] The nine countries are Albania, India, Iraq, Japan, Libya, Russia, (presumably) South Korea, Syria, United States. Japan, which declared abandoned chemical weapons from WWII located in China, is also behind schedule. Syria declared fulfillment of the destruction in August 2014 after which numerous chemical weapons deployment have been confirmed.
[9] Note that during the war, Haber argued that chemical weapons are more humane because they would save lives by ending the war faster, meaning that they would bring a soon German victory by their devastating effect (Szöllösi-Janze 1998, p. 327).
[10] The only code that mentions chemical weapons at all is the one by the German Chemical Society, but it condemns only their production and not their research.
Freemantle, M.: 2014, The Chemists War: 1914-1918, Cambridge: Royal Society of Chemistry.
Friedrich, B.; Hoffmann, D.; Renn, J.; Schmaltz, F. & Wolf, M. (eds.): 2017, 100 Years of Chemical Warfare: Research, Deployment, Consequences, Heidelberg: Springer.
Haber, L.F.: 1986, The Poisonous Cloud: Chemical Warfare in the First World War, Oxford: Clarendon.
Irwin, W.: 1921, The Next War, New York: Dutton, 1921.
Johnson, J.A.: 2017, Military-Industrial Interactions in the Development of Chemical Warfare, 1914-1918: Comparing National Cases Within the Technological System of the Great War, in: B. Friedrich et al. (eds.), 100 Years of Chemical Warfare: Research, Deployment, Consequences, Heidelberg: Springer 2017, pp. 135-149.
Kovac, J. 2013, Science, Ethics and War: A Pacifists Perspective, Science and Engineering Ethics, 19, 449-460.
Kovac, J. 2016, Ethics of Chemical Weapons Research, Bulletin for the History of Chemistry, 41, 56-63.
OPCW: 2016, Report of the OPCW on the Implementation of the Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on Their Destruction in 2015 [available online at https://www.opcw.org/documents-reports/annual-reports/, accessed 5 October 2017].
Rauchensteiner, M.: 2014, The First World War and the End of the Habsburg Monarchy, 1914-1918, Wien: Böhlau.
Ryberg, J.: 2003, Ethics and Military Research: On the Moral Responsibility of Scientists, in: B. Booss-Bavnbek, & J. Høyrup (eds.), Mathematics and War: Basel: Birkhäuser/Springer, pp. 352-364.
Schummer, J.: 2001, Ethics of Chemical Synthesis, Hyle: International Journal for Philosophy of Chemistry, 7 (2001), 103-124.
Schummer, J.: 2019, Art and Representation, in P. Morris (ed.), A Cultural History of Chemistry, vol. 6, London: Bloomsbury, forthcoming.
Szöllösi-Janze, M.: 1998, Fritz Haber (1868-1934): Eine Biographie, München: Beck.
Szöllösi-Janze, M.: 2017, The Scientist as Expert: Fritz Haber and German Chemical Warfare during the First World War and Beyond, in: B. Friedrich et al. (eds.), 100 Years of Chemical Warfare: Research, Deployment, Consequences, Heidelberg: Springer, pp. 11-23.
Tucker, J.B.: 2006, War of Nerves: Chemical Warfare From World War I to Al-Qaeda, New York: Pantheon.
Van Der Kloot, W.: 2004, April 1915: Five future Nobel Prize-winners inaugurate weapons of mass destruction and the academic-industrial-military complex, Notes and Records of the Royal Society, 58, 149-160.
Zecha, W.: 2000, Unter die Masken!: Giftgas auf den Kriegsschauplätzen Österreich-Ungarns im Ersten Weltkrieg, Wien: öbv, 2000.
Joachim Schummer:
Berlin, Germany; js@hyle.org