9

Daniel Kuebler

The current communal scientific enterprise is often associated with a specific set of scientific virtues including, among others, honesty, objectivity, skepticism, curiosity, meticulousness, fortitude, and collegiality. These virtues are seen as essential for the maintenance and continual development of the practice of modern science.[1] However, the historical and social context in which a scientist is embedded can alter the perceived value of these virtues and make it more or less difficult to put these virtues into practice.[2] At present, the modern landscape of scientific research with its hypercompetitive environment for government funds and academic positions,[3] as well as the growing ties between industry and academia, has made the maintenance of these virtues more challenging.

The increasing number of academic fraud cases[4] as well as the documented difficulties of reproducing published results[5] are just two troubling issues that have emerged from this environment. The scientific community has attempted to address these issues in a variety of ways,[6] but the remedies tend to focus on legalistic and policing solutions rather than examining how best to cultivate virtuous habits.[7] While training that stresses normative rules and accountability can influence behavior, it has its limits. The difficulty of policing the rapidly expanding body of scientific research means that the risks associated with unethical behavior are relatively low relative to the perceived benefits individual scientists can acquire in terms of money and career advancement. In such an environment, explaining how to be ethical in research and describing the consequences of getting caught does not necessarily help develop the requisite internal rationale for why one should behave in a virtuous manner. Without an appreciation of the intrinsic value of virtuous behavior,[8] the temptation to cut corners can prove too alluring in the current environment. It seems that new systems are needed to address these challenges. In particular, there is a need to develop curriculum that focuses on the intrinsic value of living a virtuous life and implement it long before individuals embark upon a scientific career.[9] In addition, there is a need to explore different financial models that can alleviate the considerable pressure on practicing scientists to act unethically.

While the practice of science in the late twentieth and early twenty-first century has created unique challenges, it is important to recognize that practice of science has always been influenced, for better or worse, by cultural and social pressures and norms. While some believe that prior to the twentieth century, scientists tended to be dispassionate intellectuals, only concerned with the pursuit of knowledge and unencumbered by ego and reputation, the history of science proves otherwise. Galileo’s classic treatise The Assayer is a case in point. While today it is widely regarded as a pivotal work outlining the modern understanding of science and the scientific method, it was largely written to belittle the views of the Jesuit astronomer Ozario Grassi. Grassi, unlike Galileo, correctly believed that comets were distant astronomical objects moving out beyond the moon. Rather than producing a dispassionate treatise on astronomy, Galileo used the text to lash out at Grassi and his views. As Robert Westfall put it, The Assayer is “one of the all-time masterpieces of sarcastic invective,”[10] while Dom Paschal Scotti has described the book’s author as “a querulous old man more interested in scoring debating points than enjoying truth for its own sake.”[11] Despite the brilliance of its contents, the book is clearly not an example of the contemporary scientific virtues of objectivity, skepticism, curiosity, or collegiality.

As long as science remains a human enterprise, the pride and envy on display in The Assayer will continue to plague the practice. However, in addition to these age-old human vices, practitioners of science in the twenty-first century must operate under unprecedented levels of financial pressure. Whether it is attracting government research dollars amidst growing competition, publishing papers to secure and hold an academic position, holding onto intellectual property, or maintaining industry collaborations, financial concerns have come to dominate the scientific landscape.

This is not to deny that monetary concerns have always impacted the practice of science. For example, in Galileo’s time, astronomy was essential for the refinement of calendars and the development of navigational tools. This made the study of astronomy extremely valuable to both church and state leaders, the leading patrons of science at the time. In addition, scientists were subject to the dictates of their financial patrons. As Galileo lamented while employed by the Venetian republic, “It is impossible to obtain wages from a republic, however splendid and generous it may be, without having duties attached.”[12] Replace the word “republic” with “university,” and Galileo’s line would not seem out of place if spoken by a faculty member at a modern R1.

Despite this, the fact that until the twentieth century a significant percentage of scientists were either financially independent, well-supported by patrons, or clergymen meant that financial issues had a fundamentally different impact on the scientific research of the time. While researchers had to be responsive to their patrons or the church, many often had the freedom to research on topics of their choosing without any financial concerns. Darwin, for instance, never had to worry about his next paycheck, affording himself ample time to spend researching and writing about the life of worms.[13]

Such a situation stands in stark contrast to the financial pressures facing today’s academic scientists. First, there are far more scientists being trained than there are academic positions available. Given that a single faculty member may train twenty or more Ph.D students during his or her career, unless there is an exponential increase in faculty positions, this scenario will continue to produce a glut of well-qualified Ph.Ds on the market. In fact, studies have found that only 20% of recent biomedical Ph.D graduates have ended up in academic positions, and the average age at which they find their first tenure-track job is 37.[14]

Even if one does land a faculty position, the financial pressures do not necessarily ease. The prospects of acquiring and maintaining grant funding are equally daunting. While the NIH grant funding rate was near 30% through the early 2000s, the funding rate has been stuck at or below 20% since 2009. While the absolute number of NIH grants has seen a modest increase, the funding rate has hovered around 19% for the last four years.[15]

A PNAS article authored by four eminent scientists, three of whom are members of the National Academy of Sciences, summed up the adverse effects this hypercompetitive environment is having on science:

As competition for jobs and promotions increases, the inflated value given to publishing in a small number of so-called ‘high impact’ journals has put pressure on authors to rush into print, cut corners, exaggerate their findings, and overstate the significance of their work. Such publication practices, abetted by the hypercompetitive grant system and job market, are changing the atmosphere in many laboratories in disturbing ways. The recent worrisome reports of substantial numbers of research publications whose results cannot be replicated are likely symptoms of today’s highly pressured environment for research.[16]

In addition to what has been called the “replication crisis” in science,[17] although there is considerable debate over whether this is an actual crisis or if it is merely a normal by-product of doing large amounts of science in a complex world,[18] there has been an increase in scientific papers being retracted for fraud, suspected fraud, plagiarism, and duplicate publication over the past twenty years.[19] Some of this increase is attributable to the increase in the number of publications coupled with the additional scrutiny that institutions and journals have implemented recently. However, even when controlling for these factors, a recent study on retractions concluded that the rise is at least in part due to changes in the behavior of individual authors.[20]

Another study on scientific retractions found that the majority were the result of scientific misconduct, with fraud (43%) leading the way, followed by duplicate publication (14%) and then plagiarism (10%).[21] The authors of this study found that only 21% of retractions were due to error. A study investigating retractions in cancer research came to a similar conclusion, finding both a marked increase in the percentage of cancer research papers retracted over the past twenty years and a high percentage, over 60%, retracted for research misconduct.[22]

What is even more troubling is that, if survey results are to believed, most cases of data manipulation and fraud likely go undetected. In a 2012 study of scientists who had published in or peer reviewed for the journal BMJ, 13% indicated they had knowledge of their colleagues “inappropriately adjusting, excluding, altering, or fabricating data” in order to publish.[23] A similar meta-analysis published in PLOS One found an even higher percentage of scientists, roughly 20%, who had witnessed questionable research practices. In addition, 2% of respondents admitted to fabricating data themselves and 10% admitted to other questionable practices such as plagiarism.[24]

The willingness of scientists to deliberately fabricate data is deeply troubling.[25] In one study of University of California-San Francisco post-doctoral fellows, 17% indicated a willingness “to select or omit data to improve their results.”[26] Another study found that 81% of biomedical research trainees at the University of California-San Diego indicated a willingness to do the same in order to gain grant funding or publish a paper.”[27]

Given the discrepancies between the number of papers retracted for fraud and the percentage of scientists who admit to having manipulated data, it appears that the vast majority of fraud and data manipulation cases go undetected. If the survey data are accurate, a not insignificant portion of scientists believe that the risks associated with unethical behavior—the likelihood of getting caught—are sufficiently low relative to the perceived benefits they can acquire in terms of career advancement and financial gain.

Another factor impacting this willingness to selectively modify data is the amount of industry money influencing scientific research. With diminishing government funds to support research, many investigators are turning to industry collaborations.[28] While there is nothing intrinsically wrong with this type of arrangement, incentives to rush a publication or withhold publication for financial gain are hard to resist. One study of eight hundred biotech faculty found that 47% had performed industry consulting work, 25% had received industry grants and, even more concerning, 8% had an ownership stake in a company associated with their research.[29] Another study found that among the department chairs at medical schools and teaching hospitals, 60% had relationships with industry, with 27% serving as a consultant, another 27% being members scientific advisory boards, 14% being paid speakers, 11% being board members, 9% being founders, and 7% being company officers.[30]

While conflict of interest disclosures are now standard when giving scientific presentations or publishing papers, this level of influence raises significant concerns about data integrity, due to both conscious and subconscious bias. In The Scientific Life: A Moral History of a Late Modern Vocation, Steven Shapin summed up the biggest issues associated with these collaborations:

From the 1970s, concerns were…expressed that the intrusion into the university of commercial considerations and commercial ties would lead to a wall of secrecy where once there had been an unchallenged commitment to openness. Others feared for the objectivity of science. Scientists would, it was thought, produce not Truth but the results wanted by their sponsoring commercial concerns. Commercial sponsorship or subvention might be the condition for certain research programs being carried out at all, and so academics with conflicts of interest had the motive to produce biased knowledge.[31]

This does not implicate all industry/academic collaborations, given that there are potential benefits, for example the synergistic use of resources and expertise. However, the money involved certainly raises concerns regarding objectivity.

From the lack of academic positions to the reduction of available grant funding to the influence of industry money, the current academic scientific landscape tends to work against the ethical practice of science. While effectively reversing this trend will require a multi-pronged approach, any attempt to reinvigorate a communal sense of the scientific virtues amongst the academic community would be helpful in addressing the problem. The implementation of modified responsible conduct of research (RCR) training represents one option to do this. In many cases, RCR training simply focuses on sets of rules and regulations that are necessary for the effective practice science.[32] However, this type of training is unlikely to work in the current environment, as it does little to foster an inherent motivation for ethical behavior. It may provide an external rationale, the fear of punishment by regulatory or university ethics committees, but when the likelihood of being caught is low, particularly for cases of subtle data manipulation, the external imposition of rules and regulations provides little motivation or incentive. In fact, the ability of reviewers or funding agencies to effectively police all cases of data manipulation or fraud, short of repeating every experiment that is submitted, is virtually impossible. As a result, rules-based training and a reliance on enforcement hold little promise for curbing the current problems.

With such a lopsided risk/reward calculus, any RCR program that can aid in developing an internal motivation for ethical behavior is key. A promising option on this front is the virtue-based RCR described by Robert Pennock and Michael O’Rourke.[33] In virtue-based programs, an emphasis is placed on the benefits that accrue to individuals and the scientific community if one lives out the scientific virtues, the character traits that allow the practice of science to flourish. As I mentioned in the introduction, while there is no universally agreed-upon list of scientific virtues, they typically include such things as honesty, objectivity, skepticism, curiosity, meticulousness, and fortitude.

If scientists take the time during RCR training to reflect on how certain behaviors, for example, transparency, objectivity, and honesty, are critical for the goals of their chosen discipline, they are much more likely to generate an internal motivation to act ethically. They do so not out of fear of consequences or repercussions, but out of a desire to advance their disciplines and to be exemplary practitioners of their disciplines. Such internal motivations, though, are much more likely to develop if this training comes from a mentor or expert within the field who is recognized as an exemplary practitioner of science. Such role models are sorely needed and can have a huge impact. As Prof. Dr. Hanno Würbel, Director of the University of Bern’s Division of Animal Welfare, points out:

Role models, not guidelines, nourish and sustain the lack of transparency that has created a crisis of confidence in research. As a young scientist, you do not plan to fish for data that confirms a premise and then publish irreproducible study results upon which others build new experiments. You go into a laboratory and watch what more senior people do; listen to the language; observe practices, learn about the importance to your colleagues and the institution—and career—of multiple high-profile publications. Young scientists do not set out to harm science; they just learn the unwritten rules.[34]

This is similar to Pennock and O’Rourke’s proposed exemplar approach “centered on exemplary persons who embody the relevant virtues.” However, they advocate using famous scientists (Darwin, McClintock, Feynman, Einstein) who displayed key virtues as exemplars. While this can be helpful, even Pennock and O’Rourke acknowledge the training would be more impactful if a real-life role model is seen as invested in the scientific virtue-based approach training and leads these scientific virtue-based reflections.[35] While scientific exemplars can be influential, when real-world dilemmas inevitably arise, famous exemplars can too often be discounted as quaint practitioners out of touch with the unique challenges facing modern scientists. Incorporating real embodied exemplars who work in the same university, building, or lab, can have a more profound impact on the career of a trainee and the decisions he or she might make. It also affords the possibility of repeated interactions and mentoring opportunities when young scientists are faced with critical decisions.

While the scientific virtue-based RCR training has benefits over rule-based methods, it does have its limitations and is not the only tool needed to help reinvigorate the practice of the scientific virtues. Even Pennock and O’Rourke, who advocate this type of approach, recognize its limits: “We do not claim that presenting a set of scientific virtues will be sufficient in and of itself to produce ethical behavior in science… [O]ne cannot side-step the practical and political complexities that researchers must confront in messy real-world circumstances as well as external pressures that can threaten even core scientific values.”[36]

It is worth emphasizing two specific pressures/real-world circumstances that must be addressed in order to cultivate the scientific virtues within twenty-first century scientific practice. The first has to do with factors affecting the culture as a whole. While there are a number of issues internal to scientific practice that can be seen as drivers of the increased number of fraud and scientific malpractice cases, this increase has coincided roughly with a concomitant increase in a variety of white-collar crimes. The US Federal Trade Commission’s Consumer Sentinel Network collected a total of 3,083,379 consumer complaints in 2015, which represents an 850% increase since the network began reporting in 2001.(30) Likewise, a recent white-collar crime victimization study (NW3C’s 2010 National Public Survey on White Collar Crime) found that 24.2% of American households reported experiencing at least one form of white-collar crime, which was defined as credit card fraud, price fraud, repair fraud, internet fraud, business fraud, securities fraud, and mortgage fraud.[37] Similarly, the level of health care fraud has increased over the past twenty years.[38]

While there is debate regarding how much of this increase is real and how much is an artifact of increased scrutiny, the incidences of white-collar fraud seem to be increasing in parallel with cases of scientific fraud. The fact that fraud is a significant issue across a number of professional disciplines suggests that it is part of a larger societal problem that needs to be addressed long before the onset of scientific training. In fact, many have argued that waiting until college or graduate school to begin professional ethics training is too late.[39] Starting virtue ethics curriculum at an early age may be more effective in the long run for a variety of reasons. First, it exposes students to these concepts at a formative age when they are just starting to internalize ethical decision-making frameworks and address questions like “why should I act in a certain manner?” or “what is the end or goal of life?” Second, this early education approach provides a foundation upon which scientific virtue-based RCR can later build. Repeated presentation of and engagement with these topics is more likely to have a lasting effect on behavior than one class or seminar in graduate school.

At present, a variety of programs have been developed and piloted to introduce either ethical theory in general, or practical ethical decision making, at the high school, middle school and even grade school level.[40] Some of these programs specifically focus on virtue ethics, while others look at a range of ethical theories. While these programs have met with mixed results, there is a clear interest in developing materials for this age group and much work remains to be done in identifying effective programs.[41]

While there is a need for this type of education early on, once students reach the undergraduate level this type of education should not cease. A philosophical course that deals with ethical theories, particularly virtue ethics, should be required for all students entering scientific professions. Unfortunately, distributed core requirements make it all too common that scientific practitioners have never been exposed to a philosophy course, let alone an ethics course throughout the entirety of their education. Equally importantly, though, would be the inclusion of ethical components in courses across the science curriculum. The Society for Ethics Across the Curriculum has been holding conferences for the past twenty years looking at how to integrate ethics across all disciplines at the undergraduate and graduate levels. In fact, there appears to be a burgeoning effort to develop and implement programs that integrate ethics across a variety of different disciplines.[42] The key is to provide opportunities for students to reflect upon virtue ethics within their field and apply what they have learned from ethics courses. This can facilitate the development of an appreciation of the intrinsic value of virtuous behavior, something that is learned over time through repeated reflection, interaction, and examples. The hope is to transform ethics/virtue training from a hurdle one must clear into a default manner of thinking about and viewing the various complex situations one encounters in science. Having ethics/virtue training as part of the curriculum from high school onward would help foster this type of environment.

This leads into the second major issue that needs to be addressed if scientific virtue-based RCR training is to be effective: the financial model of twenty-first century academic science. While one may agree that certain virtuous behaviors are essential for the practice and advancement of science, when one’s livelihood is on the line, one’s moral calculus has a tendency to become much more malleable. Translating ethical decisions into the hyper-competitive world of modern science is fraught with compromise. Some of this pressure could be alleviated if different models of financial compensation and funding were explored. Many authors have proposed alterations to the grant funding system in order to address this issue. Some have advocated for a random allocation of awards to applicants who meet a certain scientific threshold.[43] Others have advocated for everything from simplified applications systems, to equal funding of eligible scientists, to a peer-voting system to allow for researchers to increase their relative share of grant funding.[44] Beyond changes to the grant system, others have advocated for changes to the academic hiring system through the creation of additional permanent staff research positions rather than trapping individuals in low pay/long-term post-doc positions.[45]

While these ideas have their merits, they do not address one of the biggest financial issues associated with biomedical research, the need for faculty members to garner “soft money” (that is, grants) to sustain their salary. Currently, a significant portion of a researcher’s salary at large research institutions is funded by grants, rather than by the institution. This precarious situation leaves scientists dependent upon grant funding, not only to support their research, but to even draw a full paycheck. As Paula Stephan pointed out in her book How Economics Shapes Science, universities “lease the facilities to faculty in [exchange for] indirect costs on grants and buyout of salary. In many instances, faculty ‘pay’ for the opportunity of working at the university, receiving no guarantee of income if they fail to bring in a grant. Those who land funding staff their labs with students enrolled in their department’s graduate program, or with postdocs.”[46] The students and postdocs are often paid out of the same grant so that all are dependent on the PI’s continued success in the grant system.[47]

While an abrupt transition off the soft-money model would likely be too disruptive, a gradual transition toward the model that prevails in the humanities, where institutions are responsible for faculty salaries and soft money provides, instead, time and resources for research would have numerous benefits. First, it would allow more grants to be funded, as salaries reflect a significant component of most NIH grants. Second, it would give faculty a measure of financial stability. The major funding institutions, such as NIH and NSF, would have to champion this transition, as it runs counter to the research output metric that fuels the rankings of R1 universities. However, given that these government agencies largely hold the purse strings, they have some ability to incentivize universities to migrate in this direction.

Because this new model would likely reduce the number of faculty positions at research universities, other measures, such as limiting the number of graduate students or providing supplemental training, would have to be put in place in order to help reset the already distorted scientific labor market.[48] The most promising option would be to prepare, in a more deliberate fashion, the current glut of biomedical Ph.Ds for alternative careers in fields such as research administration, science policy, regulatory affairs, industry R&D, and science writing/advocacy. More formal training collaborations between industry, government, and academia would aid in the transitioning of Ph.Ds from the academy into government or industry lab settings, settings that operate under different conceptual orientations and implement more rigid laboratory practices than most academic labs. For example, designing Ph.D tracks that specifically prepare scientists for industry with good laboratory practice, good manufacturing practice, and business training would be of benefit to both industry and the current crop of graduate students. Unfortunately, while there has been much talk regarding modifying graduate education to foster academia/industry/governmental collaborations, there is no general consensus on how best to do this in practice, particularly given that each entity is driven by its own distinct interests.[49]

Given the challenges, none of these proposed changes would be easy to implement and all would face significant institutional resistance. Yet despite these hurdles alternative models need to be considered because the current funding pressures for academic scientists have become so pervasive and onerous that the virtues of objectivity, patience, skepticism, and meticulousness are very difficult to implement in practice, regardless of the level of one’s internal motivation.

In conclusion, the implementation of the virtues associated with good scientific practice have always been challenging, given the human component of the scientific endeavor. However, the hypercompetitive environment that academic scientists face in the twenty-first century, coupled with the influence of corporate money, has created a situation where new solutions are needed to help foster scientific integrity. Given that the risks associated with unethical behavior appear low relative to the perceived benefits individual scientists can acquire in terms of money and career advancement, there is a need to develop an internal motivation for scientists to act virtuously, even when it may compromise one’s professional advancement. Specific scientific virtue ethics training programs that incorporate real-world mentors should be considered, as they can foster the requisite internal rationale for virtuous behavior among young scientists. For this to be maximally effective, however, there is a need to expose students to virtue-based ethics long before graduate school and to explore different financial models that lessen the pressure on practicing scientists to act in a manner contrary to the scientific virtues.

DANIEL KUEBLER currently serves as the Dean of the School of Natural and Applied Sciences at Franciscan University of Steubenville and is the co-Director of the Franciscan Institute of Science and Health. His research examines the effects various biologics have on human adult mesenchymal stem cells as well as the use of bone marrow and placental tissue to treat orthopedic disorders. His publications include The Evolution Controversy (Baker Academic) as well as popular articles on science, politics, culture, and religion. He received a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley, and earned a M.Sc. in Cell and Molecular Biology from the Catholic University of America. He also holds a B.A. in English from the Catholic University of America.

Bibliography

  • Alberts, Bruce, Mark W. Kirschner, Shirley Tilghman, and Harold Varmus. “Rescuing US Biomedical Research From Its Systemic Flaws.” Proceedings of the National Academy of Sciences USA 111.16 (2014): 5773–77.
  • Baker, Monya. “Is There a Reproducibility Crisis?” Nature Reviews Immunology 533 (2016): 452–54.
  • Begley, C. Glenn, and Lee M. Ellis. “Raise Standards for Preclinical Cancer Research.” Nature 483 (2012): 531–33.
  • Bird, Stephanie J. “Mentors, Advisors and Supervisors: Their Role in Teaching Responsible Research Conduct.” Science and Engineering Ethics 7.4 (2001): 455–68.
  • Bollen, Johan, David Crandall, Damian Junk, Ying Ding, and Katy Borner. “From Funding Agencies to Scientific Agency: Collective Allocation of Science Funding as an Alternative to Peer Review.” EMBO Reports 15.2 (2014): 131–33.
  • Bozzo, Anthony, Kamil Bali, Nathan Evaniew, and Michelle Ghert. “Retractions in Cancer Research: A Systematic Survey.” Research Integrity and Peer Review 2 (2017): 1–7.
  • Campbell, Eric G., et al. “Institutional Academic-Industry Relationships.” JAMA 298.15 (2007): 1779–86.
  • Darwin, Charles. The Formation of Vegetable Mould, Through the Action of Worms, with Observations on Their Habits. New York: D. Appleton and Company, 1915.
  • Daston, Lorraine, and Peter Galison. Objectivity. Cambridge, MA: MIT Press, 2007.
  • Eastwood, Susan, Pamela Derish, Evangeline Leash, and Stephen Ordway. “Ethical Issues in Biomedical Research: Perceptions and Practices of Postdoctoral Research Fellows Responding to a Survey.” Science and Engineering Ethics 2.1 (1996): 89–114.
  • Englehardt, Elaine E., and Michael S. Pritchard, eds. Ethics Across the Curriculum—Pedagogical Perspectives. Basel, Switzerland: Springer, 2018.
  • Fanelli, Daniele. “How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data.” PLoS One 4.5 (2009): e5738.
  • ——. “Is Science Really Facing a Reproducibility Crisis, and Do We Need It To?” Proceedings of the National Academy of Sciences USA 115.11 (2018): 2628–31.
  • Fang, Ferric C., and Arturo Casadevall. “Research Funding: The Case for a Modified Lottery.” mBio 7.2 (2016): e00422–16.
  • Fang, Ferric C., R. Grant Steen, and Arturo Casadevall. “Misconduct Accounts for the Majority of Retracted Scientific Publications.” Proceedings of the National Academy of Sciences USA 109.42 (2012): 17028–33.
  • Galilei, Galileo. Discoveries and Opinions of Galileo. Translated by Stillman Drake. New York: Anchor Books, 1957.
  • Gülcan, Nur Yeliz. “Discussing the Importance of Teaching Ethics in Education.” Procedia–Social and Behavioral Sciences 174 (2015): 2622–25.
  • Ioannidis, John P.A. “Fund People Not Projects.” Nature Reviews Immunology 477 (2011): 529–31.
  • Kalichman, Michael W., and Paul J. Friedman. “A Pilot Study of Biomedical Trainees’ Perceptions Concerning Research Ethics.” Academic Medicine 67 (1992): 769–75.
  • Khurana, Rhakesh. From Higher Aims to Hired Hands: The Social Transformation of American Business Schools and the Unfulfilled Promise of Management as a Profession. Princeton, NJ: Princeton University Press, 2007.
  • Krimsky, Sheldon. Science in the Private Interest: Has the Lure of Profits Corrupted Biomedical Research? Lanham, MD: Rowman & Littlefield, 2003.
  • Liska, Adam. “The Myth and the Meaning of Science as a Vocation.” Ultimate Reality and Meaning 28.2 (2005): 149–64.
  • Lo, Bernard, and Marilyn J. Field, eds. Conflict of Interest in Medical Research, Education, and Practice. Washington, DC: National Academies Press, 2009.
  • Niederjohn, M. Scott, Kim Nygard, and William C. Wood. “Teaching Ethics to High School Students: Virtue Meets Economics.” Social Education 72.2 (2009): 76–8.
  • Pellegrino, Edmund D. “Toward a Virtue-Based Normative Ethics.” Kennedy Institue of Ethics Journal 5.3 (1995): 253–77.
  • Pennock, Robert T. “Scientific Integrity and Science Museums.” Museums and Social Issues 1.1 (2006): 7–18.
  • ——, and Michael O’Rourke. “Developing a Scientific Virtue-Based Approach to Science Ethics Training.” Science and Engineering Ethics 23.1 (2017): 243–62.
  • Resnik, David B., Elizabeth Wager, and Grace E. Kissling. “Retraction Policies of Top Scientific Journals Ranked by Impact Factor.” Journal of the Medical Library Association 103.3 (2015): 136–39.
  • Scotti, Dom Paschal. Galileo Revisited: The Galileo Affair in Context. San Francisco: Ignatius Press, 2017.
  • Shapin, Steven. The Scientific Life: A Moral History of a Late Modern Vocation. Chicago: University of Chicago Press, 2009.
  • Steen, R. Grant, Arturo Casadevall, and Ferric C. Feng. “Why Has the Number of Scientific Retractions Increased?” PLoS One 8.7 (2013): e68397.
  • Steneck, Nicholas H. ORI Introduction to the Responsible Conduct of Research. Revised edition. Washington, DC: U.S. Government Printing Office, 2007.
  • Stephan, Paula. How Economics Shapes Science. Cambridge, MA: Harvard University Press, 2012.
  • Tavare, Aniket. “Scientific Misconduct Is Worryingly Prevalent in The UK, Shows BMJ Survey.” BMJ 344 (2012): e377.
  • Westfall, Richard S. Essays on the Trial of Galileo. Notre Dame, IN: University of Notre Dame Press, 1989.

  1. Robert T. Pennock and Michael O'Rourke, "Developing a Scientific Virtue-Based Approach to Science Ethics Training," Science and Engineering Ethics 23.1 (2017): 243–62; Robert T. Pennock, "Scientific Integrity and Science Museums," Museums and Social Issues 1.1 (2006): 7–18.
  2. Lorraine Daston and Peter Galison, Objectivity (Cambridge, MA: MIT Press, 2007).
  3. Bruce Alberts,  Mark W. Kirschner, Shirley Tilghman, and Harold Varmus, "Rescuing US Biomedical Research From Its Systemic Flaws," Proceedings of the National Academy of Sciences USA 111.16 (2014): 5773–77.
  4. Ferric C. Fang, R. Grant Steen, and Arturo Casadevall, "Misconduct Accounts for the Majority of Retracted Scientific Publications," Proceedings of the National Academy of Sciences USA 109.42 (2012): 17028–33; Daniele Fanelli, "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLoS One 4.5 (2009): e5738.
  5. Monya Baker, "Is There a Reproducibility Crisis?," Nature Reviews Immunology 533 (2016): 452–54; C. Glenn Begley and Lee M. Ellis, "Raise Standards for Preclinical Cancer Research," Nature 483 (2012): 531–33.
  6. David B. Resnik, Elizabeth Wager, and Grace E. Kissling, "Retraction Policies of Top Scientific Journals Ranked by Impact Factor," Journal of the Medical Library Association 103.3 (2015): 136–39.
  7. Pennock and O'Rourke, "Developing a Scientific Virtue-Based Approach to Science Ethics Training," 243–62; Nicholas H. Steneck, ORI Introduction to the Responsible Conduct of Research, revised edition (Washington, DC: U.S. Government Printing Office, 2007).
  8. Edmund D. Pellegrino, "Toward a Virtue-Based Normative Ethics," Kennedy Institue of Ethics Journal 5.3 (1995): 253–77.
  9. Pennock and O'Rourke, "Developing a Scientific Virtue-Based Approach to Science Ethics Training," 243–62; Nur Yeliz Gülcan, "Discussing the Importance of Teaching Ethics in Education," Procedia– Social and Behavioral Sciences 174 (2015): 2622–25.
  10. Richard S. Westfall, Essays on the Trial of Galileo (Notre Dame, IN: University of Notre Dame Press, 1989).
  11. Dom Paschal Scotti, Galileo Revisited: The Galileo Affair in Context (San Francisco: Ignatius Press, 2017).
  12. Galileo Galilei, Discoveries and Opinions of Galileo, translated by Stillman Drake (New York: Anchor Books, 1957).
  13. Charles Darwin, The Formation of Vegetable Mould, Through the Action of Worms, with Observations on Their Habits (New York: D. Appleton and Company, 1915).
  14. Biomedical Research Workforce Working Group Report (Bethesda, MD: National Insititutes of Health, 2012).
  15. Mike Lauer, "FY 2017 by the Numbers," March 7, 2018, https://nexus.od.nih.gov/all/2018/03/07/fy-2017-by-the-numbers/.
  16. Alberts, Kirschner, Tilghman, and Varmus, "Rescuing US Biomedical Research From Its Systemic Flaws," 5773–77.
  17. Baker, "Is There a Reproducibility Crisis?," 452–54.
  18. Daniele Fanelli, "Is Science Really Facing a Reproducibility Crisis, and Do We Need It To?," Proceedings of the National Academy of Sciences USA 115.11 (2018): 2628–31.
  19. Fang, Steen, and Casadevall, "Misconduct Accounts for the Majority of Retracted Scientific Publications," 17028–33; Aniket Tavare, "Scientific Misconduct Is Worryingly Prevalent in The UK, Shows BMJ Survey," BMJ 344 (2012): e377; R. Grant Steen, Arturo Casadevall, and Ferric C. Feng, "Why Has the Number of Scientific Retractions Increased?," PLoS One 8.7 (2013): e68397.
  20. Steen, Casadevall, and Feng, "Why Has the Number of Scientific Retractions Increased?"
  21. Fang, Steen, and Casadevall, "Misconduct Accounts for the Majority of Retracted Scientific Publications," 17028–33
  22. Anthony Bozzo, Kamil Bali, Nathan Evaniew, and Michelle Ghert, "Retractions in Cancer Research: A Systematic Survey," Research Integrity and Peer Review 2 (2017): 5.
  23. Tavare, "Scientific Misconduct Is Worryingly Prevalent in The UK, Shows BMJ Survey."
  24. Fanelli, "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data."
  25. Daniele Fanelli, "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLoS One 4.5 (2009): e5738.
  26. Susan Eastwood, Pamela Derish, Evangeline Leash, and Stephen Ordway, “Ethical Issues in Biomedical Research: Perceptions and Practices of Postdoctoral Research Fellows Responding to a Survey,” Science and Engineering Ethics 2.1 (1996): 89–114.
  27. Michael W. Kalichman and Paul J. Friedman, “A Pilot Study of Biomedical Trainees' Perceptions Concerning Research Ethics,” Academic Medicine 67 (1992): 769–75.
  28. Committee on Conflict of Interest in Medical Research Education and Practice, "Conflicts of Interest in Biomedical Research," in Conflict of Interest in Medical Research, Education, and Practice, edited by Bernard Lo and Marilyn J. Field (Washington, DC: National Academies Press, 2009).
  29. Sheldon Krimsky, Science in the Private Interest: Has the Lure of Profits Corrupted Biomedical Research? (Lanham, MD: Rowman & Littlefield, 2003); Adam Liska, "The Myth and the Meaning of Science as a Vocation," Ultimate Reality and Meaning 28.2 (2005): 149–64.
  30. Eric G. Campbell, et al, "Institutional Academic-Industry Relationships," JAMA 298.15 (2007): 1779–86.
  31. Steven Shapin, The Scientific Life: A Moral History of a Late Modern Vocation (Chicago: University of Chicago Press, 2009).
  32. Steneck, ORI Introduction to the Responsible Conduct of Research.
  33. Pennock and O'Rourke, "Developing a Scientific Virtue-Based Approach to Science Ethics Training," 243–62.
  34. Helen Kelly, "Rigor and Transparency in Biomedical Research: How the NIH Is Taking No Prisoners," Laboratory Equipment, May 20, 2017, https://www.laboratoryequipment.com/article/2017/05/rigor-transparency-biomedical-research
  35. Stephanie J. Bird, "Mentors, Advisors and Supervisors: Their Role in Teaching Responsible Research Conduct," Science and Engineering Ethics 7.4 (2001): 455–68.
  36. Pennock and O'Rourke, "Developing a Scientific Virtue-Based Approach to Science Ethics Training," 243–62.
  37. Rodney Huff, Christian Desilets, and John Kane, "National Public Survey on White Collar Crime, 2010" (Fairmont, WV: National White Collar Crime Center, 2010), https://www.nw3c.org/docs/research/2010-national-public-survey-on-white-collar-crime.pdf?sfvrsn=8.
  38. "Prosecutions of Health Care Fraud Law Reach New High in FY 2013," TRACReports, January 14, 2014, https://trac.syr.edu/whatsnew/email.140114.html.
  39. Rhakesh Khurana, From Higher Aims to Hired Hands: The Social Transformation of American Business Schools and the Unfulfilled Promise of Management as a Profession (Princeton, NJ: Princeton University Press, 2007).
  40. Nancy Matchett and Mark Overmeyer, "Youth Ethics Series Curriculum," 2011, http://www.coloradohumanities.org/sites/default/files/youth_ethics_series_curriculum.pdf; R. Peeler, "Children and the Development of Ethical Decision-Making," May 1, 2015, http://rockethics.psu.edu/this-is-the-rock/news/children-and-the-development-of-ethical-decision-making; James Arthur, Tom Harrison, Emily Burn, and Francisco Moller, "Schools of Virtue: Character Education in Three Birmingham Schools" (Birmingham, UK: Jubilee Center, 2017), https://www.jubileecentre.ac.uk/userfiles/jubileecentre/pdf/Research Reports/SchoolsOfVirtueResearchReport.pdf; M. Scott Niederjohn, Kim Nygard, and William C. Wood, "Teaching Ethics to High School Students: Virtue Meets Economics," Social Education 72.2 (2009): 76–8.
  41. Niederjohn, Nygard, and Wood, "Teaching Ethics to High School Students: Virtue Meets Economics," 76–8.
  42. Elaine E. Englehardt and Michael S. Pritchard, eds., Ethics Across the Curriculum—Pedagogical Perspectives (Basel, Switzerland: Springer, 2018).
  43. Ferric C. Fang and Arturo Casadevall, "Research Funding: The Case for a Modified Lottery," mBio 7.2 (2016): e00422–16.
  44. Johan Bollen, David Crandall, Damian Junk, Ying Ding, and Katy Borner, "From Funding Agencies to Scientific Agency: Collective Allocation of Science Funding as an Alternative to Peer Review," EMBO Reports 15.2 (2014): 131–33; John P.A. Ioannidis, "Fund People Not Projects," Nature Reviews Immunology 477 (2011): 529–31.
  45. Muhammad Z. Ahmed, "Opinion: The Postdoc Crisis," The Scientist, January 4, 2016, https://www.the-scientist.com/opinion/opinion-the-postdoc-crisis-34259.
  46. Paula Stephan, How Economics Shapes Science (Cambridge, MA: Harvard University Press, 2012).
  47. Beryl Lieff Benderly, "Academia’s Crooked Money Trail," Science, January 6, 2012, https://www.sciencemag.org/careers/2012/01/academias-crooked-money-trail.
  48. Alberts, Kirschner, Tilghman, and Varmus, "Rescuing US Biomedical Research From Its Systemic Flaws," 5773–77.
  49. Beryl Lieff Benderly, "Fraying Ties among Academia, Industry, and Government Hurt Scientists and Science," Science, December 6, 2017, https://www.sciencemag.org/careers/2017/12/fraying-ties-among-academia-industry-and-government-hurt-scientists-and-science?r3f_986=https://www.google.com/.

Share This Book