Subscribe

Science Careers Blog

Dan Albert

Medical students are selected as much for their character as for their knowledge. The trait most valued (or that should be the most valued) is empathy. Ironically, studies show an erosion of empathy during medical school. Why does this happen, and what can we do about it?

During medical school orientations just a few decades ago, it was common for the Dean or another senior speaker to say to the assembled freshman class: "Look to the right of you...look to the left of you... [and in solemn tones]...in four years one of you three will not be here." Happily for today's medical students, dire threats and gloomy predictions have long since become unacceptable, and medical schools strive for their students' success. However, in many ways the transition from undergraduate to medical student is more challenging today than ever. Medical schools have a complex selection process that carefully vets applicants and admits only students it judges to have the intellect and ability to succeed. Still, in each class some medical students struggle, particularly in the first 2 years. Why do some students do poorly or even fail and what should new or prospective medical students do to raise their odds of success?

A major reason medical school is challenging is that students are exposed to a new way of learning, which differs from the methods of most undergraduate programs in two ways:

  1. Case-based learning largely replaces the conventional didactic lectures.
  2. Learning is centered on groups of medical students working as teams.  
This is a radical departure from the "Lone Ranger" method of individual learning that undergraduates are used to. In the team system, medical students work together to tackle problem sets. They teach and learn from each other with faculty input and supervision. The team approach, combined with case-based learning, has proven pedagogically superior; working in teams stimulates learning and increases retention (see references 1 and 2 below). Another advantage is that while individual accountability and responsibility remain essential, medical care increasingly depends upon teams of caregivers; if you don't learn to work in teams in medical school, where will you learn this skill?
 
Another reason students struggle is that in medical school the pace of learning is much faster than what they're used to. There's more material to master in less time. Medical knowledge increases very quickly, so each new class has more to learn. The knowledge needed to be a physician is voluminous and complex, often requiring intensive concentration and study to be fully understood.  Effective teamwork and case-based learning facilitates the learning process--but also calls for flexibility, adaptability, and maturity beyond what is needed to excel as an undergraduate. Sub-par performance results in remedial work; weak students may even need to repeat the school year.
 
So how does one assure success in making this transition? There is no single answer, no magic rule. However, here are 10 suggestions that address the major obstacles to success:
 
  1. Make the necessary emotional and psychological adjustment to deal with 4 extremely tough academic years. Yes, this is within your control. Get used to having less time for family and friends, recreation and social life.
  2. The summer before you start medical school, obtain a reading list and possibly a textbook or two. Get a head start on your work and adjust your frame of mind towards serious study.
  3. Once medical school has started, take an engaged and active role in your teams and study group. When working in a group, there's a tendency to focus on your own contributions more than those of your teammates. Avoid it; you need to know all the material, so focus on the contributions of others at least as much as you focus on your own. 
  4. Limit distractions. There is no end of attractive opportunities for committees on class affairs, participation in clinics for indigent patients, community teaching, and other laudable efforts--but school work comes first.
  5. Think realistically about ways of incorporating research into your medical school experience. If you affiliate with a laboratory, make a commitment that does not encroach too much on your medical class work. If you have a passion for research, consider taking a year off to work in a lab. In addition, many schools with M.D.-Ph.D. programs will consider allowing interested and qualified students to transfer to the program through their second year. This is an attractive opportunity for some.
  6. Although exam scores and grades are usually not entered on your official transcript until the 2nd or even 3rd years, pay close attention to how you are doing on quizzes and exams and respond when there are indications that you may need to study more or get help.
  7.  Familiarize yourself with and take advantage of mentoring and other academic support services. These are extensive, accessible, and well-organized at most institutions. Periodically review your progress and review any concerns with your assigned mentor.
  8. Medical Schools are not monasteries or cloisters. Close friendships and relationships develop and can be rewarding--but maintain stability in your personal life.
  9. Don't spend time and mental energy worrying about future decisions such as what specialty to select or where you will do your postgraduate training. The curriculum is designed to give you the experience and information you need to make a carefully considered decision regarding specialty choice, residency, and fellowship--at the appropriate time.
  10. Eat well, sleep sufficiently, and exercise regularly. The ancient Greek philosopher was right--a sound mind in a sound body is what it takes.
It cannot be stressed enough that the first two years of medical school provide the foundation for the knowledge you need in your medical career.  Not only your success as a medical student, but the health and well-being of your future patients depend on the preparedness, commitment, and hard work you bring to bear those first two years. Make them count!
 
References
1.    Michaelsen L, and B Richards. 2005. "Drawing conclusions from the team-learning literature in health-sciences education: a commentary". Teaching and Learning in Medicine. 17 (1): 85-8.
2.    Williams, B. 2005. Case based learning--a review of the literature: is there scope for this educational paradigm in prehospital education? Emerg Med J. 22:577-581 doi:10.1136/emj.2004.022707.

July 19, 2012

How You Write Matters

Recently our department received an unusual application for an ophthalmology residency. The applicant seemed impressive, a top student at a prestigious medical school with several publications and strong letters of recommendation. What made the application unusual was that the word "ophthalmology" was consistently misspelled throughout the application. I never had the opportunity to inquire how or why this happened, because the student wasn't invited for an interview.

I work at my university as a mentor for undergraduates applying to medical schools and medical students applying to internships and residencies. I also review manuscripts submitted by physicians/scientists at the start of their careers to the peer-reviewed journal I edit. In all these areas, accurate spelling, correct grammar, and even proper punctuation greatly influence how seriously I--and doubtless others in similar positions--consider the submitted material. What you write is a proxy for who you are, and careless and sloppy writing reflects strongly on the impression the acceptance committee or reviewers will have of you.

A small but poignant dramatic moment occurs whenever the researcher--young or old--opens the e-mail containing a decision letter about his or her latest research article and knows the fate of it has been largely decided by the advice of anonymous reviewers. Inevitably, we bless or curse those reviewers--but I suggest that you join them and do it early in your career.  

So new is the field of Environmental Science (ES) that, according to the U.S. Bureau of Labor Statistics (BLS), the first generation of scientists in the field are beginning to reach retirement age -- one reason that some people anticipate growth in the number of available jobs in the field.

ES came into being as a real branch of science in the 1960s and 1970s, answering a need to understand complex environmental problems and deal with a flood of new environmental laws that required specific environmental investigation protocols. The ultimate cause was public awareness and concern about the environment raised by the 1962 publication of Rachel Carson's exposé Silent Spring, and later reinforced by the energy crisis, global warming, Hurricane Katrina, and the Gulf oil spill, among other changes and events. (1)

"The aging process is not fun, but when it begins decades ahead of schedule, it's tragic" (See Editor's Choice: Splicing Therapy Comes of Age, Science 11 Nov 2011, Vol. 334, p. 739). Substitute the word "retirement" for "aging" and you have the problem facing many career scientists today. While scientific journals and the science establishment bemoan the macroeconomics of "U.S. Science and Austerity" (See News Focus: U.S. Science and Austerity, Science 11 Nov 2011, Vol. 334, pp. 750-759), those working in university laboratories, research institutes, and hospitals see the tragic results among themselves and their colleagues as grant funding dries up. This "professional progeria" unexpectedly and increasingly strikes talented researchers with track records of success and accomplishment. As John Lennon noted: "Life is what happens to you while you're busy making other plans." Still, for individuals entering science careers, it's wise to make those plans now to prepare for when the grants stop coming. If that never happens, so much the better.

There is increasing awareness that for optimum intellectual function, a balanced relationship must exist between the use of our electronic linkages and our brains. Excessive use and over-dependence on the Internet, e-mail, smartphones, and Web-enabled tablets can hinder instead of helping progress toward a successful science career.

This is a complex and contentious issue. It was addressed by Nicholas Carr, a respected technology writer, with a 2008 article in The Atlantic titled "Is Google making us stupid?" and  expanded upon in his recent book, The Shallows:  What the Internet is Doing to Our Brains (W. W. Norton, 2010). Carr believes that "we are trading away the seriousness of sustained attention for the web's frantic superficiality." As a researcher, advisor, and student mentor, I see this happening in many struggling students and young scientists.

The striking characteristics of the nutritional sciences are its long and colorful history, its broad scope and complexity, its ability to integrate with other scientific disciplines, and the excellent opportunities it offers for a scientific career.

Its long history includes the first written nutritional research study -- reported in the Book of the Prophet Daniel, in the Bible. In Chapter 1, Daniel and his companions, captives of King Nebuchadnezzar, disdain the food and wine they are offered from the royal table and request a diet of vegetables and water. After a 10-day "clinical trial," they look healthier and better fed than the "control" group eating from the royal table. As a reward, the King admits the group into his service.

The broad scope of the nutritional sciences is well documented by the information provided by the more than 160 graduate programs offering advanced degrees in the field. Nutritional sciences encompass all aspects of an organism's interaction with food, and can be investigated at levels ranging from molecules to populations.

It is common to hear undergraduates and recent college graduates preparing for a career in science complain: "I think I wasted a lot of time in college being forced to take humanities classes that had nothing to do with my area of study." This is one of many manifestations of the ongoing centuries-long battle over the relationship between the sciences and the humanities.

From a historical point of view, until the mid-19th century, the humanities (i.e., grammar, rhetoric, history, literature, languages, and moral philosophy) held the upper hand. At Oxford and Cambridge Universities, the gold standard models for American education, the areas of study consisted mainly of classics, mathematics, or divinity.

However, in 1847 Yale College broke with this tradition and formed the School of Applied Chemistry. This became the Yale Scientific School and in 1861 it was renamed the Sheffield Scientific School. Sheffield's 3-year undergraduate program focused on chemistry, engineering, and independent research. It offered the best scientific training in America. The "Sheffs" studied and lived apart from other undergraduates taking the classic curriculum and roomed together in the "college yard." The two groups did not mingle. The old truism that a classical education assured success was being challenged. Science had begun its separation and was ascending vis-a-vis the liberal arts in American universities.

The need for science majors to take courses in the humanities has been contentious ever since. The required core curriculum at most colleges and universities has atrophied over the years, while at the same time governmental funds for support of any new research in the humanities has dried up. Authorities both within and outside of science have expressed concern that scientists do not learn enough about the humanities -- to the detriment of society.

In this environment, it's difficult for the undergraduate to determine the desirability of taking courses in the humanities -- or which and how many to take. In fact, some applicants to college regard a strong core curriculum requirement as a negative factor, opting instead for programs with a minimum number of required core courses and maximum flexibility.

All this considered, I would offer the following 10 reasons why students pursuing science careers should augment their education with a strong foundation in the humanities.

One of the most promising and rewarding career choices for the student contemplating a professional life in clinical medical research is that of a primary care physician scientist. As U.S. managed care evolves, the primary care physician (PCP) is assuming an increasingly prominent and responsible role in the healthcare system. The PCP is generally the first medical practitioner contacted by a patient. The PCP does the initial evaluation; acts as the collaborator and facilitator in referrals to specialists; coordinates the care among specialists, clinics, laboratories, and hospitals; and provides long term management of most patients. From an economic standpoint, the PCP is the "gatekeeper" who regulates access to costly consultations, studies, and procedures. Approximately one-third of practicing physicians in the United States are PCPs.

So where does the research come in? Modern medicine is increasingly oriented toward evidence-based practice; that is, treating patients according to methods and means that have been scientifically demonstrated to be the best choices. To achieve evidence-based practice requires practice-based evidence. PCPs are in the right place to provide this -- to study the effectiveness of medical care and the translation of innovation from the laboratory to the bedside and the community. However, to accomplish this, prospective PCP-scientists need additional training and research capability that goes beyond the standard training provided to become a PCP, training that is both available and well supported.

Those embarking on a career in medicine or medical research are doing so at a time of tremendous change and challenge. In the span of 4 years, how are medical schools to simultaneously teach the exponentially expanding body of knowledge garnered by medical and basic science research, introduce students to new technologies and drugs that are revolutionizing diagnostic and therapeutic options, and train physicians to work effectively in an increasingly complex health care system? Last year the Carnegie Foundation for the Advancement of  Teaching published a book based on an in-depth study and blueprint for a major overhaul of medical education entitled Educating Physicians:  A Call for Reform of Medical School and Residency by Molly Cooke, David Irby, and Bridget O' Brien, three well credentialed medical educators (Jossey-Bass, San Francisco, 2010).

This study is compelling because it follows a template established a century earlier when the Carnegie Foundation carried out a study that revolutionized American medical education. Known as the Flexner Report, it established the basis for the curriculum and the standards for medical education that continue to the present day. To understand the changes called for in the 2010 Carnegie Foundation study, it is necessary to review the 1910 Flexner Study, including why it was done, what it reported, and the structure it created.

Introduction
 
Albert Einstein (1879-1955) is widely regarded as the father of modern physics. For those of us old enough to have seen him in person, listen to him speak in public or on the radio, and read his writings when they were current, these memories are precious. In addition to being a great theoretical physicist he was looked upon as a philosopher and statesman. His intellectual interests and profound observations extended widely into the other sciences and the social aspects of human endeavor. In the 21st century he remains one of the most influential and iconic thinkers of all time.

Einstein is possibly the most frequently quoted figure in the history of science, but as is often noted, many of these quotations are of dubious authenticity. Alice Calaprice, a senior editor at the Princeton University Press, has worked with the Einstein papers at the Institute for Advanced Study for more than 30 years. In 1996, she published a volume entitled The Quotable Einstein, a comprehensive, meticulously referenced, annotated, and carefully arranged compilation of Einstein's quotes. For 2011, Calaprice has enlarged this to The Ultimate Quotable Einstein, a nearly 600-page volume of approximately 1600 quotations--the "final" and definitive edition.

On a recent visit to Princeton, I had the good fortune to obtain an advanced copy of this work and delighted in it as I have in few other books. I have selected and arranged these quotations to simulate an interview with Einstein, circa 1955, on the topic of science careers.

1.    Learn on whose shoulders you stand.

On February 5, 1676, Isaac Newton wrote a letter to his rival and adviser, Robert Hooke, which he concluded with his famous aphorism: "If I have seen further it is by standing on the shoulders of giants." Two hundred years later another great scientist, the French physiologist Claude Bernard, enlarged on these words: "Great men have been compared to giants upon whose shoulders pygmies have climbed, who nevertheless see further than they. This simply means the science makes progress subsequently to the appearance of great men, and precisely because of their influence. The result is that their successors know many more scientific facts than the great men themselves had in their day. But a great man is, none the less, still a great man, that is to say, a giant."

Timothy Cordes graduated as valedictorian of his class at the University of Notre Dame. He was admitted to the University of Wisconsin (UW) where he recently earned an M.D. and a Ph.D. in biomolecular chemistry. He is currently a resident physician in the psychiatry department while fulfilling his role as a husband and father. Tim is blind. That this uncommon constellation of accomplishments can occur is notable. Ten or twenty years ago, it would have been impossible.

Twenty years ago, while serving as a faculty pre-med advisor at Harvard College, I was assigned a candidate for medical school admission: a graduate of the Bronx High School of Science, an outstanding student, a basketball player, personable, and impressive in every way. He was a "dream" candidate with one exception: He had been born deaf. Our efforts to gain his admission to medical school were a nightmare. Despite personal communications to medical school deans and admission directors, as well as letters from his professors attesting to his abilities, all doors were closed. He entered a Ph.D. program in pathology at the University of Pennsylvania and, following postdoctoral training at the National Institutes of Health (NIH), now does exciting research in the field of congenital deafness.
    
Recently, I was reminded of this experience when I sat down for a mentoring session with a second year medical student here in Madison. "I'll need to see your lips," the student cautioned me as we began our conversation, "I'm congenitally deaf."

A first-year medical student I mentor recently asked me:  "What's the point in buying textbooks? Sure, I could pull it from a shelf in the library - or save time and just Google it. But wouldn't I learn more from a Google search or a Medline search than by reading all those pages?"

This seems to be the current thinking of students in general. Our medical book store recently removed a large portion of its shelves of books and replaced it with a bigger area for computer hardware and software; the store's manager tells me that textbook sales have declined by 10-15% in each of the past several years. The medical library has replaced a major portion of its text reference section with computer carrels. Fewer new textbooks are being published, and new editions of standard texts are appearing less frequently. And the financial news tells us of the loss of profitability and financial difficulties faced by publishers.

Each year, the American Academy of Ophthalmology meets in a major U.S. city and attracts around 25,000 ophthalmologists from the United States and abroad. Among the courses frequently offered at the meeting is one, entitled "Key to Successful Publications in Peer Review Literature," taught by the editors of the three leading clinical journals in our field.  The course tends to attract residents and young doctors at the start of their careers. I have enjoyed being involved in teaching this course for a number of years and have come to anticipate the questions that will be asked. These include the following four questions, all related to the first steps in research publishing:

1.    Why should clinicians do research?
2.    Is the scientific method optional for current medical reports?
3.    What is "peer review" and why is it important?
4.    How do I select a journal to publish in?

These questions apply to persons interested in participating in medical research, and I will  discuss them briefly here.

As the practice of medicine and the delivery of health care over the last half-century has grown in complexity and content, training to be a practicing physician has gone from being analogous to a cross-country track event to a virtual marathon in time, effort, and expense. With the explosion of knowledge and technical complexity in the biological sciences, M.D.-Ph.D. candidates preparing for a life of medical research are found in a triathlon by comparison. Those who cross the finish line are well equipped to take on the challenges of being a physician-scientist.

Based on personal involvement within my own institution, discussion with program directors, and interaction with M.D.-Ph.D. program graduates, I will present a brief overview of this challenging pathway to a science career.

Why was the M.D.-Ph.D. program created?

In recent days, sports writers and broadcasters have focused on the death of former UCLA basketball coach John Wooden.  Wooden's impressive record of 10 NCAA championships in 12 years and 88-game winning streak by the Bruins received much attention.  However, the emphasis from his former players was "...how he taught us about life....he had the perspective of what was really important, and he always reinforced what he said by what he did," as Andy Hill says, quoted in the June 7 NY Times.  In other words, his players insisted he was more than a coach - he was a mentor. 

Lorraine Stomski, an expert in leadership education and coaching, explains, "People often confuse coaching with mentoring.  Coaching, which provides specific feedback, can be used within mentoring.  But mentoring is more holistic than coaching, in that it develops the whole individual - through guidance, coaching and development opportunities" (June 6 NY Times).

This was of particular interest to me because I was recently asked to give a talk to medical students on the topic, "Optimizing Your Mentored Experience."  In preparation for my talk, I spent a weekend perusing material in print and online.  My initial impression was that everything that can be said has been said and is readily available, so the best one can do is to summarize succinctly and emphasize a few key points from the mind-numbing expanse of material.  But reflecting on my more than 50 years experience being a mentor and mentee (protégé is now the preferred term), I want to share my perspective and first hand observations.

Healthcare in America has been in the spotlight for a number of months. The picture portrayed in the media is of a giant "black box" into which 16% of our gross domestic product (GDP) goes and out of which comes healthcare whose quality and quantity is under debate.  Within that black box is a complex mix of healthcare workers and their organizations, hospitals, insurance companies, government agencies, private agencies, big and small pharma, instrument corporations, citizen groups, corporate executives, politicians, lobbyists, and scientists, each with its own agenda and goals.

Of these components of healthcare, none is more attractive and respected than the group committed to the "protection, promotion, and optimization of health and abilities; prevention of illness and injury; alleviation of suffering through the diagnosis and treatment of human responses; and advocacy in healthcare for individuals" -- the nurses. (The definition is from the American Nurses Association.)  Holding a special position in the nursing profession, a carefully chosen group carries out research -- a career choice that merits consideration by qualified young individuals seeking a research career in healthcare.  A nursing Ph.D. program is an excellent way to enter such a career.

Since I work as a physician-scientist at the University of Wisconsin (UW) School of Medicine and Public Health, I have long been aware of the increasingly important role nurse researchers and nurse Ph.D.'s play in modern healthcare.  Barbara J. Bowers, Associate Dean for Nursing Research at UW-Madison, kindly gave me a generous amount of her time to convey an in-depth view of the opportunities and challenges of nursing Ph.D. programs and research careers in nursing. Much more information nursing Ph.D. programs and research careers is available on these institution's Web sites.

My discussion with Dr. Bowers made it clear that -- to paraphrase an old General Motors ad -- this is not your mother or grandmother's career in nursing.  For starters, nursing is no longer a gender-specific profession.  Nearly 15% of the entering class in nursing at the UW is men, a number that reflects the national average -- and that is increasing.

Let's look at how nursing education has changed in the last couple of decades.  In decades past, great number of nurses entered the profession with an associate degree -- a technical degree conferred after 2 to 3 years in a community or technical college or hospital. With additional coursework, these nurses could earn a bachelor's degree.  Those interested in a research career could later earn a master's degree, and eventually a Ph.D., most commonly in education, psychology, or sociology. 

In present-day nursing education, many students begin with a 4-year Bachelor of Science in Nursing ( BS or BSN) program.   These programs are often competitive; at UW-Madison, about 400 applications are received for 150 places. Those accepted generally have GPA sof 3.5 or higher.  Nursing-bound high school students need courses in mathematics, science, social studies, humanities, foreign languages, and communication skills.  Strong preparation in physical and social sciences is essential.

The curriculum of the nursing baccalaureate program at UW-Madison is representative of most programs.  The first 2 years concentrate on general education and includes prerequisite courses in the sciences, humanities, and social studies.  Applied skills are acquired during the junior and senior years via core lecture, laboratory, and clinical courses and elective courses that allow students to pursue individual interests.

UW-Madison offers an innovative option for top students interested in entering a research career in nursing:  the early-entry Ph.D. program selects first year students who are invited to plan, in conjunction with the faculty advisory committee, an individualized program of study and research. The program includes early and intensive research training, clinical practice, and required and recommended coursework.  Each student works closely with a senior faculty member whose research matches their own interests.  This research  is combined with graduate courses in the area they select and completion of the required and recommended undergraduate and graduate courses in nursing and related disciplines.  Students completing the program receive 3 degrees: B.S., M.S., and Ph.D. 

Research areas of current students in this program include the ethnocultural influences on pain and pain management, effects of global environmental change on human health, and symptom management for patients.  Students publish their findings in major professional journals and present their work at research conferences.

More traditional post-baccalaureate Ph.D. nursing programs -- my university has one of those, too -- offer a strong emphasis on research training in nursing through an apprenticeship model.  Students work closely with their nursing school preceptor and faculty committee to follow an individualized, research-driven program of study.  The preceptor advises the student on the selection of courses and serves as a liaison to the major department and other departments in the graduate school. 

Students in both programs receive financial support through graduate assistantships and traineeships. Stipends usually run about $30,000, plus tuition remission.

A major financial supporter of these programs is the National Institute of Nursing Research (NINR), part of the National Institutes of Health (NIH).  NINR currently has fourteen "priority areas":

  1. Research related to low birth weights
  2. HIV infection care delivery
  3. Long-term care for older adults
  4. Management of pain and other symptoms associated with acute and chronic illness
  5. Nursing informatics to enhance patient care
  6. Health promotion for older children and adolescents
  7. Technology dependence across the lifespan
  8. Community-based nursing models
  9. Preventing HIV/AIDS in women
  10. Preventing diabetes, obesity, and hypertension
  11. Cognitive impairment
  12. Coping with chronic illness
  13. Families at risk for violence
  14. Behavioral factors relation to immunosuppression

This list illustrates the scope and importance of some of the key issues that are the focus of  nursing research

At a time when job opportunities in general can be hard to come by, graduates of nursing Ph.D. programs are in demand in a variety of educational, clinical, and governmental settings. Ph.D. nurses have faculty appointments or positions as research scientists or research directors.  Faculty positions usually start at the assistant professor level, on the tenure track, with annual salaries of about $70,000 to $80,000 a year.

What's unique about this type of research career, Bowers stresses, is its involvement with people -- living and working with them and dealing with the challenges their health problems present.  It's a stable and rewarding career with a range extending from gerontology to health policy.  Graduates entering into it can expect to remain engaged, satisfied, and see their research funded. 

Are there special characteristics that identify people particularly well suited for a career in nursing research?   Bowers cites an interest in finding more effective ways to solve complex care problems and a high level of curiosity. Physicians tend to be more interested in diagnoses and treatment while nurses are more focused on prevention of poor health outcomes by changing lifestyle and helping patients and their families live with diseases.  Beyond this, the attributes she associates with students suited for a career in nursing research are commitment to improved health and more effective health care delivery, initiative, a desire to "push the envelope," and, above all, a feeling of excitement when carrying out research.

For those qualified individuals who want a "hands-on" career in dealing with people and their health problems, consider nursing research.  Few other careers can match its challenges and rewards.


The recent book The Immortal Life of Henrietta Lacks, by Rebecca Skloot, has become a "must read" bestseller among scientists and non-scientists (Science, vol. 327, 26 February 2010 p. 1081).  When I read this book and heard Skloot speak at length about the "HeLa experience," it led me to think about the role these cells had on my own career and the subsequent lessons I learned related to them.   

When HeLa cells became available in the 1950s, I was an undergraduate biology major in need of a project for my honors thesis.  My older brother, a medical student in the Baltimore area, alerted me that I could obtain a unique immortal line of cancer cells, the HeLa cells, to use as a basis for my project.  I got caught up in the excitement among researchers and in the lay press about the fact that this cell line was the successful culmination of decades of effort to keep cancer cells alive outside of the body.  In Skloot's words, "cell culture was going to save the world from disease and make man immortal."

In the Doctor Dolittle children's series by Hugh Lofting, the amazing Doctor Dolittle gains the gift of talking to the animals and learning their secrets. Today there are real "Doctor Dolittles" -- veterinarians learning the secrets of animals and their genetics, immunologic mechanisms, brain and nerve functioning, pathogenesis of malignancies, and much more -- by state-of-the-art scientific means. And the knowledge they gain is extremely important to humans and for understanding human diseases.

The University of Wisconsin School of Veterinary Medicine is one of the most research-orientated veterinary schools in the United States. The Veterinary school is located a stone's throw from the University of Wisconsin School of Medicine and Public Health, and there is much interaction and collaboration between the two schools.

Having partnered in research with some academic veterinarians and communicated with many others, I thought I had a pretty good idea what they did. I assumed their research training and research prospects were analogous to ours in the human-focused medical field.

However, in preparation for writing this blog entry I sat down with one of their most distinguished veterinarian-scientists, Richard Dubielzig, and was surprised to learn that while many parallels exist, there are also striking differences between their careers and those of their medical school counterparts.

The curriculum at veterinary medicine schools is very similar in concept to the medical school curriculum:  an intensive 4 year course divided between basic science and clinical learning. As at medical schools, the amount of research emphasis varies widely by institution. The following is a partial list of veterinary schools that have built a solid reputation for doing research and producing quality researchers in veterinary medicine:

University of Wisconsin
University of California-Davis
Colorado State
Ohio State
Cornell
University of Pennsylvania
University of Florida
North Carolina State
Tufts
Washington State

One difference between veterinary and medical training is that in veterinary training there is less opportunity to train for a career in research than there is in pursuing a medical degree. So if you want to be a veterinary researcher, you'll need to earn a Ph.D. degree either before, during, or after veterinary school. Opportunities for combined degrees -- D.V.M.-Ph.D. -- are more rare and less well-defined than at medical schools.
 
Veterinary schools offer excellent opportunities for an introduction to veterinary research. Most schools offer summer vacations between the first and second and second and third years; these are great opportunities to spend a few weeks in a research lab. Frequently, students planning a research career will take leaves of absence and work in a laboratory or on a research project between years 2 and 3, or between years 3 and 4, of veterinary school. NIH T32 grants, which pay students stipends to spend a year in a research laboratory, are available at several schools.

The usual route for veterinarians planning a research career is to earn a Ph.D. after veterinary school graduation. That graduate training may or may not have a clinical component. To qualify for an entry-level (Assistant Professor) tenure-track faculty position in a veterinary school, a candidate usually must have completed a  D.V.M., a Ph.D., residency training, board certification, and postdoctoral training.

Many veterinarians with research ambitions find fulfilling careers in the safety or research and development (R&D) divisions of pharmaceutical companies. Here, a Ph.D. in toxicology or board certification in veterinary pathology, respectively, is required. While these jobs have higher salaries than academia, the scope of research is narrower and more directed.

Both Dr. Dubielzig and I are impressed with the enthusiasm and gratification veterinary scientists show for their research. And once appointed to a tenured track, achieving tenure is usually less stressful than it is in most medical schools. The greatest professional challenge is often managing to meet clinical responsibilities while also getting research done. This can lead to stress, insecurity, and feelings of being under-appreciated. Not surprisingly, there is more moving around among positions in both academia and industry among veterinary scientists than there is for physician scientists. But when you ask the veterinary scientist if they would do it all again, the answer is usually an enthusiastic "yes!"

In 1925 the American author Sinclair Lewis published the Pulitzer Prize-winning novel, Arrowsmith, which inspired subsequent generations of 20th century high school and college students, including me, to consider a career in medical research.

Arrowsmith relates the tale of the bright and research-minded Martin Arrowsmith, from a small town in a fictional counterpart of Wisconsin, who progresses through medical school, private practice, and a position as a regional health official to become a dedicated medical researcher. In the story, his research talent is recognized by his medical school mentor, and Martin attains a position in a Rockefeller-like institute in New York where he discovers a phage that destroys the bacteria causing bubonic plague.

There's much more to the plot, and if you haven't read it yet I suggest you do. The issue at hand, however, is, could Martin, as a medical student and physician interested in research, follow the same path and make as meaningful a contribution in the 21st century? My answer is yes, he could, but if I were Martin's mentor, I'd suggest other ways that might suit him as well or better.

It is worth noting from a historical point of view that when Arrowsmith was written, the doctor of medicine degree in the United States had only recently gained a full measure of respectability. As a result of the Flexner Report, sponsored by the Carnegie Foundation, the standards and curriculum of present-day medical schools came into being. Medical schools became integral parts of large and established universities, the length of medical education became 4 years, faculty became "true university teachers" typically with full-time university appointments, and students were required to have college or university preparation. Also, medical education came to consist of two years of basic science training followed by two years of clinical work in a teaching hospital.

In this setting, and within the confines of this curriculum, there was opportunity for individuals such as Martin Arrowsmith to identify areas of particular interest and specific researchers they wanted to work with, through the lectures they attended. The students would work in the researchers' laboratories, under the mentor's guidance, during their vacation and spare time. This led, in turn, to numerous outstanding and gratifying research careers well into the late 20th century.

It is a path still available to medical students today, but with somewhat greater difficulty, and modified rewards. The alternative routes for a career in medicine or as a clinician-scientist include pursuing a combined M.D.-Ph.D. curriculum, or Ph.D. training before or after getting an M.D. degree. So what are the main advantages and disadvantages of these approaches over the Arrowsmith pathway?

The Arrowsmith approach is easier and more flexible. No formal or written applications, contracts, or other types of commitment are required -- just select a researcher and a topic of research according to your desires. The amount of effort and the logistics are arranged to fit with the schedule of the student and researcher. Required reading, seminars to be attended, and courses to be audited are all determined by mutual decision. The length of medical training usually is not extended, unless the student decides to take a year or semester off to devote entirely to research. There are no additional tuition costs -- beyond the cost of medical school -- and often the trainee can obtain a student grant or be listed on the established researchers grant to help defray his or her living expenses. The entire arrangement is "customized" to suit the student and the teacher. If either is dissatisfied, the arrangement is easily dissolved.

Because of the ongoing clinical training, the Arrowsmith pathway is particularly well suited for careers involving clinical trials, other forms of clinical research, or translational research.The medical degree is sufficient to qualify one for a postdoctoral position and ultimate consideration for a faculty appointment in a clinical academic department.

However, there are some negative aspects to this pathway. With the advent of problem-based learning as a major tool in medical school curricula, there has been a melding of clinical and basic science training. Also, with the explosion of knowledge in the biological sciences in past decades, basic science training for the physician is less comprehensive and rigorous than it used to be in earlier years.

Accordingly, the Arrowsmith pathway is not ideal preparation for a career focused on basic science research. Nor is the M.D. degree alone sufficient for a faculty appointment to most medical school basic science departments. While such preparation suffices for a career in clinical medicine and for concurrent clinical and translational research, it is problematic for assurance of a long-term research career in basic science. These gaps and blind spots in one's knowledge can be filled in on the basis of individual effort, but this can be difficult.

This problem was illustrated to me in a striking way while I was a clinician-scientist on the Harvard Medical School faculty. There were a number of gifted undergraduates potentially interested in a medical physician-scientist career working in various laboratories in our department. In order to motivate them to consider the Arrowsmith pathway toward a physician-scientist career, our Chair assembled them for a group meeting with a distinguished Ph.D. basic scientist who were also on the medical school faculty. We anticipated an inspirational talk encouraging the undergraduates to pursue such a career. To our surprise, the message given was a short and blunt: "If you follow this pathway for a career in basic science, the PhD's will 'eat your lunch.'"

In summary then, while it's still possible to emulate Martin Arrowsmith, today his pathway is better suited for clinical or translational research than a long-term career in basic science.

An interesting footnote to the Arrowsmith novel is the fact that Lewis was greatly assisted by a Ph.D. microbiologist, Paul de Kruif, now best remembered for his book Microbe Hunters. Even though Lewis was listed as sole author, De Kruif's contributions were sufficient to merit receipt of 25% of the royalties. Thus, even in the creation of a story of a successful physician-scientist, a Ph.D. proved invaluable.
Several years ago, during an internship in the NIH Director's office, while fulfilling a requirement for my Master's of Health Administration, I learned an interesting fact. In an internal review of when and why researchers falter in their request for NIH R01 grants, it was determined that revised applications become increasingly necessary after the 3rd 3-5 year cycle, and rejections peak after the 4th and 5th cycle. At that point in their careers, researchers are often no longer at the cutting edge in their field and, despite the benefits of experience and accomplishments, are less competitive for NIH grants.

For these applicants, there is a need to retool: to learn new techniques, gain new skills, and get fresh insights and ideas. One of the most efficient and enjoyable ways to do this is to take a sabbatical year. The origin of the term sabbatical is a "year during which land remained fallow, observed every year by the ancient Jews" (American Heritage Dictionary, 2nd College Ed. p. 1082). In modern academic parlance, it means a leave of absence with financial support given to tenured faculty member for the purposes research in a new venue, academic study and writing,  and related travel. However, from there the working definition diverges, depending on the university and department you're with.

Having a more competitive faculty member with his or her battery recharged would seem to be in the best interests of both the institution and scientist, but institutions don't always encourage sabbaticals, or support them well. I spent the first half of my career at Harvard; I've spent the last half (so far) at the University of Wisconsin. At Harvard, sabbaticals were encouraged, facilitated, and well supported. At Wisconsin, sabbaticals are less common and, in the medical school at least, require outside funding and considerably more advance planning, personal effort, and perseverance. A university's policy toward sabbaticals depends on precedent and culture, as well as finances and manpower issues. If you have a sabbatical in mind, the time to explore an institution's policies on sabbaticals is during your recruitment.

In my experience, a successful sabbatical requires at least 3 years of planning. First, you must figure out what you expect from the experience, your personal goals for the sabbatical. Then you have to match them up with the available opportunities, finding the best people and environment to help you achieve your goals. The best way of investigating this is professional interactions at meetings, conferences, and collaborations, and preliminary, exploratory visits. You may need to visit several labs before you find the right situation, or it may be obvious early on which situation is best. You may select a laboratory half a world away, but you could also end up in another laboratory on your own campus.

Traditionally, home institutions will support a semester away at full pay or an entire academic year at half pay. Departmental and institutional support varies widely; you may need to find supplemental support, especially for a year-long sabbatical, through the host institution or a funding agency. Happily, targeted support for sabbaticals from government agencies and foundations is generally not difficult to obtain, and is often generous. Almost all are posted on the Internet and easy to find and apply for.

Well in advance of granting leave, all institutions require that the faculty member make provisions for the supervision of his or her laboratory and the fulfillment of teaching and administrative responsibilities. You may need to twist your colleague's arms or do some horse trading. It is also important to know what the host lab expects. Often you're expected to teach as well as learn, which can come as a shock if you haven't worked this out ahead of time.

My own experience has been that housing is not a problem if the sabbatical term is spent at a major institution. At any one time, a portion of an institution's faculty is on sabbatical, and many institutions own faculty housing, so there are good housing options available at reasonable cost; housing can be arranged through the institution's housing office. Sometimes, though, housing arrangements aren't made until the last minute; it can be disconcerting not knowing where you will live.

Your home institution is likely to require written assurance that you'll return after the sabbatical and remain for a year or two -- which could be inconvenient if your very successful sabbatical leads another institution to offer you your dream job soon after your return.

Is sabbatical worth all the trouble? My answer is an emphatic YES. Ask the scientist who just returned from a sabbatical -- which is a very good place to start your planning.
Each year during the last week in April, more than 12,000 members of the Association for Research in Vision and Ophthalmology (ARVO) gather in Fort Lauderdale, Florida, for their week-long annual meeting.  If you ask any of the attendees what they do, they'll tell you that they're "visual scientists." But if you dig deeper you'll discover an amazingly multidisciplinary group of researchers. 

The major components of this community by training are (1) PhD's, (2) MD/Ophthalmologists, and (3) optometrists, osteopaths, and veterinarians.  A more meaningful insight into what the members do in their visual science careers can be gained from the titles of the 13 scientific sections of the organization:  Anatomy & Pathology; Biochemistry & Molecular Biology; Clinical & Epidemiologic Research; Cornea; Eye Movements, Strabismus, Amblyopia & Neuro-ophthalmology; Glaucoma; Immunology & Microbiology; Lens; Physiology & Pharmacology; Retina; Retinal Cell Biology; Visual Neurophysiology; Visual Psychophysics & Physiological Optics.

A pervasive presence at ARVO meetings is the leadership and staff of the National Eye Institute (NEI), an NIH institute dedicated to research on human visual diseases and disorders.  With an annual budget close to $700 million, the NEI is the principal source of funding for the research done by the eye-research community -- and presented at the ARVO meeting.  Hence, ARVO meetings provide an ideal venue for close communication between visual scientists and the government agency that pays for most of their research.  Jointly, they engage in strategic planning and set priorities and goals for vision research.  

Since its creation by Congress in 1968, the NEI, along with the vision science community have been successful in fulfilling the NEI's stated mission to "conduct and support research, training, health information dissemination, and other programs with respect to blinding eye diseases, visual disorders, mechanisms of visual function, and the special health problems and requirements of the blind."  The commitments of vision scientists to preserve vision and prevent blindness is a vital element in the cohesion and collegiality of the vision science community.  Both the NEI and ARVO provide information about vision science as a career and available job opportunities.  A personal visit to the ARVO annual meeting is highly recommended for anyone interested in vision-related science; it is not only informative but also inspiring.

Where do members of the vision research community work?  The vast majority are employed by universities.  Extrapolating from figures available from the University of Wisconsin - Madison, I estimate that slightly more than half are members of  departments of ophthalmology in medical schools, with most Ph.D.'s holding joint or adjunct appointments in a basic-science department.  The other vision scientists are distributed among medical school basic science departments, or science departments outside the medical school.  In addition, many pharmaceutical companies have ophthalmic divisions and career opportunities for vision scientists.

From my own personal experience, I can attest to the fact that vision science is a challenging and highly gratifying career.  

More than a century ago, Sigmund Freud famously (or infamously) wrote, "The great question that has never been answered, and which I have as yet been unable to answer despite my 30 years of research, ...is, 'What does a woman want?'" In a similar vein, medical students engaged in research projects, both male and female, frequently ask, "What does my principal investigator (P.I.) want?" Unlike Freud's question, this one can be easily answered.  The answer is, 'commitment.'

The major frustration for the dedicated lab head working with medical students was presented to me, as a medical student, in a talk by the great renal physiologist Homer William Smith. Smith noted that many interested, willing, and highly competent young men and women had come and gone through his laboratory, spending 2 or 3 years involved in his research. But when they launched into their medical careers they were all too frequently absorbed by their clinical activities and no longer incorporated research into their professional lives. Smith knew that the greatest payback for a senior investigator who accepts a medical student on his or her research team and spends time teaching and mentoring that student is the future research contributions that student will make throughout his or her career.  And the preceptor knows that unless real commitment is in evidence when the student first arrives then the outlook for long-term dedication to research is bleak.

This does not preclude the important need to introduce medical students without previous research knowledge or experience to the laboratory, or to some realm of clinical research, as an interested observer or limited participant. Research faculty welcome such an opportunity and are pleased if the student progresses to a more significant role. But it is disheartening for scientists to take on students who express a desire to play a meaningful research role, and accept responsibility for a portion of a project, and then fail to fulfill those responsibilities.

How does a student beginning work on a research project manifest commitment? The research faculty in medical schools are well aware of the rigorous schedule medical students face and understand that only a limited amount of time can be devoted to research. Moreover, they are forgiving when genuine conflicts arise and the time scheduled for research is of necessity missed.

Rather, it is the student's seriousness, level of interest, and intensity of effort that are of primary concern. Students who initiate and maintain a dialogue about the research, ask questions, and show evidence of related outside reading and independent thinking are highly regarded. In fact, committed students who are new to a research discipline are especially valuable because they ask basic questions and do not accept fixed ideas and dogma as sacred and beyond questioning.  In addition, because of their concurrent medical training, they are often in a good position to recognize previously unappreciated clinical implications and significance for the research they are undertaking. Original ideas and suggestions for advancing the research, a good learning curve for the technical aspects of the project, careful data keeping, courtesy and thoughtful behavior to all members on the team -- including technicians and assistants -- and participation in the group's social activities are all important to the student's success.

The short-term endpoint the preceptor wants is not merely a student co-authored publication, or presentation or for the student to receive a favorable evaluation or letter of recommendation; rather, it is for the student to be able to formulate a hypothesis and design a sound protocol to test it, and experience the challenge and rewards of gaining new knowledge that come with a hands-on research effort. Hopefully that initial effort and commitment will lead to research becoming an integral part of that student's later professional life, whether it is at the bench, in translational research, or in clinical studies.

A scientific career is, for many of us, one of the most intense endeavors that we undertake.  It both captures and defines our lives.  As a scientist, you often think and worry about your work when you are out of the laboratory -- even while lying in bed at night.  You experience waves of enthusiasm, rebelliousness, and even self-doubt as you continually weigh your efforts.  You are sometimes haunted by the feeling that your results justify your existence.  Yet no matter the extreme stresses that come with the work, the benefits of gaining new knowledge and insight into the natural world bring rewards that make other aspects of everyday life dull and drab in comparison. 

A successful career in science depends on many factors beyond native abilities, skills, education, and experience.  The importance of mentors and mentoring has been greatly emphasized in academic centers;  the need for wise and dedicated counselors and teachers is self-evident.  The necessity for colleagues, collaboration, and networking is well understood.

But the importance of friendship can often be taken for granted.  The intangibles that build professional relationships into the knowledge, trust, and bonds of friendship are complicated and deep.  Certainly they involve elements of equality, unselfishness, and concern.  The components of friendship are many and hard to define.  Out of the multitude of definitions for friendship, a favorite of mine is "one who knows all about you and loves you all the same."  It contains more than a grain of truth.

I was strongly reminded of the importance of friendship in my own career as a visual scientist by the recent death of Ruth Kirschstein (See Retrospective, Science 13 November 2009: Vol. 326, p. 947, and Beryl Benderly's recent tribute in Science Careers).  My years as a Clinical Fellow at the NIH in the 1960s provided me with training and direction that were important in ensuing decades.  Although assigned to an ophthalmology and visual science section at the NIH, I was given sufficient "elective" time to find my way to the laboratory of Alan Rabson, a rising star in experiential pathology.  Here I was introduced to viral oncology and became grounded in the fundamentals of pathology, tissue culture, viral transformation, and electron microscopy. 

To know and be mentored by Al Rabson was to know Ruth Kirschstein, his wife, for they were an inseparable team.  She too was in the early stages of her distinguished career in research and administration.  Their interest and support, as well as gracious hospitality, made my years at the NIH a very special time that has been an inspiration for me ever since. 

I later returned for visits to NIH and stayed in contact with both of them.  We shared scientific and personal updates and sought and offered advice to each other as opportunities and adversities presented themselves.  As the decision tree in my scientific career unfolded, their friendship was a resource I came to treasure. 

In recent years, when my own career took an administrative turn and I spent time meeting my Masters of Health Administration requirement with an internship in the NIH Director's Office, I came to appreciate fully and benefit from their idealism and vision.  Their friendship and the friendship of others has been a major factor in the advancements and enjoyments that I have experienced throughout my career.  
 
For individuals starting a scientific career today, the stresses and complexities of science are certainly more intense than 40 years ago.  But the opportunities to build friendships are there if one takes the time and makes the effort.  When such an opportunity arises, I implore you to go beyond a mentor-student, role-model, or colleague-to-colleague relationship and build a lasting friendship. Such friendships will enrich and support your career.  Friendships formed and continued early in your career have a strength and value that more than justifies the effort.  

A few weeks ago, the visual science journal I edit received a manuscript from an outstanding group of research physicians at a major university center.  In addition to two senior authors, the manuscript had four additional authors, each of whose credentials and roles in the study were well defined.  On reading the manuscript, my impression was that the study was well done and contained useful new information.  Accordingly, I assigned it for more detailed review to two reviewers who are authorities in the field.

The first review advised acceptance after some minor changes.  The second review was basically in agreement, but stated that the discussion contained a plagiarized paragraph from a previously published work.  The reviewer cited the original source.  On checking, I found this was indeed the case. I called the second reviewer and asked her how she had recognized the plagiarism.  She said the literary style of the paragraph in question was different from the rest of the paper.  She entered the paragraph into Google and it brought up the publication in which the paragraph first appeared.

We contacted the senior author of the submitted manuscript and he was surprised and embarrassed to learn of the plagiarism. He said the paragraph of concern had been contributed by one of the authors, who was a medical student working on the study.  This student called me soon after and, in the ensuing tearful conversation, explained she had recorded the paragraph verbatim with the intention of rewriting it in her own words, but was caught up in exams.  Under pressure from the senior author to contribute "her portion" of the manuscript and in the heat of the moment, she  inadvertently submitted the original material without quotation marks or citing its source.

She was obviously frightened and contrite and, indeed, had reason to be.  Plagiarism, even if inadvertent, is a serious offense in science.  It is the usual policy of our journal and many others to report plagiarism to the respective university authorities in the case of academic research. Disciplinary action usually follows.  The submitted paper is rejected and editors of other journals in the field are alerted to the plagiarism and the names of those involved.  Future works from the offending authors are handled with intense scrutiny and caution.

This was explained to the author responsible for the plagiarism, as well as her fellow authors.  After rejecting the paper and carefully investigating the matter, we were finally convinced that this was a bitterly regretted "first offense" for the responsible author and a unique occurrence for all the authors involved.  Consequently, we did not carry it beyond the rejection and stern warning. 

It is nonetheless a cautionary tale because plagiarism can derail a scientific career at any stage.

As the fall ends and winter approaches, next summer may seem far away. But this is the ideal time to arrange for next summer's research position.  This is especially true for first-year medical students who, at most schools, have the prospect of a long summer break.

A good place to find out about available opportunities is from your Dean's office or from the Associate Dean for Research.  Most medical schools offer a wealth of opportunities.  How do you choose?  A good place to begin is by determining what area interests and excites you the most: Neuroscience? Reproductive biology? Robotic surgery? Space medicine? AIDS research? It's all out there.