NSF Conference on Computing in the Life Sciences
Pomona College - April 25, 1998
The computing revolution continues and its impact on the life sciences is growing. The conference was designed to highlight new uses of computing in teaching and research in the life sciences. A series of short presentations brought focus and information to inform the discussions. The first discussion period highlighted the interaction between research and teaching, asking whether computing was making the interface more permeable. The second discussion period highlighted needs that were identified in life sciences computing, especially where NSF might be able to make an impact.
Introduction: Laura Hoopes, Pomona College and NSF Biology Advisory Committee: John Fray, NSF
Laura Hoopes welcomed the attendees to the conference and briefly described the plan of the day. She then introduced John Fray. John Fray explained that this conference was focused at the leading edge of NSF's activities for the year. NSF wanted input for planning its year 200 budget, so it sponsored this conference to listen to practitioners. He highlighted features of the 1998 and 1999 budgets, especially the Knowledge and distributed Intelligence (KDI) theme. Biology is seeing this theme in knowledge networking, learning intelligent systems, and as aspects of such areas as genomics and life in earth's environments.
A question was raised about NSF's role in the purchase of computers. John Fray discussed the general theory of NSF that colleges and universities should supply computers, but told the group that presenting computers as a tool to solve a problem could now lead to funding to purchase some computers. Use in the Arabidopsis genome project was an example he gave. Another question was raised about computational bioinformatics. For Life in Earth's Environments, the whole scientific literature needs to be accessed and used, but traditional library methods are not easily manipulated. does the NSF have any role in developing new library systems for such work? John told the conferees that the databases were seen as very important to support to preserve data. Examples were the results of the Arabidopsis genome project and data on global changes.
Speakers on the research/teaching interface in life sciences: Nancy Hamlett, Biology Department, Harvey Mudd College; Newton Copp, Joint Sciences Department for Claremont McKenna College, Scripps College, and Pitzer College; Howard Towner, Biology Department, Loyola Marymount University, Beth Braker, Biology Department, Occidental College, Elizabeth Rega, Joint Sciences and Stuart Sumida, California State University at San Bernardino.
Nancy Hamlett described the courses in biology she teaches at HMC. The pedagogy used in her department de-emphasizes lectures and examinations and uses critical thinking and synthesis. Computing enhances the courses in three ways. First, computer techniques are used by the students in data analysis. Such packages as DNAStar, PowerPoint, Chart, Spike2, etc. are used to analyze data. Databases such as GenBank are accessed and used by the students. Second, demonstrations and simulations that are hard to do in real time are provided via computers. Examples include evolutionary calculations and protein purification design. Third, the course uses computing in teaching techniques. Syllabi and notes are posted on a web page; reading quizzes are on the web, good links to web resources are highlighted on the course web page as well. In the introductory biology class, there are no laboratories but experiences relating to laboratory work are built into the course. The material relates to society via group projects on "biology in the news", in which the students research current topics and provide output in the form of posters, a web page, or another means they devise. They must create something that can serve as an educational resource for the topic.
Nancy has begun to assess the effects of these computer enhancement to her courses. She now has the results of two student surveys from the introductory course. Of those who responded (38 students), 95% created a web page. The reasons given were: 68% easy, fast; 22% reaches a wider audience; 13% more effective presentation; other reasons: wanted to learn HTML, would get better grades. A few said that they were'sick of web pages.' With regard to the syllabus, 85% said they preferred it on the web so they would not lose it. Asked if the projects reinforced the lectures, 40% said yes, 30% thought they did somewhat, 30% thought not. asked if the projects related to current issues in society, 66% said yes. Nancy was asked whether HMC provides assistance to students who are creating web pages, She said no, they expect them to figure it out, but she provides assistance if it is clearly needed
Newton Copp described the development of a computer-based course in Human Physiology. Before the redesign, the course used kymographs and strip charts to present one to two weeklong laboratories, with new equipment and experimental systems given every week. The problems with this system were:
- The students were learning equipment, not biology.
- The students collected data, then went home to prepare the report. If questions arose, there was no one to ask about data interpretation.
- Short laboratories gave little chance for students to develop their scientific abilities to ask and answer questions.
In 1991, Newton received an ILI grant form NSF to improve these laboratories. The general design was to introduce a topic, equipment, experimental design, and possible questions in the first week, have the students collect data to answer a question in the second week, and to analyze and interpret the data and draft a report in the third week. The laboratory and lecture topics were decoupled to provide long enough blocks for these experiences. Computers were used to encourage and facilitate exploration, increase the variety of questions addressed, increase the speed and sophistication of analysis, better display the results, and stabilize the equipment. Newton provided examples of exercises that had been done by his students in this course. he told us that students could address questions such as signal to noise ratio or alternative displays to give the clearest interpretation of data. The most positive outcomes have been increased sizes of samples, increased variety and speed of analysis, better graphs, and a general shift of focus to the physiological question being investigated. But, the need to deal with equipment did not really recede. Students struggle with how to save files, open windows, and analyze data whole collecting data. The capabilities of the system are often underutilized by the students, and the savings realized, amounting to about half the course budget, were not banked to allow equipment and software replacement when needed.
Howard Towner teaches an introductory biology class concerning population biology, evolution, and ecology to about 160 students each year. In 1994, he received an ILI grant from NSF to create a laboratory with 11 computers to support their projects, presentations, and graphing. The students are encouraged to use the simulations and data analysis packages on their own. Howard described evolution of the computers and software used by biologists of the 1980's and 1990's, culminating in web resources available today. In 1998, he has added information about plants and about the Ballona Wetlands to the page. He uses a camcorder to capture images of wildlife for the page. He also recommended other sites, such as the Marine Biology site run by California Polytechnic University at San Luis Obispo. To Howard, the great flexibility of access to the web by people with a variety of platforms is one of its greatest strengths, compared to the great variety of other computer-based systems he has used. Howard identified several hurdles for those wishing to author such sites. It requires access to a server on which the information can be stored. One must also acquire original materials or pay attention to copyrights. Material must be put into files with appropriate formats that are accessible across platforms (for example GIF files). The author must also spend a good deal of time to learn about software and hardware, and it is not clear that his endeavor will be favorable received by faculty personnel committees.
Beth Braker described the use of computers in field work in ecology and evolution. She told us that students perceive field biology as a "non--instrumented science" using clipboards and pencils. Her experience is that data collection is facilitated a great deal by computers. In ecology, the research efforts depend on data acquisition, storage, and analysis. These endeavors are facilitated by the accuracy and efficiency of computer methods. Beth uses dataloggers for studying the forest canopy, useful in studying photosynthetic rates and nutrient flow patterns. The GIS systems available today enable investigators and their students to create accurate maps of resource placement. Image analysis is also becoming important in ecology today. It can be used to count pollen grains on a stigma, obtain population density of animals, analyze shapes (such as leaf shape as an index of predation), and measure areas (for example that of fungai blotches on leaves). Beth described how she and her students used image analysis of area in a native tropical tree plantation to examine whether or not such a plantation was more susceptible to herbivores and natural enemies. they obtain a video image of each leaf and analyze it. Using four such samples improved the variance of the mean (0.19 for the video analysis compared to 0.7 for the hand analysis). It also improved the time per sample (8 minutes compared to 20 minutes). Beth gave an example of a local student project asking how ants affect leaf predation that was accomplished using the same techniques. Results indicated that ants may remove or negatively affect herbivores. Some difficulties with the method involve reaching leaves of interest, keeping the leaves flat, not damaging leaves, reflection from leaves, analysis of compound leaves, analysis of very large leaves. In the question period, Beth was asked about security and backup for the data. She has not totally solved this problem, but can download data to a laptop or a PC card and back it up. She was also asked about non-linearity of intensity on videotapes, and had not addressed this problem. She told us her software was Imagetool or NIHimage.
Elizabeth Rega and Stuart Sumida discussed enhancing the teaching and learning of human anatomy via computers. Elizabeth teaches about 60 students at a time; Stuart teaches about 220 at a time. They both use images to enhance the laboratory, such as the Virtual Human. They identify a number of challenges, including inadequate textbook support (e.g., no color pictures of structures), expense, student understanding of 2D to 3D conversion of information, and developing real life motor skills of dissection. They argued that this subject is not all memorization, but requires a great deal of three-dimensional visualization, understanding of development of morphology, and understanding of individual variation. Elizabeth highly recommended the site http://visembryo.ucsf.edu where embryonic development can be examined. She uses web lectures with links to important sites and has virtual office hours on the internet. Stuart said that Elizabeth had developed a number of these methods and had shared them with him; he uses them in a different setting. He said that the computer makes us better scientists and researchers and works with the students to develop 2D/3D understanding. He finds the available resources are better than labeled embryo slides. He also uses the visible human male project and the visible human female site, and uses his class web page to point students to those sites. One of the issues of concern to Stuart is variability between humans of the same structures. he is not sure that computer based methods can currently address this issue adequately. Another concern is that developing such methods is a lot of work, in some ways resembling producing a text but without royalties or academic recognition. Third, Stuart reminded us that errors can creep into the best sources, and presence on the web site for a reliable institution in no guarantee of absolute accuracy. Fourth, he also finds that 2D/3D mental conversion is hard for his students, though computer methods do improve this visualization. Fifth, he shares Elizabeth's concern about development of motor skills. He felt that these methods make teaching and research very close together, and pointed to the usefulness of the methods of his course to paleontologists.
In the question period, Rich Cardullo said the computer-driven real life motor skills were becoming important too. For example, students using micromanipulators in his laboratory have benefited from video game experience. But, this experience does not help them with phlebotomy. Beth Braker commented that variability is also an issue in ecology, and that science often prepares an idealized 'the way it is' view that may contrast with individually observed examples but still reflect an overall tendency. A question about public web pages elicited the comment that putting up a web page on a topic with medical implications, such as human anatomy, leads to a large number of inquires from the public about their medical problems. It is important to decide how these should be handled. Elizabeth told us that her virtual office hours software was a module of Front Page. She has found it very easy, but with a few bugs
Discussion of whether or not research and teaching are brought together via computer resources in the life sciences. T.J. Mueller commented that increasing student problem solving, which is facilitated by computer approaches, is much more proximate to research. Nancy Hamlett commented that there are a number of research areas where leading edge researchers post their advance class notes on the web, and these can provide a real current approach to students. She noted that Norm Pace at UCB has provided such a site. Howard Towner raised the issue of intellectual property for such sites, and said that not all researchers are so generous with their new work. T.J. Mueller noted an upcoming symposium in Claremont on intellectual property aspects of the web.
Beth Braker said that she found teaching techniques useful in research was very attractive to students at an undergraduate college. There was no comparison between student interest in sorting insects versus downloading video images from a computer. Howard Towner agreed, saying that his students can digitize sound spectrograms and use them to analyze research question; this has enlivened classes as well as research.
Diana Linden raised the question of trouble shooting and student support for students venturing into new computing areas. T.J. Mueller said that infrastructure needs to include support staff, but he has found that the needed level of support is very spotty, especially for web and multimedia methods. Rich Cardullo commented that the best peel in these areas are immediately hired by Disney and Warner. Stuart Sumida said that in introductory biology, he is able to ease students into web use via asking a question that links the students to the course web site to prepare for laboratory work, etc. John Fray commented that this was opening students eyes and educating them. Newt Copp said that this is also direct preparation for research; class room computing is being used to develop familiarity with tools in their classes and projects. He still questions how to best stimulate curiosity and promote thinking. Web based information is disorganized, not peer reviewer or moderated, and without feedback to authors or users. Credit for web-based work is a murky area. He things more professional societies are beginning to evaluate and review educational/research resource sites, but such sites change frequently, so reviews may be out of date.
Nancy Hamlett noted that students can use the computer resources to get into dialogue with major researchers. In one project in her introductory class, the students contacted a major research who was developing DNA-based computers, and he allowed the to e-mail him 3 questions that he answered via e-mail. Stuart Sumida compared the impact of computers in the life sciences with that of photography in movement analysis. One could not know if all four of a horse's feet left the ground at once until stop motion photography was developed. computers and their diverse and multiplying methodologies are having a similar or even greater impact. Howard Towner pointed out that people depend upon sensory input and that there is no real substitute for animals and plants; the computer is a preview and a tool, no a substitute for reality. Diana Linden noted that the translation from 2D to 3D thinking is helped by computing but really requires analysis or real organisms. Wayne Steinmetz noted that once we weighed the peaks from gas chromatograms to integrate them but now we have much more sophisticated computer methods. He advised us to expect improvements in computing, not judge what is possible based on today's technology
Bruce Telzer said that as educators, professors need to provide on-the-job training in thinking to the students; he finds computers useful in focusing on thinking rather than repetitive task performance. In some cases, such as in a graphing program, students may accept the computer's line when they know it is absurd. he asks students to make a hand-drawn graph and provide an approximate answer before leaving the laboratory, in order to combat their lack of trust in their own observations. Howard Towner said that specific applications could enhance students critical thinking. In some courses, he uses a PC generic problem solving program. T.J. Mueller asked Newt Copp whether his students were thinking more about experimental design or trying random experiments. Newt said that they can test ideas quickly, but becoming critical of ideas is hard to obtain from the students. They tend to stop the analysis as soon as one fit is obtained, rather than critiquing the fit. Diana Linden described a program that models the squid giant axon and allows students to change all the parameters and then get data. This program does not help, of course, with real axons that give data unlike the characteristic waveforms. Nancy Hamlett said that it was good to create experiments in which a large number of students could participate. She prefers for them to get a real experience rather than a virtual research experience.
John Fray asked what could be measured in order to see the answer to the question, does computing bring research and teaching together, in the past and the present? Does integration of research and teaching via computing improve education? Will we create long term learners in this way? Newt Copp said that interdisciplinary work and ability to work across boundaries could be examined; he aims to produce undergraduates who can bridge gaps between fields. Wayne Steinmetz advocated for examination of Nancy Hamlett's teaching methods and the student outcomes achieved; he believes empowering students is more important by far than assimilation of ever-larger texts. Rich Cardullo said that it is a challenge to teach well in spite of excessive information; achieving depth of learning requires good designs. We need to assess how well our teaching methods encourage students to question.
John Fray then asked how long can a professor track a student? Three years was suggested by Nancy Hamlett, but others thought some examination of alumni/ae could be possible as well. John suggested that hands-on learning, moving back and forth between concrete and abstract, was an important skill to assess, as well as open-endedness allowing for student initiative and ownership. Several people agreed with these ideas. Wayne Steinmetz suggested transcripts looking at types of courses selected before and after certain courses might be useful. Stuart Sumida brought up the problem of professor evaluation systems, and thought the kind of intensive teaching strategies being discussed might result in denial of tenure. John Fray asked how such perceptions might be changed? Stuart thought one could cluster junior people so they could support each other, others suggested tracking tenure processes for creative teachers. They also suggested assessing for student learning via subsequent selection of courses, career choices, attitudes towards science, and self-reported problem solving skills 5-10 years after graduation. John Fray noted that science literacy is important as well, but it is currently not tracked. Nancy Hamlett said that since Harvey Mudd established a biology major very recently, they are able to track all of their alumni/ae. There are only 5 years of data as yet; alumni/ae are sent by e-mail a group of open ended questions, and 2/3 respond. They ask about what in their education served them well or was important in their careers, strengths, areas to improve, and any messages they would like to give to prospective students. They are particularly positive about writing intensive courses and research experiences so far.
Speakers on infrastructure concerns: Arieh Warshel, University of Southern California; Steve Mayo, California Institute of Technology; Fred Lee, Pomona College, Rich Cardullo, University of California, Riverside; Bruce Telzer, Pomona College; T.J. Mueller, consultant (formerly Harvey Mudd College); Diana Card Linden, Occidental College
Arieh Warshel began the afternoon session with a talk on understanding enzyme actions through modeling. The protein structure is able to tell a scientist where certain things may take place, but not the energetics or how the reaction works. These features cannot be understood without computers. Using them the researcher can approach complexity, build models and check them against known parameters and thus discern what is happening functionally. He gave an example of the Ras p21 system and its complex interactions with GTP and GDP, an important control point for carcinogenesis. X-ray crystallographic data indicated that gln61 in protein could possibly act as a base in the attack on GTP to form GDP, but further analysis using modeling showed that this model was untenable. Additional examples of the application of this method were given. In the question period, Arieh told us that he began this type of work on simple PCs, but has now moved to SGI workstations. Asked how to avoid the kinds of mistakes in modeling that he had described, he said, "keep an open mind, model, and aspire to get the actually measured numbers."
Steve Mayo then explained his combinatoric approach to protein design, a very interdisciplinary field using biology, chemistry, computer science, and other fields as contributors. he is looking for insights into protein folding that will facilitate biotechnology for the next century, enabling scientists to design proteins that will fold in particular, known ways. First, the backbone position must be designed; then, the sequence of amino acids must be selected to get folding and functionality. His method involves consideration of the hydrophobicity of the core residues; boundary region design features such as burial of residues, packing, electrostatics; and surface features such as hydrophilic groups. He uses a design automation cycle. First, he analyzes, then designs, then simulates the desired result using computers. Then, he synthesizes the apparently optimal molecule and examines its real properties. Information from these experiments forms the basis of the next analysis beginning the cycle again. He explained the enormously complex system and the simplifying assumptions he makes for his calculations. Then he presented results in design of a modified region from a Zinc-finger Transcription Factor. The design for FSD-1 enabled synthesis of a domain that folded in close approximation of the real domain but was not homologous to the original sequence. In the question period, Steve told us that he is sure that he has found the optimal solution within the assumptions about rotamer libraries, etc. that he had made. Another question focused upon thermal stability. Steve said that hyperthermophilic designs are possible to make.
Fred Lee then discussed bringing to biochemistry and molecular biology students the use of the interdisciplinary tools of computer molecular modeling. The connection between structure and properties is his focus, and his courses are designed based upon the assumption that the best way to equip students is via hands-on experience. He builds up their basic skills in Unix and using the internet formation resources. Then, the students undertake problems that can be tackled using a Unix cluster and the computational biology server that are available for student use. He has build web pages that collect web resources and enhance self-learning on the part of his students. Tutorials on Unix, protein design editors, compilers, and on-line help are on his web page. He has the students send in their papers via FTP. They learn to use visualization software and to plot phi/psi maps and contact maps for residues. He wants them to do semicritical/ab initio calculations in solution, not in gas and to look at protein-ligand interactions, mechanisms, and computer-based protein design. So far, he has found that the students need more programming and physical chemical skills in order to do this successfully. One way this situation could be improved is to develop a computer science track for biology majors. Another way is to increase interest of non-biology majors in this interdisciplinary field. In the question period, Fred was asked whether his field is an appropriate track for undergraduates in biology, or whether biology was more to provide a broad training rather than such a focus. he responded that this direction would not be for all biology students, but would be for students who can have completed physical chemistry by junior year. He feels strongly that students who would want to go on in this area should have the option available. Fred reminded us that NMR, UV/visible spectrophotometry, analysis of data, and writing of papers are all now computer-linked; students are comfortable with many aspects of computer use. However, computer languages such as Pascal, C, and Visual Basic would be helpful for today's science students. Wayne Steinmetz commented that 15 years ago, a task force recommended that every student take a course with extensive computer methods. Today, we no longer have this requirement, but students are required to master statistics and analytical reasoning. Fred thought that some computer language work could be included in laboratory work for introductory biology and chemistry students. Chris Rohlman noted that the liberal arts approach to science has led to students with all different backgrounds in science courses. Steve Mayo told us that his graduate students originate from chemistry, physics, and biology programs and all work together on the same projects.
Rich Cardullo then talked about image processing in cell biology. He raised the question, How much do we expect students to understand the black boxes? Rich described the confocal microscopes that optically section the specimen under study. The uses of computing in image analysis he classified into two main functions. First, contrast enhancement can be done via computing. The contrast can be enhanced temporally using such methods as signal averaging. It can also be done spatially, suing various filters and convolution kernels. The second use is in quantitative measurement. Morphometry is one direction that such studies can take. Another is the study of dynamic behavior of molecules, for example on cell surfaces. He presented examples of the analysis of the interaction of the egg zona pellucida and receptors on the surface of sperm, an interaction that provide species specificity in fertilization. He also described Xenopus egg activation. The egg is loaded with a calcium sensitive dye and observed every thirty seconds. A wave of calcium ion is detected that moves from the fertilization pole of the egg to the opposite point on the egg surface over a five-minute period. Stimulation of the hEGF receptor by adding Epidermal Growth Factor resulted in pulses of calcium ion release in eggs, rather than waves as produced by fertilization. Another example Rich gave was of studying quantitative molecular motions on cells via Fluorescence Recovery After Photobleaching (FRAP). This techniques uses Nanovid microscopy and fluorescence correlation spectroscopy. This technique can be used to follow movement of molecules in three dimensions using VIFRAP, leading to the conclusion that lipids in membranes have 100% mobility. In the question period, Wayne Steinmetz asked how one might best prepare undergraduate biology majors to use these techniques. Rich told us that they are generally ready by junior year, but that he also finds some physics majors can participate in these experiments. His groups are composed of students with different backgrounds, since his work is very interdisciplinary. He pointed out that one problem is the lack of many students arriving today from high school with biology, chemistry, and physics courses. He particularly finds almost none with chemistry or physics, and believes this trend is a real problem.
Bruce Telzer described building a cell biology laboratory around computer-assisted data acquisition and analysis. His goals are to introduce image processing. allow universal access to class data by all students, and to save time. Some examples of his uses of computers are: capture of EM negatives, light microscopic images, gels, blots, etc.; developing print images from video images via software and LaserJet printers; scanning for quantitation of gels and blots; statistical and graphical analysis, and storing of data files on servers. He uses a computer with framegrabber, Image1 program, a microscope with video camera allowing high intensity and low intensity work, power supplies for lamps, fiber optic cables for high resolution imaging, and an Argus IO image processor. Students use Image1 and a computer to acquire raw images from microscopy, from VCR, or from black and white cameras. Bruce told us that students could look at an axoneme in sperm or cilia and see microtubule assembly using light microscopy images, although these events are ten-fold below the resolution of the microscope. also, this type of analysis takes minutes, not hours and one sees a live image, as compared to what must be done to observe using an electron microscope. The image is sent to Argus 10 to subtract background. Then frame averaging of the image is used to decrease the effects of Brownian motion. Bruce showed us clear images in which his students had been able to record assembly of microtubules over time using this technique. Current students engage in a Five-week analysis of red blood cell membrane proteins during which he is fairly directive. Then, he has them select and study a question of their own choice in cell biology for which he is a resource for the remainder of the semester. Examples of such projects he give were to purify microtubules from brain, to recombine dynein and outerfibers, to reactivated axonemal microtubule sliding. to assess microtubule protein rations via Western blots or via immunoprecipitation, and to examine microtubules in cellular cytoskeletons via immunofluorescence. He recommended the use of laptops with PC cards that enable network access; his students use them as digital laboratory notebooks and can also network with other students in the laboratory.
T.J. Mueller (firstname.lastname@example.org) described a neurobiology course he taught in the fall of 1994, involving active, student-driven learning that he calls "lecture-free learning". He noted that such methods do not decrease the professor's work load, but do create a learning environment to engage the students with the subject. He compared this computer-based course favorably with the neurobiology course he had taught for 13 years at Harvey Mudd College using lectures. T.J. put the course material onto multimedia machines and also let the students put material there. On the first day of class, T.J. gave an overview of the research issues, methods what they are used for, etc. The first assignment given was to look in the newspaper and think of a question you'd like answered by the class. From this question, a research projects was developed, then a story was produced, put into multimedia format, and presented to peers. Examples of such questions were given: Why am I colorblind? My grandmother has Alzheimer's Disease. What has happened to her brain? What is motor memory? How does the brain work? The latter questions was considered too broad; T.J. had to work with some students to obtain a question that could be addressed by this method. He gave the students information about library research. After considering the question each student had raised. T.J. Assigned the students into groups with related topics, such as the vision team, the motor team, the memory team, etc. Each team worked together to develop the understanding necessary to answer their questions. A first presentation from each team was scheduled in mid semester, allowing feedback to the teams. They then improved their presentations for the final peer and instructor evaluation. Class time was used to follow team progress followed by discussion. After a time, the students asked for lectures on five particular topics and followed them with great interest. A problem with the course was the first generation ILS software used which was very buggy, but the software for such a course is now much better. In the question period. T.J. was asked to compare this experience with that of writing a term paper. It is similar in providing depth rather than breadth, but real world stories, emphasis on being educational, engaging, and convincing to peers, use of the more engaging computer/multimedia materials, and pseudoexperimentation set it apart. There was not wet laboratory with this course, although, T.J. thought wet lab would be an asset to such a course. There were 20 students in this course.
Diane Card Linden described taking a neurobiology course from a lecture class only to a laboratory course using intracellular recording. She had an ILI grant in 1990-1992 to develop this course. The goals were to promote students understanding of elusive topics in biophysics and to have them understand intracellular recording techniques. Included were early laboratories involving repetition of tasks so that students learned to setup, trouble shoot, and carry out experiments on intracellular recording from amphibian nerves. At the end of the semester, teams carried out projects that they designed based on the earlier experiences in the laboratory. the software used was MacLab; at the time, there were few systems and this was the easiest to use. the intracellular recording apparatus was hard wired to the modem port for recording potential differences. The program contains sophisticated routines for analysis of data. Students maintained their data on a diskette that they could take home to analyze further via spreadsheets, etc. In addition to collecting their own data, Diana provides an actin potential program, a sheep brain program, and a program on neurosimulation on neural networks for exploration by the students. She gave several examples of projects the students had performed. One involved physostigmine effects on spontaneous synaptic potentials. Another examined the effect of 2mM extracellular Rb+, K+, or Li+ on Vrest*. The system she uses works reasonably well but the Macintoshes need to be updated and more memory is needed. Strong points of the laboratory work included the development of student expertise with both the equipment and the dissection, the open-ended experiments that students could "own", and the increased excitement they have about neurobiology. Diana insists that the students present all of their data in a research style report. On evaluations, the students say they understand the concepts and that the best part of the course is the final project. In the discussion, Karen Parfitt suggested invertebrate preparations and recommended "the crawdad workshop" at Cornell University. Diana said that the amphibian preparations were not very hard to do, and only take five minutes when you are skilled; they just required the students to practice to develop their skills. Diana told us she groups the students into pairs and that the juniors and seniors take this course. She worries because she is unlikely to gent another ILI to replace her old equipment, but Occidental does not support or replace Macintoshes. She is considering applying to ILI to develop laboratories on voltage clamping.
Discussion of infrastructure needs that arise from increased use of computing in the life sciences, particularly the role of NSF in addressing such needs.
T.J. Mueller said that the Mellon grant that is supporting computer and multimedia based learning at the Claremont Colleges does not aim to provide basic equipment, but just to allow models and breakthroughs. He thought NSF is pursuing a similar strategy, and thought NSF might put pressure on institutions not just to match but to continue the programs. Others thought that institutions are currently pouring funds into this area and may be be able to keep up with demand. Bruce Telzer said that 5 years of life for a computer system is long, and plans must be made for replacements. T.J. said that personnel (computer support people) must be increased to allow faculty to use computing optimally. Asking faculty to add learning multimedia or computer data acquisition programs to their expertise needs support; otherwise faculty will not do it. He also suggested that courses and/or departments should develop a way to bank budget dollars so that computers and software could be replaced when needed.
Bruce Telzer noted that when a centrifuge is purchased, it last 10-20 years, and one can ask for a replacement on a research grant or a teaching grant after a long period of use. With the short lifespan of a computer, replacements may be needed within the time of one research grant. He thinks the equivalent of a service contract for an HPLC might be to simply lease computers and always include the lease cost on every grant proposal. Elizabeth Rega raised the issue of discrepancies between disciplines. Institutions may replace computers for science or mathematics but not for English or Art, leading to resentments. Stuart Sumida said that ways are needed to integrate curricular, for example through a core program, so that all can be supported. But, standing budget line items for computer replacement are needed and are almost never there.
John Fray noted that it would be valuable to NSF to discuss the short term versus long term approaches to computing. He suggested that a novel idea might enable people to apply for a new ILI grant, for example. NSF might also consider developing mechanisms to support long term, multi-user computer facilities. Stuart Sumida reiterated the severe lack of support personnel faced by professors at his institution. He thought perhaps the NSF should not grant equipment without a guarantee of support personnel from the institution. A good model for funding would be for NSF to provide or require as part of the match 1 or 2 years of personnel support and funding for replacement of the computers in year 5.
John Fray then emphasized that the NSF has moved from its view that undergraduate institutions focus just on education. They now see such institutions taking on a major role in the integration of research and education. The two new review criteria used for all proposals to the NSF are enabling reviewers to focus on the issue of whether research results are being disseminated and providing education in some context. The National Science board, NSF's oversight body, has recommended that NSF move grant lengths from about three year to a 4-5 year span on the average. There is a great deal of new growth in life sciences using computing, for example in genomics, that John thinks offers an opening for innovative proposals. The community at this workshop would be a great locus for emphasis on research/education interfaces as well as life sciences computing. He recommended that proposals to the KDI initiative be seriously considered. He told the group that computer upgrades could be proposed in the budget, but institutions would need to identify their potential effects as a model to other colleges and universities.
Stuart Sumida raised a concern about the development of Internet II and the commitment of his institution to Internet I. John Fray said that NSF's support for Internet I was ended; the wave of the future is Internet II. Wayne Steinmetz noted that access to scientific library information is very important; soon, paper journals will be a thing of the past and electronic libraries will be the norm. John Fray said that proposals relating to this need could be considered by the Biological Infrastructure Division if a paradigm could be developed to integrate the proposal with research/education interfacing. Diana Linden and Wayne Steinmetz told the group that budget cuts now make electronic approaches to the literature essential for advanced classes. Elizabeth Rega noted that temporary faculty are present in large numbers, and often get minimal computer support
John Strauss brought up the pressure from society on higher education to improve productivity. John Fray suggested that educational institutions could be transformed via computing into research colleges. Jon Strauss said that Harvey Mudd and other Claremont Colleges have no desire to let go of their educational goals; instead, they want to provide cutting edge research opportunities while still interfacing research and teaching. John Fray said that great opportunities exist at NSF for developing uses of computing in life sciences at the research/teaching interface, and colleges like these must take the lead in preparing proposals of this kind. Stuart Sumida noted that consortia are very important; state universities, community colleges, etc can establish formal or informal linkages to facilitate such efforts via faculty training and shared resources. He thought that faculty release time was not as effective as sabbatical support in getting faculty up to speed on these techniques. T.J. Mueller suggested a mechanism such as the Mellon grant one, with an umbrella grant and small subgrants provided to faculty in a consortium. John Fray told us that this type of mechanism is very hard for NSF to use.
After dinner lecture: David Laidlaw, California Institute of Technology. "Impressionistic Image Analysis".
David Laidlaw described to the group a way to use multilayered images to provide a great deal of integrated information about what is going on in a cell or a tissue. His technique uses bio-informatics, neuroinformatics, and Diffusion Tensor Imaging. His techniques uses ellipsoids to summarize diffusion information. The images produced can be viewed in 3-D using special glasses. Rotation of the structure being studied is followed via video. The images produced can enable the investigator to view 6 coded values at once in their different locations within a structure. For example, a view of the spinal cord showed in 'underpainting' the basic shape with a grid for the resolution of data. next were ellipses showing anisotropy or spheres showing isotropy. The orientation of the spheres shows the direction of the fastest diffusion in the plain. Reddish tint is used to show extension of the ellipses or spheres out of the plain. Opacity and striping can be used to indicate other variables. The results resemble paintings by Van Gogh, but convey an enormous amount of data. Comparisons of spinal cord of normal people with those having Multiple Sclerosis revealed a number of changes, for example much faster diffusion in the diseased areas. There is also a subtle change in the gray matter structure that can be seen. David is now applying this technique to mouse mutant strains such as shiverer, a genetic mutant lacking adequate myelinization of nerves.
Those in attendance at the NSF Conference on Computing in the Life Sciences are listed below:
John Fray, NSF
Laura Hoopes, Pomona College
Nancy Hamlett, Harvey Mudd College
Jack Mueller, consultant
Wayne Steinmetz, Pomona College
Howard Towner, Loyola Marymount University
Lynne Mizuno, Glendale College
Richard Cardullo, University of California, Riverside
Bruce Telzer, Pomona College
Elizabeth Braker, Occidental College
Diana Linden, Occidental College
Arieh Warshel, University of Southern California
Chris Rohlman, Pomona College
Stephen Mayo, California Institute of Technology
Julie Archer, California Institute of Technology
T.J. Mueller, consultant, RR1, RFD 317, Little Deer Isle, ME 04650
(formerly Harvey Mudd College)
Eugene Wu, Harvey Mudd College
Jack Strauss, Harvey Mudd College
David Laidlaw, California Institute of Technology
Stuart Sumida, California State University, San Bernardino
Fred Lee, Pomona College
Karen Parfitt, Pomona College
Elizabeth Rega, Joint Sciences for Pitzer, Claremont McKenna,
and Scripps College
Jon Strauss, President, Harvey Mudd College