Professional Society Presentations

Mary Funke, American Chemical Society
Nick Claudy, American Geological Institute
Mike Neuschatz, American Institute of Physics
James Maxwell, American Mathematical Society/The Mathematical Association of America
Jessica Kohout, American Psychological Association
Carla Howery, American Sociological Association
Herb Maisel, Computing Research Association and the Association for Computing Machinery
Richard Ellis, Engineering Workforce Commission of the American Association of Engineering Societies
Howard Garrison, Federation of American Societies for Experimental Biology
Vin O'Neill, Institute of Electrical and Electronics Engineers, Inc.
Alan Fechter, Catherine Gaddy, and Eleanor Babco, Commission on Professionals in Science and Technology

Mary Funke, American Chemical Society

I'd like to introduce two of my colleagues: Corinne Arasco, who's a senior research associate from my office; and Joan Burrelli, who left ACS about four months ago to join the NSF staff. And most of the studies that I'm going to be talking about today were actually conducted by Joan.

The ACS Department of Career Services exists to enhance the professional and economic staffs of chemists and chemical engineers, and to that end we provide one-on-one services, direct contact with employers, and information on trends and issues in the workplace to assist members in career decision-making. And, of course, that last item -- information on unemployment data, trends and issues that help chemists in career decision-making is what we're here to talk about today.

I've brought copies of our brochure that outlines all of our programs, including all of our workforce studies and publications. I, of course, can't cover them all in 10 minutes, but I'm sure you'll take a copy and feel free to contact us back at the office if you would like more information than what we're covering here today.

The ACS conducts two annual studies: the annual salary survey and the starting salary survey. These are the two that you're probably most familiar with. The objective of both studies is to collect and analyze data on salaries and occupational status of chemists and chemical engineers. The salary survey covers experienced chemists and chemical engineers and the starting salaries survey covers new graduates. All degree areas are covered in both the salary survey and the starting salary survey.

Who are we surveying? In the salary survey, we are surveying our members. We have slightly over 150,000 members, and every five years (including 1995), we survey all of our members under the age of 70 who have full and associate status in the organization and mailing labels indicating they live in the U.S. This year that translates to 93,000 members. We start to collect our data in February. We measure salaries as of March 1. We have an internal cutoff date for the data in May, and we produce the report in July or August.

Out of that five-year report, we're able to look at small minority groups, e.g., we produce a very large report on woman chemists, and a report on chemists with disabilities. We can also produce reports on other minority groups or special interest groups. For example, we're able to break out the Asian population.

In the intervening years between the five-year surveys, we measure a smaller number of our members (about 20,000), usually targeting a special project area. For example, last year we looked at fringe benefits in the chemical industry and in academe. In 1993, we looked at health and safety. In 1992, we didn't look at a special focus group. And then in 1991 we looked at duties and responsibilities of chemists and chemical engineers.

We compare salaries against factors such as years since the B.S., work specialty, work function, type of employment, sex, and degree areas. We also look at employment statistics: unemployment rates, employment and work specialties, work functions, et cetera.

For the starting salary survey, we start to collect our data in July. Departments that have ACS-approved programs generate lists of students for us, and that's the target of the starting salary survey. We survey graduates on the September of the year before to the July of the present year. For 1994, that translated to about 8,000 chemists.

Every year we also do some sort of special study. This year we are completing a survey of doctoral mentors to obtain information about the current employment of Ph.D. graduates. Last year we did a piece on current trends in chemical technology, business and employment; the year before that (1993) chemical and professional retired chemists; and in 1992, domestic status and discrimination and career opportunities for men and women chemists. I'm just going to address the first two because of the time restraints.

The Ph.D. study was designed to look at two factors. Everyone, of course, is very concerned about what is really happening with post-docs and temporary employment of new Ph.D.s, and that's what this study was designed to do: to look at what's really happening with the pipeline for post-docs and what's really happening with temporary employment of American versus foreign-born scientists.

The other objective was to determine the quality of programs, because in chemistry, as in other scientific fields, many are concerned about an oversupply of new chemistry Ph.D.s. There is a premise that there is an oversupply and perhaps we should be influencing programs that are not strong to stop producing. There was also a premise that if you had an overabundance of doctoral students or people in temporary employment for long periods of time, that meant your program wasn't strong.

You can, of course, argue whether the premises of the study were valid. But guess what? There's no difference, we found no difference between the top 25 percentile and the lowest 25 percentile of producers of Ph.D. chemists in terms of how long they had been in the post-doc.

We've reached the deadline for gathering information on this, and we're in the process of writing the report. It's actually being written by Dave Montgomery from the University of Maryland. He's a physicist by academic training.

We will have a final report in June.

The interesting thing is that this report was not intended to be published, but the data is so strong, we're going to ask that it be published.

We targeted chemistry faculty from the ACS Directory of Graduate Research which consisted of about 4,500 faculty members. We sent them this survey which, when we do it again will look different. But we did send them this.

Interestingly enough, we have received about 30 percent of the actual survey returned, but it accounted for 62 percent of the doctoral students that we were trying to measure in this time frame. So we're incredibly pleased with that, and so we feel that it's very, very strong data.

I'm not going to stand up here showing you table after table, but I do want to show you two tables that we found extremely interesting. This is the current status table. This is the employment status for U.S. citizens by year of graduation. What this is saying is that 7.9 percent of the U.S. citizens that graduated in 1988 are still in post-docs and 3.3 percent are in temporary positions. (See Exhibit 1)

A lot of things could have happened between 1988 and 1995. We are leaning towards thinking these graduates have been in temporary positions since that graduation date. That's hard to tell from this, but assuming that it is, you're looking at over 10 percent of that population.

If you look at those in this country on temporary visas, you see that 16.7 percent of 1988 graduates are in post-docs in 1995 and 5.1 percent are in temporary positions. This is pretty startling data.

The other special study I wanted to discuss briefly was the one on current trends. Current trends is a qualitative study, but to me it is a must-read. This is the last great work done by Dr. Joan Burrelli before she left ACS. It is truly a must-read for all degree fields, levels, and degrees of experience, because it covers trends in the chemical technology, business and employment of chemical professionals. It talks about growth areas and decline areas. It talks about federal policies affecting R&D funding and about what employers are looking for. While it is qualitative, it's excellently researched and excellently written, and there is no other publication like it right now.

We did telephone interviews with 114 people representing 100 different organizations -- large and small industrial and academic organizations and government agencies. We also did library research and looked at corporate annual reports.

To summarize, we do our two annual surveys -- the starting salary survey and the salary survey. We think that both the salary and employment data coming from those two surveys are strong and consistent and should be and are utilized by young Ph.D.s in all degree areas to make some determinations about their career directions. We also do special studies, some of which are likely to be repeated. In particular, we know that we'll be doing current trends every five years.

That concludes my remarks. I'll be glad to try to answer any questions.

SUTER: Is your survey of your membership, or does it extend beyond your membership?

FUNKE: No; just of the members. Under the age of 70, full and associate, and on the five-year survey it's all of them, and on the intervening years it's about 25 percent of them.

FECHTER: The survey of mentors, though, was not only of members, was it?

FUNKE: No, I'm sorry. I thought you were referring to the salary surveys. No, the survey of mentors were graduate faculty of chemistry departments listed in our Directory of Graduate Research. So approximately 4,500 faculty members in the U.S.

SPAR: Putting the foreign-born on that questionnaire intuitively says to me it's going to lower your response rate. Have you done any kind of testing? Have you split out whether or not it's affecting the types of responses?

FUNKE: We're splitting out several tables according to U.S. versus foreign-born. We did not make the judgment that the higher foreign-born numbers in some institutions affect the response rate.

KOHOUT: What is the capture rate of the ACS? How many chemists are members of ACS?

FUNKE: That's a good question, and, Joan or Corinne, you might be able to answer it. There's three different measurements of chemists, of how you define chemists.

BURRELLI: Well, relying on old NSF data, it's somewhere around 45 percent. While members are about 45 percent of the chemists, it's much higher for the Ph.D. population and much lower for the bachelor's level chemists.

FECHTER: To follow up on that, has anybody done any analysis of whether the characteristics of the members differ significantly from the characteristics of the non-members?

BURRELLI: In terms of degree, yes. And, of course, degree affects function, in the sense that B.S. chemists do different functions in industry than Ph.D. chemists, they--

FECHTER: Within the Ph.D. chemists, are the members of societies' Ph.D.s different in their characteristics from the NSF survey, from the non-members?

BURRELLI: No.

FECHTER: An analysis has been done of that?

BURRELLI: Yes -- based on the 1986 NSF U.S. scientists and engineers data.

FECHTER: Has it been done for the doctorates?

BURRELLI: There are isolated tables here or there on that, but nothing that's in a format that could be handed out.

KRUYTBOSCH: This could be done with the currently emerging data. You wouldn't know whether they were a member of ACS or not. The only question is: Do they belong to a professional association? Then you can look at all the characteristics, the age and sex and all that.

KIMSEY: You pointed out that 11 percent of the U.S. citizens who got Ph.D.s in 1988 or in 1990 now are in post-doc or temporary positions? Do you have similar data from any other year or years and/or is it possible to pull out of your salary surveys from other years what those probable numbers would have been, say, over the last 10 years or over some trend?

FUNKE: No. This would be our starting point to collect this kind of data.

KIMSEY: Could you get this information from your salary survey?

FUNKE: The salary survey would tell us how many post-docs there are in that given year, but you wouldn't know when they got their degree or you wouldn't know if they left the post-doc after that, and how soon after they left the post-doc. We have never tried to follow up on those post-docs after the year of the salary survey. And at this point it would be very hard to backtrack because, as you know, they move. It would be almost impossible to have the current addresses.

VOYTUK: In terms of the current trend study, that's essentially a demand study, actually. Is that correct?

FUNKE: Well, no. We looked at three different factors. We looked at what are the growth and decline areas technologically. We looked at hiring trends. And we looked at factors affecting the business of chemistry. So we did not try to look at supply and demand, but to try to understand what was actually happening in the industry in terms of employment.

BROWN: All right. Thank you. That was a good start, and I noted how one of the questions that will occur probably with all of the speakers is the question about how well does your membership reflect the entire profession out there.

Our next speakers are Nick Claudy and Marilyn Suiter of the American Geological Institute.

Nick Claudy, American Geological Institute

The American Geological Institute conducts basically four surveys, and in comparison with what you have heard and will hear, we probably are woefully behind in our data collection and our analysis thereof. We do have some long-standing studies that have helped develop information that is probably derivative of the actual information we are trying to gather.

We feel that our 27 member societies represent pretty much the complete spectrum of those in the geosciences and earth sciences -- solid, water, and air. Many of the career guidance and directions that I will talk about are performed by our member societies rather than by the American Geological Institute itself.

With all of that as a preamble, let me go over briefly the kinds of information we gather. Our first survey is an annual student enrollment and degrees granted survey that we've been doing since about 1956. We gather information on enrollments of freshman and sophomore majors through Ph.D. candidates. We are somewhat slower than the other sciences to gather data on post-docs. From what we can gather from National Science Foundation data, there are far fewer post-docs in the geosciences now than in other physical sciences. That will probably change as the employment pictures becomes more and more bleak.

Student enrollment information is collected across 20 sub-disciplines that are pretty well representative of the whole spectrum of the geoscience discipline.

We cover associate degrees through Ph.D.s. In light of employment trends as they seem to be evolving (at least anecdotally), associate and bachelor's level scientists are becoming increasingly important; we probably have enough Ph.D.s at this point.

We get information by gender, by ethnicity, and by citizenship--that is, U.S. and international students. We can look at the trends in student enrollments and at least note where major events have occurred in the geoscience workforce. For example, the downturn in the oil and gas industries, which began in about 1983, is reflected in the student enrollment trends. It also documents the rise in such fields as environmental science and all of the derivatives of those studies, such as hydrology and waste disposal.

At one point, those who were pursuing earth science teaching (mostly for training for the K-12 levels) through Earth Science Departments became nearly extinct. I think part of the saving of that group had to do with acknowledging that this was the situation. We at least informed academic decision-makers that there was not enough going on. The Education Departments and Geoscience Departments began to interact a great deal more, and now there is hope that there will be a good supply of good teachers rather than what was often the case of the football coach being the earth science teacher.

We've been doing our faculty salary survey since 1983, and, again, probably the least important data are the actual salaries themselves. We survey professors, associate professors, assistant professors, and instructors. We do base compensation only and not compensation that's received through consulting or other kinds of research grants. We do it on a 9-month versus 12-month pay rate. We do it by $250 increments, which develops quite a bit of resentment among those who have to fill the forms out, but it does give us a pretty exact salary study.

We do it by gender and by ethnicity, and, again, the most interesting thing from these data, I think, are what it reveals about the composition of the faculty workforce. Our most recent study seems to indicate that somewhat more than 50 percent of the total academic workforce is at the professor level and about 25 percent at the associate professor level and about 15 percent at the assistant professor level, and the remaining are the instructors, the adjuncts and so forth. That tells us clearly that the population is aging, that there are no more coming along in the pipeline to replace those who are retiring. If that is, indeed, occurring--where are the new instructors coming from?

That would seem to indicate something about a trend in not hiring tenured professors anymore and using adjuncts and temporary faculty for the money savings and other reasons.

We did an employment and hiring survey for four years, and then through lack of resources, we had to suspend that. Our employment and hiring survey went out to approximately 300 different employers across six different employer categories. And the questions were very simple. We asked: "How many do you currently have aboard? How many did you hire the previous year by degree, by whether they were new--that is, freshly degreed or whether they had experience? How many did you project were going to be hired in the coming year, again, by degree and by new and experienced? The number projected that would be lost during the current year due to retirement or resignation or termination?"

We asked for starting salaries for the new hires, that is, the inexperienced new hires, those who had not had a professional position before. And that information, again, helped us track where things were happening in the major employment categories.

One of the problems we had was that among the fastest growing employment categories were environmental consulting and consulting in general. However, consulting may be a sort of buzz word, a fancy word possibly for under-employment. We also noted that perhaps because filling out questionnaires is not billable, consultants didn't have the time to fill out the questionnaires. So the very sector we wanted the most information from was the least forthcoming.

But we did oil and gas, mining and minerals, federal and state governments and consulting firms and academia, and research institutes and national laboratories. So at least up through '91 we have pretty good data about the trends. However, the flux in the oil and gas industry with the loss of some 450,000 jobs over the past 10 years, seem to indicate that things are still not stable now. While we can identify some trends, it's far from settling into a pattern that we'll be able to predict in terms of where jobs will be, and certainly not at the Ph.D. level.

Finally, we did a survey in 1986-87, over two years which was literally a census of practicing geoscientists in the U.S. and Canada. This went to individuals. We got about as much baseline information as we could to establish the total population and obtained all of the types of information that we could not develop through employers or from going to a particular employment sector. For example, we got things like age, salary, years of experience, education and relative rank in a particular current employment.

These last two are studies that we would like to have assistance in reinstating because right now so much is going on in terms of employment -- the downturn in employment especially for Ph.D.s, the rise of the importance of the associate's degree and the bachelor's degree, the qualifying degree -- at least in geoscience -- except for obvious places like research institutes and academia is the master's degree. Are we doing a disservice to our students by encouraging them to go for a Ph.D. when we know that the job market is fair to bleak?

In summary of where we think we're going now, the outlook appears to be strong for the associate's and the bachelor's degree holders in the sense that there's a great deal of geotechnical work that's going to be coming up for which they are ideally suited. The phrase that I keep hearing is "the blue-collar astronaut," "the service station in space," that sort of thing. These are exactly where this cohort will find employment.

The labor market is certainly strong for the master's degree and not very strong for the Ph.D. due to budget cuts. In academia, as we gather information for a directory of geoscience departments, we are finding departments being either entirely eliminated, forces being cut, or departments being combined (e.g., geoscience with geography). There are also cuts in the federal sector and threats to things like the U.S. Geological Survey. The competition for research dollars is certainly not abating.

We think we have adequate numbers of Ph.D.s available, so the loan rate from other sciences will decrease, because we have quite an adequate supply. However, we do run into new and experienced geoscientists now competing for the same jobs.

What I think we need now is a good assessment of where exactly we are, straight talk about the possibilities of careers, and the preparation perhaps not for alternative careers but what you can do in the geosciences and where you may end up with the skills you have as a geoscientist applied in other areas.

I think that will do it for now, and I'll take any questions.

SUTER: I was interested in your employer survey. Have you thought of asking employers about the satisfaction that they have with the graduates? I was interested especially in associate's degrees and bachelor's degrees.

CLAUDY: We haven't. In fact, we haven't even been able to get enough from employers. We get a lot of grumbling about filling out surveys. So we don't know, but we would like very much to do studies of associate degrees and the role of community colleges in preparing students--obviously not only for college, but for rewarding careers below the Ph.D. level of research and academia.

SPAR: In your discussions with employers, have you had any indication that having a Ph.D. is a detriment?

CLAUDY: Only to the extent that such a person may be over-qualified, over-specialized. At least, I think, in oil and gas and mining and minerals, what they're looking for is a well-rounded individual that a master's degree probably indicates and then they will train that person into more specialized work. The Ph.D. can, in that scenario certainly be a detriment.

I think anyone would be glad to have a Ph.D. if they could afford them, but that doesn't seem to be the current trend. BABCO: The Ph.D. is a much smaller component of the geosciences. What is the ratio between the degree levels?

CLAUDY: We think the total population of geoscientists is somewhere in the 70,000 to 80,000 range; probably 10 percent total have Ph.D.s, 20 to 25 percent with a master's, and the rest bachelor's and/or associate's, or are on loan from other sciences.

KOHOUT: Can you tell me how you developed your pool of employers to go out to?

CLAUDY: The employer categories themselves are easily defined, and then it's simply a matter of developing what we think is a representative sample. For instance, of oil companies, we have what used to be the Seven Sisters and the independents.

Again, the most troublesome area are the consulting firms, which seem to be growing very quickly and becoming more specialized themselves. It's difficult to keep up with trends in consulting and trends in environmental science, because so much seems to be going on in so many different directions.

Mike Neuschatz, American Institute of Physics

I'm here with my colleague, Roman Czujko, and we have a fairly active program in research. As a matter of fact, I apologize. We hogged about three-quarters of the space on the tables outside with our publications.

I'm not going to go through too much detail about the first five studies listed (see Appendix A , p. 2.1). We do a fairly extensive survey of the supply pipeline from high school up through graduate school. The one hole up until now has been the two-year college sector, and we have a proposal with NSF to fill in that gap. That will give us a pretty good picture of the way the supply moves through--moves up the academic ladder, if you want to think of it that way, with important offshoots at each position.

Because of the topic that this discussion was centered on, I'm going to focus on the graduate students going on into their work, their occupations. The first thing that is of interest for that is number five, the graduate student survey, in which we try and usually get responses from all of the approximately 170 Ph.D. departments in physics in the United States. They give us data on the Ph.D. degrees that they've granted that year. It differs somewhat in definition from the SDR, the NAS's effort, because ours is driven by the department rather than the individual in terms of defining who's a physicist or who got a physics degree. But there's a very large overlap.

From the departments, we get numbers of degrees and demographic data on gender and citizenship. And we also survey all the graduate students in physics each year, and that includes people who are on the point of getting their degrees. We survey them in the late spring and early summer. So we get information from current Ph.D. recipients about their educational experience in the past and also their plans for employment. A fair number of them have already secured employment, and so we have a section--I believe the questionnaire is up to four pages now, and a page of it, approximately, is devoted to information about their upcoming job.

For the people who respond and give us addresses, we go out again six months later in what we call--it's number six--the initial employment follow-up and ask again about their work situation. That includes some people who didn't have specific information about their job; they were just in the process of seeking a job or taking a post-doc.

We get about a 60 percent response rate from the people whose names and addresses we have. We are working now to try and increase that through various methods. That gives us a fairly good snapshot of what the employment situation is for Ph.D. recipients six months after their degree.

Unfortunately, like most societies, I think we tend to lose them at that point. What we pick up again are people who join member societies, and, again, the question is what proportion of the people who got their Ph.D.s ultimately become society members, and that's a tough one to answer. I think every society has the same problem.

We don't know exactly, although I've been trying to do some back-of-the-envelope computations in the last couple weeks, and I'm coming up with quite a high percentage. But it's very difficult to define because society members include people who got their Ph.D.s in other countries. Some of the people who got their Ph.D.s in this country then go abroad. There's a certain proportion of foreign students who go back immediately after their Ph.D.

There's a lot of coming and going in that period, and it's difficult. And it's crucial because we feel that one of the key issues is that transition to the first job. And in most cases, in physics at least, that's delayed by the post-doc, so that you really don't capture a good sense of where people are heading to until you get beyond the post-doc. And that can be, as we've seen for some of the other disciplines--and it's true in physics, too--a quite extended period of time.

One of the things we've done to get a snapshot--though it's only barely adequate--is a membership survey (that's number eight on the list) that gives us the data for our salary report. The survey is similar to what Chemistry does. We also use it for our society membership profiles. We have 11 member societies. We have a four-page questionnaire with a response rate around 65, 70 percent, usually. We ask questions about employment situation and about salary. We ask a number of questions about background, and in the most recent one, we've targeted the issue of under-employment. We're asking quite a number of questions. We're trying to get at it in several different ways to find out to what degree people have both subjective and objective markers of under-employment.

We're just starting to analyze the data now, so we don't really have any output results on that. But I think it will be fairly interesting.

In addition, we just recently completed a study as part of a Sloan Foundation project that looked at members of the undergraduate physics honor society, Sigma Pi Sigma, many years later. Actually, it was the whole membership. But it allowed us to focus on the area for both undergraduates and people who went on to higher degrees of what exactly they ended up doing. We could look at the whole range of people throughout their career because this society has been going on for, I don't know, 50 years or 40 years. We did a cutoff to try and get people who were occupationally active at this point.

It's not a representative study because this is a special selected group, but it certainly gave us a better picture than we've had before, especially for undergraduates, though there was information on graduates, too, in terms of what kinds of careers they ultimately go into, what range of skills they use, and to what extent their education played into and became important in providing the skills that they ultimately needed.

A couple of other studies that we've done--I could go on and on and on. I won't. But we did do a young faculty survey a few years ago--and it was a repeat of one we had done in the late '70s--that looked at people in the professorate within 10 years of degree at Ph.D. departments. That was a follow-up that tried to get at the period of time right after the degree during which people (or many of them) took post-docs and tried to establish themselves as full members of the working physics community.

We also do some demand side surveys of institutions. We do a survey of academic departments, and we did a survey of National Labs. In both cases, we got quite a high response rate. In the academic workforce survey, perhaps 95 percent of the Ph.D. departments participated.

While the total number of National Labs was fairly small (about 30), the number of Ph.D. physicists represents a fairly sizable proportion of the total Ph.D. workforce. Some of these labs are quite big.

That basically covers most of the data that we currently collect. We feel that it would be important to try and do some kind of study that tracks Ph.D. recipients over several years after their Ph.D.s in order to get a better handle on several key questions, like: What proportion of foreign Ph.D. recipients leave immediately, and what proportion leave after their post-doc? I think that is a key question that nobody seems to have really good numbers on. It would also be good to ask questions about how long people spend in post-docs and how does their early career look once they get through the post-doc period.

Again, I did some back-of-the-envelope computations to try and figure out whether there is any sense that I could make out of how long people stay in post-docs. I'm very interested in the figures that ACS just put up. But I was getting numbers somewhere around three-and-a-half years for U.S. citizens on average. There was a great range, from one to eight or nine years with a higher number for foreign students -- maybe in the range of four-and-a-quarter years on average. But the averages are very iffy, and as the job market deteriorates, if that's one of the key places that people are absorbed, I think that's one of the areas that we feel we'd like to put our efforts to get answers.

KRUYTBOSCH: Yes, I'm curious, and the question would be for Mary Funke as well. There is a Department of Education rule that says that university departments are not supposed to give out information that isn't already public. How do you get your lists of addresses, names, telephone numbers or whatever, from the departments? Do they just give them to you?

NEUSCHATZ: We get names from departments, and in most cases we get addresses, too. In some cases, we have to go to the department and have them distribute our questionnaires to their students, and then the students supply their own addresses. But in most cases, we get addresses from departments.

FUNKE: For our starting salary survey, the departments that have ACS-approved programs supply the names and addresses.

James Maxwell, American Mathematical Society/The Mathematical Association of America

I'm here today wearing the hat of an associate executive director at the American Math Society. I've been on leave since last August, spending a year as a consultant working in the Division of Mathematical Sciences at NSF, and also sneaking downstairs from Floor 10 to Floor 9 to dig into the treasures in Science Resources Studies. I'm doing some work there which I have really found enjoyable.

My role at the AMS in connection with the data is to provide senior staff liaison to a committee of volunteers drawn from three societies: American Mathematical Society, the Institute of Mathematical Statistics, and the Mathematical Association of America. Quite a mouthful.

These volunteers, about nine volunteers, oversee our survey effort and really participate seriously in it. All three societies provide financial support for the direct expenses for the survey.

The surveys I'll describe very briefly to you are conducted in-house by AMS staff, and then I have in parenthesis here: one person, part-time. Guidance is provided by the Data Committee and the reports that are circulated to our full membership are, in fact, drafted by the Chairman of the Data Committee, with help provided by myself and the staff person. But it's very much a volunteer involvement with it.

We have three annual surveys that we do: a survey of new doctorates, a faculty salary survey, and a departmental profile survey. Now, just reading those to you points out that we really focus all of our effort on the academic setting. That's the easiest group to go after, and it's where the vast majority of Ph.D. mathematicians are employed.

Now, a few details about each of these, or at least the first and the last. The survey of new doctorates is conducted very similarly to the way AIP just described. We send forms in May of each year to the doctorate-granting mathematical sciences departments and ask the departments to report to us individual data on each new doctorate awarded through that department during the past year. We use the same 12-month period as NSF. We also ask them to tell us what they know about the employment plans for the upcoming fall.

We also ask the department to give us summary data on the gender, ethnicity, and citizenship of the new doctorates. It's actually on a separate form. They give us just a tabulation in a matrix form.

Once we get individual data from the departments, we follow up with a mailing to each new doctorate to gather more detailed information about that individual, and in particular, more information about their employment plans.

This survey goes out in May and data comes in over the summer and into the fall. We actually prepare a first report from this data that is published in a November publication that goes out to our full membership. It's about 30,000 individuals worldwide. So it's a very, very short time interval and that does introduce some problems if you dig into the details of what goes on.

Now, that's a brief description of the survey of new doctorates. The salary survey -- there's really very little to say there. We gather a profile on academic year salaries by rank for the upcoming year, with a survey form that goes out in June to the departments. This is reported on in the same November report.

In the fall, we send out a department profile survey which gathers data on fall course enrollments, data on undergraduate majors and graduate students, profile data on faculty, and in the last four years--information on the faculty recruitment by the department for the prior year. This data is presented to the membership the following summer, and that same report includes some updates on the employment information that was presented in the November report.

In addition to these three annual surveys, we do, like some of the other societies, conduct ad hoc surveys from time to time. One recent survey of this sort which I thought would be of particular interest to this group is called "Employment Experiences of 1990-1991 Doctoral Recipients in Mathematical Sciences." The survey collected data on the employment experience since graduation and the fall 1993 employment plans for a sample of individuals from the doctoral class of 1990-91.

This is the class as defined by the AMS survey as opposed to the survey of earned doctorates, significantly funded by NSF and conducted by the NRC.

We began collecting this data on a sample of this Ph.D. class from 1990-91 in late spring of '93, and we followed up throughout the summer. In fact, we were still in some cases chasing individuals in the sample into the winter of 1994.

The report on this has been prepared, and I've got some pre-prints back there. It's going to appear in July, and so there's a note up there that asks you not to distribute it.

I'll give you an indication of some of the information provided by the survey -- three bullets I took from it.

Among the 920 1990-91 doctoral recipients who were in the U.S. in the fall of 1991, we had an estimated 400 that were not in the same position in the fall of 1993 that they had held in the first fall after they received their doctorate. An estimated 150 individuals moved out of U.S. academic positions, and an estimated 24 individuals moved from outside of academia into U.S. academic positions.

Another highlight was that among those who held non-tenure-eligible positions in their first position after receiving their degree, 17 percent were unemployed and seeking employment for fall 1993. This is for those whose first position after they got their Ph.D. was in academia, but the position was non-tenure-eligible. It thus includes people in post-doc or post-doc-like positions. Seventeen percent of those who responded to the survey said they were unemployed or seeking employment.

A third bullet here is that in the first fall after receiving their doctorate, we had an estimated 79 percent of the individuals in the U.S. in academic positions. In the fall of 1993 we still had 76 percent of those whose employment status we knew were in academic positions in the U.S.

As I said, this survey is going to be in a July issue of our membership publication called "Notices." It took a year to get it written, not because it was that complicated but because a very meticulous volunteer was working on it in between other things. We're at the point now where it's in '95, and this is going to be something that they're going to want to follow up on.

We have not done very many surveys of the sort of this most recent one using a real carefully drawn sample and doing very intensive follow-up. It was no surprise to me, based on what I hear from talking with survey specialists, other professional societies and here at NSF, that's very resource-intensive, particularly for an operation of the sort I described to you, one individual who spends part of her time on this survey. So we're going to be scrounging for resources, and I can certainly relate to the remarks that Ken made about this, that at the same time that resources are tight--and they are in our professional societies--the need for some of this information is greatly increasing.

That's all I wanted to pass on. I'll be happy to try to answer questions.

RAPOPORT: You said that 17 percent of those who were initially in non-tenure-track positions were unemployed in your follow-up study, but do you know offhand what percentage of those who went into academe were in non-tenure-track positions of that group?

MAXWELL: Yes, I looked that one up particularly after I found that highlight there. Of the ones who went into academic positions, I think it was 46 went into non-tenure track and 56 went into tenure track.

FECHTER: Based on the last two numbers you just gave us, Jim, does that mean that if you wanted to get something that looked more like a standard unemployment rate, that rate would be more like about 9 percent rather than 17 percent?

MAXWELL: Well, yes. Also, we ask each year about the employment plans of the new doctorates. Starting with our earlier surveys in the '70s, that got labeled unemployment rate, and it's really not what it is. It means the numbers we report get labeled that way and get remembered that way, but, in fact, they always overstate true unemployment. And one of my plans when I go back is to try to clear that up. Certainly when you stay around NSF and SRS, you get very insecure about your methodologies. They do a very excellent job.

HORVITZ: I was interested in this last point about whether you had unemployment or not. What proportion of the 950 actually were followed up?

MAXWELL: We followed up on a sample of about 350 -- stratified across several characteristics. Among those who took U.S. employment in academia, we got virtually 100 percent response. We had a total sample of 365 out of 921 that were in the U.S., either employed in the U.S. or for whom we had a U.S. address. We got an 82 percent overall response rate.

Among the group who went into academic employment, it ranged from 86 percent for those in research institutes, but 98 and 99 percent for academic departments.

HORVITZ: Those response rates are very good. I guess the point I wanted to make was that when you're estimating something like whether they're employed or not, which may be a sensitive situation, it may be that non-respondents are unemployed to a higher degree than the respondent group. You have to be concerned sometimes whether the issue is a sensitive one -- and relates to the response rate.

MAXWELL: As I said earlier, as I have had more time to dig into some of this and have had to answer some questions during the presentation this past fall at CPST, it's made me realize how easily I had fallen into the routine of thinking of that number, about new mathematicians, new Ph.D.s unemployed. That's really not what it is. It's a picture of employment plans, and it's information that starts coming in June and keeps coming in throughout the rest of the year. But it's closer to being plans than to actual traditional employment information.

Our community, the mathematics community, thinks that these surveys should be sent out one week and reported on two weeks later. So the expectation--

BROWN: Thank you, Jim. If we were to give a prize for the rosiest employment outlook, you wouldn't win it.

Jessica Kohout, American Psychological Association

Good morning. I am with the APA, and we have been lucky enough to be renamed the Research Office. We used to be the Office of Demographic, Employment and Educational Research, which is what we do, but we also do so much more than that, and our efforts really go beyond just those three areas. This slide (see Appendix A, p. 3.1) gives you an idea of what we do, our routine efforts that support the kind of work that we're talking about here today. The biennial doctorate employment survey is mailed to new doctorates, both Ph.D.s and Psy.D.s. It's probably the one that has the most direct relevance to the issues of the employment situation for new doctorates.

About three-fourths of licensed psychologists are members of the American Psychological Association. Unfortunately, a much smaller proportion of research academics are members. I can't even give you a percent on it. So we have a real skew in terms of the representation of the field at the APA, and we're much less representative of master's than we are of the doctoral folks.

The doctorate employment survey is fairly extensive. We get about 4,000 names from the Chairs of psychology departments each fall, and it's mailed to these individuals very early in the following year. It's about eight pages long. The response rate is about 57 percent. It slipped a little bit when we extended it in support of a NIDA grant on research interest. We hope to bring it back up again.

We use the data for tables in a standard report. We run special analyses for faculty, for students, for employers, people who are interested. The data are presented at meetings. We use them in papers. And we hope to display some of these data on Internet. I think they'll be of interest to students and to employers. And APA is just starting to go up on the Internet and pull its materials together, so I'm hoping to have that available.

Of course, to do our job, we rely on outside sources, including very strongly NRC/NSF data and NCES at the Department of Education for master's data, and other data below the doctoral level. ACE also puts out some good reports that we use in describing the context for education. It's used in reports, papers, and presentations to update our tables and publications.

I've been intrigued--and you'll hear that reflected in some of the questions that I've been asking today--about how did you identify your pool of employers. We're trying to pull together an employer survey. It's going to be difficult and we don't have the resources to do it yet, but we hope to come up with some resources to pull that off within the next year. I'd say 1995-96.

The really important one that I want to get at is the career survey. We've managed to get a bunch of folks from previous doctorate employment surveys to give us current addresses so we can go back to them two and three years down the line and find out what they're doing, not just a year later but two and three years later, and what their salaries are and what their employment situation is. So those are exciting. Unfortunately, we have so much to do, they sort of get put to one side.

In the recent past, because of interest in the questions and lack of data, we redid an undergraduate department survey in 1993. The last one had been done in 1984. The master's employment survey, which got an abysmal response rate simply because we just don't represent the master's folks, came out of APA, and nobody responded to it. But the data line up with the national data available in the master's in psychology, so there's sort of a validity because of that. We're working on an undergraduate employment survey, and I'm hoping the response rate is somewhat better on that.

You asked that we talk about some issues for careers for psychologists. These last two slides (see (see Appendix A, p. 3.2 and 3.3) are pulled directly from a report that Dr. Catherine Gaddy and I gave to the Council of Graduate Departments in February. They were interested in the career opportunities and the employment forecasts for psychology in the 21st century, and these are some of the areas that we've identified where psychology can make contributions.

Now we need to try to find data to wrestle with the issue of supply and demand. We don't have those data at APA at this time in as fine a form as I would like. But we're working toward that goal. Because the questions that are coming in from the Chairs are: Where can my people find work? How do we prepare them for the changing labor market in psychology?

We were surprised that the national unemployment rates from the NSF indicate about a 1.2 percent unemployment rate in psychology. Of course, a lot of people are employed part-time or as post-docs. Our data indicate about 3 percent unemployment among new doctorates and a bleak outlook in terms of the perception of the job market. We're going to continue to track that. We only have three years, '89, '91, and '93, so we need to keep looking. It's been dropping, and the drop in those since '89 is fairly substantial. It's significant, and it concerns us because the national data are better than our data -- and they're more positive.

But when we looked at the data that were out there from NSF, from NRC, and BLS, the same growth is obvious. If we look at the statistics, we've been having growth-- particularly at the master's level; the doctoral level is fairly steady. There is some reason for optimism.

But there are some issues that need to be dealt with within the association. We're looking at defining roles and functions at different degree levels. We're having a lot of disagreements about differences between the master's and doctoral roles that need to be clarified. So, collection of data to indicate what the appropriate roles are at the different levels of education and training are important.

We need to be more outward-looking, educating the consumers of psychologists, so that they understand what psychologists do and how this is different from other fields. I don't think that's out there yet.

Linkages -- how do students get jobs? Students shouldn't think they're going to walk into an academic position, because the jobs are not going to be there.

And the skills -- the important ones that keep coming up are the computer and research skills. These skills are needed to be educated consumers of research as well as to do the research. Grantsmanship, internships, getting out into the field, getting practical experience, and other hands-on experiences are important. So, this is the kind of thing we're looking at, trying to find data in these areas to talk about these issues a little more in detail.

BROWN: Well, I thought that was a particularly good presentation in terms of what do you do with the data after you have it. Clearly, you want to target it on students trying to make decisions, much in line with the COSEPUP report.

SYVERSON: One of the interesting pieces of data that we've seen from the survey of our own doctorates is that students in clinical psychology tend to have a fairly high amount of debt as they graduate with their Ph.D.s.

KOHOUT: Yes.

SYVERSON: And you're doing an employment survey. It would be very interesting in that survey to ask them questions about debt and how they're paying it off and what kinds of burdens, if any, that's putting on their finances.

KOHOUT: We do ask about debt levels, and it can be broken out by field, so we can get clinical, experimental, and so on. You're right. Clinical psychology is very high right now.

FECHTER: A clarification question which may kick off some discussion for this afternoon, too. You made the comment earlier that the national data for the doctorates show an unemployment rate of about 1 to 2 percent; whereas, your data are showing 3 percent, if I recall correctly. And I'm wondering whether you were comparing apples with oranges in the sense that the national data on unemployment that give you the 1 to 2 percent range would be for all doctorates at all age levels and experience levels; whereas, you --

KOHOUT: New doctorates.

FECHTER: You were looking at new doctorates.

KOHOUT: New doctorates.

FECHTER: And that the unemployment rate for new doctorates will tend to be much higher than the overall unemployment rate. There is a relationship between experience and unemployment.

KOHOUT: We have looked at national data on the new docs compared to our new docs, because those two--

FECHTER: So it wasn't apples to oranges?

KOHOUT: No, it isn't.

FECHTER: Okay.

KOHOUT: It's as close as we can get it, and we keep pulling out any of our people who aren't included or who are not likely to be included in that national group, and still we have a difference. So I'm going to keep an eye on that.

We see a trend toward increasing part-time employment. The post-docs have risen substantially in the last two or three years, also, so people are using those.

SHETTLE: As a follow-up to Alan's question, one of the things that we speculate on when we look at your data versus our data is the time frame. The "new" doctorate recipients in our survey are at least a year or so old. And we think there's a steep decline in the unemployment rate in the months following receipt of a doctorate--in other words, many people start off searching for the ideal job, but, if after six months or a year they haven't found it, they decide to take something that's not quite up to their original expectations.

Have you compared your data to ours from this perspective?

KOHOUT: I haven't compared that. I know that most of our folks tend to get their jobs within six months. We're surveying them at least six months, if not more, after. For the '93 graduates, we got the names in the fall of '93, I think, and mailed in the beginning of '94. So they've had about a year -- six months to a year, to find a job. And when we ask them how long did it take you to find your position, it's took less than six months to get a job.

SHETTLE: However, it could take even longer than after six months -- so I'm wondering if that's some of the disparity.

KOHOUT: Well, it would be interesting to take a look at that. I think it's feasible.

SHETTLE: It would be interesting.

Carla Howery, American Sociological Association

The American Sociological Association has about 13,500 members. Most of our members are in academic settings; about 25 percent are in non-academic settings. And as you may know about sociology, the post-doc is not a dominant opportunity. So we're pretty much dealing with people who go directly from degrees into academic and non-academic employment. Now, on our staff we have 24 people, roughly two FTEs working in research. My colleague Mike Schubert and I are here today.

We have a slightly easier road than some other groups in that the ASA is more or less the sole national society for sociologists. So we're not trying to organize and mobilize and coordinate with a lot of different groups. Thus, our colleagues do look to us for information about the profession.

We're very pleased to be a part of this conference and very pleased to have social science included in the conference, and we hope that we can reciprocate in that our members, as well as other colleagues in social science, are not only on the SRS staff but have a great deal of background in professions, in higher education, and in survey methodology. And we hope this will be useful.

Now, I make that generous offer with some embarrassment because, until two years ago, the ASA itself did not have an active research program. It was actually the classic case of the shoemaker's child with no shoes. People would call up and say to us: What is the average salary for a small liberal arts college in Indiana? We'd say: Beats me. You think sociologists have some data?

We relied heavily and gratefully on NSF and NRC and other national data sources, which served us well on many questions, but, of course, didn't ask all the questions that we wanted information about. So, I'm glad that I can come now and say that we have corrected that embarrassment and have put our professional skills to use in establishing a research program on the profession that is now a couple years old.

The goals of our research program are to systematize our internal records and transform information into data systems. I want to take a minute to talk about that issue because this was a major coup in our own internal working, and maybe other associations have already done this, or they might want to think about this as well.

For example, we produce a Guide to Graduate Study that is a catalogue of all graduate departments in sociology and all the particulars about them and their faculty and their specialties and so on. And that's been done for many, many years. The information for that was sent in to us, collected, and typeset, and a book was produced. These were non-manipulable data. This was information, not data, and so if someone asked me, as they did, how many African American full professors are there in graduate programs in sociology, I'd say: I'll get back to you on that. And I would have had to have gone page by page, assuming that I would actually know the race and gender of each of those persons, and say, ah, that's one here and one here.

Now we collect essentially the same information from departments, bring it into our office, code it, enter it as data, and then produce a book from those data. So this kind of thinking about how to use internal information that we were already getting in new ways and making them into manipulable data has been significant.

We knew that changing address is a major function of our society, i.e., having people change their address and getting their things, but, of course, this was essentially a clerical issue. Someone would call in and say I used to be at the University of Alabama and now I'm at the University of Chicago and we would change their address to get their materials correctly. But in doing so, we lost important information. We lost where they were. Now we have transformed our record keeping system so that we retain in our archival file information about people's past affiliations, including the number of times they change their address, so that we have a sense of people's work pattern over time, simply coming from the routine process of changing address.

Interest areas are also something that we are trying to track changes in. We ask new members to indicate interest areas and on their membership renewal form, we ask them to update those interests. We're trying to track how people identify themselves within the specialties of our field.

So we're trying to examine many ways--and there are probably many more possibilities--about how to use regularly collected routine information as data sources. This has been a major internal effort to fix what goes on inside or use what goes on inside more effectively.

We're trying now to undertake primary data collection, and I'll talk to you about some of those surveys, which are similar to what other people have presented -- of members as individuals, non-member individuals, and departments. We are also very grateful for and are trying to use the existing data from other sources. For example, we collect almost no data on institutional characteristics. We know that asking a sociology department Chair, "What is the endowment at your institution? How many library books do you have? What is the minority enrollment?" and so on will not produce clean and accurate data, but we can get that from other sources and link those data sets to our own. And so we're relying heavily on that kind of outside support.

We're then trying to inform our members and Chairs, in particular, about these data sources and findings. Our association recently, in the constricting times with which we're all familiar, has focused almost entirely on making the role of the Chair a much more central and important one. The Chair to us is not the person who lost Spin the Bottle. It's the person who is representing the discipline and needs to be the well-informed leader about what the trends are within our field in order to make some intelligent planning decisions.

We have surveyed department Chairs on a biennial basis and graduate departments on an annual basis to produce the guide that I mentioned.

Our members, fortunately, are fairly tolerant of long instruments. You know, sociologists have to do a questionnaire before breakfast, so they're usually pretty good about tolerating a long instrument and returning them, so we have some advantage there, albeit with some grumbling.

In this kind of instrument, we ask quite a bit of information about individual faculty, about students, and about new Ph.D. cohorts. We have done one survey of a sample of members, and we'll do further samples of individual members. In those surveys we have included information such as work history and the nature of people's professional work, i.e., what actual activities they do in their role as sociologist.

We hope to do future surveys of non-members -- and not only to get them to become members. However, the problem is figuring out who these people are and where they are. The K-12 teacher pool is certainly one; the huge pool of B.A. graduates is another one. Non-member Ph.D.s and M.A.s are slightly easier to identify.

We do surveys about member satisfaction on specific things, which are pretty straightforward, about how people feel about the services they receive. I anticipate in the future that we will also do an early employment survey of the new Ph.D.s.

In looking ahead, I would say that probably the direction we will most likely go beyond continuing these kinds of surveys is to do more qualitative work, to do more focus groups, to do more probing of what we see as particular troubling patterns within the data that we have found. For example, for whatever reason, we're finding that minority women in sociology do not advance through the career ladder as quickly as minority men or non-minority men or women. They are not dropping out of the field. They have positions, so it's not an unemployment issue. It's a career track issue.

So, clearly, then, we need to talk to them about why they seem to have two and three academic positions before they get tenure, why it takes so long from time of degree to getting tenure. We also want to know why with these apparent problems they're still in the field. We want to understand what is happening and what can we do about it?

Finally, fortunately, our own members as social scientists do a lot of research, and so we're able to capitalize on their research. Part of the advantage of having a Research Office now is that it galvanizes some of that research and brings it in. We have a journal called "The American Sociologist," which is a journal on the profession. People like to navel gaze and study their profession, and so we're trying to coordinate and keep records of all those surveys, and to some extent replicate the better ones.

As you might imagine, we disseminate this information in specific research briefs in our newsletter. We also have an annual Chair conference now where, again, we're emphasizing the centrality and importance of the Chair and try to give Chairs briefings on trends in data and implications of this so that they will use this information in their own setting. And within the ASA office, we try to have our research programs service other components in the office. For example, we have a minority affairs program, and we're providing that program with considerable detail on data on the minority pipeline issues.

The good news in sociology, to get to the data now (and this is NSF data), is that, by and large, this curve is going up again. The trend in M.A.s and Ph.D.s is turning up, so we're not particularly concerned per se that we have a problem with degree production. We're certainly less sure about where people are going and how satisfied they are. And I would say we have more of an under-employment problem than an unemployment problem.

Our field, like many others, is becoming more feminized. We have almost 60 percent of our graduates as M.A.s and Ph.D.s women, and that's a whole other conference, which we have had.

In looking at our full-time positions, we have asked our departments to give us information about the number of full-time positions, and they have reported to us the number of active faculty lines. This is the average number of faculty in these various kinds of departments. These are the active faculty lines by institutional type, whether they're tenured, non-tenured, and so on.

We have asked departments about their hiring. Roughly, in 1992, one-third of departments had an open line. That line came from a number of sources: resignation, retirement, and so on. So we try to break that out as to the source of those lines. Roughly a quarter, as you can see, of those new positions were actual new lines. So there's some vitality, but it's not overwhelming us.

In that particular year, there were 302 positions open in sociology at the academic Ph.D. level. We roughly calculate that for that year 523 Ph.D.s graduated from graduate institutions. This, of course, doesn't take into account people who have had their degree and have been around and are looking for a position. So we're not concerned particularly with unemployment, as I say, but with under-employment or sort of a mismatch of employment.

Sociology has a double-edged sword. On the one hand, we think that our graduates can do lots of things, and we're very pleased about that. But on the other hand, there are not specific slots, particularly in non-academic work, that are labeled sociology and that are particularly designed for sociologists.

Herb Maisel, Computing Research Association and the Association for Computing Machinnery

I'm wearing several hats today, which leads me to tell you to correct your agenda. It's not the Computing Research Association of the Association for Computing Machinery. It's the Computing Research Association "and." I don't think CRA would like to be labeled a being a part of the ACM, necessarily.

I'm wearing several hats, as I said. One is I'm a professor of computer science at Georgetown, and I am not a statistician, although I have a doctorate in mathematical statistics. However, I will attempt to give you a survey of the kinds of things being done that relate to computer scientists, and I have been very careful to give you contact persons and telephone numbers because I will answer very few of your questions.

Finally, because I hear so many people giving some very important data about employment and unemployment, and realizing that nothing in this will speak to that--these are only data sources, not numbers--I decided I had better give you the results of a very important survey that I conducted.

In the early '70s--by coincidence, in the early '70s, early '80s, and early '90s, I was the Chair of the committee at Georgetown University that was recruiting new Ph.D.s for entry positions. In the early '70s, we got about a dozen applications, and--well, I don't want to say anything more about what the quality implications were.

CONLON: For how many positions?

MAISEL: One position. In the early '80s, we got about two dozen applications for one position. In the early '90s, we got more than 550 applications for one position. This is in computer science now. More than 150 of those applicants had gotten their degree, most of whom were Ph.D.s, at least a year before applying. Many of those people were unemployed based on their application. That's the only data I will provide. But that experience is why I'm here, and I will have a postscript to add to that. But you'll have to wait for that.

Now let's get back to the sources for data. If you look at the first page ( Appendix A, p. 5.1), I'm going to speak about five sources, the first of which I'm sure is by far and away the one that's most interesting to you: the annual Taulbee survey. The second is something that I had expected Jim Maxwell to tell you about, but he didn't, so I'm going to have to say a little bit more than I planned to about the 5-year survey of Mathematics, Statistics, and Computer Science Departments by the Conference Board of the Mathematical Sciences.

The third is something very new. The ACM has instituted a data base called the member profile, and they started that in 1993.

The fourth, probably things you're not at all interested in, but they're examples of the ad hoc studies that the ACM is making recently, and, in fact, what I did was just list the two most recent ones and tell you something about each of them. That gets us on to the second page.

All I'm going to give you is an overview of each of these things. As I promised you, I'm going to also give you a contact in each case of someone who knows a lot more about each of them. One copy of the most recent issue of this every-five-year study of the Conference Board of the Mathematical Sciences; and for each of the other three, information about what has been collected, including the survey forms themselves.

The next page: This is the survey that's of most interest to you. The Taulbee survey, which is named in honor of Aaron Taulbee, who had the foresight to begin this, I would guess something like 30 years ago. It was originally started because we were all very concerned about the fact that we were producing so few Ph.D.s that we're in great trouble. In fact, the phrase that was used was that we were "eating our seed corn," because more than half the new Ph.D.s were going to non-academic institutions. That's changed. But I told you I'm not giving you any numbers.

The kinds of data available: faculty salaries in Computer Science Departments; Ph.D. production by departmental category--the categories range from those that produce the most Ph.D.s to those that produce the fewest, so it's sort of from biggest department to smallest; Ph.D., master's, and bachelor's recipients -- the numbers overall and by gender and ethnicity; and employment of Ph.D. recipients by specialty, which I'm sure is the thing you most want.

The contact is Juan Osuna, and Juan is with the Computing Research Association, which leads me to point out that on the first page of the handout, you will see the annual survey by the Computing Research Association.

Next, the CBMS survey. This is an every-five-year kind of thing. For some of you, that means you want to flip the page immediately, right? Speaking about getting results in one week or two weeks. But it's a very comprehensive survey. It covers statistics, mathematics, and computer science departments. It gets an enormous amount of data of various kinds. Just look at how thick this is, various kinds such as the number of student enrollments, the number of degrees awarded, the number, teaching load, and other characteristics of the faculty, the departmental characteristics, the information about libraries and information about instructional practices.

The survey is the responsibility of the mathematical sciences, not any of the organizations I am representing today.

The ACM member profile (initiated in 1993) collects and records demographic information about ACM members, things like contact addresses, employment information, the highest degree, the major technical areas of expertise and interest, and, for student members of the ACM--the expected graduation date, degree, and student chapter membership.

The name of the contact person is on the handout. She is a staff person in the ACM headquarters office in New York. And she would be the most knowledgeable about all the methodology and results associated with this profile.

As I understand it, the ACM feels that they can't give you individual data, but are willing to give you summary data from their profile.

Now, the last two, which, as I told you, are probably the least important from your point of view, are two very special studies. The first was instituted because there was some concern about membership reaction to our primary publication. Over time there have been several studies that attempt to gauge how well they like it and other ACM communications. Things are improving. They like the communications more and more.

Finally, we have the survey of ACM members who have dropped one or more special interest groups (or SIG). One of the major organizational approaches at the ACM is to permit people who are members to join units in which they share common interests, like interests in the area of computer science or computers in general. SIG memberships are very carefully monitored because they tell us an awful lot about where our members are going and what they're doing. Also, SIG memberships result in extra charges on dues -- so, we're talking about a source of income, and we're very concerned when people drop their SIG memberships. And as you can see, there is someone else at headquarters, Donna Baglio, who is in charge of the SIG area there who would be the best source for those data.

As I told you, I don't have much to say, but I do have a postscript. When I have a platform, I make an appeal. The reason why I reacted positively to this meeting at first was because I thought the primary subject on the agenda was what we can do for the people who are under-employed or unemployed, not counting but doing. And I hope that some time soon we're going to have a workshop at which we exchange information that speaks to what each of us is now doing to help those people who are hurting. Maybe some of us can learn from each other and we can do more for our members.

Now I'm done.

BROWN: Thank you very much. Your last point is a good one. It will be talked about this afternoon and, of course, is very much part of the COSEPUP report I referred to earlier.

CONLON: Can you tell me how large these organizations are?

MAISEL: The ACM has about 80,000 members. The Computing Research Association--and really its origins were in the ACM--was intended to be an association of people who are primarily interested in academic departments that produce Ph.D.s, and its membership is oriented toward those departments. I'm not sure if they have individual memberships yet. They're mainly departmental memberships. Their headquarters are in Washington, D.C., by the way, as you can see from the contact. I won't speak to the Mathematics Society.

BABCO: What's the date on the latest Taulbee survey?

MAISEL: The latest Taulbee survey covers 1993-94 and was reported in the January 1995 issue of Computing Research News.

FECHTER: Herb, my impression is that the Taulbee survey is usually undertaken and administered by somebody on campus. Is that still true?

MAISEL: No. That's undertaken and administered by the staff at the Computing Research Association.

Richard Ellis, Engineering Workforce Commission of the American Association of Engineering Societies

Often at NSF gatherings, many people know who and what we are, but I think at this point perhaps I should give a little basic background.

The American Association of Engineering Societies is an umbrella group. We are quite small. The entire staff of the association is 15 people. Our membership is also small. We have perhaps, counting every conceivable kind of member, 40 members. Every member is a society.

When we go to represent the engineering profession on the Hill, we observe to the worthies there that we represent directly, as members, approximately 800,000 people.

The profession as a whole, depending upon how you choose to count them and depending upon whether or not you count such folks as retired people who are still active in the profession--and there are a lot of those in the societies--could be numbered anywhere from 1.5 million to 2 million folks. So I think it's fair to say that our take in terms of the proportion of the potential population we represent resembles that of the other societies here today. For example, ACM's 45 percent probably wouldn't be too far off for us.

We do notice that many of our societies are experiencing a phenomenon of aging and difficulty in recruiting young people, which is not surprising given the fact that the typical member of our societies gets employer support for society dues, and a lot of those people have had difficulty finding employment right away.

I will briefly review what we do. What we do resembles that of practically every other society that has spoken here. There are in the handout, I think, a great deal of detailed information about our specific reports, the societies associated with AAES and other things, so I will not go into any detail on those materials except to say that we do compensation studies and have done them since 1952.

We have been the primary source, I think it's safe to say, of information on enrollments and degrees of engineers since the late '60s. You will find in the handout the overall results for degrees and enrollment in 1994, including fall 1994 enrollment and including details on the distribution and development of trends of women and minorities in the profession.

The activities of the Workforce Commission, which is a group of volunteers that dates back to 1950 and which is organized as a part of AAES, encompass all of this data collection. In addition, we have recently acquired two other responsibilities that were not previously handed over to the research operation at AAES. We will now be the people who are producing for AAES the International Directory of Engineering and Related Organizations, a directory of societies. I believe many of the organizations represented here are or have been in previous editions of that directory, and we are also the people who do Who's Who in Engineering. It now falls on me to direct those projects as well as the more conventional research operations that you read about in the handout (see xx.x).

We are, in addition to our traditional studies, doing some new things. Beginning last January, we killed a long-time publication which was somewhat limited because of its format, and we began issuing something which is sort of a cross between a magazine and a journal. This is its first issue. The handout is its second issue. The reason why you do not have a printed copy is that we don't have printed copies. They're at the printer. The printed copies should be in distribution sometime in the next week or two.

The first issue, which you do not have, included among other things, as far as I know, the first publication and use in the United States of the Bureau of Labor Statistics new historical version of its occupation industry matrix, and so we have a decent set of data there on the breakdown of engineers by industry, sector, and discipline.

It also included extensive commentary on what is basically a situation of compensation stagnation for people in the profession, as well as a variety of other news items.

This issue, which I have handed out today does provide the latest enrollment and degree data. It is a fairly thorough report, though anecdotal for the most part. However, I think it is a reasonably accurate one for understanding placement, including trends in the placement for people with Ph.D.s. And about all I can say is that our impressions match all of the rest of you -- the outlook is bleak and getting bleaker, particularly for engineers.

Engineering in particular I think is facing a phenomenon which may not be quite as visible to some other societies here, though I am sure it is very visible, indeed, to some, including to ACM. That is -- direct competition with foreign talent. I do not mean here immigration. I am talking about the migration of work or, as Robert White of the Academy of Engineering put it last October, the migration of know-how.

It is possible these days to acquire top-flight scientific talent, Ph.D. talent, for example, in electrical engineering at 10 cents on the dollar. That phrase was used by the CEO of Intel at a talk to one of our member societies just last month. His figure, by the way, is quite accurate. We have taken a look and compared some other pieces of information we have, and the 10 cents on the dollar figure looks quite accurate to us, quite reasonable.

With those kinds of cost comparisons, I would suggest to you that many major employers of engineers simply cannot avoid considering with great seriousness the question of whether or not they should employ foreign nationals that do work that has traditionally been done here. There is no question that the talent is there. There is also no question that a great deal of the talent is there, and more is on the way.

In White's talk to the Academy of Engineering last October, he observed that in 1990, which was the year in which we produced 65,000 first-degreed engineers, six Asian nations produced 250,000 first-degree engineers. And these are not slouches of people. They are quite talented, quite energetic, and they probably work harder than most of us do.

I suggest that we are all in for a long time, possibly a century, of intensive competition, and people had better get ready for that. We are also observing that many of these folks who have come here to get their Ph.D.s and then returned home, have now created educational establishments abroad which have their own constituencies and their own pressures to take care of their students. And so we are seeing, in addition to competition with the students that these people are training, a reduction in the participation of foreign nationals in U.S. graduate programs, particularly in engineering.

For the very first time, we have seen a reduction in the proportion of foreign national degrees at the Ph.D. level in engineering going to students with temporary visas. It took a long time for that figure to turn around, but we do believe it has turned around. The enrollment data we have suggests that it will continue to do so.

There was a 10 percent decline between 1993 and 1994 in enrollment of foreign nationals at the master's level. That is the sharpest decline in one year of practically any enrollment statistic I can remember in the 10 years I have been with AAES. So I would suggest that we have some signs that things are happening and that there are discouraging omens, I would think. They're mostly anecdotal, but I would think that we have some reasons to start talking about reasons for serious concern in the engineering education establishment about what's going on.

BROWN: I am reminded by many of your remarks about a report that we put out in SRS a while back, produced by Jean Johnson, who is sitting in the back of the room. Jean compiled data on education for scientists and engineers in Asia and reached many of the conclusions that you refer to right now, just this tremendous build-up of educated scientists and engineers in Asia.

Jean is working on a project right now on science and engineers in Europe you should know about. Dick referred to some difficulty in recruiting new members. I was asking Jean about whether there is data from professional associations in Europe, and she said, well, the French will not join professional associations--if I quoted her right. That's one problem we don't have here.

STREETER: I get a lot of calls from members of the engineering societies, and they point out that the percentage of blacks and Hispanics in the engineering field is extremely low, and they're asking NSF whether are we planning on doing anything to increase that number.

Are you planning to create any projects with schools to increase the number of blacks and Hispanics in the engineering fields?

ELLIS: You'll find some information in the next copy of Engineers. The question of trends with women and minorities in engineering is treated in a separate article.

There are a wide variety of these programs in place and have been for a number of years. I have an example with me from NAMEPA (the National Association of Minority Engineering Program Administrators). I just came from their meeting down in New Orleans. It has been around for some time and deals with both blacks and Hispanics. WEPAN deals in a similar fashion with women's programs.

I would say that not only is the profession finally beginning to show some signs of the impact of these efforts, but we are in an interesting position from simply a scholarly and data point of view with respect to the women. Because the entry of women in engineering came as late as it did, we have a curious opportunity to acquire a fairly decent longitudinal data base on the impact of the feminization of the profession here, because we understand what is happening and there is a feeling of what questions to ask. And here is a large profession, by far the largest one of those represented at this table, which is changing late enough in the game that there is an opportunity to really track the effects of the change.

We began to act on this three years ago in a national sample survey we did of people in the profession for the Society of Women Engineers. We will conduct a second round with SWE participation, using a somewhat broader approach and a somewhat larger project. That is getting underway now, and we hope that we can keep this operation going indefinitely at about three-year intervals. Part of the objective is to see if we can begin to measure what happens to the profession as more and more women begin to become practitioners.

I would observe that for minorities we have an improvement of about doubling the proportional representation. The improvement is somewhat greater than that in terms of absolute numbers over a period of about a generation, which is enough of an improvement that you can't just ascribe it to demographic changes. In the case of the women, the improvement is far greater than that. Women in engineering now constitute roughly 20 percent, just a little shy of that, of the enrollment and of the graduating classes. Women in engineering are here to stay. The "old boys" are going to have to learn to live with that. Many of them aren't happy about it, but that's the way it is.

FECHTER: I want to make three points in response to the question. One is that if there is any feeling among the natural sciences and engineers that they are exemplary with respect to the progress that they have made with respect to having women and minorities in their workforces, engineering I think takes the prize. They are doing much better than any other natural sciences when it comes to this particular aspect of their workforce.

The second point I want to make is that while the progress is exemplary, I would be hesitant to characterize what's happening as the feminization of engineering. Even though we're up by a very significant amount, less than 20 percent of the engineers that are graduating from these campuses are female, and there's a long way to go, and I don't want anybody to forget that.

Thirdly, I don't know whether Richard mentioned this, but the National Association of Minorities in Engineering is an organization that has an extensive set of programs dealing with bringing racial and ethnic minorities into engineering, and that's a group he should be interacting with in respect to those questions.

Howard Garrison, Federation of American Societies for Experimental Biology

I'd like to say that I'm very pleased to be here as a representative of the Federation of American Societies for Experimental Biology.

The Federation of American Societies for Experimental Biology, FASEB, is a federation of nine separate biomedical research organizations. It has a very interesting history in that the American Physiological Society is the first society and gave birth to several of the other societies as they split off, federated back with the organization so they could meet at the same time and in the same place, so that they have always had very close historical cooperation since their origin.

Some of the newer societies at the bottom of the list (see xx.x) are relatively recent, and their membership is growing in some areas. The cell biologists are the fastest growing society of the nine member societies.

We have a very limited program of data collection. As a federation, we obtain our data from the societies, and as this chart may suggest, the societies do not have a standardized program of anything--membership categories, data collection, or anything else. So we're limited in what we have been able to pull together.

The membership categories I think are very important. The American Physiological Society has student members. The American Society of Biochemists and Molecular Biologists does not. The American Society for Molecular Biology has a reputation of being somewhat of an elite organization, limited to people who are more established in the profession. So when we pull our data together, we really are compiling data on a widely different group of societies.

The information that we are able to bring together, largely from our directory data base, is limited in terms of the sources. Each society collects the information and compiles information for the directory in different manners. Some send the directory data on the basis of dues renewal forms. Others send special directory update cards. Our information comes in at different points in the year, in different manners, from different sources and from different forms.

But this is the main source of standardized information we have on our society members -- the number of dues-paying members in each society. There's a total of 47,000 dues-paying members, of which there is considerable overlap. There are about 41,000 members after unduplication of memberships.

Each of the societies has its own data collection program. Some are quite extensive, but they each collect their information in different ways. It's very difficult for us to aggregate them.

At one point last year, I went around to each of the societies and asked them how they collect information on their members. The membership forms are often quite elaborate. The American Physiological Society asks date of birth, sex, ethnicity, degrees, position, type of work, primary activity, secondary activity, other society memberships, and other professional organizations that they're members of.

The American Society of Biochemistry and Molecular Biology asks degree, year granted, title, and position. And that's the only demographic information they collect on their membership forms.

Dues notices go out once a year and for some societies are a method of collecting updated information on activities in their membership. The American Society of Pharmacology and Experimental Therapeutics every year asks the members to confirm the directory data, the type of employer, type of department they're in, and the number of pharmacologists they supervise. Other societies, like the Biochemistry Society, do not send out a formal questionnaire of any sort with their dues renewal notice. They just remind the members it's time to pay their dues.

Additionally, there are special surveys that the societies use. Some of our societies have a very active survey research program. The American Physiological Society has surveys of Physiology Departments, and that works very well for them since it is a more historic discipline. There are lot of departments named Physiology.

Some of the other societies, for example, the Biophysical Society, have members in a large range of departments, often in Chemistry Departments, and, therefore, they're not able to identify departments as clearly. They don't do departmental surveys.

Some of the societies do membership surveys. The American Society of Cell Biologists has a very nice survey instrument. I hope it's very nice -- they asked me for my advice in designing it and they worked very hard at it. They spent a lot of time designing an instrument. And when they came to me the first time, I suggested why don't we try and coordinate the wording of this question with other instruments in the field. And I sent them down to the National Research Council, asked them to contact the folks working on the Survey of Doctorate Recipients to see if there's any way that that questionnaire might be adapted for use by the cell biologists.

Unfortunately, I think that the latest rendition of the SDR question was so extensive that the types of information being collected were not conducive to the survey that the cell biologists were interested in doing. Too much detail for them at that particular moment, so they were not able to use the SDR questionnaire. But there was--at least there's a growing interest in some of the societies on developing a survey program.

Unfortunately, it's at a very early stage. I know the cell biologists, while they spent a lot of time designing their questionnaire, did not spend as much time designing the methodology for collecting the data. Their approach was put the survey form on a table outside a seminar room at the annual meeting, and I believe about 86 percent were left there at the end of the meeting.

Unfortunately, the quality of the information that comes from informal and casual surveys like that is not very good.

So, we're really very much at the beginning of a data collection program. But there is growing interest in demographic data and demographic issues among the member societies.

For a long time, employment opportunities were relatively good in the biomedical sciences, and in addition, some of the research developments were so exciting that people were looking more at the discoveries that were coming out rather than the conditions of employment that many of the younger students were facing. The excitement of the times was really very contagious, especially among the leadership of the profession, and I think it's fair to say that we've turned our attention to demographic issues at a later date than many other societies represented around this table.

We have as a federation, also, additional constraints due to the need for coordination among societies. Different societies are experiencing employment difficulties at different stages in their growth. Many of our member societies such as the Physiological Society have been sensitive to this issue for a long time. Some of the other societies representing new, emerging "hotter disciplines" are not as concerned about this issue and don't feel it quite as intensely. So that has been a little bit of a constraint for us.

We do have leaders now who are turning more attention to the issue, and one of the things that we were able to do for the first time at the federation level, was to begin a little bit of data collection and data analysis. The program that we proposed and, after many weeks and months of discussion, were able to finally get approved by the board was a merger of the FASEB data base with the NIH grant file. This permitted us to determine what percentage of our membership received NIH grants and what percentage of the NIH grants went to members of our society.

We thought it was a good idea. It was a program that had a number of advantages for us. It didn't cost very much money, since the data were already collected. And it had the opportunity of demonstrating to our member societies that this data wouldn't hurt -- it didn't have to be something that they needed to fear or in any way be concerned about. And we found some things that I think were very interesting to us. We found that, overall, 30 percent of the NIH grants went to FASEB members. We did particularly well in the traditional research projects, the R01 grants, and also in the project grants.

We also found out that about 20 percent of our members were principal investigators on NIH-funded research. So that the information that came back was interesting. We were also able to identify what institutes funded our members' research, and I'm hoping that as a result of that the potential for a research program on the profession will have greater acceptance among member societies.

In the process of carrying out this initial research activity, we also were faced with a number of questions. When the membership saw the initial data, they came back to us, and they said, well, how many of our members are not in academia? How many of our members are employed in industry and, therefore, are not likely to apply from NIH grants?

My answer was, well, we don't know. We don't have that information. We don't collect that. And many of the people raising these questions, which were very interesting in terms of interpreting the data I just put on the screen, were surprised and were concerned that they didn't then have that basic information about who was eligible.

We did use the opportunity, therefore, to turn our attention to the directory, open it up, and do some very simple tabulations, manual coding of characteristics with all the errors inherent in that process, and asked what can we tell from these directory listings about our membership.

With a small sample, we found some things that were interesting, and I don't think anybody knew about the Federation of American Societies for Experimental Biology before; namely, that we have a large number of members who have only M.D. degrees. I was particularly interested in that because as someone who has spent many, many years with the data on earned doctorates and doctoral recipients that the NRC collects, I was surprised to find that about 20 percent of our members from this sample were not listed in the standard Ph.D. surveys.

We have a large number of people who have only M.D. degrees. In addition, there are a few people who have other professional degrees that would not qualify them for tabulation in the NSF-funded survey documents.

We were very interested in that. We thought that said something about our membership. They represent a unique segment of the scientific community. We're different in that respect from many other groups.

It's important to note that our estimates are pretty crude. 58 of 600 people didn't specify a degree when they sent their information into the directory. They could be students, or maybe they were modest or forgetful. For whatever reason, they didn't want to reveal their degree status. While it is a very initial and very crude effort to characterize our members, one of the things I'm hoping is that this will remind the societies and the leadership that perhaps now is the time to begin collecting a little bit more detailed information, a little more systematic data on the membership.

The other thing that I think was interesting is the distribution of our membership by address (not institution of employment but address data) coded very painstakingly by Steve Heinig and myself. The information contains, again, a certain number of people who didn't provide the information. In this case 45 reported a street address without an employer name. They could be unemployed. They could be people who are disenchanted with their institution's mail delivery system, people who have changed jobs, people who are planning to change jobs, people who for some reason don't want to identify themselves as a member of a federation society at their place of employment. Perhaps they're staff members of another scientific society.

In addition to the initial fact that this is very crude data based on a very unreliable method--even where we have been accurate in classifying people, it does not get at the fact that many of our members have more than one place of employment. There are many people who are now affiliated with biotech research companies, start-up companies, who have appointments at VA hospitals and at medical schools. Joint appointments are fairly common. There's a need to be very careful with this data.

But one of the things I thought that was important for us was the 62 people, approximately 10 percent of the members, who were employed in industry. For the Federation itself, that was a striking discovery. We knew we had industry represented in some of our societies, but I don't think anybody had any idea that industry itself was a substantial segment of the federation's membership. We thought of ourselves as an academic organization, and that does not seem to be quite the case, at least for a significant minority of the members.

That is fundamentally where we are at this point. We're hoping that this initial set of tabulations, which will be presented to the board at the end of the week for the first time, will stimulate a greater interest on the part of the federation and membership societies for a number of different activities. We have been very impressed at what other groups have been able to do. As soon as the members begin to discuss this issue, I have a stack of 18 copies of the American Institute of Physics demographic analyses. I'd like to be able to show our members what other societies are doing and what might be done by us. So we're hoping to go forward with this.

The future is a little bit uncertain, but I think that over the long run we will be able to participate in a greater program of data collection.

In closing, I would like to say that there is one other activity that we have been involved in. It's been an activity of a much more political nature, and for various public action purposes, we have identified other biomedical research societies that have similar interests to ours, investigator organizations, including the Society for Neuroscience, the Society for Microbiology at various times have convened meetings of the leadership of these biomedical research societies for discussions of relevant issues. The American Psychological Association was also a participant in one of our recent convocations.

When we brought together this group, we identified 35 different societies with a membership of 250,000 members. Now, we made no effort to eliminate duplication, and I know that there is significant duplication among the groups. But we were able to identify a number of research organizations, including clinical research organizations, organizations made up primarily of physician scientists, that do research and whose members may not be represented currently among the societies at this group today.

But I would like to thank you for bringing this group together and thank those of you who helped us in the past to begin a program of research at the federation. We hope that in the future we'll be able to work with you again.

BROWN: Thank you. An interesting picture of the beginning of a data collection effort, as well as some of the problems of an association of associations and how you bring them all together.

Vin O'Neill, Institute of Electrical and Electronics Engineers, Inc.

Good morning. I'm on the staff of the IEEE's United States Activities Board. The Institute of Electrical and Electronics Engineers describes itself as a transnational technical and professional society made up of some 320,000 electrical and electronics engineers and computer scientists worldwide. The IEEE United States Activities Board handles and represents the technology policy and professional careers issues of the approximately 240,000 IEEE members who live and work in the United States.

We have a limited data collection activity within the United States Activities Board, primarily focusing on a salary and fringe benefits survey that's conducted every two years and a periodic member opinion survey that's conducted less frequently, the most recent one being conducted in 1990.

In terms of our organizational structure, we have two relevant committees. One is the Survey Committee that actually develops and administers the survey, made up of volunteers. The work itself is done under contract with research specialists, and the oversight and direction is provided by volunteers, and IEEE staff acts as sort of a facilitator for the administration of the surveys.

We also have a Workforce Committee which does more in the way of monitoring trends affecting economic, demographic, and technological trends that affect education and employment opportunities for electrical and electronics and computer engineers.

With me today is Roger Madden, who is a volunteer chairman of the IEEE USA's Workforce Committee.

Basically, I think I'm here to just give you a quick overview of the salary survey. As I said, we conduct it every two years, beginning in January of '95, as a matter of fact, and we're nearing completion of the data analysis and preparation of a final report.

I brought with me excerpts from the 1993 survey. It includes kind of a quick overview of the survey and some of its findings, as well as annotated copy of the survey instrument itself for those of you who are interested in it.

We sample about 10 percent of our U.S. members, so approximately 20,000 to 24,000 members are sampled, selected randomly. Approximately 30 percent of those sampled respond to the survey. We feel that gives us a very good periodic snapshot of our membership, who they are, what they do, where they work, and how well they're doing financially. It gives us a good overview of the demographic composition of our membership by age, by gender, and I certainly would concur with Alan Fechter's observation that although we're making a great deal of progress, we're certainly a long way from the state of feminization of engineering. Our most recent survey showed that our members are 95 percent male and 5 percent female.

We look at ethnicity, of course, and we also look at citizenship. The most recent survey suggests that 86 percent of our members are natural born U.S. citizens, 8 percent are naturalized citizens, 4 percent are permanent residents, and about 2 percent are here on temporary visas of one kind or another.

We look at educational background. The statistics show about 56 percent of our members have baccalaureate degrees of one kind or another in engineering, about 26 percent are master's degrees, and 18 percent Ph.D.s.

We look pretty closely at employment sectors. This gives us an opportunity to periodically check trends or changes in the location and the kinds of work that engineers are doing. About 64 percent of our members are employed in the private sector, both defense and non-defense; 13 percent work for federal, state, or local governments; 8 percent work for educational institutions; 13 percent work for utilities; and another 3 percent work for other not-for-profit organizations.

We look at the size of employers and the types of business, and this has shown an interesting change over the past ten years or so. Ten years or so ago, most of our members tended to work for large corporations. Currently, or most recently, in '93, the distribution was about even between a third working for large corporations, a third for mid-sized businesses, and a third working for small businesses. So this documents some of the change in employment and labor market conditions affecting our membership.

We look closely at employment status, obviously a major concern, particularly with the changes in the global economy, the downsizing that's going on, and the cutbacks in defense spending.

We look at years of experience and numbers of employers that our members work for. We try to monitor registration status--that is, licensing as professional engineers.

We also look at the benefits that the members receive as well as the income and the salaries that they obtain. We're particularly interested in insurance programs -- life insurance, health insurance, and retirement income benefits. All of these things affect our efforts to influence public policy in these areas as well as to develop member services that are responsive to the needs of our members.

Just touching briefly on the opinion survey, in 1990 we conducted an international membership opinion survey which gave us a unique opportunity--I should say the first opportunity--to compare some of the demographic characteristics of IEEE's U.S. members with similar characteristics of members in other parts of the world. We also survey in that instrument some of the members' concerns, both about business conditions and employment trends, as well as give them an opportunity to evaluate services that are provided by IEEE and to suggest and make recommendations to us on services that they think an organization like ours can provide.

So that's pretty much of an overview of the data collection that we do here in the United States Activities Board.

Some of our technical societies--and I mentioned the Computer Society in particular--also do specialized surveys of various kinds. So we have a rather substantial survey research activity going on, but I'm really here to speak only on the U.S. Activities Board.

We depend, as you would expect--and I would assume most of the rest of you do--on a number of other data sources, both for hard statistical information but also for perspectives to help us evaluate the public policy implications of changing statistics affecting engineering labor markets: the Bureau of Labor Statistics for employment and unemployment statistics and occupational outlooks; the National Science Foundation particularly for its statistics affecting Ph.D. scientists and engineers; the Engineering Workforce Commission for statistics on enrollments and degrees and salaries, as well as the interpretations of the kind contained in the new newsletter; and the Commission on Professionals in Science and Technology.

So, as I say, we depend on a number of other sources, both for statistics but also for interpretive perspectives that help us to assess changes in labor market conditions affecting engineers and our members in particular.

GADDY: As Ken mentioned, we are not an association or a society but, rather, a commission of societies and corporations. I'll just tell you a little bit about that before lunch.

One background slide to give you a couple facts (see Appendix A, p. 8.2) Some of you have known the commission for more than 40 years. For those of you who haven't, just a couple quick facts about the background. We exist at the pleasure of our members, the societies and the corporations. We are an affiliate of AAAS and will be moving into their building at 1333 H Street in a month. That's just a little bit about us. I'll give you more specifics on that if you would be interested.

We are a linking organization. We work with the professional societies, obviously, who commissioned us in the first place, with industrial corporations (we now have about 40 corporate members), non-profit research organizations, such as the NRC, government agencies and academic institutions. I think this is really exciting because we can look across organizations with different perspectives, different data, and work to help bring the perspectives together when it's appropriate.

We have three major functions. First, we serve as an information clearinghouse. Eleanor, Willie, and I skim about 50 or 60 publications every month and do a digest called "Comments" that many of you have seen. It's a very valuable service.

I don't know how deep your in basket is, but I think that's a very valuable service and will become more so, because we're just progressively inundated with more information on-line and off-line still. So we serve as an information clearinghouse, both information from the societies and our corporate members as well as government reports and a wide range of periodicals.

Secondly, we do some analysis and interpretation, such as looking at trends.

Third, we do coordination and facilitation, helping to bring folks together. We were delighted to be invited to this meeting and coordinating this kind of thing with different groupings of people. We don't have a vested interest in any one particular discipline or in any one particular point of view.

Just a quick slide with a few of the representative issues that have been addressed over the past four decades (see Appendix A, p. 8.5). I tried to organize some of these because they're cradle-to-grave sorts of issues. Everything from preparation of students through retirement, with perhaps some more attention now taking a look at career transitions, because of job mobility. So a range of issues have been addressed over the years, depending on the timeliness, as well as some other issues, and I just put a few here that cut across the tracks of the careers of scientists and engineers.

We've got three sorts of areas of general products and services and publications that we do to disseminate information, and we will be on Internet June 1st when we're over at AAAS. Symposium presentations such as this, and we also have a small component doing some grant-supported research. Right now we're finishing a grant for Sloan, taking a look at the situation for young researchers, and that will be out this summer.

Two final slides, just a few thoughts that we had as Alan and Eleanor and I talked about spending a little time with you today. One is that we're delighted to see NSF supporting collaborations such as this meeting. They have the resources to pull together a large group like this. That's great. We would like to continue in the role of facilitating this dissemination and cross-fertilization between meetings such as this.

You all need to get on with the survey research that you have to do in your job with running professional societies and associations. I've had the privilege to do both of those activities in my career, and I have an appreciation for how much energy that takes.

We have the privilege of being able to look across all the groups and look for trends, to connect people, to bring people together and so forth.

We would like to be able to include in our data set that we share with the larger community data from other professional societies that we don't currently have. We're talking to people at FASEB and will be talking to other organizations about that.

I also think it's extremely important that we start to bring in the perspectives of academic institutions. The academic institutions are under more and more pressure to do outcomes assessments, taking a look at where their graduates are going and using a feedback loop into their program evaluation. I think partnering with academic institutions and their organizations will be increasingly important.

Finally, we'd like to assist--actually, compared to the one person who said they had a half a person doing research, we're huge. We have three full-time staff. While we're not going to be doing huge data collection and analysis efforts, we can do some specialized analysis efforts. I think that you all have sometimes been frustrated that there are resources to collect all this data but there's just not the time to do some of the special analyses with it once you've got it. While we can't do a lot of that, we can target some.

Alan Fechter, Catherine Gaddy, and Eleanor Babco, Commission on Professionals in Science and Technology

I would like to add to Cathy's comment my appreciation for pulling this together. I also want to say that as we do this project for the Sloan Foundation -- putting together a report on what we know in terms of information bases that will inform us about what's happening to investigators and where the world is going, two things struck us. One is that we have two types of data sources. One is the comprehensive national data coming out of the NSF and BLS and the collaboration with the NRC -- which is usually accused of coming out too late. The other set of data comes from the societies, which seems to be more current and more focused.

One of the things that may be starting this afternoon's discussion off is to ask questions about where the complementarities might be and how we might put it together.

One of the things that struck me about the data was is the need for more disaggregated data --and the COSEPUP report is clearly an example of that. COSEPUP used a lot of the NSF data, but it was aggregate data, by and large. They had to go and get special tabulations to be able to do the kind of disaggregate analysis that allowed us to look at what's happening to the young investigators and the young scientists coming out.

If you look at the NSF data bases and NSF reports with respect to things like unemployment rates and under-employment rates, they don't show you much. Nothing is going on there. And the reason for that is it covers the entire workforce. Whereas, if you look at the society data, most of their most interesting data deals with people who are just out of school. The signs of problems become much clearer if you look at the data disaggregated, and I think there's a real question here as to how we should be going.

The second related point is that I think the gaps that exist now are really gaps in terms of utilization of what's there with respect to the NSF data. The NSF data all people have to work with, generally speaking, are from the published sources. That's very limited in terms of the kinds of information that is out there.

To get at issues that are important, you really have to disaggregate the data in certain ways. That requires special requests and special tabulations, and I think we have to think about how we work those things together.

BROWN: Alan, I couldn't agree more. There's no way we can put out publications that meet everybody's needs for detailed information. And so you'll hear this afternoon how we are working to improve our data bases and the accessibility of them and our ability to disseminate the data to anybody who can use it.

Are there other comments? I was just wondering, maybe you should say a few words more about your Sloan Foundation project -- I'm not sure if everybody knows about.

FECHTER: Unfortunately, the timing was not very good. The Sloan contract was developed a couple years ago. Betty Vetter was then the executive director of the commission, and, of course, we all regret and mourn her loss. We are very fortunate to have Catherine Gaddy to come in to replace her. She brings a lot of energy and enthusiasm and, as you noticed with her presentation, collaborative spirit to the process.

The report fell between the cracks, basically. We had a workshop last June, out at the AIP. A similar group to this was convened to ask questions and to kind of parse the issue in some ways. And based on that workshop and the research that Betty had done earlier to look at what's there in the way of data, we're developing the report.

The report will have major sections. One section deals with the question: What's the objective of collecting the information? It makes no sense to make recommendations about data collection unless you know what it's going to be used for. Because if you don't know what it's used for, any data will do.

The second issue that I think needs to be addressed is what are the appropriate indicators that you need to address any particular problem. We're dealing with the young investigator problem -- the young researchers and the labor market for those people. Standard measures of unemployment rates may or may not be significant for scientists and engineers, simply because most scientists and engineers, because of their capabilities (either acquired or genetic) are capable of finding jobs. It's not a question of their not having jobs. It's a question of whether they are using their skills most effectively in the jobs that they're working at. We, therefore, need indicators of market conditions that may be non-traditional.

The third issue, I think, is determining the criteria we use for deciding whether things have changed or not. I don't think we've really addressed that very well with respect to science and engineering. BLS has criteria when it comes to the unemployment rate--or, at least it used to. They say that if the unemployment rate doesn't go up by at least 1 percent and if that trend doesn't continue for at least one month beyond that time, there's no trend because it could be part of the statistical noise.

We don't have any criteria for deciding whether the world has changed with respect to science and engineering, and we paid a hell of a price for that with respect to the young investigator issue. We are behind the power curve on that issue because we didn't know what to look for. I think that's what we have to address this afternoon. That's what's going to be in the report, too, I hope.

SHOEMAKER: The American Society for Microbiology was not asked to make a presentation this morning. But I would like to bring to your attention the fact that ASM has commissioned Westat, which is a nationally recognized research survey firm, to conduct a survey of employment and training needs in the microbiological sciences. We conducted focus groups to define the issues for the survey last year, and we have now signed a contract with Westat, and the survey is underway.

We have sent out screener surveys to construct the sample frame already, and we are just about ready to send out questionnaires to survey four different sectors of employers in the academic field and government and in industry and in the clinical and medical areas. We expect to have a report back in September which I think will be of interest.

MAISEL: Both what was just said and what was said earlier leads me to a question, and I'm not sure if there's a way to get the answer. We've heard from four groups that are really umbrella groups. And in two cases, the biology umbrella group and the electrical engineering umbrella group, and perhaps in the engineering umbrella group, there was not awareness of the umbrella level of everything that's going on at the constituent level.

Now, is there a mechanism for being sure that we've touched all the constituent bases? For example, one of my purposes in coming to the meeting, which is what led to the question, was I was very curious about the data in microbiology, because Rita Coldwell is a personal friend of mine, and when I spoke to her about employment problems, she didn't think there were any. And I wondered whether that was because it was her experience that there weren't any or if there just aren't data that would convince her that there are.

The same thing was true of what kinds of surveys are going on by the Computer Society. I'm very interested in what their view of unemployment is as opposed to on the computer science side. Is there some place where there is a check to make sure all the constituent bases have been touched?

FECHTER: I think in the case of microbiology the answer is that they are at the early stages of trying to put something together. But I also think your second hypothesis is probably true, too. If you ask Rita Coldwell, you're going to get Rita Coldwell's perception of what's going on.

I will give you a better example. When the COSEPUP was putting together its report on graduate education, David Goodstein came to town to talk about his perceptions of what was going on. For those of you who don't know, he has a very apocalyptic view of the world, that research universities are flat, there's no more growth, we've reached the asymptote, that's the end of it, folks.

He gave his presentation, and I won't mention names, I guess, but it's a Nobel laureate from a very big east coast university who is a life scientist, who said I have no problems funding my students.

Well, I think the problem is that even within a given society you have different groupings of people, depending on where they're coming from. And I think for that reason it's important for any of us who worry about surveys and survey methodologies to think about how we stratify in a way that will allow us to pick up these sub-currents that will be there.

VOYTUK: Maybe to add to your question, there are some sort of sideline data collection activities, especially in the biological areas, such as the biomedical needs study which is done for NIH by the NRC. That's a study which is sort of data intense. Over the past few years, information has been collected on the workforce, and we're now trying to piece together what the various data sources are and what information they have. In other words, it's not mainstream in data collection, but there are some activities to try and look at these sub-organizations which might have data.

FECHTER: I just wanted to follow up to what Jim said. I think it's an important point. When the biomedical research personnel needs study committee was trying to look at the labor market conditions for life scientists, biomedical scientists, and the behavioral scientists, none of the indicators were giving strong signals. It's just not clear what's going on there.

But we played some games. We tried very hard to find things that would tell us something, the post-doc data, the unemployment rate data, salary data.

One of the things that was intriguing, which I think needs to be talked about and followed up on, and we talk about it at the commission, is what goes on at professional society meetings at what I call the cattle market where employers come for recruiting people and students come looking for jobs and maybe people who are beyond being students now are coming to look for jobs. Information on the activity levels of these kinds of activities I think could be promising. The only good one we use--the only good one we found was the neuroscientists.

GARRISON: We tried to look at that. We found a number of variables. First of all, our meetings have different numbers of societies joining together, so the number of societies at the meetings varies from year to year. We've also found that even with comparable groups, the geographic location is important, e.g., Anaheim got a lot more job applicants than others.

left arrowup arrowright arrow


Table of ContentHelpNSF
button