Group Discussion


Improving our understanding of the job market for scientists and engineers
Plans for future work and other action


Improving our understanding of the job market for scientists and engineers

Discussion Leader: Al Tupek, Deputy Director Division Director, SRS

I was asked to make a number of follow-up comments on the SESTAT presentation. We just acquired a Sun work station on Monday. The purpose of that Sun work station is to make the data available that Carlos has described -- the SESTAT data system. All the surveys in there will be available to you so that you can analyze the data in any way you want. You will be able to use that Sun work station interactively from wherever you are, and if you don't like the way Mark has done it, you can do it another way, to find things your own way to the extent possible.

Of course, the system I mentioned earlier that is on CD-ROM will also be put up on that Sun work station so there will be a couple of very interesting data bases available soon.

MAXWELL: That's this CASPAR CD-ROM thing?

TUPEK: The CASPAR is a CD-ROM, that is the data about academic institutions, but the SESTAT, the data on the personnel will be available in the Sun work station and on CD-ROM.

KRUYTBOSCH: The last page of this handout (see 11.6) has a sample from the computer screen that you will be able to reach through the World Wide web and you can actually select the variables you want.

TUPEK: I should also mention in terms of CASPAR. Apparently we charge a fair price for CASPAR and a lot of people don't want to pay it. It is $300-and-some. You can download it through our gopher for free. Now, you do need 600 megabytes of storage to download it, but it is free if you want to go through that trouble.

???: The contractor has just informed us that ever since it became available the requests for new CD-ROMS have really dropped off so that they may not be the vehicle of choice at this point.

VOYTUK: Well, we may need parts of it, but we don't need 600 megabytes.

TUPEK: That's true, if you know what you are doing. That's a good point.

The purpose of this session is to really be a brainstorming session. I think if you look at our agenda, there are a number of questions that we can talk about. I think what is important is that we focus on your most important concerns, your most important issues and ideas that you have for working closer together and in better collaborations.

This can be an open discussion. Everybody in the room should feel free to participate in this discussion. During the last session, the last half hour that Carolyn Shettle is going to be leading, we will be looking at where will be going from here?

We don't need to come to any conclusions. Let's just talk about the issues you feel are most important. There are quite a few people here today and we only have an hour or maybe even less to discuss this, so let's try to keep the discussions as brief as possible. Let's try to give everybody an opportunity to say something about some of the issues that they feel are most important.

We are not going to structure it in any way. It's an open discussion and any topics you feel are relevant feel free to discuss.

KOHOUT: I have worked a lot with NRC and NSF data over the years. I consider them really good. Also, the people listen to you when you call with your concerns about how questions are asked, how the field is being described, etc. What I might suggest is that if any of the associations or professional societies have problems with the way that fields are being presented or the questions are asked that those be raised to the staff here so that they can be considered.

I would like to suggest that this group work together to come up with standardized ways of asking questions. I don't know if the minimum data set is a possibility but certainly standardizing questions.

I would like to, for psychology, raise the issue--and I am sure there are other fields that are starting to look into this--we have a group of people called the Psy.D.s and they are not included at all. They are not Ph.D.s, they are Psy.D.s, and it is a fairly major group and they are not included in this data set. Information on multiple employment positions for folks who are employed in more than one place or they may be teaching, research. Can we get that kind of data?

Those are just some of the issues -- the major issues for me as a consumer.

ELLIS: That is related to multiple employment. In the past there has been a tendency, I don't know if it is in your current system or not, to classify people's specialty in one way and one way only. You have this and this only.

And that doesn't describe real people particularly more experienced people. Lots of people have multiple specialties. One of the consequences of this has been a serious under-count of people who are available who do have skills in those particular areas because you are only picking up people who report principal skills or current skills.

And so in national manpower estimates when people look into the needs for certain kinds of esoteric specialties, there is a tendency to under-count the number of people who are there, because nobody has picked up people who have skills but are not using them at the moment.

And in modern survey research there is no need to stick to the kinds of questions that used to be coined back in the days of the 80-column card. Can't you make more use of the multiple response information and allow for people to have more than one specialty or secondary specialty?

KRUYTBOSCH: When you say, specialty, what do you mean by that? You mean educational?

ELLIS: Field of practice. For example, in our IEEE salary and benefit study, there is a long list of principal areas of technical competence, and for example, radiation systems (meaning antenna development) and engineering management. Well, lots of people have both of those specialties. To force people into picking and choosing when lots of people would say, hey, I have three of these, just doesn't make sense.

It's another case in which we bean-counters are imposing our reality on somebody else's work.

KRUYTBOSCH: I think your point is well taken but for example, NSF has a system for classifying the various program announcements that get sent out across the approximately 80 fields that we are involved in, each of which has many sub-fields. The catalog for that thing is about 30 pages long. And you can't put that in the questionnaire. So the other direction we are going is to have people write in and then you do some kind of auto-coding. Now, we haven't really done much on that. We are doing an experiment in '95 where we have a question for those who say they are doing some research and development. We are asking them to give us a key word or key phrase on what technology or what specialties are they doing research in? What is their line of research?

That will be a fill-in question and we will have the problem of figuring out can we do auto-coding on this or not? So there is a kind of experiment on that. But I think the remarks are very well taken.

ELLIS: Well, particularly in the sense that these data bases are sometimes used to respond to perceptions that, particularly in times of national crisis, it needs particular kinds of technical expertise. Past kinds of technical expertise can certainly become needful again and to restrict ourselves to only looking at current expertise, for example, may mean that the data base gets excessively trendy and in a sense overly current and overly narrow.

FECHTER: I'm delighted to hear about the coding of the fields. It is going to be difficult, as we are trying to get something richer.

But I want to, for the record, make a couple of points about additional data analyses from existing data sources. I think the most important one again in the current situation is much more intensive analysis of the period of time between the receipt of the doctorate and the first real job which covers the period of post-docs and early career transitions. There are data that we have, longitudinal data that will allow us to work on this, NSF's data base. NRC and NSF data bases are clearly a prime target for that kind of analysis.

And maybe even in the recent graduate's data that we are now beginning to get in a longitudinal way could be important to that. I think that is a very important gap in our understanding of what is happening in labor markets that needs to be built.

The other point I want to make about traditional data analysis related to the discussion that just took place is that when I ask myself where the complementarity exists between the professional society data and the national comprehensive data the one thing that leaps out at me is that the professional society data should give us a lot of rich information about the kinds of specialization that we've just been talking about.

One of the things that is very important to consider is how do we do that linking between the professional society data and the comprehensive national data? I think to do that well we have to start asking questions not only in terms of whether you are a member of a professional society, but what is the name of the society? You have to ask people to tell you what society they belong to, and then you can begin to do analysis about representativeness and non-representativeness that can lead you down the road to being that much more sure.

KRUYTBOSCH: The overlap issue I think has got to be important.

FECHTER: Yes, exactly.

GARRISON: I know that we have overlapping membership with the American Society of Microbiology and all I know about that is anecdotal. I can tell from my 9 societies what patterns of membership overlap. Our president is currently a member of ASL and I want to believe that your president is a member of one of our societies.

FECHTER: Yes, I think that really is, that comes under the second question. There is a gap that needs to be filled there about professional society affiliation. That the NSF, I think, can fill very nicely for all of us if they were simply to collect that information in the interim.

SHETTLE: Or, in theory, though it would be expensive and we are not committing to anything, a merge of professional society membership lists with some of our data bases. I am just throwing that out.

FECHTER: Yes, I understand.

VOYTUK: What links exist? For example, we have social security numbers.

GARRISON: We don't have social security numbers for all of our members. Only one society out of nine collects social security data.

FECHTER: And maybe it is going to be difficult as well.

KOHOUT: But that's why you wait and work on a standard data base.

GARRISON: I think that would be very helpful for us for a number of reasons. First of all, even in terms of initiating a project such as this, that's the first thing people ask -- will it take a lot of time, a lot of effort? No, it won't, we've got a survey here.

Well, we may have some special questions. We are different than the rest of those other people. Well, do you want to take the time to develop it with a committee and oversight and review and pre-test, or do you want to take this one that other people are using and allows you to link the NSF data base? We'll take that one, everybody will.

HOWERY: We have spoken a great deal today on the supply side when somebody graduates from school and what happens to them, and there is also the possibility of doing something on the demand side. And one thing our American Statistical Association did to understand demand very cheaply was one of the members went to the Amstat News, which is our news magazine, looked at the advertisement by specialty fields and just took a count by year so that you could see the trends.

For example, we saw a trend, as expected, in bio-statistics (medical statistics) that had a tremendous increase in the number of advertisements. Other societies may wish to do the same. Something like this was done very rapidly, at minimal cost, and it was quite useful.

FECHTER: There was something at one time called the high-technology recruiting index which was done by Deutsche and Evans which was a very good indicator of what was going on in the engineering and technical labor markets. They used square inches of advertising in technical journals as their indicator.

CZUJKO: A couple of things. First, I want to thank you for inviting me here. It's been a very thought provoking and exciting day and it was also an opportunity to finally meet so many of the people I have talked to over the phone over these years.

Just to run through a couple of quick lists of issues that are important to us in the physics community. International flow of experienced scientists. To give you a quick example, about two years ago I did a survey for the American Physical Society which is the largest physical society in the world. 8,000 of their members live outside of the U.S. and that is bigger than a lot of professional organizations.

Among the things we found out from their group was that of the people who belonged to the American Physical Society in developing countries, about 3/4 had been students here. Of those in developed countries about, 3/4 had spent time working on a scientific collaboration here and then returned home. So, there's a tremendous amount of international flow and it's not all permanently staying here or permanently leaving.

The post-doc and the outcomes has been mentioned a number of times -- it's a really important issue.

One that hasn't been mentioned yet, I toss it out because it's a good time to start planning this is the retirement process. We know from the data we've collected that it's not simply that you stop working. You start a second career on a full-time basis or on a part-time basis and it's a multi-step process. I think there's a lot of important data to be gathered there for the future.

The diversity of careers and the constellation of skill sets, and some of the bachelor data that we saw just a little while ago is very exciting.

I think that we need to talk about more than what is your degree field. I don't know if NSF can do that because you already have 280 degree fields, but it is possible -- and we have been doing it over the past year now -- to develop a richer understanding of the skills and tools that people say they are expert in. Programming, using sophisticated lab equipment, math, problem solving etc., etc. The fungibility issue is related to that one.

On the more technical side, I think the issues about electronic data collection and when are we going to start doing it and the problems that that are going to develop as comparisons are made across data sets. Obviously, not only across your data sets, but between our organizations and yours. I think this is a place where if we can standardize some of our key questions, it would provide us an ability to link not case-by-case but subgroup-by-subgroup.

And I think that may be good enough without expanding your questionnaire form for that. And that's it.

GARRISON: Well, I would just like to say that I think the employment data will be a useful source of information. We are currently starting an on-line employment bulletin board at FASEB. It will be able to identify specialties at a very fine level of specialization and be very helpful in that respect.

But I think the difficulty for using it for trend analyses of demand, at least for the immediate future, is that as a new venture we are hoping it's going to grow and reach new audiences. As a measure of change over time it's going to represent our marketing effort as much as anything else.

FECHTER: Thinking about what we've been hearing around the table, I want to throw out a first reaction to it. I think that it means that we have to think about flexibility in terms of how we undertake our surveys, particularly I think for the surveys at NSF. On the one hand, we are talking about the need to know something about the post-docs -- after the Ph.D. and into the first job, knowing more about that.

At the other end, we want to know something about leaving the work force. Now, I am a sample of one. I can tell you all about my experiences, but that wouldn't do. But you need to have some information at that end. And it seems to me that we have to think about whether one survey instrument suits all needs and issues?

I don't know what to do about it, other than to say I think you have to think about that. I think we have to think about sampling strategies in a similar way. That is if you really feel there is going to be a need for information at these levels, what does that say about sampling strategies with respect to these surveys, given an era of limited resources?

MAISEL: I was waiting, but no one said it. What I heard today was an emphasis on answering a question what was? What I haven't heard is an emphasis on answering the question, what will be? Now, from the decision maker's point of view, what will be is much more important than what was. And we have been burned with what will be, I know that, but we have been burned because perhaps we don't know how to answer the question.

I think research devoted to really figuring out the answer to the question, what will be, is a good investment, and I haven't heard anything about that at all.

BROWN: Now that Allen has retired and is doing three times as much as ever, you might say a few words about the modeling effort in the discipline as you see it.

FECHTER: Well, okay. I think I vote for the point of on-faith believer in modeling back to the very reasons that you mentioned. We need to know about the future to make any sense of the decisions. We don't do it very well. We didn't do it very well in the past, in part, because I think it wasn't done properly. What was done was done in the context of saying there is an answer, here it is. Not only was the number wrong, but the sign was wrong.

I think doing some scenario analyses is very important, because if you are going to do scenario analyses, you do them on the basis of something. Most times when you do these kinds of projection models, history is an important input to determine what the parameter values are that you are going to be using to project the future. Otherwise you go by the seat of your pants or you can go to experts to come up with their answers.

I think that modeling efforts would help us understand the demand side with scenarios, because we don't know what the future is going to bring with respect to demand. On the supply side, we do have leading indicators. We do have the Ph.D. production and the B.A. M.A. degree production. We also have information on the freshman intentions, all of which are leading indicators that can tell you something about where the world is going to go with respect to domestic supply.

With respect to foreign supply, your guess is as good as mine as to what we can do. On the supply side, we can get some information based on past flows because the world doesn't change very dramatically in terms of the rates of flow that we see. We can obtain selective future characteristics of what the work force is going to look like.

We are looking at some of these kinds of efforts. I was looking at these efforts at the National Research Council and I am still finishing up a report that will deal with using demographic modeling as a way of projecting the future with respect to the characteristics of the work force.

The demand side is more difficult and I think it is on the demand side that we have to think about scenario analyses. Even on the supply side of the demographic models, all of the models assume we are taking an historic rate of flow and using that as a way of projecting out into the future. Those flows are obviously influenced by market conditions. In-flows will go down when the market gets bad, out-flows will go up when the market gets bad. Some way of linking these flows to current market conditions needs to be found.

So I think we can model the future and that's important. We can also look in a less ambitious way for leading indicators of where the market is going to go. There is freshman norms data, for example, that UCLA produces that may be a very good leading indicator of future supply.

If, in fact, it tracks well we can do the empirical analysis and see how well intentions bear out with respect to what people do. It may be that profit rates, for example, in industrial corporations may affect R & D expenditures which, in turn, affect the demand for Ph.D.s in particular. Maybe that is something that can be a leading indicator.

We need to find ways of at least coming up with leading indicators as a less ambitious way of dealing with the future. And I think that would be very good. Where we went wrong in the past, I think, is that we had a long period of time where things went up. And suddenly--it happened in the early '70s and now it's happening in the early '90s, things stopped going up. Then everybody gets excited about our ability to track labor markets. I think from the point of view of data collection and data analysis, what is important for us to know is when do the rates of change change? Being able to identify when a trend changes is a very important objective that we should be thinking about.

BROWN: I believe you started talking about modeling and then also talked or implied it was important to disseminate our data faster. Is that part of your thought?

FECHTER: Well, one step before that I was answering your question about what data. What should we be looking at in terms of the data that we want to get out faster. You are right about that. I think we missed the boat in terms of these shifts in trend because we didn't get the data out fast enough or we didn't look for the right things in the data. I think one of the things we didn't look for was what is happening at the intersection between getting the Ph.D. and the early part of the career?

A lot of things are going on there that just weren't picked up because we just weren't looking at that. We have to ask the question of what is it we want to look at and then get that out fast.

And then before that, you can get leading indicators of things that might tell us when the trend is going to change. If profits were going down, for example, and that has an affect on industrial demand for scientists and engineers through R & D, could that have been an early warning signal that we should have looked at?

MAISEL: Alan, I guess what I meant was we don't know how to predict. I really mean that. For example, I will ask the question -- which is the more important information, the fact that there are 250,000 Ph.D.s in engineering being produced in Southeast Asia who work for 10 cents on the dollar or the fact it is the number of Ph.D.s that you will produce in 1999?

FECHTER: It depends on the question.

MAISEL: Which is? I don't know. What is the more important piece of information.

FECHTER: When Gertrude Stein died, around her deathbed were a bunch of people and they were known innovators -- they knew what was happening. And they said, what is the answer? And Gertrude Stein's final words were, what is the question?

And I think you can't talk about this issue unless you answer that one first.

CONLON: Alan, how do you determine when a new Ph.D. has settled into what you call the first real job?

FECHTER: I don't know.

CONLON: Is it 10 years later that you ask for them to look back, when did you settle into your first real job?

FECHTER: It may vary a lot is how I would answer that question. A Post-doc is not a first real job.

CONLON: No.

FECHTER: Maybe some soft money positions in academia aren't really first real jobs.

CONLON: So you say redefine.

FECHTER: Contract jobs. I think you have to sit down with people in the field and ask that question. We can't answer that question.

ELLIS: I think there is a tendency to sort of cast these questions in ways which, in a sense, are asking the wrong questions. When people ask for serious forecasts about the outlook for employment of people, my standard answer is that I have a crystal ball on my desk. My one standard litany is I that it's dependent on the U.S. economy among other things. I ask them to tell me which way it is going, because I would like to get rich too. We point out what people on Wall Street have said for years, that if anybody knew how to predict the economy effectively, he would be an absolute idiot to tell. I mean it is inherent in the logic of the thing. There are probably questions that we are asked that may, in fact, be unanswerable or at least not answerable in the sense that we think we need an answer.

And, yet, at the same time, I think there probably is a tendency to give up and that may be premature. I think there are great and worthwhile gains to be had from scenario building and intelligent playing with models in the sense that they can narrow the scope of uncertainty.

They can give people the sense of how fast trends are moving so that when people hear about developments, they can get more accurate information on whether this is something that is going to affect everybody in the whole world tomorrow or whether it is something that is going to affect a few people gradually over time, so that people have time to prepare sensibly for it.

You can begin to make discussions more rational. Now, that's a serious payoff. I think if people have more realistic expectations for what they can get out of thinking about these issues, then you have a chance of making some headway. So I think at the same time it is important for us not to demand that we reach the millennium right away in terms of our capabilities.

SRS did an excellent review of trying to predict demand a few years ago in the form of a little paper. It's a nice piece of work. And the one thing I didn't like about it is that after looking over the record, which was not terribly reassuring, they sort of threw up their hands and said, probably can't be done at all.

And I thought that was giving up. I don't think people should quit trying. I would like to--when Eleanor and I were looking at the Foundation's efforts to try to do crystal ball gazing a few years ago, one of the things that hit me was what a shame it was to have to say that we didn't trust what the Foundation had come up with because it had the effect of making people distrust that process. I thought that was bad.

I would love to see what could happen if somebody could go back to work on those and just do better work.

So I don't think that we should ask quite so much out of people to begin with. I think we need a more realistic sense of what part of the research we can do.

FECHTER: I think a lot of people make a distinction between predictions and projections. I think that is an important distinction. I would agree with Herb, you can't predict. But I wouldn't agree that we can't project. I think the name of the game is being able to layout sensible ranges of possible outcomes on the basis of an informed decision that we could put up there for people to know.

ELLIS: I would go one step further. I think the work of Herman Kahn proved, in his area, that it doesn't even necessarily have to be realistic to be useful. And I think sometimes discussing outrageous projections is still instructive.

KRUYTBOSCH: Often the interests of groups in careers and what have you are very specific and in trying to predict or even project the need for Ph.D.s in the given area is difficult, because their skills are so fungible--like if you look at the emergence of computer science.

I remember what we went through, during the whole '70s, jingling around educational categories in the doctoral survey. The computer people did the survey to try to refine the taxonomies, but we were always just a little bit behind the ball, because we didn't know what was going to happen. Then, we had to try to run and create an instrument to measure what was happening. Then you discover, well, it wasn't well calibrated and you have to change it again.

Well, now that has probably settled down but that's happening all the time in a whole variety of fields.

ELLIS: One more comment and then I will shut up. I think one of the reasons SRS has had--and the Foundation, in general--has had this difficulty with this kind of game is there's been a tendency, first off, to avoid projects that were sort of informal.

But with this kind of work, sometimes it's wise to be informal and less definite and less definitive. Because you are in an arena where you can't really be very definitive and be honest. And also for some of these issues it is a decided advantage to be small and freewheeling and to be able to do fast response work and that has never been our forte in any area, but some Federal agencies have that capability.

And I think it is something that if you want to get into this game, you are going to need to be more informal and to be able to react quickly to developing areas of interest. That is the kind of capability that the Foundation really has not paid a whole lot of attention to, although I think it is practicable. I think that is the kind of capability you need to be developing.

GARRISON: Well, I agree it's a good idea to be able to look forward to the future instead of just looking at the past. But I think one of the constraints that a lot of us who represent professional societies face is that we represent organizations with here and now interests. And I am not sure we can do all that much in projecting future trends given the missions, mandates, and organizations we have.

Carlos made the point that the fields change. That's something I observe all the time. We've got organizations and we see more organizations emerging all the time. The Protein Society is just one of the hottest growing fields around. We hear that there are 7,000 members. They are almost as big as one of our oldest societies, physiology. Proteins fold in very unique ways. It's very important to how they function. People who understand this are very much in demand. Leukocyte biology is another hot area. Immunology is sort of stable now, but the leukocyte people are going through the roof.

And from their point of view, it's a great market out there. They wish there were more of these people. You have an organization that comes along and says, we don't have any problems. And the organizations that are around have established interests as well.

So I am not sure that we can necessarily be the people pointing to the future, given where we are in our organizations. We can provide some data and it's useful data but I think--

CONLON: But you know more about your own people.

NEUSCHATZ: Yes, and we also have to remember that however chaotic the market economy is the chaos pales and looks incredibly orderly compared to the political process. A lot of our demand is driven, especially in certain disciplines, not just by economics matters, but also by political matters which can turn on a dime as we are finding out right now.

So I guess there also needs to be humility even in our projections. Even if we don't call them predictions, but projections, they are subject to so many unknowns that it is an important exercise to do and it is important to set out scenarios, but even our scenarios are going to be very, very rough guesses.

Especially since any prediction sometimes engenders a response which then changes the reality that you are trying to predict -- especially if it has some political clout or some movement among the decision makers.

So again, these are things that have so many uncertainties built into them that I think we really have to be careful. And it is no accident that we focus, for the most part, on the current situation and trends that reinforce where are we in the current situation? It gets very murky out there.

FECHTER: Following up on that, let me reinforce the point I tried to make earlier which is that especially for the data collecting organizations, the strength is not so much in what they can say about the future, but what they can say about where we are and where we have been.

And what is important is that we develop the capability in thinking about the future for people to come together and talk about a paradigm. Physics would have a paradigm, for example, about where the future was going in terms of employment opportunities, and what needs to be done is that when the data violate the assumptions that underlay that paradigm, we have to be out there fast to say that.

That's the key. As data collectors and data disseminators, let decision makers know when their paradigm of the world is no longer the proper paradigm.

VOYTUK: Let me make just one comment on the projection modeling. That has to do with, as our colleagues have said before, about serving the membership and there is an important issue here.

That is that professional organizations have a very difficult time in making projections, specifically projections which go contrary to their hope. We need some sort of other organization or body, such as NSF, to do that sort of thing.

In changing the theme -- we've talked a lot about saying what is. However, I don't think we do a very good job of gathering information about why something is happening, what people think.

And in particular, this issue with regard to post-docs. We know that there are multiple types of post-docs and people are on post-docs after seven years and we make conjectures as to why that is happening because there are jobs out there. But we really don't know. I mean in the biological sciences the post-doc period may be extending simply because these people need to know more and, therefore, they're going through a longer period of additional training. That may be true in other fields or, again, that may be optimistic.

My guess is that it is probably the job market, but I don't think we have really good information about why people are doing things. We do ask questions like, is this a job that you want to begin or are you seeking some other type of employment? Really getting a handle on a wide range of particular situations is something that you don't survey.

SHOEMAKER: I would just like to bring up industry which we haven't talked about very much. We do know that scientists and engineers are probably going to be going into industry more than into universities in the future. And you have a workshop with professional societies, but I think you also need to collect some data to find out what the needs are in industry. Perhaps, are we training people adequately in certain areas that are in need or in demand by industry?

POLLAK: I'm sure this story has been told, I haven't been here the whole time. But I want to say one word here that I think is the most important for the group here and that's the comparability of data across societies.

When I was preparing chapter three of Indicators the last go around, I wanted two tables in there. And the SRS data we had only went through 1991 -- so I looked at professional society data for those two tables and I really tried to find comparable data across societies that I could put in one table. I looked at data from the Chemical Society, from the physicists, and from the math societies and I couldn't get comparable data. The two tables I wanted were unemployment and under-employment and starting salaries for new Ph.D.s.

So that is one of the things I would like to see accomplished from these meetings, is agreement about definitions and the presentation of data, so that it can be used by the societies and the Federal Government to compare different types of fields.

BROWN: That is true -- it was Melissa's idea to have this conference in the first place, and it did grow out of her Indicators chapter. There are a couple of approaches to that. One is to bring the concepts together in the questionnaires so that in X years from now, they will be comparable.

Another approach, more in the here and now, is to do additional analytic work to try to bring together the statistics that we now have and to make them comparable.

FECHTER: There is a third option and that is that the noncomparabilities have largely to do with the levels of the statistics that you are talking about. If you forget about the levels and only worry about changes in those levels, you can go to indexes. And if you compare an index of the AIP with an index of ACS it may not be a bad comparison.

MAXWELL: I don't know if this question really fits in the general theme of these discussions, but I am curious about what people think might be the explanation for why the individual society data on unemployment and, in some cases, under-employment seems so far, so much worse than what you get from the NSF data, the more broadly based data.

The AMS is going to put out a number some time this summer which is this final best estimate of the people who didn't have the kind of committed employment, so that they were viewed as unemployed. They didn't have employment plans in place for the fall of '94 and that number is going to be 10.7.

Now, I've already said in my presentation that that number is not appropriate. You shouldn't refer to that number as truly unemployment, but it is a number we have been collecting the same way. And in 1989 and 1990 it was 2 percent. The following year in the fall of 1991 it was 5 percent. For the fall of 1992 it was 6.7 percent. For the fall of 1993 it was 8.9 percent. And for the fall of 1994 it is 10.7 percent.

I mean it is a trend, although it is soft as it is. What do other people think about why the numbers that the societies put out individually seem to be so much worse looking?

NEUSCHATZ: Well, one thing that comes to mind is that the number that you get and many of us had were numbers taken soon after graduation. And if the job market is deteriorating one of the ways that they show up is a longer period of time for people to arrange employment. So the gap between the two numbers would increase because of that.

So, in other words, NSF numbers show an increase, but it shows a modest increase. Our numbers show a much more galloping increase and that might be one difference.

MITCHELL: I think going to that same point -- first of all it depends on who you put in the group you are calling recent graduates and how long it has been since they actually graduated and when you asked about their status.

And on the SDR that is usually a longer period and so most people have had a chance to get jobs than the professional societies.

RAPPORT: I was going to say the same thing. One is measuring the length of time it takes to get a job and one is measuring unemployment for the period, so that is the difference.

HARDY: The other thing to analyze with your own data is to know whether you are measuring an expectation or not. If it is an expectation of some future event, it is greatly influenced by the newspaper and the general psychological climate that influences one's expectations. So you want to look at your question and decide whether it is influenced by the psychology of the moment, or did you ask them for a factual statement, such as are you employed, are you not employed?

There are many ways to compare the employment rate. I know there are thousands of ways of taking them apart. I think it's important to go through that to understand it. There may not be anything wrong with your number versus somebody else's number, but they may be measuring different concepts.

MAXWELL: I worry that our number gets exaggerated, because of the way we present it, as the unemployment rate, rather than as a statement of how long it is taking to get employment which is a really dramatic shift from what things were five, six, or seven years ago.

And so people are really disappointed. Their expectations have been--

MAISEL: But that is an unemployment rate, but at a very specific point in a person's career and a specific point in time. It isn't misrepresentation to say it's that, but at that point in their career and at that point in time that's when they experience this great difficulty in getting what they want.

And gradually either their expectations melt away or they keep plugging away and then later on in their career that number changes. So the number is appropriate but it is appropriate just to represent that particular cross-section of time.

RAPPORT: We were talking also about the nature of jobs and the longer post-doc, but there might be a linkage between this time. One reason that people are taking post-docs may be that not only does it take them longer and longer to get a job, but the bad job market improves the bargaining power of the people trying to hire them. Since post-docs cost less in terms of salary, benefits, and commitment, I think there is a linkage here, too.

FECHTER: You are suggesting exploitation.

RAPPORT: No, I am suggesting a market.

TUPEK: I think this is a good time to transition into the final session. Carolyn is going to lead the discussion and we are looking at what our next steps will be. Keep in mind some of the issues that have come up, including issues related to projections and issues related to the timeliness of data, and the associations having much more timely data than NSF, issues related to data consistency which has been brought up recently, and the lack of demand-side data and other things.

Plans for future work and other action

Discussion Leader: Carolyn Shettle, Chair, SRS Planning Committee

Ken indicated at the outset of the workshop today that our objective in setting this up was in large part to try to build stronger partnerships between SRS and the professional associations. And so I want to end today's session by asking what is the best way to proceed from here, so that we don't just have a meeting with a lot of great ideas going around and then go away and never have anything happen.

I would like some sense of what is the best way to proceed. We originally said that maybe this will be the first in a series of workshops. That is a possibility, but there may be other ways of getting together that would be better.

Also, I am very interested in getting feedback from you in terms of what are the topics that you might want to discuss. Do we want to continue with this topic, do we want to branch out into other issues that are of mutual concern? Some such issues have been raised today -- international statistics, immigration, R & D funding. Also, what kind of meeting do you want?

I just want to throw it open and get some input now and see where we go. We are very open on directions.

KOHOUT: Several years ago I was lucky enough to be involved with social work, clinical nursing and psychiatry in sitting down and working with folks at CMHS to come up with a minimum data set and work on comparability of measures. It took two years to come to agreement on some of the issues and definitions for just those four and the task got even harder when we started throwing in some of the counseling categories for the practitioners of psychology. However, it was a very enriching effort, because we did it as a group and we worked together and hammered out the issues and what we could live with in terms of definitions.

But if you want to use our data and we can use your data and things can start to mesh together, I think that is the step that has to be made because right now I find it confusing, the very different definitions of what is considered to be unemployment and the different ways of approaching even demographic variables. And it would be fun to be able to compare and to learn about the data, so that you have a picture of what is happening.

But it takes time and I think if we sat down as a group and worked together face to face once a month--that may be not be feasible. It may take a little bit longer. You may not want to be as extensive or as intensive as that. But that was one thing that we we're looking at.

SHETTLE: So you're suggesting the whole group or a sub-group comes together?

KOHOUT: That's something that is open to discussion. It was difficult to do it with a small group of people and it's going to be more difficult to do with a large group but it depends on what people want to do.

FECHTER: Maybe it would be all right to identify particular indicators that you want to see consistency for, focus in on those, maybe have a meeting to discuss those in particular, okay? And what might emerge from that meeting then would be a smaller group that could get together more frequently to work out some kind of protocols to make that more consistent.

KOHOUT: That's what it is supposed to do.

BROWN: Can you be more specific on the subject, was that the comparability?

FECHTER: Yes, yes, but not so much comparability as to large, generic monolith type of a thing, but comparability with respect to a particular indicator or set of indicators.

MAISEL: Only because you didn't repeat it and I want to pitch again. I would love to see a workshop with a similar group, and perhaps even things like the computer society, the IEEE invited and others, that speaks to what are the societies doing for their unemployed or under-employed members?

BROWN: That's good.

MAISEL: Unemployed members or perhaps what about information to students currently at the stage where they are starting their career? Is that part of this?

CZUJKO: No, we are talking about the adjustment process.

MAISEL: What are we doing to help people?

MAXWELL: The AMS, in conjunction with the Society for Industrial Employment, has been setting up, with some support from the Sloan Foundation, this sort of help for people who are trying to make a transition. But the AMS as a society, by itself, is saying it is helping the process to simply make clear to people at an early stage in their education, as much as we can, about the reality of employment.

MAISEL: So it's open-ended, anything that would help these people.

FECHTER: The Commission has been talking about putting together a--they have a biennial symposium and the subject that the Commission is discussing for its next biennial symposium is what I will euphemistically call, coping with the change, which is basically getting at some of those things that Herb was talking about.

And we are at the very early planning stages of that process, but I just thought I would announce the fact that we will be talking more about it, so you should know that.

GADDY: I have been trying to think about how we can be most helpful and supportive, and I mentioned perhaps doing some thing between meetings to be helpful. But we have a Commission meeting on Tuesday, May 16th, at the American Chemical Society Offner Building. The meeting starts at 9, but it's open from 1 o'clock until 3:30.

So if anyone from NSF or NRC or any of their organizations would like to stop by in that 1-to-3:30 window, it is open. So if there are ways that we can help we would be pleased to.

In the morning we will be debating the future of CPST, so we will keep it closed, but in the afternoon will be when the societies presents what is going on. We are trying to figure out ways to do better linkages, and if you can think of more specific things that we can do, please join us for that afternoon if you can.

KRUYTBOSCH: Since the constituency of NSF is academia, SRS has always had problems tracking various things going on in industry.

But it seems to me that many professional society members are in industry. That provides a pipeline into this area of technology, work, work and technology and science. It's presumably a place where a lot of the changes are taking place that we're very curious about. For example, we're interested in changing patterns of work, increases in the use of part-time workers, consultants, temporary workers and in the projectization of work, the flattening of hierarchies -- all of these things are changing the conditions of work in industry. We really don't have much of a finger on that at all. Yet, several of the professional societies have half of their members in industry.

It seems to me this ought to be a resource that could be used to generate at least qualitative ideas about what's going on, what new technologies are coming in, how things change, how are hiring patterns changing. There might be some fruitful kind of get together along these lines. People from business societies that are working in different kinds of industries. We might learn a lot by having some kind of meeting focused on changes going on in that sector.

POLLAK: I agree with what Alan said about looking at a few elements of people that we are talking about. I think that's a good idea and also what Carlos just got finished saying.

I think I know of three things that I think I would like to see for the first focus. One is the definition of post-doctorate. The other one is unemployment among recent degree recipients -- we've all heard that. And the last one which I haven't heard much about but I think is really important and it creeps into the recommendations we had in our recent Academy Report is the starting salaries for new Ph.D.s. And then following up on what Carlos said is that what is really important is to communicate information by sector. That is really important information to provide students whether or not the starting salaries are increasing or going down is a really good indicator of the employment situation out there.

FECHTER: I think that the starting salary is meaningless if you don't have the job. You need to know what the employment probabilities are.

POLLAK: Sure.

FECHTER: And then given that you are employed, what will you be making as a salary.

BROWN: As we discuss these topics, remember that the subject here is also the vehicle by which we do something. In other words, we could have, for example, another meeting with something like this but on a different topic some months from now. We could have smaller working groups on some more concerted topic. We could have some other kind of project. So, in your comments, please make reference to the vehicle by which these good things might get done or at least discussed further.

ELLIS: I won't comment directly on that. I have a sense that there's some frustration about these kinds of meetings in the sense that there is a certain feeling you get, less I think at this one then at some, that we cover the same ground often, go off our separate ways, and then come back a year or two later and do pretty much the same job again.

I think that if NSF would like to make a dent in that kind of impression that it might be wise to pick out something that is relatively small and doable and actually do it. For example, we were discussing earlier the matter of trying to see if we could settle some of these conflicting definitions. And I was sitting back here muttering that if we could pick off one, just one, and come up with some noticeable progress as a result of one of these sessions I think that would make people's ears perk up.

In a sense I think it would be nice to get a lot more specific, and a lot less grandiose. But to be able to come back and say, okay, at the beginning of such and such we did accomplish so and so.

BROWN: Would this "hot button" term, under-employment, be a good subject for what you envision?

ELLIS: I don't really know. It seems to me that you have an opportunity to maybe pick up one of those and then a small but solid accomplishment I think would not hurt at all.

BROWN: That was actually a good comment. Let's continue a little bit in that vein with some specifics that fit into Dick's comment.

FECHTER: I have one and that is looking at the topic of filling the data gap. The format may be a workshop and the specific topic is immigration and international flow.

BROWN: Good. Another possible topic might be data dissemination by electronic means so that we get things out faster and more uniform in some way, data dissemination.

SHETTLE: Which is very consistent with the COSEPUP type of recommendation.

BROWN: Or another one along those lines would be faster data, although that might be SRS talking to itself.

ELLIS: Just as a matter of clarification, I guess what I had in mind is the appeal of the coordinated question asking and definition making thing is that that is something that would involve a case of an agency's actually encouraging and then bringing about something that did require cooperation among the group, as opposed to something that is an initiative that one of us could undertake and report some accomplishment.

That is something that any of us and the Foundation can always do. What we see that isn't so often done is one of these things that takes joint action as a suggestion to the group. That is why I think if you could take a position of helping the group deal with one of these shared issues that, I think, is the kind of thing that I had in mind.

HOWERY: One thought I had from the societies surveys is that I had no feeling for the quality of the surveys and maybe at one point, maybe not the next meeting, given all the effort that is being put in and the effort that NSF has put in here, to just maybe get a feel of the quality of the surveys attained by different societies.

Possibly NSF could get a sample of the surveys, how they've been done, and maybe internally analyze it or somebody else could do it, and possibly recommend some better ways to do it.

FECHTER: A guide to various data bases that exist, is that what you're talking about?

HOWERY: Yes.

MAISEL: Along the lines of a workshop on dissemination, I am wondering if I am not going to want to attend a workshop on how to really effectively milk everything out of SESTAT. And what I mean by that is -- is it really going to be so simple to use, the interface going to be so incredible and the documentation that is available so context sensitive, that it avoids letting you go off and make terrible misinterpretations of data, which is very easy to do.

I don't know whether the system is going to be that way two years from now. I have the sense that this kind of thing you are describing could be tremendously valuable in time, but it doesn't happen the first time you sit down. It may not happen the first two weeks that you work with it.

MAISEL: I just want to repeat your request because I think your suggestion for a workshop is an excellent one and it has very limited objectives. It is probably a good place to start. Let's just see if we can get a definition of a post-doc among us, that we can all agree on and say this is what we mean when we use the expression post-doc.

SHETTLE: That's a major challenge.

MAISEL: Yes, but that will indicate--

FECHTER: If we do that you are going to be guilty of doing what you accuse the statisticians of doing before -- which is that the statisticians are the major definers of the fields. Every field has its own unique culture that constitutes it and affects the nature of the post-doc.

MAISEL: At least we will have a sheet of paper that says the definition of post-doctorate for these people is that and the definition of post-doctorate for these people is that, and we will have much less problems making sense of the data that you have to cope with. It will be objective.

And I want to point out that when it comes to who defines things, the statisticians are probably ablest to define post-doc and not ablest to define under-employment. That is a real tough one, we should stay away from that one.

MAXWELL: The data committee that I described that oversees our survey could not agree among themselves in repeated discussions about this issue of writing out a sentence definition of what a post-doc was in mathematics. So what we decided to try to do was to try to identify a set of characteristics of this first employment, job, that you could look at to say sometimes that's right, it looks like or smells like a post-doc that has these characteristics.

The individual looking at it could perhaps, you know, make some refinement, or refine statements about the nature of these jobs.

NEUSCHATZ: Sort of like designer definitions.

MAXWELL: Designer definitions.

GOLLADAY: Each of the two surveys I mentioned earlier collected information on post-doctorates. The first one that goes to the institution, the academic institution asks them, and every time we get together (and we did just last week) we talk about what is the post-doc going to be. We can say anything we want; however, what they decide to report is a characteristic of their institutional culture, since they use employment files and so forth.

And we all feel a little better at having chewed on this topic every time, because we are working on it, but we get what we get. And I think we see that there's quite a variation.

I like the notion that was mentioned, find out what are the specific measurable things you really want to know, e.g., the type of contract, affiliation with the department, whether it is teaching or non-teaching, etc. And then if somebody calls it a research associate or a project associate or a post-doc or a fellow or an assistant professor, because of various employment things, it doesn't really matter, you have at least got the characteristics.

The other survey is the Survey of Earned Doctorates. You're going to the person, the individual. Depending on how they feel when they are spending their 20 minutes to fill out that form, they might want a post-doctoral fellowship. What they may really want is a job and some money and a place to light and yes, they may say they have a post-doc there. It may not be the national fellowship that they've applied for and would hope to get.

So there are problems but it has been proposed that we go out and look at actual institutions where some of these folks, or a lot of them are, and see how are they hired and what the situation is.

HARDY: With the '93 surveys we had a lot of base-line data we had to gather, but with '95 we had a little bit of extra room for the questionnaire and what we decided to do was what we call post-doc module. We basically took the designer effect in that we don't expect the people in this room and the outside data users to agree on a definition of post-doc.

We tried to have a wide question that would cause these people to fall into a series of other questions, where you can choose. I don't like three being called a post-doc, good, he is out. I like two being a post-doc, good, he's in.

And I really do think having heard numerous people discussing how they have had committee meetings to try to come up with a single definition. I think that is sort of telling us something, isn't it? The critter does not want to be tagged in a little box. There are numerous critters. They look a little bit alike. They happen to go by a name called post-doc.

Now, what is in your mind and what is in your mind and what is in your mind is slightly different, and perhaps in deference to our data users we should allow some designer definitions. And as long as we recognize there are multiple definitions, you can up front say this is what my definitions are.

The same thing is going to work with under-employment. You are not going to be able to force that critter into one little box. You probably need to say, this is my definition and this is why I define it, and I think that is pretty fair and probably pretty viable from a data point of view. Because there are some critters that will not be corralled and I think those are two of them.

LANIER: I am sorry that Carla Howery is not with us right now. John Sevagas and I worked with the American Sociological Association as they designed their questionnaire. I think many of us have worked with a lot of you. It's one of the things that I think we have a pretty good handle on -- what SRS data looks like, what Federal data looks like in general, NCES, all the other data bases. We are usually more than willing to help.

Carla mentioned at the beginning that this was their first real attempt at collecting data. So we were able to help them design their questionnaire, so their taxonomies did fit with national data and they could see places where they no longer had to collect. We could direct them toward other associations and other Federal agencies.

VOYTUK: I guess as opposed to post-docs, I think the more burning issue is this issue of employment and under-employment -- maybe more attention should be paid to that.

And maybe it's not just unemployment and under-employment but whether an individual is somehow progressing in a career that he or she thinks is appropriate. How to structure questions and how to get information, I think would be something that should be investigated.

BROWN: I think that is a good idea and a key point. One of the points you are implying is not a meeting to try to agree on the definition of something, but rather a meeting to say that under-employment is an important issue, as indicated in the COSEPUP report. Instead of trying to define it, let's see what our data has to say about this particular concept in the various rich ways that all of these data go together.

VOYTUK: That's right.

BROWN: All right, I like that approach.

FECHTER: I would say the same thing is true for post-docs. It may well be that rather than having a meeting to define post-docs--

BROWN: The subject is not the definition of post-docs, it is understanding the issues and data about post-docs. Are they really increasing and if so, why or whatever?

FECHTER: Well, also maybe to deepen our understanding of the differences that exist across the statistics so that when we see the existing data sets we have a better sense of their strengths and limitations.

I can think of these meetings, in part, as an educational function. We accept the fact that, when it comes to post-docs, we can't define it, but we know it when we see it. Accepting that, we can try to deepen our understanding as data collectors and data users of the various existing estimates.

And maybe perhaps on the basis of that understanding making adjustments to the measures themselves.

CZUJKO: I would rather we didn't give up on the idea of trying to define post-doc. -- as long as we give thought to the possibility of operationally defining it as a combination of answers on four different questions. So it is not a yes/no thing. Rather, in these three or four different kinds of combinations we can call it a post-doc or call it under-employment.

And I think it is doable, if you don't try to do it in one question.

FECHTER: I don't think you can do it across fields necessarily, that is what I think.

CZUJKO: I'm not sure. I think we ought to give it a shot.

KRUYTBOSCH: The end of this discussion here suggests to me that we might think in the future of actually holding a combination definitional and research seminars. In other words, when our data is up and running, it would be excellent topic of a meeting to look at what the data show about post-docs in the different fields. The data can then be criticized, maybe improved for the next time around on under-employment.

Okay. Let's look at the data that we have. Somebody will be writing papers and analyses of these phenomenon. Anyway, let's think about doing that in the future. Maybe not the next meeting, but as we get the data analysis running.

BROWN: Start out, let's say, with an SRS presentation on what we have and then feed into it--

KRUYTBOSCH: Yes.

BROWN: --additional data on the same subject from the associations.

KRUYTBOSCH: Or maybe one of the people in the professional associations will want to analyze the SRS data on a particular topic and then run it also by the other fields as well. I mean we don't have to do it all.

SHETTLE: We're essentially out of time. I want to make sure to thank the planning committee who have been working very hard, especially Tanya who has been working especially hard. And Melissa who got this started and then went off on a detail.

And I want to thank everyone who came today. I think, personally, it has been a very lively discussion. One thing that I may well do, given that we had so many good ideas, is to get back to you and say, here are some options that we think are viable and maybe you can give us additional written input on what you would like to do next.

BROWN: That's a good idea. Thank you, very much for coming, we enjoyed it.

left arrowup arrow


Table of ContentHelpNSF
button