SRS Presentation

Afternoon Session

Alan Tupek, Deputy Division Director
Mary Golladay, Program Director, Education Program
Carlos Kruytbosch, Program Director, Personnel Program
Mark Regets, Personnel Program

Alan Tupek, Deputy Division Director

This morning I was certainly impressed with the extensive data collection efforts that the societies have -- an enormous amount of data collection that's going on to understand the scientific and engineering labor force.

We'd like to spend the next hour talking about the data collection efforts that the National Science Foundation has underway. I am sure all of you are somewhat familiar with us, but we wanted to make sure you understand the depth and breadth of our data collection efforts. And we also want to talk about some of the new data that are just now becoming available that we haven't had for almost a decade.

Basically, the Division of Science Resources Studies produces statistical reports on the U.S. science and engineering enterprise. We do that by sponsoring surveys that are conducted by other organizations, such as the NRC and the Census Bureau and some private organizations.

We consider ourselves a Federal statistical agency, and we are very active in participating with the other Federal statistical agencies, so that we have a pretty good idea of what's going on in the other agencies that have data related to our own. Many of us worked in other statistical agencies before coming here, so we are good contact source for information.

We sponsor about 14 surveys, either on an annual or a biennial basis. We produce about 35 statistical reports a year. Sometimes they're a two-page data brief and sometimes a fairly comprehensive report like our Science and Engineering Indicators report.

The types of data we collect include R&D funding data, especially from the Federal Government and R&D expenditures by the academic sector and the private sector. We also collect related data on the research facilities instruments for research at universities.

I am not going to talk too much about the education and the workforce data because the speakers that follow me will go into a lot more depth on that.

We also have a biennial survey on public attitudes toward science, engineering, and technology issues. It's disseminated through our Science and Engineering Indicators report.

One thing we have been working very hard on is electronic data dissemination. We have been putting a lot of our information on the Web and we have even more information on Gopher. It's an easy way of getting it. Of course, if you are used to using the Web, you don't even have to download it. You can use that data interactively. We would be happy to talk to your computer people and guide them through setting up your browsers so that you could use the data we have interactively on the World Wide Web. The Gopher information is a good source for downloading our current data.

On the Web, the full address is xxx. Alternately, you can go to the NSF home page (xxx) and look for information about U.S. science and engineering data.

This is probably hard to read in the back, but this is the information that's currently on the Web. We have the Characteristics of Doctoral S&E's and S&E Degrees. We have the Asian report that was mentioned earlier, and our National Patterns of R&D. We even have our old pocket data book available on the Web.

As I mentioned, on Gopher we have even more. In some cases the data comes up electronically before it's produced in paper form, so it's a good thing to be aware of.

This may not be the survey you're most interested in, but just to give you an idea of what you can get to (see xx): In addition to having the data there, we have the questionnaires, the methodology for the surveys, the imputation procedures, limitations of data--everything--this includes data from maybe four or five publications, printed publications, that we put all together and link together.

We would like to begin to start linking our data to other data sources, including data from other government agencies and the data that you have. So I think that the potential for cross-referencing each other is enormous, and hopefully we can talk about that a little later.

For our surveys of academic institutions, which includes surveys done by the National Center for Education Statistics, we have a system called CASPAR (Computer-Aided Science Policy Analysis and Research), which basically provides a lot of information from a number of different surveys about individual academic institutions. It is a great tool for research.

That's sort of an overview, and I think that what is most important is that we talk about the data that relates to the scientific and engineering workforce, and to begin that discussion. Mary Golladay, who is the program director for our Education Program within the Division of Science Resources Studies, is going to talk about her program.

Mary Golladay, Program Director, Education Program

I would like to add my thanks to all of you who are here today for a very interesting and informative a.m. session. I thought it stayed lively, and I certainly learned a lot. I thought I knew a lot about association data collection, but I, nonetheless, picked up a lot of information.

Based on some of the things that I found informative, I would like to tell you just a couple of things very briefly about what our program does. We are responsible within SRS for monitoring education, primarily participants but also some of the aspects of infrastructure that Al has already mentioned--the research facilities and instrumentation.

Now, to do that, to keep track of and report on education, we are into data acquisition, everything from actually directing and conducting our own surveys to acquiring data from other Federal agencies or associations, private data sources, and putting it all together and trying to use it in ways that are most informative for those making decisions.

I want to talk just briefly about two of the surveys which we do, because I think they relate to some of the topics talked about here today and are closest to the other surveys that have been mentioned. And we'd like to open as a suggestion, propose an exchange not of data, exactly, though that is very useful, too, but of actual instruments, data questionnaires. There were some that I heard about today I haven't heard about before, and I would like to promise that I will see to it that everybody here, all of you folks, do receive copies of actual instruments for the two surveys I am going to talk about and would like to ask you if somebody at SRS does not yet have a copy -- if we could have a copy of your surveys. I think it would be very useful, and we will take on the responsibility within the division of assembling and cataloguing them, having them readily available.

It's one of those things we have done in an ad hoc sense with many of you over the years, and it has proven useful. Hopefully we can make that a little bit more active.

The two surveys I want to talk about are our survey of graduate students and then the survey of earned doctorates. What I'm going to do is just introduce them rather than give you the detailed information.

The survey of graduate students goes to every institution in the country that offers graduate programs, and it is a survey at the level of the academic department. When I say, every graduate level department in the country, that means that it goes to over 600 institutions -- approximately 10,900 departments. And we do try to keep track of them all.

We have a listing of all those departments. We know exactly what they're called. We've heard different names tossed around today, and here they all are, all 11,000 of them. This is a fascinating list, really, and folks in the Foundation like to look at it. I will be coming back to that whole point about the nomenclature and the taxonomies.

We are very proud of the survey. It has an exceedingly high response rate. Last year it was like 99.9 percent. There was one institution that would not respond or could not respond.

I guess we have closed out and we are little down from that this year, but in this day and age we are exceedingly proud of the spirit of cooperation we have with the institutions that is represented by the rate of return on that survey. So it's virtually a complete survey of academic departments and keeps track of numbers of students, some general demographics--race, ethnic, and gender information--and then sources of support, which is how the survey got started in the first place, incidentally, trying to look at how students finance their graduate education.

That information is used in a wide variety of ways, and as I say, I'll see to it that each of you has a copy of the actual survey questionnaire.

The other survey I want to mention is a survey of earned doctorates. This survey is jointly funded by the National Science Foundation and four other Federal agencies. It is conducted by the National Research Council. And I am extremely proud to be affiliated with that survey. I go back here 10 years, and there are others around the table who go back twice that and more in their relationship to this particular survey -- Peter Syverson and Alan Fechter most notably. And we are also very privileged to have here the most recent additions to that survey team: Rob Simmons from the National Research Council, and, on our staff, Susan Hill, who directs the project for the Federal agencies.

I have to take just a couple of minutes as a commercial to say that I think this survey is truly a model of cooperative activity amongst Federal agencies, the Federal Government and institutions, the Federal Government and associations, which bless the effort, and the people at the institutions who really work very hard to see to it that we have a 95 percent response rate on the individuals filling out the form at the institution where they receive their doctorate.

The remaining 5 percent is pretty much filled in through public records, so it ends up being a 99-point-something or other file of all persons receiving Ph.D. degrees from U.S. colleges and universities. So it is a wonderfully complete data base.

A couple things about it. The information is supplied by the individual as opposed to the department, and that leads me to a couple of points that I want to raise. The first one has to do with taxonomies. We talked about overlap and how the nomenclature changes. That is another problem. The world doesn't stay still. Fields come into being and go out of being, and so time series don't mean you just always do the same thing -- because the world doesn't do the same thing. Keeping track of all those shifts is one of the maddening and exciting technical problems.

Another interesting aspect of this survey is that certainly in science and engineering fields, the fields are the most differentiated in terms of our reporting of the actual disciplines at the point of doctorate. I just did a rough count. Our list of degree fields has about 280 alternatives, i.e., the newly minted Ph.D.s choose where they are on that 280-item list. Interestingly, that doesn't always agree with the department of their affiliation or the department giving them the degree. Hence, that's one of the reasons why numbers aren't always going to exactly mesh. But it's interesting because the person is categorizing themselves, we believe, as to how they view themselves as they make that transition from academia.

So keeping track of the taxonomies across all fields is one of those areas of technical work. It's kind of grubby. But it requires a lot of attention, something we take seriously, and is important.

Let me just share a couple facts with you also about the timeliness of data issue, because that is of concern to all of us.

Generally, it is likely to take, in overall survey processes, about a year from the "as of" date until data are completely gathered, completely checked, and distributed. That seems very difficult to accept, but in the case of the graduate student survey, for example, we want a fall snapshot. Surveys are mailed out in the fall. Institutions often close their books in, let's say, November -- every institution has a particular date on which they try to say, okay, this is going to lock in our fall statistics.

We ask for the information in January, and it often takes that long for them to get it together. Some institutions don't respond then; there are some state systems that won't respond until March or April -- and it has to do with their own reporting. We try as quickly as possible to do edit checks, and final verification, but with 11,000 departments it is a big job. It is pretty well automated, but it does take time. We try to close out that survey in June or July and then we have the task of getting the information out. So that means that we really try to get data back out to people about a year from the "as of" date.

Now, we can, with a lot of special effort, shave that a little bit, but there are limits. Likewise, in the survey of earned doctorates, the close-out is, I believe, in June which is the cuts off for the year. We get the survey forms, code and enter them. It's a big job for those 38,000 Ph.D. forms.

FECHTER: This is an old saw, and I'm going to put it on the table again. That is, has SRS considered coming out with preliminary estimates based on the partial sample to get stuff into the hands of people earlier?

GOLLADAY: We have not recently explored what the cost or the technical effects of that would be. Actually, in the graduate student survey, we do have something we call a match sample, which is run for us in February. We look at a certain set of departments from the year previous and from the current year to see how is it going -- are we up or down this year, and what does the field shift look like. We have not heretofore shared that. It is certainly a possibility that we could look at again.

FECHTER: I guess the question is--I know the Commerce Department used to comes out with preliminary estimates of gross domestic product, for example, which ultimately get revised as more data come in. The real question here is whether, in fact, the preliminary estimates would be stable enough ultimately to be reliable.

TUPEK: The Commerce Department estimates are not really based on survey results. It's based on partial information from some surveys but not from all the surveys that the data is coming from. You may also recall that they used to have a splash GNP estimate--

FECHTER: Right. They got rid of that.

TUPEK: It was so unreliable.

FECHTER: I guess the question is whether you have given any kind of effort to assessing the reliability of preliminary estimates based on survey information.

TUPEK: We could do that, although I think that is pretty--I would say it's unprecedented. I don't mind being in a leadership position, but--

FECHTER: I thought that's what NSF was all about.

MITCHELL: If I could say something else, I think the direction of the real solution might be in changing methods of data collection and moving towards more electronic collection via the Internet. That will make transmission of data so much quicker. I think we can solve that problem eventually through that mechanism.

FECHTER: I think they're complementary efforts. I don't think that one necessarily precludes the other.

TUPEK: That's true.

MAISEL: Early in your talk, you spoke of the objective of these surveys and studies to be to help those making decisions. How do you get feedback? How do you know you're helping those making decisions?

GOLLADAY: We don't have an organized system in place for getting feedback. We know who uses the data and how it's used, and generally it comes back through sources. The data on graduate students are used extensively within the Foundation by the subject areas. They even are interested in a lot of the detailed information on departments.

There are a lot of people who use some of the national trends. Our Women, Minorities and Persons with Disabilities report, (which is now at the printer) presents data that is used to track the progress of representation of particular groups, and their representation in types of institutions, in certain discipline areas, and in programs. Probably some of those uses and their use in the debates about whether we need to do more of this and less of this or shift resources, those debates are known to all of us.

We have been looking at ways of trying to pin down uses a little more specifically. I'm not sure we've come up with a good system. A lot of people like the information because it is nice to know or it's interesting. That's one kind of use or benefit; whereas, another much more important one may be that somebody may look at survey results and say we've got a real problem here with those who are in the recent Ph.D. category and there is some need for some policy intervention here. So the uses vary, everything from just the kind of casual person taking it in--

MAISEL: Can I ask that question in a different way?

GOLLADAY: Okay.

MAISEL: Do you routinely prepare executive summaries that are brief and direct and do you have a special distribution list for those?

GOLLADAY: We certainly know who--

BROWN: The answer to the first way you asked question was yes, we have had a customer service task force in the division for more than a year. We took one kind of survey that was an interview of a great many people, and we are now working on a more formal mail type survey.

In answer to your second question, we have our series of data briefs which announce the availability and summarize some of the data from every survey when it comes forth. Those are available--those go to a mailing list, but they also are put onto the World Wide Web that Al described, so they get tremendously wide dissemination as well.

KOHOUT: I think several years ago, you had a rip-out sheet where you can respond as to the usefulness of the data. I sent those back in because I get the publications. If you use them, that is one way of seeing whether your data are being used.

SPAR: Have you looked into the idea of supplying just the raw records with some sort of software that you could tabulate yourself? That has some nice implications. One, it saves money. Two, it speeds up the process. Three, it provides much more flexibility.

TUPEK: We do have some micro data system data bases.

SPAR: I'm talking about a micro data system.

TUPEK: The CD-ROM I mentioned is definitely micro data -- data at the level of institutions or departments, even.

GOLLADAY: Yes.

TUPEK: The earned doctorate data is available in micro data, but only through a licensing arrangement, because we cannot protect the confidentiality of that data. No matter how much we strip off that file, you can figure out who everybody else is, and that's a problem. But we do have a license arrangement that people can use that data.

Carolos Kruytbosch, Program Director, Personnel Program

In terms of the question about the executive summaries, one of the principal vehicles for displaying the treasures of SRS, is the Science Indicators report. Since it is biennial, a lot of the data is not the latest. But that appears on the month before the hearings on the budget, and it's delivered to the Congress. We always receive requests for 20 to 50 tables from OSTP as they're making their presentation, preparing the administration's presentation for the science and technology budget. So, this is one of the main places that we display our data. These are indicators, so they are summaries of very complex surveys. You look at some of these tables, Congress will go to sleep. You have to have a nice punchy summary with two bars only on the chart.

FECHTER: One bar.

KRUYTBOSCH: Indicators, there's an art to the presentation, and that's the principal way in which the data get presented.

TUPEK: Carlos Kruytbosch is our program director for our personnel data, including all the data on the scientific and engineering workforce.

KRUYTBOSCH: You should all have a copy of this little flyer on the SESTAT data system and also a copy of a handout on the SESTAT system. One gives summary information; the other spells it out a little more.

The personnel group in SRS is responsible for collecting the data on science and engineering personnel in the U.S. It's composed primarily of five data collection efforts which are listed on the back of this little brochure. Three of them are the major surveys that you're probably familiar with: the doctorate survey, the survey of recent college graduates, and the national survey of college graduates. Together, these three surveys are conducted on separate populations and with separate sampling techniques, yet in the last survey year, 1993, the questionnaires were coordinated across the surveys. Prior to the surveys there were focus groups done and analyses of pretests of the questions.

The content of the questionnaires were coordinated and provided the basis eventually for consolidating the survey data on all levels of degrees. So we can get a composite picture of the science and engineering work force in the United States. I have a slide here (see Appendix A, p.11.3) which I will show you in a minute on that.

I just want to remind you that for one of the component surveys (the recent college graduate survey), which is a survey of approximately 1 in 40 bachelor's and master's degree graduates in the U.S. in science and engineering disciplines, the names and addresses are gotten from the colleges and universities, and we survey the individuals. That's why I was interested in how you managed to get your names and addresses from colleges and universities because there are some Federal regulations about that. The second survey is a survey of doctorate recipients, which is about a 1 in 11 sample of all recipients from U.S. doctoral-granting institutions with degrees in science and engineering.

These surveys have been going on since--the doctoral since 1973, and I'm not sure of the recent college graduates, also going back quite a long time. That's something I wanted to add to the graduate student survey, Mary. You have a wonderful, historical survey. That makes it very valuable, and we currently have some problems because with all the improvements that we have been making in terms of response rates and the consistency of questions and so on in the surveys, we have a problem with a break in the time series. We are really working hard to try to develop bridges from the old series.

In other words, if you look at unemployment rates, the data that we have now really reflect relatively small but, nevertheless, increases in unemployment of Ph.D.s as compared to the data for the 1970s. However, to what degree the data are contaminated by changes in the response rates and in the type of sampling and so on and so forth is not known. While these are relatively minor problems, they're irritating if you want to get a nice, clean time series.

So, two of the surveys have been going on biennially for two decades. The third one, the big one, the most expensive one, is the National Survey of College Graduates, and that has been conducted also historically. It started in 1962. The basis for these data is the Decennial Census. We go to the Census Bureau and they give us a sample of everyone who said on the day of the Census that they had a bachelor's degree.

Now, for some reason or other, they don't have a question in the census form about the field of the degree. It's just whether you have a degree or not. So that causes a lot of pain and worry, and we have to go find out what fields these degrees are.

SPAR: They don't, simply because there is no Federal legislation--

KRUYTBOSCH: Of course, of course. That's just a joke.

KRUYTBOSCH: If they put it in, it would save us a lot of money.

SPAR: And cause them a lot of grief.

KRUYTBOSCH: Yes, that's right. It would. On the whole, it would cost the American public a heck of a lot more to fill in to answer it than we would save by it.

So we get a sample of about 1 in 150 of all people who have a bachelor's degree in the United States. It doesn't matter where they got the bachelor's degree from. And so we send out 220,000 or so of this sample, and we go after them with, first of all, a mail questionnaire and then a second mail questionnaire, and then CATI (Computer Assisted Telephone Interviewing). After that, we sic the census field representatives on them for personal interviews.

All that effort resulted this last time around in around 80 percent. We had hoped for more, but that was more or less acceptable for us.

So here we have this wonderful approximately 180,000 returned questionnaires from people with bachelor's degrees in the United States. So what do we do? We throw half of them away. The people who don't have degrees in science and engineering are just sitting there. If anybody wants to analyze this data, please, we'd love it. But we don't have the money to do that, and we aren't going to be following up those people in later years, the non-science and engineering people. But that data is there, and we have made some effort--not a huge amount of effort--to try to interest people in working with these data, but it just seems to me amazing that here you have a sample of, in a sense, the elite in the United States or about to be elite or whatever, and people are not interested in looking at it.

Anyway, we extract the science and engineers out of that, and this gives some sense of what you'll see. (see Appendix A, p. 11.3) The green represents the National Survey of College Graduates (NSCG), which was based on everybody who had a bachelor's degree in April 1990 and answered the questionnaire about their activities in April 1993. As Mary was saying, it takes a while. We didn't actually close out that survey until March of 1994, after you had gone through all these phases. They were still doing personal interviews, I think.

KRUYTBOSCH: There were people who had gotten bachelor's degrees between 1990 and June 1992 (the cutoff date). That's where the National Survey of Recent College Graduates (RCG) done in 1993 comes in. These recent graduates can be added in order to present a complete picture of those who had bachelor's degrees in 1993.

We re-survey the respondents to both the NSCG and the RCG surveys in 1995 in order to get a continuing picture of what's going on.

One of the most interesting things about the NSCG is that we have here a group of people that we haven't surveyed in ten years -- that's the portion of the U.S. science and engineering workforce with Ph.D.s that were gotten from non-U.S. institutions. While many of these are immigrants, they're not all immigrants. Some of these are Americans who took degrees in Heidelberg and what have you. So we're very much looking forward to analyzing this group, and we think it's between 5 and 10 percent of the Ph.D. workforce. It's an enormously high number, higher than you would think.

As I say, these data are being examined, though they are not completely cleaned yet. But we haven't had a chance to get into some of the most juicy analytical things yet.

I want to say one other thing. On the middle two pages of your blue handout (see Appendix A, p. 11.4) you'll see the content, the items of content in the questionnaire. So, any analysis that you wish to do on demographic characteristics, labor force characteristics, activities, etc. can be designed and can be run. I was wondering whether or not our categories, our disciplinary categories and our occupational categories, would mesh with the concerns that you have with your constituencies and your membership groups. By and large, at least for the large associations, it would seem to me that they would mesh rather well.

There are some categories that we don't have at all, like neuroscience we don't have. Microbiology we have. Everybody with a degree in microbiology we have. I'm not sure whether we have people who are working as microbiologists. I forget whether we have that in the occupational classification.

But what you can do, of course--one thing that hasn't been mentioned here, really, is that you have what you call the fungibility issue, people who have training in a given discipline and end up working in something that is rather different. Sometimes this is not by choice, which is one of the issues we talk about as under-employment. Often it's by choice, that, in fact, you make a heck of a lot more money if you're a Ph.D. physicist in the stock market if you're lucky and you're smart than you would by being a professor of physics or whatever.

So one of the interesting things has never really been looked at in our science and engineering workforce is the contribution of science and engineering training to various sectors of the economy. Again, this is something that is very exciting, and we're looking greatly forward to exploring some of these patterns.

I think that Mark is going to give you a little titillation on this issue when he comes up.

TUPEK: Mark Regets was managing the National Survey of College Graduates and is now devoting his time to analyzing some of the data that's coming out of the surveys. I'll leave it up to Mark to continue.

Mark Regets, Personnel Program

REGETS: Eventually, all the three major demographic surveys are hopefully going to be integrated, and we may even have a public use file for you of some sort. And we certainly hope you will be able to dial into the Internet and do some of your own tabs.

What I'm going to do today is give you an idea of some of the things that can be done just looking at the files separately, mostly working from the SDR because of the interest of this session in recent Ph.D.s.

Here are the unemployment rates (see xx.x): 1.6 percent for all S&E, a little bit higher with the maroon bars, which is for Ph.D. graduates between 1988 and 1992. But what you do see here is differences by field, in particular geoscience and physics jump out, with the unemployment rate rising to 4 percent for recent graduates in physics.

Now, this was in April 1993 as a reference date. Just as a comparison, for the full population the unemployment rate was 7.0 percent.

We can also look at under-employment, or at least some measures of it. Under-employment here is defined as a person working part-time because a full-time job is not available or working outside of his Ph.D. field because a job in his field was not available.

Here you actually see a slightly lower under-employment rate for the more recent Ph.D.s than for the group as a whole. There is no particular way to tell why that is so. My guess is that it's actually easier to find a job in your field if you're inexperienced than if you've proven you can do it.

CZUJKO: Actually, if I may, just look at part-time employment and what you find is that there's a very large group over the age of 50.

REGETS: That's driving that. Okay.

CZUJKO: Or part-time and would rather not be.

SPAR: Have you done this as a function of age?

REGETS: Well, right here you're seeing it broken down with recent versus all Ph.D.s. There's a lot of different ways I can slice it. This chart is about a day old.

SPAR: How are you defining fields of specialization? By original degree?

REGETS: By original degree.

SPAR: That could be wholly misleading because many people's actual field of work, once they've been out in practice for a while, they don't do what their original--

REGETS: Absolutely. I agree. We do some other--

SPAR: So you're assuming that they're under-employed, where, in fact, they may not be.

REGETS: No, no. The under-employment would be if they're outside of their field because a job in their field was not available. They have to check that box.

MITCHELL: Because they say so. May I make a clarification? If you're outside your field because of personal preference, because of family constraints, family reasons, pay and promotion opportunities were better in another field, you're not counted as under-employed.

MAISEL: That's all reasonable. But it wasn't made clear initially.

SPAR: To what degree can you infer that the--how can I phrase this?--that new entrants, having learned newer technology, make it more difficult for the older engineer, if you would, who does not have those new technologies under his or her belt to be able to either find a job if they're out of work or to change jobs? That the employer wants to hire the young new engineer instead of the experienced engineer because that experience may not be doing them very much good?

Is there anything in this that you can back out of that gets you anywhere near that?

REGETS: Offhand, no. That's a fascinating research topic.

FLOOR QUESTION: Actually, the graph doesn't show that, because physics is a field where, you know, the new technology does not impact as much, and in engineering it could. And you can see it's almost the same--

REGETS: Yes.

FLOOR QUESTION: So I don't think that's an overwhelming factor.

FECHTER: Is 0.4 of a percentage point difference statistically significant?

REGETS: In this case, I think it would be.

FECHTER: Is it?

REGETS: Given the sample sizes.

HARDY: Mark, let me just add a little something. In the 1995 survey we have what we call the work history module. What we try to have is somebody go through their work history, starting with their Ph.D., and block out what they've done in large blocks, like I taught for ten years. It also includes an absence from the labor market and why. So if you were out of work for a long period of time, let's say a year, we also would know the age at which that happened. So we would know during a latter part of your career you were out of the labor force for two years because a job wasn't available in your engineering specialty. So that's the future data. That's the 1995 survey.

FLOOR QUESTION: A job in my field is not available, but my field, in my view, is limited to Southern California. I don't want to move to Chicago.

REGETS: Frankly, it's how the person checks the box.

FLOOR QUESTION: So it could be a geographic preference as well as--

REGETS: Yes. Obviously, some people will check a job in my field wasn't available when there clearly was a community college job teaching in his subject and he didn't want it. There's a lot of factors that could cause a person to check that box. It doesn't mean that there absolutely wasn't a job in the field, just that the person felt there wasn't, given what he was looking for.

ELLIS: I have to ask the question. Is there a standard definition of under-utilized?

REGETS: No. That has come up before. And we're probably going to offer a few different definitions, in fact, but this was the one that we have published in one of our data briefs.

ELLIS: One other question. The Foundation's strong suit has always been doctoral data. How close are you to being able to produce similar statistics for people at the bachelor's level? Because of the population you're supposed to serve, the biggest single chunk is B.S. level engineers.

REGETS: As the director of a survey aimed at bachelor's and master's degrees, I certainly can agree with that. But I'd say, oh, preliminary, end of next week.

We've had a lot of our data in-house for maybe about a month-and-a-half to two months, so there are a lot of things we're trying to do with it.

Let me move on. We asked people about what their occupation was in 1988, as well as in 1993, and we also asked them how closely they felt their job in 1993 was to their Ph.D. field. So we classified occupations for each Ph.D. field based upon how many people said in 1993 that that occupation was closely related. Then we've used that to categorize the 1988 occupation as well.

The reason we went through all that was so that we could try to see just using the 1993 survey, which doesn't have some of the sample selection problems of some of the earlier surveys in the SDR series, whether we could compare two different cohorts of new Ph.D.s. So we looked at 1983 to 1987 cohort in 1988, and we looked at the 1988 to 1992 cohort in 1993.

In general, there really isn't much of a difference in the types of jobs rated by this closely related measure that earlier cohort took as opposed to the later cohort.

That's not the only way we can compare the two cohorts, though, according to their occupations. We can also look to see what sector of the economy their occupations were in.

In this case, we're looking to see what proportion of the 1983-87 cohort was able to find jobs in education, or found a job in education, versus the 1988-92.

SYVERSON: Does this include K-12?

REGETS: Actually, it does include K-12.

SYVERSON: It does.

REGETS: Yes. The numbers are not large for K-12.

SYVERSON: So it begins with K?

REGETS: That's right.

In this case, you have a definite trend of fewer recent Ph.D.s finding jobs in education, and that's that way for most of the fields, with perhaps the interesting exceptions of math, physics, and the social sciences. Your guess is as good as mine why particularly in physics, which had the higher unemployment rates, that would be true.

Post-docs are in there. That's probably the answer.

Let me switch to the National Survey of College Graduates for another slide. As Carlos explained, the National Survey of College Graduates sent out 215,000 surveys to people with a bachelor's degree or higher. Part of what we got back were 10,000 observations on people with Ph.D.s. So unlike the SDR, we're able to make statements about everyone with a Ph.D. in the United States regardless of whether or not it came from a U.S. institution. So we can look at the percent of Ph.D.s in different fields that have Ph.D.s from a foreign school (see x.x) and the percentage of each field that are foreign-born.

There's a lot more analysis we can do for this. We haven't tried to exclude people by whether or not they were employed in science or engineering. This is strictly on the fields that they received their Ph.D.s in.

About 1 to 1.5 percentage points of each of the bars representing Ph.D.s with degrees from foreign institutions, by the way, are U.S. natives who received their Ph.D.s from foreign institutions.

NEUSCHATZ: Aren't a lot of the foreign-born people who actually got their Ph.D.s in this country?

REGETS: That's right.

NEUSCHATZ: That's the difference between--

REGETS: Yes. Most of the red bar are clearly foreign-born with degrees from the United States. It's only the blue that are degrees from foreign institutions.

This is showing over 10 percent of the Ph.D.s in life science in the United States having their degrees from foreign institutions, and the same for physical science. I'm not sure that this is something we've had data on before, and actually, I personally was surprised that you had higher percentages of degrees from foreign institutions in life sciences and physical sciences than among non-science and engineering. I pictured a lot more degrees in French or whatever.

SYVERSON: This doesn't tell us the citizenship of these--

REGETS: No.

SYVERSON: They could all be U.S. citizens.

REGETS: They could all be--

SYVERSON: By this point, if they were foreign-born, they could have been brought over as immigrants.

REGETS: Many of them are.

SYVERSON: That's important because there's enough xenophobia running around right now. It's important that many of these people could be immigrant status or naturalized U.S. citizens.

REGETS: I think there's a lot of different ways that we can cut this data.

VOYTUK: Mark, I've heard statistics as well with regard to U.S. citizens who got their degrees in this country and are teaching or working in foreign countries.

REGETS: Those we would love to have more data on.

VOYTUK: That's my question, I guess. Are there any foreign data sources that--because I heard the comment that the numbers were roughly the same. In other words, that blue bar that you have there corresponding to France, for example, may be the same with regard to U.S. citizens with U.S. degrees working in France.

REGETS: Point me to a data source, and I'd love to analyze it.

SPAR: People in foundations working overseas with Ph.D.s, there's quite a number.

KRUYTBOSCH: I could say something about that. Jean Johnson, who I don't see but she was here earlier -- she could say it better herself. She looked at Germany, at France, England, and Japan at the Ph.D. degrees granted in those countries that were granted to non-nationals of those countries, and found the number to be somewhere around 30 to 40 percent for France and for England. In Japan, it was about 30 percent; it had gone up from 15 percent in one decade. In Germany, it was the lowest. It was around 8 percent or 10 percent.

What I found fascinating about that was to realize that we are not the only planet in the system, that, in fact, most of the major countries also host and provide an educational resource for the world. We're not different.

REGETS: I think in general the international flow of scientists and engineers is something that has not been well studied, but has always been important and is becoming more important.

Let me put what might be a less provocative slide up. This is to move on to the third of our major SESTAT demographic surveys, the National Survey of Recent College Graduates.

This is a survey of people that received degrees from U.S. schools, bachelor's and master's degrees in science and engineering, after the 1990 census. This fills in the gap of people that received their degrees after the Decennial Census and who, therefore, may not have been in the sample frame for the National Survey of College Graduates.

In April 1993, we looked at their status -- whether they were full-time students, employed in the field of their degree, employed in some other field or some other status.

REGETS: That's 60 percent of people in engineering are employed and that's--

FECHTER: Now, have you taken out those who are--is that of all graduates or only of those who are employed? Is that 60 percent of all--

REGETS: All engineers.

FECHTER: All right, that explains it, then I am not surprised.

SPAR: Okay, so can we sum those two and say--

REGETS: Yes, you sum those two, so 60 percent of engineers are employed in science and engineering, another 15 percent are in school and 15 percent are in things that we classified as non-S & E jobs, although some jobs, like management consultants, sometimes can be pretty ambiguous.

ELLIS: What is the point in time for these data? 1992 to 1993?

REGETS: These are basically 1991 and 1992 graduates as of April 1993.

ELLIS: Okay, well then, a couple of things. First off, a lot of them were, in fact, unemployed because the placement was extremely weak at that time, and it has since recovered some. I am curious, if somebody was looking as a computer programmer was that an S & E job? Because a lot of these people found that they could get work doing that when they couldn't do anything else.

REGETS: A computer programmer no, a computer scientist yes. And the division between those two occupations can be arbitrary depending on their response.

MAISEL: A computer programmer is no, even if all they have is a bachelor's degree?

REGETS: I believe that is right.

MAISEL: Wow.

HARDY: The jobs are counted. I mean there is a job program and they check computer programmer. We consider their degree as S & E in our definition, but their occupation is not S & E. The reason being that most computer programmers -- by a vast majority are below a bachelor level.

ELLIS: It's possibly true in the data because in the past the computer programmers might not have that degree but I think more and more you will find they do.

HARDY: Actually that is from 1994.

REGETS: I think this is showing that we have a lot of conversations that we should have with each other.

NEUSCHATZ: And you thought this was going to be non-controversial.

NEUSCHATZ: I have a question as to whether the residual that is not shown--we've already learned that it's not just people employed outside of S & E -- but it could also include the unemployed.

REGETS: Yes, that's true.

NEUSCHATZ: Could it also include other things like missing data or are there other categories hidden in that residual that aren't being--

REGETS: Not missing data. If we couldn't determine the person's employment status, we considered them out of scope in the survey.

NEUSCHATZ: So everyone in the residual that is not in one of those two bars is either employed out of S & E or unemployed?

REGETS: Right. And we have the chart around if you want to see the other.

CZUJKO: Are part-time students in the residual as well?

REGETS: Yes.

HOWERY: One of the things about a field that may have convinced students to major in it was that it would open up lots of possibilities. Is there a question regarding intent when a person majors in a field? For example, somebody will major in something like mathematics because that's a nice background to go into medicine, or go into the stock market or whatever afterwards.

REGETS: Oh, absolutely.

HOWERY: The person may be very successful but majored in the field as a background.

REGETS: Well, for this purpose a person working in another science and engineering field certainly would be included.

HOWERY: Suppose the person goes to business after majoring--

REGETS: If a person goes to business that would be considered non-S & E. But at the bachelor's degree level I personally wouldn't consider that to be a failure.

I think that businessmen should know more science.

ELLIS: I'm curious in one respect and that is we are discovering a lot of people are taking non-traditional jobs, and many people are agreeing this is a good thing. They see it as an extension of scientific and rigorous ways of thinking to arenas that previously haven't been subjected to this, but that can be subjected to this.

Now, an excellent example. How would you classify, with respect to S & E employment, a Ph.D. engineer who is working for J.P. Morgan using a computer to compute derivatives? Is that an S & E job or not?

FECHTER: Math or financial?

ELLIS: What I am saying essentially -- are you allowing this as an extension of scientific principal in new arenas?

REGETS: If he puts himself down as working in financial management, by our taxonomy he would be non-S & E which proves to us our taxonomy isn't perfect and we know that.

ELLIS: No taxonomy can be but I am saying that it looks to us as if this is an area that is rapidly shifting and it probably is going to bear some watching carefully, so--

REGETS: If a majority of Ph.D.s said that financial management was highly relevant to their degree, would it show--

WILKINSON: One thing I want to make clear is anybody with an S & E degree still is in the data base and one of the things we intend to do is to fully analyze the fungibility of these 300 occupations that are not S&E.

REGETS: That's right.

KRUYTBOSCH: This is the first time that you will be able to examine in detail the relationships between the educational qualifications and the occupational activity. And this is going to be a big work. We haven't even scratched the surface yet on this.

And I think that addresses the very issue that you raised as being very important -- the changing patterns of work in America and the relationships of education to that. We have the data here to do that, but we haven't started doing it yet.

ELLIS: But that is the thing that I think that all of us really need to look at.

MAISEL: I have two things to say. One is that it is obvious from the reaction that we all agree that we have a very wonderful data source here. I'm certainly impressed. But I must get back to the decision maker. If someone asked the question, is that person doing what they were educated to do, what is your answer? That's the underlying problem I have. If we are talking about information for decision making, it should be the decision makers not the statisticians who define it. The statisticians are coming from the wrong place.

REGETS: We have other measures that we intend to report on which is how closely a person believes his occupation is related to their degree field. We are also going to do traditional labor economics things such as estimating earnings functions and seeing if the Ph.D. in physics is earning you more on Wall Street, whatever that tells us.

FECHTER: It goes back to what I said this morning, you have to ask the question first, "Why do you want to know?" How you present the data will depend very much on what it is you want to know? You will have to present that data in different ways to answer different questions. I think that is very clear.

left arrowup arrowright arrow


Table of ContentHelpNSF
button