Graduate Commencement Address
University at Albany, State University of New York
May 21, 2000

Julian E. Zelizer
Associate Professor of History and Public Policy


Let me begin by offering my sincere congratulations on your wonderful achievement. Although I know, and can clearly remember, how the final days of graduate study are some of the most intense and tiring times that you will experience, today you should celebrate that you have made it. Now, you will enter an exciting new stage in your professional lives. I am honored to be part of your celebration.

While thinking about what topic would be pertinent for today, I kept returning to my impression that we are experiencing a dramatic revolution in knowledge. All around us—every year, every day, and every minute—there seems to be yet another major advance in knowledge about the human condition, the environment, and technology. We can't help but turn on the television or radio, switch on the computer, or open the newspaper and learn about changes in our understanding of the world and those who inhabit it. Computers transmit information from around the globe within nano seconds, new medical technology enables doctors to regularly cure diseases that were once terminal while the science of genetics is unraveling the mysteries of the human chromosome. Social scientists are providing a wealth of sophisticated data on some of our most vexing social and political problems. Things that once seemed like dreams are now commonplace. As you know, I could spend my entire time listing these advances. But I think that it's safe to assume that you are familiar with what I am talking about.

This is not the first time we have undergone a dramatic revolution in knowledge. In fact, at dawn of the twentieth century, the United States found itself in a comparable situation. Between the 1880s and 1920s, what historians call the Gilded and Progressive eras, it seemed that pathbreaking knowledge was everywhere. Along with the creation of the modern corporation and the expansion of the federal government, these were the years in which the modern research university emerged. In this era of expanding wealth, there was a premium on education and expertise. Some universities, such as Columbia, reorganized to stress the production of faculty research and graduate training. Many states established model systems for public higher education and research. Graduate programs were established across the country in all types of fields including economics, history, law, and medicine. Education now aimed to teach students the technical expertise needed to enter into the thriving professions. The American Historical Association was founded in 1884 and the American Economics Association in the 1885. Other disciplines soon followed. Outside the academy, cities overflowed with new professional associations of accountants, engineers, teachers, attorneys, doctors, and more. The American Medical Association reorganized into its current form in 1901 and lawyers formed the American Bar Association. From these organizations and associations came an endless flow of journals and books full of information about all aspects of American life ranging from the operation of our economic markets to the physical aspects of heredity. The impact of this knowledge was felt everywhere as a result of innovations in engineering and new technology. Cities and homes literally lit up at night as a result of electricity. Elevators, railroads, and factories fueled a new industrial age. At the same time, average citizens were able to communicate over longer distances through telephones. Business could now rely on adding machines, typewriters, and cash registers to make operations more efficient. Gradually, the automobile transformed how Americans conceived of travel and work.

But Americans quickly learned that improved knowledge could have its costs. In 1939, for example, the sociologist Robert Lynd captured these fears in his landmark book, Knowledge For What? In this work, Lynd pleaded with social scientists to use their newfound knowledge for socially beneficial purposes. He challenged colleagues to look up from their books and think about what they were doing with their expertise. Social scientists, he said, needed to become more active politically and to become more problem-centered in their technical investigations. The role of the scholar, Lynd said, was not just to "stand by, describe, and generalize, like a seismologist watching a volcano." He warned that if social scientists avoided the big questions: "the way ahead will be a prolonged series of emergencies." Likewise, the brilliant African-American writer W.E.B. Dubois—who once championed the mere production of social scientific knowledge as the key to ending racism—dismissed his earlier confidence in the inherent value of information as a "young man's idealism."

These types of warnings about the limits of knowledge seemed prophetic as the world entered into World War II. In the 1940s, citizens discovered how improved knowledge did not necessarily mean an improved society. After all, it was the highly advanced and sophisticated German nation responsible for history's most grotesque display of human evil. Eugenic scientists, whose findings on limiting "unfit" human birth had previously earned respect from prominent scientists, could now only be remembered for helping legitimize the horrific policies of Adolph Hitler. Moreover, although celebrated at the time, many critics would eventually note that it was the brightest scientific talent who brought us weapons of mass atomic and nuclear destruction.

After 1945, Americans lived in a schizophrenic state regarding the value of knowledge. During some years, we looked at new knowledge with the greatest of optimism. During the 1950s, experts were national heros. Doctors were praised for eliminating diseases such as polio while Americans learned how to raise their children from popular psychologists. Parents in the 1950s even turned to experts, believe it or not, to teach their school-age children how to survive the nuclear bomb.

In less exuberant periods, the nation seemed more suspicious of knowledge. Left wing student activists reminded the nation in the 1960s, that it was the "best and the brightest" who brought the nation into a disastrous and costly conflict in Southeast Asia. Conservatives in the 1980s pointed out that Lyndon Johnson's War on Poverty programs, crafted by some of the most talented economists and welfare experts in the nation, had failed to resolve the problems that faced the poor and often made things worse than before.

Since the 1980s, we have been undergoing another revolution in knowledge. The sheer amount of information that is created every day is simply breathtaking. Most striking, computers have facilitated new types of communication and the analysis of crucial data. How many of us would have guessed two decades ago that the entire world could literally watch news events unfold live from anywhere? How many of us would have thought that someone in the United States could instantly communicate in writing with a friend living in Japan or send video images all around the globe for very low costs? I would argue that our knowledge of space, society, the environment, and the distant past is more sophisticated, accessible, and comprehensible than in any other period of human history.

Not only have we seen dramatic advances in what we know, but also in how many people have access to that knowledge. The democratization of higher education has been a central development in American history since World War II. While the modern research university was created at the turn of the century it was not until after the 1940s that large numbers of Americans were able to attend those institutions. Congress passed the GI Bill of Rights in 1944, and enabled millions of Americans to obtain a college degree for the first time in their family's history. States throughout the country, including New York, created massive public higher education programs that offered undergraduate and graduate students access to quality education at a reasonable cost. By international standards, the scale and scope of our programs are impressive. Knowledge is no longer monopolized by a small elite. Breaking down class barriers, these universities have offered important economic opportunities for citizens of all social, economic, and ethnic backgrounds.

To be sure, conditions are far from perfect. Education costs have escalated and segments of Americans still do not enjoy these resources. Although in more states, public education is perceived as a vital economic asset, many politicians remain hostile to these programs. Recent studies, moreover, have demonstrated that too many portions of the population do not have access to computer technology. Until those challenges are met, our national promise will remain unfulfilled.

Nonetheless, it is important to appreciate our accomplishments in expanding access to knowledge through higher education. It was wonderful to read a recent survey reported in The Chronicle of Higher Education that found that most of those polled believed there should be no limits to how many people received college educations, in contrast to 1993 when many Americans felt that "too many people" were going to college. We now have a system that offers unique opportunities for a majority of Americans. With an information-based economy, this is a seminal accomplishment. All of us at the University at Albany are part of this great experiment.

But access and quantity are not enough. I come back to the question raised by Robert Lynd: knowledge for what? For what purposes will we use all the new knowledge that has emerged. I fear that in our exuberance over new knowledge, in our rush to information, we are not asking for what purpose we will use this new knowledge.

Today we are often bombarded with information that does not seem to have any added value. I recall a colleague who told me with excitement about the promise of the web for undergraduate and graduate students. My colleague told me how students could now easily have access to thousands of documents within minutes and how this would revolutionize their education. Even as person sympathetic to this outlook on the future, I could not help but wonder what good thousands of documents would do for a student who still could not understand and analyze a single piece of data? Even worse, what if they simply don't care about learning? When we scratch beneath the surface of dot.com fever, we often find a generation of young Americans who feel alienated from politics, civil society, and even their families. What will this new knowledge do for them? Knowledge for what?

My question is not what good is this new knowledge, but rather, how can we use our new knowledge for the greater good? Of all the institutions in the United States, universities need to promote not only the production of knowledge but also the questions about what should be done with that knowledge. We need the philosopher to ask why the computer scientist does what she or he does. If we don't ask these types of questions about where new knowledge is going, we might repeat mistakes that we have seen in this last century. Without that broader vision and without taking a moment to look before we leap, society will walk blindly armed with our great knowledge into dangerous situations. Will advances in genetics be used to improve human health or support a new eugenics agenda? Will computer technology result in improved communication or will it become a tool that erodes our basic rights to privacy? As a historian, I do not pretend to be able to envision what dangers might lurk ahead. But I know enough from studying the past, from the lessons of the Holocaust to the threats that emerged from nuclear weapons, to understand that serious dangers are likely to emerge if we are not cautious. As Mark Twain said, "History Does Not Repeat Itself. But It Does Rhyme." We must actively avoid these dangers by being judicious about directing our knowledge to just purposes.

We would do well to listen to Albert Einstein, one of the greatest minds in all of human history, who warned that: "concern for man himself and his fate must always form the chief interest of all technical endeavors," Einstein wrote, "Never forget this in the midst of your diagrams and equations."

One of the more disheartening trends in universities and foundations around the country has been the diminishing support for scholars who are not involved in applied research and who ask less practical questions about the human condition. Today universities are directing their funds toward applied research in the natural and social sciences. Research projects are being encouraged only if they will provide some immediate benefit to business, legislators, or the local and national community. Steep cuts in humanities programs in the last five years that have been implemented in many places ranging from George Mason University to the entire Ohio university system. Many universities, desperate for funding as public financial support keeps dropping, are understandably devoting themselves to these types of practical questions. While this type of research is absolutely essential to the mission and health of these institutions, those scholars who are likely to be asking the questions about what all this knowledge is for find themselves isolated. Even at the lower level the emphasis has switched to practical knowledge. I recently heard a powerful politician cynically ask why high school students should be required to take a course in the arts as opposed to more vocational classes that would give them the skills needed for the job market. I thought: must we starve the soul to train the hand?

I would argue that this isolation is a mistake, one that will diminish the potential of all scholarly research and training. This trend is but one example of how we sometimes fail to address the question of "knowledge for what?" in our exuberance to produce new forms of knowledge.

I do not intend for this to be a pessimistic talk about what might go wrong. Just the opposite. I feel fortunate to live in one of the greatest moments in world history. The advances in knowledge, in my opinion, are tremendous and breathtaking. Ultimately knowledge is crucial to civilization. The scientist Robert Oppenheimer reminded us in the 1950s that "access to knowledge, the unplanned and uninhibited associations of men for its furtherance—these make a vast, complex, ever growing, ever changing, ever more specialized and expert technological world, nevertheless a world of human community."

But new knowledge must be used for the right purposes. As I look before this group who will face this task as the new generation of professionals, many of whom I have had the privilege to meet and work with over the years, I am confident that you will help the nation move forward in the right direction. My purpose is not to give you the right answers but to urge you as professionals to keep asking the right questions. Knowledge for what? We are at a critical juncture. By asking the right questions now, and responding to the answers that we find, our future will be bright. Only then will we as a nation be able to turn knowledge into progress. As graduates entering into the professions you will now have the opportunity—and the responsibility—to make sure that future historians recall these years as a golden age.

Let me conclude by congratulating you once again on a tremendous achievement and by wishing good luck in the coming years. I hope that your time here has been both memorable and personally rewarding. Thank you for having me participate in this great event.


Return to Commencement 2000 University at Albany