- Sektör: Printing & publishing
- Number of terms: 1330
- Number of blossaries: 0
- Company Profile:
Routledge is a global publisher of academic books, journals and online resources in the humanities and social sciences.
Two-year college programs, often focused on vocational goals, have grown rapidly since the Second World War to encompass more than 10 million students (about half part-time) in 1,100 institutions—about 44 percent of American undergraduates. Community colleges grant nearly half-a-million degrees annually, plus thousands of certificates.
Programs are either terminal (AA/AS—Associate of Arts or Sciences) or preparatory for attendance at a four-year college. From Philadelphia Community College or East Los Angeles Community College to Native American institutions like Oglala Lakota College in South Dakota or New Mexico’s Navajo-based Dine College, community colleges have created unique opportunities to democratize college and incorporate diverse students into its academic life.
While private junior colleges had emerged to fulfill these roles in the early twentieth century, “community” colleges were established after the Second World War as part of educational restructuring on the part of state universities to reach lesseducated students, while not diluting their central campuses as research and teaching universities. They were intended to be located within commuting distance for high-school graduates as well, thus decentralizing state education, and to offer flexible schedules and cheaper classes (sometimes at the expense of professors). In the 1960s, these colleges became central to the planning of systems in California, Kentucky and Midwestern states and later expanding into the Sunbelt. Often envisioned for rural areas and small towns, they were also incorporated into urban education, including the CUNY system in New York. They have also taken on responsibilities in professional retraining, adult education and welfare-to-work programs.
In 1996 California had more than 1.8 million students enrolled in 106 public community colleges; it was followed by Illinois, Texas and Florida. Of this total, roughly 137 colleges are private; technical institutes and private schools owned by families or corporations constitute a rather gray area in this educational branch. Some of these junior colleges, for example, specialize in women’s education or the arts; Kilgore College in Texas has become known nationwide for its precision drill team, the Rangerettes.
While community colleges have proven immensely popular, they have also been easy targets for attack because of the non-academic nature of their vocational classes: the most popular programs tend to be in health services (registered nurse, dental hygiene, physical therapist), business, telecommunications and mechanical fields. The need for remedial programs in language and math that face many of the colleges also denigrates students and institutions as inferior rather than serviceoriented (a frequent charge in New York City reforms). Community colleges are also involved in education and class, drawing poorer, minority immigrant and working students rather than the pool of elite liberal arts students or other four-year course students–55 percent are Hispanic and Native American students. Their continuing success, on both an individual and a collective level, underscores both the opportunities and demands of American education in the early twenty-first century.
Industry:Culture
White-collar crime refers to offenses such as tax evasion, misuse of public funds, embezzlement, fraud and abuse of power. Its perpetrators are most often members of the social elite who have positions of influence in business and government. The term “white-collar crime” was first popularized in 1940 by criminologist Edwin H. Sutherland.
However, the phenomenon first became prominent in the American consciousness in the early 1970s, when President Richard M. Nixon was forced to resign amid allegations that he had abused his presidential power to cover up his involvement in the Watergate scandal.
The legal system has traditionally been lenient in its punishment of white-collar criminals, while penalties for crimes like burglary, drug dealing and murder have grown increasingly stringent since the Second World War. This trend has reinforced class divisions between white-collar and traditional criminals, who often come from poor backgrounds.
Authorities have grown more vigilant in their prosecution of white-collar offenders since a wave of highly publicized cases in the 1980s and early 1990s. Successful Wall Street traders Ivan Boesky and Michael Milken paid millions of dollars in fines and served substantial prison terms during this period for their involvement in illegal insider trading on the stock market. White-collar crime also played a role in the financial collapse of the savings and loan industry. Banking executive Charles H. Keating, Jr., whose institution was one of about 700 that went bankrupt, was prosecuted and jailed for defrauding his customers by persuading them to make high-risk investments in his bank’s parent company. Several US senators who had received large contributions from Keating were reprimanded for ethics violations after a Senate investigation found that they had lobbied federal regulators on Keating’s behalf. The federal government’s ultimate decision to help rescue the savings and loan industry with $130 million in federal funds (as of the end of 1996) angered many taxpayers. They accused the government of protecting the interests of wealthy business people at the expense of average citizens.
The federal government focused increasing attention in the 1990s on its own victimization by white-collar criminals. Growing numbers of businesses and individuals have faced prosecution for defrauding government agencies. Financial abuses have been especially prevalent in the federal Medicare and Medicaid entitlement programs.
All forms of white-collar crime expanded in scope in the 1990s as perpetrators began to use the Internet and the World Wide Web as tools for corporate espionage and other illegal business practices.
Industry:Culture
The 229 Roman Catholic colleges and universities in the US currently educate over 600,000 students. These institutions are affiliated with such religious institutes as the Jesuits and the Sisters of Mercy or, in the case of Notre Dame, with the Congregation of Holy Cross. As of 1989, 44 percent (100) are comprehensive institutions and another 40 percent (91) are liberal-arts colleges, with the rest research institutions or junior colleges.
Religious colleges tend to enroll higher numbers of women and part-time students than do non-religious institutions. They also tend to be located in the Midwest and the Northeast part of the country.
Historically Catholics often felt unwelcome in many Protestant-dominated private and state colleges. Catholic colleges were established to create a religious environment for higher education and professional schools. These schools trained everincreasing numbers of immigrant children and fostered inculturation in a Catholic milieu. They were highly effective at both tasks. Even in the late twentieth century nearly 40 percent of these students were the first in their families to attend college.
After the Second World War, the GI Bill offered tuition funds to veterans and this influx of students caused many schools to expand greatly The increase in the number of men with a college education fueled the postwar economy and propelled the percentage of college-educated Catholics upward. The children of these alumni often attended parochial schools and added to the expansion.
In the 1960s, the liberalizing effects of the Second Vatican Council were felt in the United States, and the increased numbers of wealthy and influential alumni contributed to a movement to make Catholic colleges more academically competitive. In the late 1960s, most institutions were turned over to boards of trustees, a majority of whom were lay people. At the same time, increasing openness to the new scientific methods shifted the focus of scholarship to a more scientific and a less specifically Catholic perspective. The percentages of priests and religious teaching in the colleges began to drop, with a concurrent hiring of lay people to respond to the increasing student population and the desire for increased professionalism. Ultimately the debate over what it means to be a Catholic, or religious, college grew. A few institutions even disaffiliated themselves from the Catholic Church. Schools based their identity on the teaching of theology and the development of faith life revolving around campus ministry programs. Concerns persist that colleges are becoming less Catholic. Attempts by the Roman Catholic Church to address the issue of higher education in the United States have centered, since 1991, on the Apostolic Constitution Ex Corde Ecclesiae of Pope John Paul II, in which he defines the nature of a Catholic university. Feverish debate continues unabated on this question of Catholic identity.
Industry:Culture
The theoretical foundation for how people and firms make decisions about consumption and production of goods and services. Alfred Marshall, a noted economist, referred to economics as the “study of mankind in the ordinary business of life (Buchholz 1996:4)”.
The concept of scarcity that one can’t have everything, underlies economic thought.
Scarcity combined with choice leads to the primary economic problem: the determination and interaction of supply and demand. All of these questions, in turn, are central to American business, government and thought. Hence economics has been a popular discipline for students and policy-makers, and Americans have dominated Nobel Prizes in the field since 1970, especially through strong departments at the University of Chicago, Harvard, MIT, Yale, Princeton and the University of California (economists are also found at major business schools like Harvard, Stanford and Wharton).
Within the field, microeconomics deals with the behavior of individual consumers and firms. Macroeconomics focuses on the larger picture; the total level of economic activity such as employment, national income and inflation. The primary value of both is in their predictive power in the real world. Unfortunately most economic theory requires assumptions that cannot be replicated in life. The violation of “ceteris paribus”—the notion of “all other things constant”—makes extracting the effects of a single policy from all other noise factors nearly impossible. While economists have developed rigorous models through Econometrics and Chaos Theory they rarely tell the entire story For example, economics may be able to prove an association between interest rates and inflation, but it often cannot indicate which causes the other. The conservative nature of its proponents and the failure of economics to reflect reality accurately led Thomas Carlyle to refer to economics as “the dismal science.” Economic theory is often better at explaining what has happened than what will happen.
At the same time, data mining—the concept that if you look at enough data, you’ll eventually find something that proves your point—has supported some dubious economic arguments. For example, economics is often used to support political rhetoric. In the early 1980s, Ronald Reagan promoted a plan to cut taxes in order to increase tax revenue. The plan, now referred to as supply side economics, argued that reducing taxes can increase revenue by increasing general economic activity Then presidential candidate (and future supply sider himself) George Bush referred to this plan as “Voudou economics.” Politicians of every background use economic theories to justify their preferred policies.
The blending of disciplines has created new avenues for growth in economics. Urban economics, gender economics, environmental economics and many others have found their way into the curriculums of American universities. The subdisciplines often seek to quantify that which is not easily observed or explained using more traditional methods.
Increasing globalization also presents challenges for economics because assumptions may not be consistent across cultures and data are difficult to compare across borders.
The definitions of utility happiness and wealth, to name a few variables, are far from universally accepted, despite what economics may have one believe.
Industry:Culture
The fundamental discourse of division in American culture for the last three centuries has been that of race, specifically the division of “black” and “white.” In many ways, in fact, this framework has eclipsed fundamental issues of class and confused issues of gender and citizenship, especially in everyday discussion. Other groups—Native Americans, Asian Americans, Hispanics, Irish Americans, Jews, etc.—have “fit” into American society in terms of this division as well. Hence, terms like “ethnicity” or stereotypes of new immigrants cannot be discussed without some understanding of the history and construction of race.
Race, in turn, has fundamentally referred to the social construction of biology. Often, this has meant phenotype—what someone looks like, based on skin color and a few associated categories—inscribed on the body. Genealogy was imputed via an unbalanced cultural model; “one drop” of African American blood determined race in many states.
This also precluded the construction of intermediate groups (mestizo) found in Latin American societies; “half-breed” was an insult, not a category. In practical terms, “looking” black or Indian or Chinese was the social determinant of racial categorization.
While the category “mulatto” might be recognized (or even sanctioned in New Orleans), legally these people were defined as black—hence, the longstanding category of the “tragic mulatto” and issues of “passing” (blacks living as white) which haunt literature and film.
These categories had further implications for policy and thought in the early Republic, where the Constitution defined slaves as the equivalent of three-fifths of a human being.
Scientists also argued polygenetic versus monogenetic models of racial origin which again made non-white races less than human. Even in areas like medicine, education and the census, pseudo-biological racial assumptions underpinned unequal treatment (as sociological assumptions later would do).
The “clear-cut” categories of race were confounded by European immigration in the nine-teenth century, which produced “white” populations that differed in language, culture, class and strategy from the dominant Northern European populations. Hence, the notion of ethnicity developed out of the category of race, mingling “visible” features with other distinctions of race, religion and perceived behaviors. In the nineteenth century for example, the Irish would have been classified as a race separated from the English, the Scottish or the Germans; through much of the twentieth century the Irish were considered an ethnic group, part of a so-called white race. This transformation is part and parcel of the story of assimilation, made possible in effect by the reality that the Irish may have been considered different from mainstream WASPs, but they were not as different as African Americans, the Chinese, or Native Americans were deemed to be. For the Irish themselves, and for other Southern and Eastern Europeans who faced hostility from native-born Americans, considerable mileage could be gained from the process of “whitening,” and by propagation of notions of ethnicity. At the same time, the idea of an “Irish” race allowed the Irish to distinguish themselves from their British colonizers and even to organize “racial” (political) action in Ireland without appearing disloyal to their new nation.
In the 1920s, immigration quotas reified certain categories of origin (older ethnics) as legitimate populations, while proscribing others as racially inferior. Here, the limitations on Asian immigration imposed between 1882 and 1943 are particularly striking. At the same time, the Great Migration of African Americans from the South to Northern cities reaffirmed the presence, meanings and tensions of the fundamental racial divide.
Race and ethnicity became major questions in politics and the social sciences from the turn of the century onward. Laws in the North and South sought to define race in terms of rights, location and boundaries in such areas as marriage. Anthropology, under the leadership of Franz Boas, developed a strong commitment to refuting race as a biological category which has continued to the present; Boas and his followers also worked with American ethnics, although this would be developed even more by sociology and anthropology in subsequent years (for example, the University of Chicago school of sociology). Many foundations were active in supporting this research and its propagation; African Americans like W.E.B. Du Bois also challenged the preconceptions of race on both intellectual and political grounds.
Political, demographic and cultural changes since the Second World War have further complicated concepts and usage of race and ethnicity Continued migration of African Americans to Northeastern and Western cities, places populated by large numbers of descendants of immigrants, encouraged a competitive racism in which divisions among European immigrant groups became increasingly blurred at the expense of increasingly ghettoized and segregated black (and Asian or Hispanic) communities. That is, contact and even marriage across religious or “ethnic” lines—e.g. Irish Catholic and German Protestant—became more socially acceptable, even while laws of miscegenation forbade marriages between “white” and “other” in many states.
The massive migration out of cities into the suburbs after the war, encouraged by the GI Bill and highway construction, further accentuated the racial divisions between white suburbs and black de-industrialized inner cities. In addition, political changes occurred from the emancipation of slaves in 1865 all the way through the Civil Rights movement, which both provided a strong racial caste to political discourse and then assaulted that discourse in ways that would contribute to status anxiety of more marginal white populations.
Within such a context the 1960s became a key period in the creation of what Steinberg has called the “ethnic myth.” In the countercultural assault on corporate America, many of the icons and mechanisms of assimilation came under attack, from the bland suburban tract to the WASP-dominated college campus and military-industrial complex.
Expression of European ethnic heritages and sometimes the “invention of traditions” became common at this time, further exaggerated once expressions of black cultural identity (connected with Black Power) became seen as politically potent weapons.
Ironically, what started out as a radical critique of mainstream American society soon turned into a white-ethnic backlash against civil-rights advances, the War on Poverty and affirmative action.
Boundaries between race and ethnicity, however, started to erode as the end of the century approached. This is particularly the case since post-1965 immigrants arriving in the United States have not fitted easily within the black-white model established in political discourse. Asian Americans have never fitted within the system, while Hispanics and immigrants from the Caribbean and Africa have brought their own concepts of race with them that often incorporate (or conceal) racial distinctions and mixtures within the ethnic group. Further, whereas in the nineteenth century nativeborn Americans and European immigrants often came together at the expense of African Americans, in some of the recent reactions to immigration African Americans and other native-born Americans have moved closer together in their opposition to the newly arriving immigrants. In both cases, we must be aware of ethnicity as a potential strategy to divide class interests, as well as of the efforts of those divided by race and ethnicity to come together in common causes. Sometimes, in fact, division and cohesion are closely interwoven, as in the long history of Black-Jewish relations in the US.
Categories have also become neutralized in public (multicultural) discourse, where “ethnicity” is sometimes used as a “softer” word than race to imply that all divisions are epistemologically equal. “Ethnic studies” has become a major academic field in research and teaching by both challenging and blurring categories of difference (see literature, race and ethnicity). Moreover, demographics continue to complicate simple categories of ascription or self-identification, as questions of the 2000 census already have revealed.
With the elimination of laws against miscegenation since the 1960s, as well as the presence of new racial/ethnic groups, “multi-ethnic,” “bi-racial” or “hyphenated” families and citizens, while not the norm, have a greater presence in everyday life. Others have also identified American as their ethnic, racial or heritage category.
“Race” and “ethnicity” are used in more confusing and sometimes sinister ways in popular culture. “Black” music, “black” audiences/consumers and “Hispanic television” are all assumptions made in marketing and mass media. Asian Americans have both gained and suffered from assumptions underpinning their categorization as the “model minority,” while Native Americans have had to learn to reassert a complex of biological, linguistic, historical and cultural features to claim tribal identities.
“Ethnic” can refer to established American neighborhoods, food and nostalgia, or lead to the creation of “vaguely ethnic” characters in mass media, marked by clues of food, accent or religion. Here, ethnicity sometimes stands in for other categories like class.
Ethnic is also used to refer to continual global borrowing—“ethnic chic” may take items from Russia, Mayans, Nepal and Zulus, while “world music” mixes rhythms, instruments and heritages. Race, by contrast, tends to be strongly marked in the same situations—no television character is “sort of black” or “maybe Asian,” although light-skinned African Americans often have been highlighted as models and actors (and white female actors like Katharine Hepburn portrayed Chinese women on screen). Again, in reading these characters, it is important to see where “race” is an issue and where these characters are also used as vehicles for the discussion (or concealment) of issues of class, gender and “otherness.”
Industry:Culture
Up to 1,000 cookbooks were published annually in the US in the late twentieth century.
Such proliferation represents not only a diversity of audiences and tastes, but also a recognition of cooking as cultural capital (for the middle class), as well as a realm in which domestic knowledge has given way to outside mass-mediated expertise.
Cookbooks, cooking television and other media, like food itself, thus embody critical changes in American society and culture.
This can be seen by successive editions of the classic Joy of Cooking (1931) created by Irma Rombauer. Unlike the equally popular Better Homes and Garden Cookbook (first published 1930; multiple re-editions), linked to a publishing empire, or the Betty Crocker Cookbook (1950; multiple re-editions) of food manufacturer General Mills, this was an individual effort, originally self-published. Early editions met the needs of a world where domestic service was disappearing. Collecting recipes from friends and additional information that now seems dated in its reliance on canned soups or overcooked pasta, Rombauer later adapted easily followed recipes to new conditions like wartime rationing.
This tradition of change was continued by her daughter and her grandson, who produced the new 1997 edition. This last comprehensive volume—which ranges from beating eggs to comparing caviars—takes into account sophisticated palates and distinctions between newly available ingredients, global cuisines, health concerns and family dynamics that make pizza a meal category Joy of Cooking emerged in a relatively limited market. Postwar prosperity and nuclear family domesticity changed the needs and markets for cooking guidance. Magazines and the press also taught cooking (including published collections from food editors like the New York Times’ Craig Clairborne). Television food shows also appeared with the earliest stations, incorporating cooking teachers like Dionne Lucas and showman cooks like James Beard, who linked his recipes to commercial endorsements as well as cookbooks. Julia Childs’ inimitable PBS French Chef (1963–73), with sequels, set new standards in cookbooks and television for generations to come, re-establishing food and an acceptance of kitchen mistakes that French cuisine as a goal, yet doing so with a love of demystified haute cuisine. Her success was followed by other PBS shows and a cable food channel, again often linked to cookbook sales and celebrities.
Meanwhile, advertisers supplied recipes to enhance sales and create new uses for their products, from gelatin to cream cheese to soup. This onslaught for the food consumer increased with new machines—pressure cookers to microwaves to breadmakers—that altered the American kitchen. Some products, in fact, became identified with specific recipes: Nestle’s chocolate chips and Toll House cookies or Chex cereals and snack mixes.
Other cookbooks have expanded with affluence and leisure, as well as exposure to new immigrants and travel. Prominent among cookbook categories and television shows are those that champion cuisines of Italy France and Asia, as well as domestic regional/ethnic specializations like Cajun, Southwestern or soul food. Celebrity chefs become multimedia institutions with restaurants (chains), cookbooks, shows and guest appearances. Other writers incorporate the ethnography of food into their writings, like Paula Wolfert on the circum-Mediterranean or Marcella Hazan on Italy. Newspaper sections and magazines targeting affluent consumers—Saveur, Food and Wine, Gourmet—also combine narrative, pictorials and recipes. Often, these make demands on time and ingredients that set the process and results of cooking apart from everyday eating, reinforcing its cultural capital in the middle class. Other cookbooks meet specialized interests and needs, whether in preparation categories—basic, grilling, baking, speedy etc.—or nutrition and diet, featuring light, low-fat and salt-free foods.
Clubs, schools, churches and other groups also elaborate community through cookbooks and cookbook sales. Folklorists and anthropologists have examined these food ways and contributed celebrations and collections patronized by institutions like the Smithsonian. Indeed, these complexities of community and change permeate media that permit cooking for status or raise questions of identity embodied in Jeff Smith’s wry subtitle on his The Frugal Gourmet on Our Immigrant Ancestors (1990): “Recipes you should have gotten from your grandmother.”
Industry:Culture
The history profession in America has changed dramatically in the past half century. The GI Bill after the Second World War brought about rapid growth in American higher education, including history programs. In the 1950s, the dominant interpretation of American history was the consensus school, which emphasized the homogeneity and lack of conflict in American culture and history at least partly in reaction to the previous Progressive historians, who portrayed class, political and regional conflict as the motivating force in American history. Daniel Boorstin, Louis Hartz and Richard Hofstadter were among the best-known and most influential historians working in the consensus tradition.
The 1960s saw the beginnings of a profound historiographical upheaval that is still underway today This revolutionary “paradigm shift” had several components. Perhaps the most visible and controversial development was the emergence of a New Left school of history Influenced by a growing sense of the problems and inequalities of American society particularly in reaction to the Vietnam War and the Civil Rights movement, New Left historians criticized the consensus school for an overly complacent and congratulatory view of America’s past, and instead emphasized the elements of exploitation, imperialism and racism in America’s past. New Left history produced revisionist accounts of the Cold War, slavery and abolitionism, emphasized the pervasiveness of radical events and individuals in America’s past, and often tried to write “history from the bottom up,” focusing on lower-class or previously “inarticulate” elements of society. William Appleman Williams and Eugene Genovese were among the most influential New Left historians.
Related to New Left history was a profound and long-lasting development, the emergence of a “new social history” The most important accomplishment of social historians was to expand the focus of the discipline beyond an emphasis on white, middle-/upper-class males to a more inclusive perspective. The growth of women’s and African American history was just the tip of the iceberg. Social historians examined an almost endless array of previously ignored subgroups of American society including Native Americans, Asian Americans, Hispanic Americans, European ethnic groups, workers, the poor, the elderly children, adolescents, gays and bisexuals, and so on.
Following the example of the influential French Annales school of historiography American historians also focused more attention on private behavior and everyday life, producing studies of topics from marriage and family life to the climate and environment, and nearly everything in between.
American historians have been caught up in “culture-wars” controversies between the left and right in recent years, including a bitter conflict over a planned 1995 Hiroshima memorial exhibition at the Smithsonian Institution. Historians have pondered the significance of postmodern theory for their discipline, and worried about growing specialization and loss of coherence. A chronically depressed and depressing job market has marred an otherwise unprecedentedly vibrant, challenging, innovative era in American historiography.
Industry:Culture
The National Association for Stock Car Auto Racing was founded in 1948 in Daytona, Florida, to promote the products of the major automakers, first Ford and Chevrolet, and later Pontiac/ General Motors. NASCAR quickly expanded from Florida through the Midwest to California in the early 1950s, the nature of the sport changing dramatically as speeds increased from an average of just over 100 mph in 1957 to current speeds of 200 mph. In addition, dirty driving (spinning other cars on the track or bumping them from behind) has become common, increasing the popularity of the sport, and making crashes frequent occurrences. Dale Earnhardt, one of the leading drivers in the 1990s, has been notorious for his expertise in dirty driving.
The Winston Cup is the most important prize given to the driver who has the best record in the season’s races. While some very popular drivers have surfaced in recent years, none has triumphed like Richard Petty in the 1970s, when he won the Winston Cup on seven occasions.
NASCAR has generally been dominated by white males from the Southern states, like Virginia and the Carolinas. Its fan-base is both national and large, especially with the Daytona 500 being shown annually by the television networks (NASCAR is one of the few sports that always profits the networks), but it too is largely white (African Americans have seldom participated, Wendell Scott being the exception in 1963).
Industry:Culture
There is a rich history of the study of life in all civilizations, though current methods stem primarily from Eurocentric roots. However, there is an increased attention to multicultural contributions to the understanding of biological processes. Biology is typically studied at two broad organizational levels—molecular/cellular and organismic/ population. General inquiry is based on classical scientific methods which include development and testing of hypotheses, though feminists advocating interconnectedness of living systems have suggested that such approaches omit critical aspects of understanding complex entities. Inclusion of chemical, physical and mathematical techniques for examining complexities at different levels has become a defining trait of biological study since the mid-twentieth century Biology teaching has come under increasing scrutiny as technology provides alternatives to dissection and animal testing. Though some argue that substitution of computer programs for use of whole organisms creates an atmosphere of disrespect for the complexity of form and function, others contend that the destruction of living organisms for demonstration of simple principles shows equal disrespect for life.
Curricular changes are beginning to incorporate ideas of bio-ethics alongside the creative discovery of scientific principles through active learning.
Popular discussions of biological problems such as population control (P. Ehrlich’s The Population Bomb, 1968), pesticide hazards (R. Carson’s Silent Spring, 1962), conservation and biodiversity (E.O. Wilson’s BioDiversity, 1988) and genetic engineering (Suzuki and Knudsen’s Genethics, 1989) have led to questions about science as social knowledge. Social Darwinism, the application of evolutionary concepts of resource allocation to humans, persists in current social programs. Use of IQ, tests for providing access to education and other resources, as well as the influence of genetic testing on potential discrimination continue to emerge as controversial in popular literature.
Advances in biological technology have allowed the genetic engineering of food as well as medicines to become a part of everyday life. PCR (Polymerase Chain Reaction) has opened the door for sequencing DNA fragments, building comprehensive gene libraries (catalogues of known sequences), and constructing genetic hybrids. Public hysteria has been fanned over perceived problems of recombinant DNA techniques without widespread understanding of regulation, control and applications of recombinant organisms. For instance, microbial cocktails containing engineered organisms are found commonly in grease digesters of major fast food chains as well as at the frontlines of pollution eradication.
The fundamental question of defining life in a biological sense continues to be refined.
As technology provides the ability to push the limits of life sustainability from the less than two pound premature baby to the continued body functioning of a brain-dead person, questions of what is life abound. Putative evidence of life on Mars, for example, was discovered from an extraterrestrial fragment recovered in Antarctica. Though no actual life forms were found, by-products of living organisms were taken from the fragment, stimulating speculation about what conditions might have allowed life to exist on this neighboring planet and what forces might have shaped its evolution.
Biology is also linked to cultural debates where research and theory intersect with policy and change. The mapping of the human genome and progress in gene therapy have raised questions of ownership as well as impact. The specter of “biopiracy” has also been raised as corporations seek to exploit resources that have been taken as common goods.
Issues of the environment and human participation within complex ecological systems continue to keep biological knowledge and projections in the public eye.
Industry:Culture
Soul music became the voice of black America in the 1960s, but that voice was hardly a singular one. Mixing the sacred sounds of gospel with the profane of the blues—and a dash of lush, pop production—soul brought black music to new heights of expressiveness. Lyrically, soul veered from complex, adult romance to optimistic anthems of black pride, a reflection of the social changes and civil rights activism taking place in the United States.
Ray Charles, Sam Cooke, Jackie Wilson and James Brown pioneered the form in the late 1950s with their mixtures of gospel and R&B. In the 1960s, major soul music scenes centered around labels, producers and studios in Detroit, MI (Motown), Memphis (Stax/Volt), Philadelphia, PA (Gamble & Huff) and Alabama (Muscle Shoals).
Throughout the decade, artists such as Smokey Robinson, the Supremes, Ben E. King, the Temptations and Curtis Mayfield brought the sounds of sweet soul music to listeners—black and white—across mainstream America. The demand for “Respect” in the Otis Redding song, as performed by Aretha Franklin in 1967, stands as much as anything for the spirit of soul.
The soul era is generally considered to have died out when the Civil Rights era ended, signified most dramatically by the assassination of Martin Luther King, Jr. in 1968.
Musically soul lived on, though it spawned the new variations of funk (Sly Stone, James Brown, Parliament/Funkadelic) and disco (the Trammps, Donna Summer) and the singular sounds of Stevie Wonder.
Industry:Culture