Monday, 21 December 2009


From the Oxford Dictionary of English Folklore:

This word, in various spellings, means a loosely defined midwinter period (not a single day) in the early languages of most Germanic and Scandinavian countries. Bede, writing of pagan England, mentions two months, ‘early Yule’ and ‘later Yule’, corresponding to Roman December and January; after the Conversion, ‘Yule’ was narrowed to mean either the Nativity (25 December), or the twelve days of festivity beginning on this date. The word Christmas replaced ‘Yule’ in most of England in the 11th century, but not in north-eastern areas of Danish settlement, where it survived strongly till modern times as the normal dialect term for Christmas. Nineteenth-century writers took up the word as a way of denoting the ‘Christmas of olden times’, with its lavish food and secular jollity, situated in a largely invented ‘Merrie England’.

The medieval liking for pageantry and symbolism sometimes led to Yule being impersonated (cf. Father Christmas). In 1572 the Archbishop of York ordered the Mayor and Aldermen to suppress an annual parade on St Thomas’s Day (21 December) called ‘The Riding of Yule and his Wife’, because it drew ‘great concourses of people’ away from church-going, and involved disguising. The man representing Yule carried a shoulder of lamb and a large cake of fine bread; he was accompanied by his ‘wife’, carrying a distaff, and by attendants who threw nuts to the crowd.

The Yule Log (or Clog, or Christmas Block) is mentioned in folklore collections from most parts of England, but especially the West Country and the North. It would be the largest piece of wood which could fit on the family hearth, and was usually brought in on Christmas Eve with some ceremony, and put on the fire that evening; many writers, including Herrick, say it was kindled with a fragment kept from the previous year’s log. It was also generally believed that it would be very unlucky for the family if the log was allowed to go out on Christmas Day. It is not clear when the custom arose, since the first definite references are only from the 17th century, for example Aubrey’s “In the West-riding of Yorkshire on Christmas Eve at night they bring in a large Yule-log, or Christmas block, and set it on fire and lap their Christmas ale and sing ‘Yule, Yule, a pack of new cards and a Christmas stool’”.

Victorian illustrations of a medieval Christmas often show several men hauling huge trees or stumps in with ropes, but the antiquity of the word ‘Yule’ cannot prove the custom’s age. Less well known is the custom of lighting a Yule candle on Christmas Eve, first recorded by this name in 1817. These were taller than usual candles (‘half a yard in length’), and there was a tradition of chandlers and grocers giving them to their regular customers. The custom is reported chiefly from the north of the country, but its wider range is indicated by Parson Woodeforde’s diary entries, in Norfolk, such as: ‘I lighted my large wax-candle being Xmas day during tea-time this afternoon for abt. an hour’ (25 December 1790). The pre-Reformation Church made a particular feature of candles at Christmas, and strong connections between the season and candles persist to this day. It was thought unlucky to light the Yule candle before dusk on Christmas Eve, and once alight it was not moved. As with the log, a small piece was kept ‘for luck’ in the coming year.

A Christmas Reminder

Friday, 18 December 2009

Kirsty MacColl

10 October 1959 – 18 December 2000

Ezra Pound on Brooks Adams

In the Greenwood Press Ezra Pound Encyclopedia:
Carta da Visita
Written in Italian, Carta da Visita was published in Rome in 1942 by Giambattista Vicari. A new edition was published in Milan in 1974 by Vanni Scheiwiller, and an English version, in a translation by John Drummond and titled A Visiting Card, was published in London in 1952. This translation was reprinted, with some revisions, in Impact (1960), pages 44–74, and without the revisions in Selected Prose 1909–1965 (1973), pages 306–335.

Carta da Visita consists of some thirty headwords, apparently disjointed but with a certain internal coherence, dealing with economics, politics, history, and culture. The concept of the nation, its control of the economy, its monetary system and policies (the right to coin and to lend money, the introduction of stamp script), and its falling under hostile forces such as international monopolism and usurocracy govern Pound’s choice for headwords. Characteristically Pound traces the history of his favorite conflict through positive and negative models: Monte dei Paschi di Siena and the monetary experiment of Wörgl versus Bank of England. He also points to the affinities between the American and the Fascist Revolutions. As regards culture, he advances his “canon” (Homer, Sappho, the Latin elegists, Troubadours, Cavalcanti, Dante, etc.) in a dense but imprecise language for whose renewal and reinvigoration he is hoping.
Stefano Maria Casella
One of the thirty joints as published in Impact:
Brooks Adams

This member of the Adams family, son of C. F. Adams, grandson of J. Q. Adams, and great-grandson of J. Adams, Father of the Nation, was, as far as I know, the first to formulate the idea of Kulturmorphologie in America. His cyclic vision of the West shows us a consecutive struggle against four great rackets, namely the exploitation of the fear of the unknown (black magic, etc.), the exploitation of violence, the exploitation or monopolization of cultivable land, and the exploitation of money.

But not even Adams himself seems to have realized that he fell for the nineteenth-century metaphysic with regard to this last. He distinguishes between the swindle of the usurers and that of the monopolists, but he slides into the concept, shared by Mill and Marx, of money as an accumulator of energy.

Mill defined capital “as the accumulated stock of human labour.”

And Marx, or his Italian translator: “commodities, in so far as they are values, are materialized labour,”
so denying both God and nature.
With the falsification of the word everything else is betrayed.

Commodities (considered as values, surplus values, food, clothes, or whatever) are manufactured raw materials.

Only spoken poetry and unwritten music are composed without any material basis, nor do they become “materialized.”

The usurers, in their obscene and pitch-dark century, created this satanic transubstantiation, their Black Mass of money, and in so doing deceived Brooks Adams himself, who was fighting for the peasant and humanity against the monopolists.

“ ... money alone is capable of being transmuted immediately into any form of activity.” -- This is the idiom of the black myth!

One sees well enough what he was trying to say, as one understands what Mill and Marx were trying to say. But the betrayal of the word begins with the use of words that do not fit the truth, that do not say what the author wants them to say.

Money does not contain energy. The half-lira piece cannot create the platform ticket, the cigarettes, or piece of chocolate that issues from the slot-machine.

But it is by this piece of legerdemain that humanity has been thoroughly trussed up, and it has not yet got free.

Without history one is lost in the dark, and the essential data of modern history cannot enlighten us unless they are traced back at least to the foundation of the Sienese bank, the Monte dei Paschi; in other words to the perception of the true basis of credit, viz., “the abundance of nature and the responsibility of the whole people.”

The difference between money and credit is one of time.

Credit is the future tense of money. Without the definition of words knowledge cannot be transmitted from one man to another. One can base one’s discourse on definitions, or on the recounting of historical events (the philosophical method, or the literary or historical method, respectively).

Without a narrative prelude, perhaps, no one would have the patience to consider so called “dry” definitions.

The war in which brave men are being killed and wounded, our own war here and now, began - or rather the phase we are now fighting began - in 1694, with the foundation of the Bank of England.

Said Paterson in his manifesto addressed to prospective shareholders, “the bank hath benefit of the interest of an moneys which it creates out of nothing.”

This swindle, calculated to yield interest at the usurious rate of sixty percent, was impartial. It hit friends and enemies alike.

In the past the quantity of money in circulation was regulated, as Lord Overstone (Samuel Lloyd) has said, “to meet the real wants of commerce, and to discount all commercial bills arising out of legitimate transactions.”

But after Waterloo Brooks Adams saw that “nature herself was favouring the usurers.”

For more than a century after Waterloo, no force stood up to the monopoly of money. The relevant passage from Brooks Adams is as follows:
“Perhaps no financier has ever lived abler than Samuel Lloyd. Certainly he understood as few men, even of later generations, have understood, the mighty engine of the single standard. He comprehended that, with expanding trade, an inelastic currency must rise in value; he saw that, with sufficient resources at command, his class might be able to establish such a rise, almost at pleasure; certainly that they could manipulate it when it came, by taking advantage of foreign exchange. He perceived moreover that, once established, a contraction of the currency might be forced to an extreme, and that when money rose beyond price, as in 1825, debtors would have to surrender their property on such terms as creditors might dictate.” *
I’m sorry if this passage should seem obscure to the average man of letters, but one cannot understand history in twenty minutes. Our culture lies shattered in fragments, and with the monetology of the usurocracy our economic culture has become a closed book to the aesthetes.

The peasant feeds us and the gombeen-man strangles us - if he cannot suck our blood by degrees.

History is written with a knowledge of the despatches of the ambassador Barbon Morosini (particularly one dated from Paris, 28 January, 1723 (Venetian style), describing the Law affair), together with a knowledge of the documents leading up to the foundation of the Monte dei Paschi, and the scandalous pages of Antonio Lobero, archivist of the Banco di San Giorgio of Genoa.

We are still in the same darkness which John Adams, Father of the Nation, described as “downright ignorance of the nature of coin, credit, and circulation.”

* Brooks Adams, The Law of Civilization and Decay

From ‘Cider with Rosie’

by Laurie Lee:

The week before Christmas, when snow seemed to lie thickest, was the moment for carol-singing; and when I think back to those nights it is to the crunch of snow and to the lights of the lanterns on it. Carol-singing in my village was a special tithe for the boys, the girls had little to do with it. Like hay-making, blackberrying, stone-clearing, and wishing-people-a-happy-Easter, it was one of our seasonal perks.

By instinct we knew just when to begin; a day too soon and we should have been unwelcome, a day too late and we should have received lean looks from people whose bounty was already exhausted. When the true moment came, exactly balanced, we recognized it and were ready.

So as soon as the wood had been stacked in the oven to dry for the morning fire, we put on our scarves and went out through the streets, calling loudly between our hands, till the various boys who knew the signal ran out from their houses to join us.

One by one they came stumbling over the snow, swinging their lanterns’ around their heads, shouting and coughing horribly.

‘Come carol-barking then?’

We were the Church Choir, so no answer was necessary. For a year we had praised the Lord out of key, and as a reward for this service - on top of the Outing - we now had the right to visit all the big houses, to sing our carols and collect our tribute.

To work them all in meant a five-mile foot journey over wild and generally snowed-up country. So the first thing we did was to plan our route; a formality, as the route never changed. All the same, we blew on our fingers and argued; and then we chose our Leader. This was not binding, for we all fancied ourselves as Leaders, and he who started the night in that position usually trailed home with a bloody nose.

Eight of us set out that night. There was Sixpence the Tanner, who had never sung in his life (he just worked his mouth in church); the brothers Horace and Boney, who were always fighting everybody and always getting the worst of it; Clergy Green, the preaching maniac; Wait the bully, and my two brothers. As we went down the lane other boys, from other villages, were already about the hills, bawling ‘Kingwenslush’, and shouting through keyholes ‘Knock on the knocker! Ring at the Bell! Give us a penny for singing so well!’ They weren’t an approved charity as we were, the Choir; but competition was in the air.

Our first call as usual was the house of the Squire, and we trouped nervously down his drive. For light we had candles in marmalade-jars suspended on loops of string, an d they threw pale gleams on the towering snowdrifts that stood on each side of the drive. A blizzard was blowing, but we were well wrapped up, with Army puttees on our legs, woollen hats on our heads, and several scarves around our ears.

As we approached the Big House across its white silent lawns, we too grew respectfully silent. The lake near by was stiff and black, the waterfall frozen and still. We arranged ourselves shuffling around the big front door, then knocked and announced the Choir.

A maid bore the tidings of our arrival away into the echoing distances of the house, and while we waited we cleared our throats noisily. Then she came back, and the door was left ajar for us, and we were bidden to begin. We brought no music, the carols were in our heads. ‘Let’s give ’em “Wild Shepherds”,’ said Jack. We began in confusion, plunging into a wreckage of keys, of different words and tempo; but we gathered our strength; he who sang loudest took the rest of us with him, and the carol took shape if not sweetness.

This huge stone house, with its ivied walls, was always a mystery to us. What were those gables, those rooms and attics, those narrow windows veiled by the cedar trees. As we sang Wild Shepherds’ we craned our necks, gaping into that lamplit hall which we had never entered; staring at the muskets and untenanted chairs, the great tapestries furred by dust - until suddenly, on the stairs, we saw the old Squire himself standing and listening with his head on one side.

He didn’t move until we’d finished; then slowly he tottered towards us, dropped two coins in our box with a trembling hand, scratched his name in the book we carried, gave us each a long look with his moist blind eyes, then turned away in silence.

As though released from a spell, we took a few sedate steps, then broke into a run for the gate. We didn’t stop till we were out of the grounds. Impatient, at last, to discover the extent of his bounty, we squatted by the cowsheds, held our lanterns over the book, and saw that he had written ‘Two Shillings’. This was quite a good start. No one of any worth in the district would dare to give us less than the Squire.

Steadily we worked through the length of the valley, going from house to house, visiting the lesser and the greater gentry - the farmers, the doctors, the merchants, the majors, and other exalted persons. It was freezing hard and blowing too; yet not for a moment did we feel the cold. The snow blew into our faces, into our eyes and mouths, soaked through our puttees, got into our boots, and dripped from our woollen caps. But we did not care. The collecting-box grew heavier, and the list of names in the book longer and more extravagant, each trying to outdo the other.

Mile after mile we went, fighting against the wind, falling into snowdrifts, and navigating by the lights of the houses. And yet we never saw our audience. We called at house after house; we sang in courtyards and porches, outside windows, or in the damp gloom of hallways; we heard voices from hidden rooms; we smelt rich clothes and strange hot food; we saw maids bearing in dishes or carrying away coffee-cups; we received nuts, cakes, figs, preserved ginger, dates, cough-drops, and money; but we never once saw our patrons. We sang as it were at the castle walls, and apart from the Squire, who had shown himself to prove that he was still alive, we never expected it otherwise.

We approached our last house high up on the hill, the place of Joseph the farmer. For him we had chosen a special carol. which was about the other Joseph, so that we always felt that singing it added a spicy cheek to the night. The last stretch of country to reach his farm was perhaps the most difficult of all. In these rough bare lanes, open to all winds, sheep were buried and wagons lost. Huddled together, we tramped in one another’s footsteps, powdered snow blew into our screwed-up eyes, the candles burnt low, some blew out altogether, and we talked loudly above the gale.

Crossing, at last, the frozen mill-stream - whose wheel in summer still turned a barren mechanism - we climbed up to Joseph’s farm. Sheltered by trees, warm on its bed of snow, it seemed always to be like this. As always it was late; as always this was our final call. The snow had a fine crust upon it, and the old trees sparkled like tinsel.

We grouped ourselves round the farmhouse porch. The sky cleared, and broad streams of stars ran down over the valley and away to Wales. On Slad’s white slopes, seen through the black sticks of its woods, some red lamps still burned in the windows.

Everything was quiet; everywhere there was the faint crackling silence of the winter night. We started singing, and we were all moved by the words and the sudden trueness of our voices. Pure, very dear, and breathless we sang:

As Joseph was a walking
He heard an angel sing;
‘This night shall be the birth-time
Of Christ the Heavenly King.
He neither shall be bornèd
In Housen nor in hall,
Nor in a place of paradise
But in an ox’s stall …’

And two thousand Christmases became real to us then; the houses, the halls, the places of paradise had all been visited; the stars were bright to guide the Kings through the snow; and across the farmyard we could hear the beasts in their stalls. We were given roast apples and hot mince-pies, in our nostrils were spices like myrrh, and in our wooden box, as we headed back for the village, there were golden gifts for all.

Thursday, 17 December 2009

Christmas in America: European Inheritances

From Penne L. Restad, Christmas in America: A History (New York and Oxford: Oxford University Press, 1996):

"Shall we have Christmas?" was the way one Pennsylvanian asked the question in 1810. Throughout their colonial history and well into nationhood, not only the matter of "shall" but of "how shall" Christmas be celebrated challenged Americans. Their search for answers to these two difficult and sometimes divisive issues can be found in a chronicle of evolving customs, cultural discord, and striking invention. It begins with the first European emigres, who brought to America an ambiguous legacy concerning the holiday that was almost as old as the Christian Church itself.

Christians had wrestled for centuries with questions of if, when, and how to celebrate Jesus' birth. As a commemoration of the miracle that established the Godly paternity of Jesus, Christmas was a celebration of the event upon which the existence of Christianity depended. At the same time, the festival functioned from its inception as an end-of-year substitute for pagan rites and quickly absorbed many profane elements, ones that remain among its most attractive features. As the observance of Christmas spread, the details of its celebration became as varied as the cultures that kept it and as changeable as the history of those cultures. But the radically paradoxical mix of both the sacred and the profane remained.

The earliest Christians gave little attention to Jesus' birth. They expected the Second Coming any day, and in any case viewed birthday celebrations as heathen. As the possibility of his imminent return faded, the faithful


took a more historical perspective and began to search for evidence of the day or even season of Jesus' birth. They found no clues in the Gospels. Nor could they locate any other reliable sources to pinpoint his nativity. Undeterred, some placed his birth on May 20 and others on April 19 or 20. Clement, Bishop of Alexandria (died c. 215), nominated November 18. Hippolytus (died c. 236) calculated that Christ must have been born on Wednesday, the same day God created the sun. The De Pascba Computus, written anonymously in North Africa about 243, posited that the first day of creation coincided with the first day of spring, on March 25, and contended that Jesus' birthday fell four days later, on March 28.

Sometime in the fourth century of the Common Era, the Roman Church began to celebrate a Feast of the Nativity and to do so on December 25. A variety of issues influenced the decision. Internally, heresies plagued Church authority. Arianism, one of the most threatening, regarded Jesus as a solely human agent of God. The Church insisted on his divinity. By assigning him one human quality -- a birthday -- it appropriated some of Arianism's appeal, but sustained Jesus' place in the Holy Trinity.

The Church had also grown concerned about the increasing popularity of pagan religions and mystery cults in Rome. Each year beginning on December 17, the first day of Saturnalia, and continuing through Kalends, the first day of January, most Romans feasted, gamed, reveled, paraded, and joined in other festivities as they paid homage to their deities. The Church's alarm deepened when Emperor Aurelian, noticing that the pagan rituals had begun to converge around Mithras, the solar god, decreed in 274 C.E. that December 25, the winter solstice on the Julian Calendar, be kept as a public festival in honor of the Invincible Sun. Rome's Christians challenged paganism directly by specifying December 25, rather than some other date, as the day for their Nativity Feast.

Exactly when the Church of Rome began to keep Christmas, however, is not known. The first extant reference to the Feast of the Nativity may be as old as 336, in the earliest list of martyrs of the Roman Church. Perhaps Christmas was celebrated even earlier. Some scholars believe that Emperor Constantine (ruled 312-337 C.E.), who had converted to Christianity and built the Vatican atop the hill where the Mithras cult worshipped the sun, may have instituted the festival.

In any case, by the middle of the fourth century, the Church had boldly declared its Nativity holy day to be observed on the same day as the winter solstice. The concurrence of the two celebrations gave the Church an opportunity to turn elements of the Saturnalia itself to Christian ends. For


example, it used the creation of the sun, the center of the Saturnalia, to reinforce and symbolize frequent scriptural and doctrinal imagery of God as the sun, and of Jesus' role as Son of God. The creation of Christmas was thus a measure of Christianity's growing power, challenging the crowds enjoying Saturnalian revelry to join the once secretive Christians in a celebration not of the birth of the sun, but rather the birth of Jesus, the Son of God.

The overlapping of Saturnalia and the Feast of the Nativity set the terms of all future debate over the Christmas festival. Its Christian aspects, at least in their most intense form, emphasized heavenly afterlife. The heathen elements absorbed into the festival affirmed life and exalted its annual renewal. The Church made no clear separations between the two perspectives. Instead, it layered profane activities with sacred ends to answer the needs, spiritual and physical, of the total person. This combination of sacred and profane made some religious leaders uncomfortable. For example, Gregory of Nazianzen (died 389) urged that "the celebration of the [Christmas] festival [be conducted] after an heavenly and not after an earthly manner" and cautioned against "feasting to excess, dancing and crowning the doors." Indeed, the paradox of purpose forged an enduring Christmas reality. As one historian succinctly characterized it: "The pagan Romans became Christians -- but the Saturnalia remained."

The custom of honoring Jesus' birth on December 25 quickly spread to the Eastern Church. By 380, Christians in Constantinople honored it as "Theophany or the Birthday." These Christians had once observed Epiphany, January 6, as a joint Feast of the Nativity and Baptism. This was the same date that popular legends held pagan gods made themselves known to humans. "Deep in the tradition of the Church's spirituality," writes John Gunstone, "was the idea that Christ's appearance in flesh was the consummation of all epiphanies." During the Christological controversies of the fourth and fifth centuries, the celebration of Epiphany spread westward, but the Roman Church, with its celebration of the Nativity set in late December and its emphasis on Jesus' incarnation and divinity, recast it to commemorate the adoration of the Magi. In Constantinople, Epiphany continued to consecrate Jesus' baptism, but the Eastern Church began to mark December 25 as the day of his birth. The dual celebration, that of birth and baptism, that had defined the old holy day ceased to exist.

Over the next thousand years, the observance of Christmas followed the expanding community of Christianity. By 432, Egyptians kept it. By the end of the sixth century, Christianity had taken the holiday far northward


and into England. During the next two hundred years in Scandinavia it became fused with the pagan Norse feast season known as Yule, the time of year also known as the Teutonic "Midwinter." Sometime around the Norman incursion in 1050, the Old English word Christes maesse (festival of Christ) entered the English language, and as early as the twelfth century "Xmas" had come into use. From the thirteenth century on, nearly all Europe kept Jesus' birth.

The tension between the folk and ecclesiastical qualities of the holy day did not ease with the advance of Christmas-keeping. Documents of the Middle Ages, Tristram Coffin has noted, were "fat with decrees against the abuses of Christmas merriment," an indication "that people at large [were] doing just what they ha[d] always done and paying little attention to the debates of the moralists." Some clergy stressed that fallen humankind needed a season of abandon and excess, as long as it was carried on under the umbrella of Christian supervision. Others argued that all vestiges of paganism must be removed from the holiday. Less fervent Christians complained about the unreasonableness of Church law and its attempts to change custom. Yet the Church sustained the hope that sacred would eventually overtake profane as pagans gave up their revels and turned to Christianity.

These conflicts continued during the Protestant Reformation, but with little promise of resolution. In England, the Anglican Church repeatedly, but with little success, tried to gain control over the day. Its custom had been to begin Christmas on December 16 (known as "O Sapientia") and celebrate for nine days. But during King Alfred's reign (871-899 C.E.), a law passed extending the celebration to twelve days, ending on Epiphany.

Celebrants devoted much of the season to pagan pleasures that were discouraged during the remainder of the year. The annual indulgence in eating, dancing, singing, sporting, card playing, and gambling escalated to magnificent proportions. By the seventeenth century, under the reigns of the Tudors and Stuarts, the Christmas season featured elaborate masques, mummeries, and pageants. In 1607, King James I insisted that a play be acted on Christmas night and that the court indulge in games. One account of an evening's "moderate dinner" noted a first course of sixteen dishes. In 1626, the Duke of Buckingham found that the captains, masters, boatswains, gunners, and carpenters of three ships had abandoned their service in favor of Christmas revels, leaving their vessels prey to any enemy. In


1633, the four Inns of Court presented a masque, The Triumph of Peace," at a cost of £20,000.

It fell to Puritan reformers to put a stop to the unholy merriment and to bend arguments over the proper keeping of Christmas into an older and more basic one -- whether there should even be an observance of the day. Defying the decision of the Anglican Convocation of 1562 to maintain the church calendar, the Puritans struck Christmas, along with all saints' days, from their own list of holy days. The Bible, they held, expressly commanded keeping only the Sabbath. That would be their practice as well.

In taking the offensive against Christmas-keeping, Puritans distributed colorful diatribes against the excesses of the holiday. Philip Stubbes's Anatomy of Abuses (1583) condemned revelous celebrants as "hel hounds" in a "Deville's daunce" of merriment. William Prynne's Histriomastix (1633) inveighed against plays, masques, balls, and the decking of houses with greens. "Into what a stupendous height of more than pagan impiety. . . have we not now degenerated!" he lamented. Christmas, he thought, ought to be "rather a day of mourning than rejoicing," not a time spent in "amorous mixt, voluptuous, unchristian, that I say not pagan, dancing, to God's, to Christ's dishonour, religion's scandal, chastities' shipwracke and sinne's advantage."

Even as Puritan condemnation of Christmas intensified, the economic and social upheaval of the late sixteenth and early seventeenth century had begun to alter English life. The standing social order, along with the paternalism of its manor system, was crumbling. Christmas, in its role as a part of the old structure, could not escape unscathed. In some years, the lavish celebrations lapsed. In many cases, the emphases of the holiday changed. It transformed, in the words of J. M. Golby and A. W. Purdue, into "a symbol for hospitality towards the poor, an understanding between the different levels of society, and happier and more prosperous times in now neglected villages." King Charles I (1625-1649) went so far as to direct his noblemen and gentry to return to their landed estates in midwinter in order to keep up their old style of Christmas generosity.

The rise of Oliver Cromwell's Puritan Commonwealth dealt another staggering blow to England's Christmas celebrations. Parliament outlawed seasonal plays in 1642. It ordered that the monthly fast, which coincidentally fell on Christmas in 1644, be kept. Parliament purposely met on every Christmas from 1644 to 1652. In 1647, it declared Christmas a day of penance, not feasting, and in 1652 "strongly prohibited" its observance.


Ministers who preached on the Nativity risked imprisonment. Churchwardens faced fines for decorating their churches. By law, shops stayed open on Christmas as if it were any regular business day.

Yet resistance was not uncommon. One year, protesting Londoners decorated churches and shops with swags of bay, rosemary, box, holly, privit, and ivy, only to watch the Lord Mayor and City Marshal ride about setting fire to their handiwork. The populace "so roughly used" the merchants who ventured to open shop in 1646 that the shopkeepers petitioned Parliament for protection. In Canterbury, when the Lord Mayor ordered that the markets be kept open that Christmas, a "serious disturbance ensued . . . wherein many were severly hurt."

It was within this particularly turbulent era that English Christmas customs entered early Virginia and New England. Most settlers and adventurers arriving in the New World welcomed Christmas as a day of respite from the routines of work and hardship. Some observed it, at least in part, as a holy day. Others attempted to feast. On Christmas, 1608, Captain John Smith and his men, having endured for "six or seven dayes the extreame winde, rayne, frost and snow" as they traveled among the Indians of Virginia colony, "were never more merry, nor fed on more plentie of good Oysters, Fish, Flesh, Wild-foule, and good bread; nor never had better fires in England." Maryland-bound passengers aboard the Ark in 1633 "so immoderately" drank wine on Christmas that "the next day 30 sickened of feve[r]s and whereof about a dozen died afterward."

Only Dissenters tried to ignore the holiday. The Mayflower Pilgrims, who arrived at Plymouth in December 1620, spent Christmas building "the first house for commone use to receive them and their goods." Within a year, however, the Pilgrims themselves had to face dissent. On the morning of December 25, 1621, less reform-minded newcomers to the colony "excused them selves and said it wente against their consciences to work on that day." Governor William Bradford allowed the "lusty yonge" Englishmen to rest, saying he "would spare them till they were better informed." But at noon he found them playing games in the street. Angered, Bradford told the frolickers that it ran against bis conscience that they should play while others worked. If they desired to keep Christmas as a matter of devotion they should stay in their houses, he said, "but ther should be no gameing or revelling in the streets." Nor did the Puritans of Massachusetts Bay Colony observe Christmas. Governor John Winthrop


entered nothing in his diary on his first Christmas in America in 1630, and in succeeding years he attempted to suppress the holiday.

In the early non-English settlements, sparse evidence points to a more traditional attitude toward the holiday. In 1604, for instance, French settlers of St. Croix Island, off the coast of Maine, held religious services and spent the remainder of Christmas Day playing games. In 1686 LaSalle's French colony on Garcita Creek celebrated what was probably the first Christmas in Texas. "[W]e first kept the Christmas Holy-Days. The Midnight Mass was sung, and on Twelve-Day, we cry'd The king drinks . . . tho' we had only Water...."

As the first settlements grew into more established communities, patterns of Christmas celebration peculiar to the colonies began to appear. Geographic separation from European homelands, the proximity of disparate religious and ethnic groups to each other, and the hardship of new beginnings disrupted old habits and holidays. In Dutch New Amsterdam, early in the seventeenth century, eighteen languages could be heard among the 500 or so inhabitants. Numerous Christmases abounded, persisting as an expression of individual heritages. In large towns, where various groups lived close together, the common ground for celebration could often be found in public and secular rather than in potentially divisive religious areas. Thus, Christmas, although widely celebrated, retained little importance in society as a whole precisely because of religious and cultural diversity.

Particularly in the middle colonies, a wide range of ethnicities and religions prevented a shared ecclesiastic and religious holiday. Pennsylvania Quakers scorned Christmas as adamantly as Puritans did. Huguenots, Moravians, Dutch Reformed, and Anglicans, who also lived in the colony, all kept Christmas in their own way. Shortly after Americans had won their independence, Elizabeth Drinker, a Quaker herself, divided Philadelphians into three categories. There were Quakers, who "make no more account of it [Christmas] than another day," those who were religious, and the rest who "spend it in riot and dissipation."

"Frolicking," the name many gave to this sort of boisterous Christmas and New Year's fun, could be found throughout the colonies. In the New England countryside, revelous intruders entered houses with a speech and swords at Christmas time. Far into the eighteenth century, masked merrymakers roved Pennsylvania's Delaware Valley "making sport for everyone." Southerners shot guns, a custom similar to one practiced in northern England.


The antecedents to this seasonal phenomenon have been traced to Roman times, when early Christians, seeking to ridicule pagan superstition and the Roman custom of masquerading, masked themselves on New Year's Day. Many, however, flagged in their intent and joined in the heathens' frolics. Church officials attempted to persuade members to desist, but failed. In time, even clergy could be found in full disguise, taking part in miracle and mystery plays performed during the Christmas season.

The convention of disguising, or mumming, and performing plays and skits dispersed throughout nearly all European countries. In England, beginning under the reign of Edward III (1327-1377), it became a form of royal entertainment. It peaked in the fifteenth and sixteenth centuries, when elaborate dress and formal presentations, such as Ben Johnson's Masque of Christmas and that in Shakespeare's Henry VIII, were the order of the season. Enthusiasm for court masques diminished thereafter, dampened by the Puritan Directory. But the tradition of masquerading and mumming continued to thrive in more rustic forms. In parts of England, householders, family, guests, and servants donned masks and painted their faces or darkened them with soot to become "guisers," "geese-dancers," or "morris dancers." Often they dressed as animals. Sometimes men and women exchanged clothes with each other. Disguised, they played crude tricks on one another, or went from house to house and entered without permission. There they might dance, sing, feast, and act "a rude drama," mocking propriety and challenging the social order.

American colonists engaged in similar antics, though usually without the performance of even a rudimentary play. They concentrated instead on disguises, noisy good humor, and chaotic peregrinations through neighborhoods. Across the land, revelers, almost always males, gathered to shoot off firecrackers and guns, paraded with musical instruments, call from house to house in garish disguise, and beg for food and drink on December 25 and, in some places, on New Year's.

Such frolics, drawn from the custom of English Anglicans, as well as those of Swedish, German, and other settlers, were especially prominent in New York and Pennsylvania. Samuel Breck remembered maskers from his Pennsylvania childhood in the late eighteenth century. "They were a set of the lowest blackguards," he wrote, "who, disguised in filthy clothes and ofttimes with masked faces, went from house to house in large companies, . . . obtruding themselves everywhere, particularly into the rooms that were occupied by parties of ladies and gentlemen, [and] would demean themselves with great insolence." As the elder Breck and his friends


played cards, Samuel had watched the mummers "take possession of a table, seat themselves on rich furniture and proceed to handle the cards, to the great annoyance of the company." He could only get rid of them by "giv[ing] them money, and listening) patiently to a foolish dialogue between two or more of them . . ."

Usually an informal code regulated the mummers' reception. According to one set of rules, "the proper custom" had been to ask the uninvited guests "into the house and regale them with mulled cider, or small beer, and home-made cakes," or "give the leading mummers a few pence as a dole, which . . . they would 'pool,' and buy cakes and beer." One never "address[ed] or otherwise recognize[d] the mummer by any other name than the name of the character he was assuming.

In New York, the calling ritual varied slightly. Men had gone from house to house, firing their guns, on New Year's Day since "time immemorial." At each place, after being invited in for food and drink, the men of the household joined them. "[T]hus they went on increasing their numbers until the whole neighborhood had been saluted and visited. . . . " The remainder of the day the shooters engaged in contests of marksmanship and other sports. At least one, the "very barbarous amusement" of "Shooting Turkeys," required a keen eye and sharp betting skills.

The southern colonies, largely rural and unhampered by Quaker and Puritan dissenters and whose white population was comparatively less diverse, cultivated Christmases of a very different sort. Decentralized living, a dearth of women, and a high death rate kept the holiday at bay during the first decades of settlement. As social and political conditions stabilized, southerners began to look to England for models of dress, manner, and social behavior. Their Christmas, like that of the English manor, evolved as an interval of leisure rather than a set of rituals assigned to one particular day. During the season, Virginians, Carolinians, and Marylanders especially enjoyed dancing, but also engaged in card playing, cock fighting, nine-pins, and horse racing. Anglicanism, the established religion in most of the planting colonies, did not pressure its members into sacred observance.

While southerners may have aspired to recreate a sense of the English Christmas, its authentic reproduction eluded them. No pre-Revolutionary account mentions boars' heads or wassail bowls, mummers or waits. In England those traditions had been on the wane when John Smith first ventured through Virginia, and by the 1650s had been mortally threatened


by Cromwell's Parliament. A French traveler, who along with his entourage of nearly twenty stopped unannounced at the Virginia home of Colonel William Fitzhugh in 1680, left one of the few accounts from the seventeenth century. "[T]here was good wine and all kinds of beverages, so there was a great deal of carousing," the visitor wrote. For entertainment, Fitzhugh provided "three fiddlers, a jester, a tight-rope walker, and an acrobat who tumbled around." When the travelers left the next day, Fitzhugh sent wine and punch to the river's edge for them and then lent them his boat.

By the middle of the eighteenth century, tales of Virginia Christmases had spread back to England and began to create an aura of romance around the South. "All over the Colony, a universal Hospitality reigns," London Magazine reported in 1746; "full Tables and open Doors, the kind salute, the generous Detention, speak somewhat like the old Roast-beef Ages of our Fore-fathers. ... Strangers are fought after with Greediness, as they pass the Country, to be invited."

Evidence of eighteenth-century Christmas celebrations is nearly as scarce as for the seventeenth. Best known is the Christmas chronicled by Philip Vickers Fithian, a Presbyterian tutor from New Jersey. Fithian spent a single Christmas season, in 1773, at Nomini Hall, a plantation owned by Robert Carter, one of the wealthiest Tidewater planters. The first sign of the season he recorded occurred on Monday, December 18; students barred one of Fithian's colleagues from teaching school until "twelfth-day" (January 6), a custom known throughout the British Commonwealth. However, Fithian continued to teach, noting proudly that his "scholars are a more quiet nature, and have consented to have four or five Days now, and to have their full Holiday in May next. . . . "

Excitement built as the holiday approached. "Nothing is now to be heard of in conversation, but the Balls, the Fox-hunts, the fine entertainments, and the good fellowship . . . ," Fithian wrote on the 18th. His entry for Christmas Day began, "Guns fired all round the House," after which the various "Servants" who regularly attended him greeted him with "Joyful Christmas." He rewarded them with the expected small change and a donation to a "Christmas Box." As for Christmas dinner, Fithian noted that it "was no otherwise than common yet as elegant" as any he had ever attended. Not until the following Sunday, December 26, did he and the Carters go to church. The minister "preach'd from Isaiah 9.6 For unto us a child is Born &c. his sermon was fifteen Minutes long! very fashion-


able—," but few attended. Fithian reopened his school the following Wednesday, December 29. The holidays at Nomini Hall had ended.

Not all southerners partook of the sumptuous Christmases reported in London or witnessed by Fithian. Neither ritually exacting nor regularly held, the holiday on each plantation seemed to have its own style of celebration. In 1709, William Byrd began Christmas by attending church, where he "received the sacrament with great devoutness." Afterward, he dined on roast beef with friends and "in the evening we were merry with nonsense and so were my servants. . . . " The following year he spent Christmas quite differently, reading a sermon and dining alone. Thomas Jefferson rarely mentioned Christmas. George Washington frequently spent his holiday hunting and settling such year-end financial matters as renewing the terms of indenture for his servants, and attending church.

Perhaps, as Julian Boyd has suggested, the Enlightenment, which uprooted superstitions and redefined social classes, prevented a precise duplication of an English Christmas. Indeed, there may even have been some attempt to rationalize the Christmas festival. In December 1739, the Virginia Gazette briefly recounted a history of the holiday, noting that some Christians "celebrate this Season in a Mixture of Piety and Licentiousness," others "in a pious Way only," others "behave themselves profusely and extravagantly alone." The last category was comprised of the many who "pass over the Holy Time, without paying any Regard to it at all." The writer concluded that "On the whole, they who will be over-religious at this Time, must be pardoned and pitied; they who are falsely religious, censured; they who are downright criminal, condemned; and the Little Liberties of the old Roman December, which are taken by the Multitude, ought to be overlooked and excused, for an Hundred Reasons. . . . "

This broadly permissive approach to Christmas contrasted sharply with prevailing attitudes in New England. Like their forebears in England, the Puritan leaders of New England sought to expunge the holiday altogether. Their struggle betokened a broader battle against growing numbers of non-Puritans in the region and periodic intervention in religious affairs on the part of the Crown.

The entry of non-Puritans began at the founding of Plymouth and Massachusetts Bay colonies, and increasingly presented a problem as displaced English workers, many of them Anglican, bolstered the labor-short economies. At first, Puritans relied on what one historian called the "infor-


mal pressure of like minded co-religionists" to quell the observance of Jesus' birth. But this strategy proved inadequate. In 1659, in an atmosphere of tension over Anglicanism, other heresies, new trade, and general disarray, the Massachusetts Bay General Court banned the keeping of Christmas by "forebearing of labour, feasting, or any other way." The law aimed to prevent the recurrence of further, unspecified "disorders" which had apparently arisen in "seurerall places ... by reason of some still observing such Festiualls," and provided that "whosoeuer shall be found observing any such day as Xmas or the like . . . " would be fined.

Pressure from England contributed to the troubled atmosphere. All of the once forbidden holiday rites had begun to be practiced once again during the Restoration in Britain, in forms more extreme than before. As early as 1665, Charles II demanded that Massachusetts rescind its anti-Christmas law to reflect these changes. Finally in 1681, Massachusetts issued a repeal, citing as a reason that a ban on Christmas would be a derogation of the King's honor. Still, in 1686, Puritan militants barred newly appointed English Governor Andros from holding his Christmas services in their meeting house and forced him to move to the Boston Town Hall.

The renewed English fervor for the raucous excesses of Christmas began to wane almost as rapidly as it had revived, while New England's Puritan leadership gave little indication that it had gained much tolerance for the holiday. "[M]en dishonour Christ more in the 12 days of Christmas than in all the 12 months besides," wrote Increase Mather in his diary. He reiterated the case against Christmas in A Testimony Against Several Profane and Superstitious Customs Now Practiced by Some in New England, a tract published in England in 1687. "In the Apostolical times," Mather wrote, "the Feast of the Nativity was not observed. ... It can never be proved that Christ was born on December 25. ... The New Testament allows of no stated Holy-Day but the Lords-day. ... It was in compliance with the Pagan saturnalia that Christmas Holy-dayes were first invented. The manner of Christmas-keeping, as generally observed, is highly dishonourable to the Name of Christ."

Increase's son Cotton escalated the rhetoric against the holiday by making more explicit the fearful connection between Christmas and sin. He even linked it to Salem's witchcraft. "On the twenty-fifth of December it was," he wrote, "that Mercy [Short] said, They were going to have a Dance; and immediately those that were attending her, most plainly Heard and Felt a Dance, as of Barefooted People, upon the Floor. . . . " Mather


later denounced the holiday in more general terms. "I hear a Number of people of both Sexes, belonging, many of them to my Flock, who have had on Christmas-night, this last Week, a Frolick, a revelling Feast, and Ball, which discovers their Corruption, and has a Tendency to corrupt them yett more, and provoke the Holy One to give them up into eternal Hardness of Heart."

Despite his strong tone, Cotton Mather did not forthrightly condemn Christmas itself. Like Bradford, who in 1621 had stopped the newcomers' street revelry, he expressed more concern for the liberties taken during the celebration of Christmas than for the fact of celebration. Calling the merrymaking an "affront unto the grace of God," he tacitly turned the question of "should" to one of "how" to hallow Jesus' birth. "Can you in your consciences think that our holy saviour is honored by mirth, by long eating, by hard drinking, by lewd gaming, by rude revelling, by a mass fit for none but a Saturn or a Bacchus, or the light of Mahametan Romandon?" he asked. "Shall it be said that at the birth of our Saviour ... we take the time to please the hellish legions and to do actions that have much more of hell than of heaven in them?"

In all, Christmas became a point at which Puritan piety and autonomy grated against English custom, British authority, and Anglican influence. Bostonians, for example, openly repudiated Anglicanism by refusing to close their businesses on Christmas. "Carts come to Town and Shops open as is usual," Judge Samuel Sewall noted on December 24, 1685 (and nearly every year after). That same year Sewall smugly noted that he thought the British colonial officials were "vexed . . . that the Body of the People profane it [Christmas]," and thanked God that there was "no Authority yet to compell them [i.e. Puritans] to keep it." The Crown-appointed governor twice took Sewall aside in 1722 to discuss recessing the General Court on Christmas. Sewall opposed adjournment, but suggested (after a discussion with Cotton Mather) that the matter be voted on by the Council and Representatives. The governor took the opposite side, arguing that "All kept Christmas" except the Puritans. Provoked, Sewall responded: "the Dissenters came a great way for their/Liberties and now the [Anglican] Church had theirs, yet they could not be contented, except they might Tread all others down." Ultimately, the governor ignored Sewall's entreaty and closed the court on Saturday until the following Wednesday, December 26.

Others besides the British government challenged the Puritans on Christmas. Holiday rituals in observing churches attracted a fair number of


putative Calvinists at Christmastide. Ebenezer Miller, graduate of Harvard but recently ordained an Anglican, Sewall noted in 1727, "keeps the day in his New church at Braintrey: people flock thither." On another occasion he spoke to a Mr. Newman "about his partaking with the French church on the 25th of December, on account of its being Christmas, as they abusively call it." Congregational ministers countered by ordering fasts on Christmas Day and tried in other ways to show their disregard for the festival. One spent the Sunday preceding Christmas outlining his proof that the celebration of Jesus' birth was "Popery and prelatic tyranny, a destroyer of consciences."

In the end, whether slowly in New England or more rapidly in the middle colonies and the South, the forces of pluralism and the need for social harmony shaped and encouraged Christmas celebration. Yet its status as a holiday remained haphazard and varied widely. Like the colonies in general on the eve of the Revolution, regions and communities were as notable for their different approaches to the holiday as for their commonalities. It would take the project of nation-building in the wake of the Revolution to begin to define an American conception of Christmas.


Invisible Victims: Elites, the Steamroller, and the Spiral of Silence

Excerpts from chapters 8, 9 & 10:‘The Spiral of Silence and the New McCarthyism,’ ‘Affirmative Action, the University, and Sociology,’ & ‘Elite Accommodation and the Flaws of Affirmative Action.’

Prior posts from this book: 1, 2, 3. Bibliography for all excerpts here.

From Frederick R. Lynch, Invisible Victims: White Males and the Crisis of Affirmative Action (Westport, CT: Praeger Paperback, 1991):

Chapter 8: The Spiral of Silence and the New McCarthyism

In the winter of 1982, a student whom I shall call "Bill" informed me after class that another faculty member, Professor X, was being called "racist" by a small clique of minority students. The charges had been voiced in a class taught by another faculty member and in various informal student gatherings.

"I respect Professor X," said Bill, "and I think what they're saying is bullshit. But I think he should know what's going on before it hits the fan. If he doesn't know, you should tell him."

A couple of days later, behind closed doors, I informed Professor X of the problem. He was stunned, saddened, and, not least of all, afraid. Professor X had received tenure the year before. But tenure was a thin shield against charges of racism and we both knew it. Initially, Professor X had absolutely no idea how such charges could have come his way. We were both aware of the irony of my informing him of such accusations, for I had voiced fears to him and to others that I might be called "racist" because of my research on affirmative action. (See "Doing Affirmative Action Research in California" at the end of Chapter 9.)

There was no formal complaint, no hearing before the dean or anything else. Just name-calling, which faded away. But the effects lingered on.

In 1988, I asked Professor X to re-assess the matter. He had no trouble recalling the event. He ascribed the incident to four minority students' misperceptions of his lectures on family structure and mobility. He had cited research in class which indicated that having large numbers of children at an early age -- especially out of wedlock -- was likely to inhibit upward social mobility. The four minority students, schooled in other classes against


"blaming the victim," thought Professor X was doing just that and attacking their way of life. Not coincidentally, Professor X remarked, the accusing students had received low grades in previous courses they had taken from him.

This behind-the-scenes name-calling heightened Professor X's sensitivity. Already shy about mentioning matters of race to student audiences, he now rarely, if ever, mentions race as a variable in social research in his classroom. In spite of the fact that race and ethnicity often appear in the statistical data of his subject matter, Professor X prefers to "leave such matters to the text." When it comes to race, Professor X censors himself. Professor X is hardly alone. Most other social scientists and university faculty have felt the chill.

Self-censoring, "chilled" faculty reflect the power of ideological taboos that have dominated American intellectual discourse for nearly twenty years. Yet "chilled" and "taboos" are not quite sufficiently precise terms to understand how intellectual thought in America has been self-censored. The steamroller metaphor set forth in the first chapter comes somewhat closer to the process of what has happened. But there are more precise concepts to understand how people have perceived what they can and cannot discuss, regardless of public opinion.

Attempts to ignore or suppress Stanley Greenberg's research by politicians and the mass media suggest some degree of formal censorship. Far more powerful than formal censorship has been massive, informal censorship, which German sociologist Elisabeth Noelle-Neumann has aptly termed the "spiral of silence." The spiral of silence, in turn, must be linked with another reinforcing concept, cognitive dissonance. The operation of both of these processes has produced what is best described as a New McCarthyism.


Elisabeth Noelle-Neumann's spiral of silence theory attempts to explain how people may misperceive public opinion because one faction feels much freer than others to speak out in support of its views. The more vocal group's views, in fact, may not be the majority; nevertheless, they may become the majority, or considerably increase their numbers, because the populace perceives such openly expressed views as the majority and wishes to conform to the supposed majority view (1974; 1977; 1984).

Noelle-Neumann maintains that human beings are strongly attuned to the opinions of others. This sensitivity to public opinion is like a "social skin." Public opinion can be a potent source of social control in that most people find it more comfortable to conform to majority opinion. A long history of scholarship, coupled with more recent social psychological experiments, strongly suggests that people prefer to change their views rather than differ from majority opinion. Since we are social beings, humans fear isolation.


Noelle-Neumann acknowledged her indebtedness to Alexis de Tocqueville's concept of the "tyranny of the majority." But Noelle-Neumann wished to see how a majority opinion may arise out of a minority viewpoint. The process may build in a spiral, that is, the perceived majority group becomes even more emboldened to speak out while those not so perceived progressively go mute. Hence the spiral of silence.

Noelle-Neumann's initial interest in the spiral of silence occurred in the context of German elections in the late 1960s. Public opinion polls showed public support evenly divided between the Social Democrats and the Christian Democrats. Yet a student of Noelle-Neumann quickly removed a "Social Democrat" button from her coat because she had encountered too much hostility. The student had falsely perceived that she was being labeled as a proponent of an unpopular cause. Though she had the support of at least half the public, the student was made to feel quite otherwise. Those who supported the opposition were vociferous in their support and behaved as if public opinion were with them. In describing the political confrontation between the Social Democrats and the Christian Democrats in Germany at the end of the 1960s, Noelle-Neumann noted that originally there was an even split between the two parties.

Those who were convinced the new Ospolitik was right thought their beliefs eventually would be adopted by everyone. So these people expressed themselves openly, and self-confidently defended their views. Those who rejected the Ospolitik felt themselves left out; they withdrew, and fell silent.

This very restraint made the view that was receiving vocal support appear to be stronger than it really was and the other view weaker. Observations made in one context spread to another and encouraged people either to proclaim their views or to swallow them and keep quiet until, in a spiraling process, the one view dominated the public scene and the other disappeared from public awareness as its adherents became mute. This is the process that can be called a "spiral of silence." (1984: 5)

I shall not and cannot statistically test Noelle-Neumann's spiral of silence concept as it applies to public opinion on affirmative action. Nevertheless, the concept fits much of the data contained in this book and is a useful, sensitizing tool in understanding public opinion on affirmative action.

Data on public opinion cited in Chapter 2 decisively demonstrate that the majority of Americans have been overwhelmingly opposed to affirmative action as quotas or preferential treatment. Yet interview data gathered here and elsewhere show that people perceive the pro-quota viewpoint as the majority viewpoint. Why? The spiral of silence suggests an answer.

It became obvious to the student interviewers and to me that we were cutting in on the spiral of silence simply by interviewing the thirty-two white males. It was sometimes even more obvious in background conversations with corporate officials or other corporate or government insiders. People were surprised to be asked about "the other side" of affirmative action.


The vast majority of views they had heard or read were either pro-affirmative action or held that it was a non-debatable issue, which had already been decided. As Noelle-Neumann had described, they had detected what they thought was the predominant opinion on the issue and acted accordingly. Thus, those interviewed for this study (and those quoted in secondary sources as well) were guarded and hesitant in stating anti-quota views even to friends and co-workers. They did not know that anti-quota views have been the majority.

Affirmative action proponents have been able to accelerate the spiral of silence because they have been able to invoke collective guilt over past treatment of blacks in the United States. They have been able to target opponents as racists. The mass media were largely silent on affirmative action or took a pro-affirmative action stance. No wonder, then, most of those interviewed for this book felt their critical views of affirmative action were not held by a majority of Americans. Indeed, only two knew of public opinion polls on the issue at all, and only one of the two had an accurate idea of the results.

To compare a hypothetical contemporary situation with that which inspired Noelle-Neumann's interest in the spiral of silence: imagine an American college student wearing a button which stated "Get Rid of Affirmative Action Quotas" The results would be at least as provocative as those encountered by the German student who wore the "wrong" political button.


Cognitive dissonance has reinforced the spiral of silence on affirmative action. Cognitive dissonance occurs when two cognitive elements clash: two opposing attitudes, or an attitude and a contradictory situation, (Zajonic, 1968). In the case of affirmative action, cognitive dissonance can easily arise when the values of equality are juxtaposed with the history of unequal treatment of blacks. The ideal of equality and the legacy of slavery and discrimination contradict each other. This conflict generates tension, embarrassment, and guilt in the minds of most Americans.

People have a tendency to try to assuage guilt and resolve dissonance. Therefore, egalitarian or compensatory programs aimed at redressing past discrimination have a natural appeal in restoring cognitive balance: the imbalance and guilt caused by discrimination against blacks in the face of American commitments to equality can be eased by efforts to make up for past discrimination. Obviously, cognitive dissonance has been a powerful tool for proponents of affirmative action of all types. This is one reason affirmative action debates almost always focus upon blacks rather than the other minority groups (Hispanics, Filipinos, Pacific Islanders, Asian/Indians, and so forth) who have quietly been included in many race-conscious programs.


Cognitive dissonance over past treatment of blacks is woven into nearly every argument for and against affirmative action from settings of private discourse to the professional literature. Evidence of this same tension and guilt surfaced in varying degrees with nearly all interviews with white males conducted for this book. Manning Greene specifically used the term "cognitive dissonance" in explaining the awkward and embarrassed response of his friends. Other subjects mentioned past discrimination against blacks but protested that they were not responsible for this and that the current generation of white males should not suffer for the sins of the past. And, as mentioned in previous chapters, there were the obligatory neutralizations: "I'm not a racist, but. . . ."

This climate of intimidation and guilt cannot be overstated with regard to the expression of critical views on affirmative action. As was seen in the previous chapter, this social psychological setting was fostered by the neglect of affirmative action as an issue by the mass media.

Noelle-Neumann maintained that the role of the mass media in fostering a spiral of silence is complex and crucial. The previous chapter demonstrated that the mass media ignored and avoided affirmative action as an issue and avoided or sought to neutralize the policies' impact on white males. Public opinion polls on the issue were not well publicized. Instead, the media have highlighted the contradiction between past treatment of blacks and the ideal of equality.


The post-World War Two phenomenon known as McCarthyism would seem to be a good illustration of the spiral of silence. Though Joseph McCarthy and anti-communist movements spawned by him never obtained the support of a majority of Americans, vocal anti-communist minorities nevertheless had enormous impact. Anti-McCarthyism opinion was chilled by individual fears of being labeled "communist."

The social forces suppressing criticism of affirmative action for the past twenty years have been strikingly similar to processes underpinning 1950s McCarthyism. First and foremost is the climate of fear and intimidation generated by both McCarthyism and affirmative action. The steamroller metaphor fits both phenomena well. Name-calling and guilt-by-accusation were employed with deadly effects by both anti-communist groups in the 1950s and by pro-affirmative action people in the 1970s and 1980s. In the heyday of McCarthyism, people feared being labeled "communist" or "fellow-traveler." As we have seen in previous chapters, affirmative action critics have been fearful of being called racist for so much as raising questions about such policies. […]


A conflict perspective on both affirmative action and McCarthyism is that both phenomena were manipulated by political and economic elites. Michael Rogin has argued that elites also capitalized upon McCarthyism (Rogin, 1967). Rogin contended that McCarthyism was not an irrational, agrarian, populist, anti-elitist movement produced by "structural strain." It was not a revolt against the elite, for some elites backed McCarthy. McCarthy did have "real support at the grass roots," with Catholic Democratic workers and southerners. But McCarthyism drew much of its power from conservative business elites and conservative Southern senators. McCarthy and his supporters were countenanced by moderate Republicans and even supported to some extent by liberal Democratic elites.

Fred Cook also concluded that elites either backed McCarthyism or simply withered in the face of it:

On the highest levels of intellect and leadership, the abdication of responsibility was all but complete. The Democratic-liberal establishment, which should have provided a rallying point, virtually disintegrated before the first onslaught and sought to camouflage itself by riding to the hounds with the foe. (Cook, 1971: 18)


The virulent fanaticisms of the past were truly fringe movements; they did not have behind them the power and prestige of the respectable and influential in American society. McCarthyism was different. It had the all-out support of ultraconservative big business interests in rebellion against the twentieth century." (1971: 570)

If one substitutes "Republican-business establishment" for "liberal-Democratic establishment" in Cook's first paragraph, the paragraph aptly depicts the non-response to affirmative action. In the second paragraph, "affirmative action" can be substituted for "McCarthyism" and "ultraliberal political, academic, and mass media interests" for "ultraconservative big business interests." In chapter 10, it will be seen that a small number of institutional elites (notably in civil rights, the media, and government) promoted affirmative action, while other major elites acquiesced.

Whatever the role of elites in affirmative action and McCarthyism, the general parallels are strong. There is no denying the official bullying, the baiting and labeling, the manipulation of guilt and fear, the complicity of the media and institutional paralyses produced by both McCarthyism and affirmative action. Both were personnel screening practices designed with honorable intentions, though the means were less than honorable. Both processes could dissolve into crazes with lasting results: even today, employees of state universities in California must sign a "loyalty oath," a holdover from McCarthyist fervor. And both movements could run over victims who were too startled and disheartened to raise a protest while co-workers and friends looked the other way.


Chapter 9: Affirmative Action, the University, and Sociology

"It is an unlovely spectacle," columnist George Will has written.

White lawyers and editorial writers telling blue-collar whites that promotions or jobs or seniority systems must be sacrificed in the name of racial reparations. It calls to mind Artemus Ward's jest during the Civil War: "I have already given two cousins to the war, and I stand prepared to sacrifice my wife's brother rather than the rebellion be not crushed." (1985: 96)

Will might have added sociologists to the list of those instructing white males on the goodness of self-sacrifice on the altar of group equality. White males caught in the web of affirmative action are likely to receive scorn from sociologists and like-minded academicians.

Sociologists' antagonism towards the established middle classes comes somewhat naturally. In part, this antagonism toward the middle classes stems from the clash of collectivist assumptions of sociology with the individualist values of American society. But there are other inherent reasons as well. Peter Berger (1967) has pointed out that sociologists tend to be inherently distrustful; their studies may debunk respectable middle-class views of the world. Furthermore, sociologists value cultural relativism, a cosmopolitan view in which no culture is seen as inherently better than another. Since Berger's analysis, anti-capitalist and anti-bourgeois sentiments among sociologists have intensified.


In analyses of affirmative action, anti-middle-class biases slide easily into anti-white bias and rationalization of whites' individual injuries.

Sociologists are ill-disposed to see any genuine grievances among whites caused by affirmative action. Indeed, some sociologists have begun to posit a "new racism" amongst whites. With this newer, more sophisticated racism, white males excuse their economic and occupational failures by scape-goating blacks. Whites who express hostility to affirmative action quotas or minorities are dismissed as believers in antiquated individualism who do not understand the collective realities of stratification and the need for collective remedies.

James R. Kluegel proclaimed this view in the title of his article, "If There Isn't a Problem, You Don't Need a Solution: The Bases of Contemporary Affirmative Action Attitudes" (1985). Stanley Greenberg suggested the "new racism" explanation for white males' opposition to affirmative action in his comments for a Los Angeles Times article on "Racism Runs Deep" (March 8, 1987). Having described Greenberg's findings as showing that "dozens of white blue-collar workers around Detroit . . . blame their own hard times on blacks," the reporters quoted Greenberg as stating, "They [white males] feel the government that was supposed to protect them has instead given everything away to the blacks: Blacks get the jobs, blacks get the welfare; blacks get the loans. . . . They have no historical memory of racism, no tolerance for present efforts to offset it."

The reporters and Greenberg conveniently forgot to add: "present efforts" at whose expense?

Even Marxist-oriented, conflict theorists cannot see the class and age lines of cleavage inherent in affirmative action. They see no legitimacy in the complaints of white males toward such policies. (When I discussed the interviewing for this study, a Midwestern sociologist sniffed with disgust that the reverse discrimination reports of white males might have any merit: "Of course, you realize they'll probably be alibiing." And when a famous Marxist criminologist visited the campus where I taught, he asked about the nature of my current research. When I informed him about my studies of white males and affirmative action, he dismissed the results of the study in advance with a one-liner: "I'll bet your white males were angry." He then changed the subject.)


Chapter 10: Elite Accommodation and the Flaws of Affirmative Action

How did controversial social policy that lacked public support nonetheless become institutionalized? That has been a central question of this book. Chapter by chapter, a number of contributing factors have emerged: the individual and collective silence of the victims; the influence of sex-role behavior upon silence; the collective guilt and individual fear of being labeled racist; the elusive and capricious implementation of affirmative action policies; the ideological bias and self-censorship among the mass media and the social sciences; the spiral of silence; the New McMarthyism; and the craze-like behavior.

Affirmative action has evolved as a set of programs imposed from the top down, over and against public opinion. In this chapter, I wish to focus upon the question of how and why those in positions of power and influence -- the elites -- formulated or at least went along with affirmative action and other race-preference programs.

A standard starting point in examining the role of elites in American life is C. Wright Mills classic study, The Power Elite (1956). Mills's framework, augmented by a recent attitude survey of elites by Verba and Orren (1985), provides a useful portrait of the role of elites in affirmative action policy.

In Mills's view, by the 1950s, a century of economic and political centralization produced three major institutional hierarchies in American society: the large corporations, the executive branch of government, and the military. By the term power elite, Mills meant "those political, economic, and military circles which as an intricate set of overlapping cliques share decisions having at least national consequences. In so far as national events


are decided, the power elite are those who decide them" (1956: 18). Legislative bodies such as the Congress, Mills maintained, had been reduced to the "middle levels of power."

Mills made clear that he saw no well-organized, coherent, unified ruling class. His was a structural analysis of power, not conspiracy theory. What unity existed amongst the elite was due to two factors. First, the elite was ''composed of men of similar origin and education. . . . There were psychological and social bases for their unity, resting upon the fact that they were of similar social type and leading to the fact of their easy intermingling" (1956: 19). Second, the corporate, government, and military sectors were brought together by similar economic and structural interests, most notably, "the development of a permanent war establishment by a privately incorporated economy inside a political vacuum" (1956: 19).

Mills's analysis is useful for this study in that Mills emphasized that: (1) elites are structurally interrelated, and (2) power lies in decision-making. Those who are in the top "command posts" of the major institutional hierarchies can make decisions of enormous consequence with few, if any, democratic checks on such powers. Mills was one of the first social scientists to discern the emergence of centralized, administrative power and rule in society.

Indeed, administrative law is the fastest-growing body of law in the land. The powers wielded by top corporate and government administrators have expanded greatly since Mills wrote The Power Elite. So vast are these powers that Allan Bloom has stated that an "administrative state" has replaced politics (1987: 85). Were he alive today, I suspect Mills would concur. (On the importance of administrative law, see Vago, 1988.)

Mills's emphasis on the giant administrative power wielded by the heads of major corporate and government institutions can be supplemented by a more recent study of elites by Verba and Orren (1985). Fortunately, for purposes of this study, Verba and Orren were specifically concerned with the attitudes of elites towards equality. They collected data on attitudes toward equality from 2,762 institutional leaders in nine elite sectors: business, organized labor, farming, civil rights organizations, feminist groups, political parties, intellectuals, the mass media, and a sample of college seniors at ten elite colleges (1985). Unfortunately, Verba and Orren ignored elites in the executive branch and military, two of Mills's three major organizational hierarchies. However, it can be reasonably assumed that the elites in the military have remained conservative. As for elites in the federal bureaucracy, it has already been seen that they have tended to be quite liberal and activist on matters of affirmative action.

On the other hand, Verba and Orren's addition of elites in the mass media and the intellectuals as well as the "challenging elites" in civil rights and feminist organizations is overdue. (Verba and Orren also included labor leaders in their survey of elites. Mills contended that the labor elites' powers


were derived from legitimation and protection by elites in government. In the case of affirmative action, we shall see that unions usually went along with corporate or government administrators. Only rarely did unions take action on behalf of white males.)

Verba and Orren confirmed Mills's views on the social backgrounds of elites. The elites were mostly white, male, and college educated. Leaders of civil rights organizations, however, were nearly all black and the feminist elites were mostly female. Yet homogeneity of background did not produce a unified outlook on equality or programs to remedy inequality. Verba and Orren found persistent polarities of attitudes between those elites aligned with the Democrats, on the one hand, and Republicans, on the other. Democrats saw poverty and racism as a result of "the system," while Republicans saw such inequalities in individual terms. Democrat-aligned elites favored more government intervention to assist the poor, but few wished to impose equality of results through the use of quotas. In choosing between more traditional equality of opportunity versus the newer and more radical equality of results, Verba and Orren found that the majority of leaders opted for the former:

American leaders on both the left and the right share the basic view that individuals may in fact deserve to be unequal in results, because of their own failures. The equality debate in America, therefore, is not over whether anyone really deserves to be at the bottom or whether the losers are always worthy of help from the government, but over whether those currently at the bottom are the ones who deserve to be there, and thus whether the government should assist them. (1985: 83)

Quotas for blacks and women in jobs and education were overwhelmingly rejected by all elites except those of civil rights organizations and feminists. Three out of four black leaders and a majority of feminist leaders supported the use of quotas for blacks in jobs and education. On quotas for women, blacks approved, but feminist leaders split down the middle. As for other liberal groups: "Democrats, youth, labor, and intellectuals, who blame the system for black poverty almost as much as black and feminist leaders do, nevertheless deplore the use of quotas" (1985: 84).

In addition, Verba and Orren found that the elites ranked national priorities differently, and most put race and gender equality towards the bottom of the list.

American leaders give different priorities to equality as a national goal, particularly to equality for women and for blacks. . . . Gender equality appears at or near the bottom of the ratings more often than any other goal. Half of the groups place it last, and everyone but feminist leaders, who rank it first, and youth place it among the three lowest categories. Equality for blacks does not fare much better. . . . Only blacks rate it near the top. (1985: 120)



Based upon Verba and Orren's work, affirmative action quotas can hardly be seen as the product of a consensus agreed upon by national leaders. On the contrary, most elite members opposed quotas, and the elites were split on a variety of related issues.

Yet there is no question of the imposed-from-above nature of affirmative action. Capaldi (1985) even likens the process to fascist-imposed "reform." What, then, was the role of elites in the affirmative action revolution?

Verba and Orren's study contains a major flaw common to attitude studies: the gap between attitudes and actual behavior.

Did elites behave in a manner consistent with anti-quota attitudes? The data presented in Chapter 3 strongly suggest that they did not. Especially in the years since Verba and Orren's survey (1976-77), corporate and political elites appear to have yielded with minimal resistance to quotas imposed by judges or federal agencies. More than that: corporations and government agencies have initiated their own affirmative action quota procedures. Representatives of the corporate and business communities have been filing friends-of-the-court briefs exclusively on the side of pro-affirmative action forces in the Supreme Court.



A fusion of economic and bureaucratic interests can be seen in the contemporary acceptance of affirmative action procedures.

The desire to "do something" about the minority poor gained initial impetus in the wake of searing urban disturbances -- Watts, Detroit, Newark -- in the 1960s. There was a sense of crisis and fears of economic and racial polarization (Jencks, 1985b). Hiring more blacks at General Motors and in government agencies in Detroit, for example, could clearly be seen by elites as a tactic for cooling urban unrest and bringing members of the underclass into the system.

Once affirmative action rules were formulated, organizations subject to equal opportunity regulations expediently accepted such plans and procedures as routine overhead expenses. Simply in terms of risk management, it has been easier to put up with affirmative action paperwork than to risk harrassment by government agencies, lawsuits, and bad press. A given number of affirmative action hires, who do not perform on a par with non-affirmative action hires, can be tolerated (Jencks, 1985b; Murray, 1984a).

A case that illustrates all of this was Sears, Roebuck and Co.'s victory in a protracted struggle with the Equal Employment Opportunity Commission. Sears prevailed largely because the corporation had implemented its


own vigorous quota plan in the 1970s.

Business and government entities have learned to live with, even embrace, affirmative action. The U.S. Chamber of Commerce, whose 1,800,000 members include thousands of small businesses typically hostile to government regulation, took no position in the Santa Clara case. Many business organizations filed amicus curiae briefs supporting the program and praised the Supreme Court ruling.

The reason for the business support is simple: a tough affirmative action program now can stave off expensive discrimination suits later. Sears, Roebuck and Co. has spent 15 years fighting charges by the Equal Employment Opportunity Commission, first in administrative proceedings, then in court. The trial alone lasted 10 months in 1984-85. The 806-store retailer refuses even to reveal how much it spent on lawyers, though it gives a hint. Expenses, including hiring statisticians, sociologists and other expert witnesses, came to $6.8 million and one can guess fees to attorneys were probably at least twice that much.

Sears finally won at trial -- the judgement is on appeal -- against a charge that its female employees tend to be concentrated in lower-paying sales jobs than men because of discrimination. The company's chief litigator in the case . . . specifically credits the court victory to a race-and-sex-conscious hiring program Sears started in 1968 for its work force, now at 300,000. "They have the best," declares Morgan. "Out of every two job openings at Sears they filled one with a black or a woman."

Governments, especially in large cities, similarly view affirmative action plans as a small price to pay for political harmony. (Insight, April 27, 1987: 9)

Yet Sears's victory was a long and costly one: twelve years and $20 million. "The transcript of the 10-month trial in the last case alone is 15 feet high. At one point, Sears employed 250 people full time to compile information, some of it dating back to 1960, to meet EEOC investigatory demands" (Weiner, 1986).

Many other corporations simply capitulated to EEOC rather than drag out negotiations and court contests. When EEOC won a landmark 1973 discrimination settlement with American Telephone and Telegraph, EEOC tried to run up the score with charges against General Electric Co., Ford Motor Co., General Motors Corp. and the United Auto Workers. GE settled in 1978 for $32 million for training programs, payments to employees and other steps. Ford's 1980 settlement cost $23 million, and GM's pact in 1983, with the UAW as co-signatory, is about $42.5 million. (Weiner, 1986)

Therefore, it makes economic sense to maintain in-house quota programs, even if a corporation must tolerate a number of unqualified or incompetent workers.

In 1985, the National Association of Manufacturers voted a fresh endorsement of affirmative action. Monsanto and DuPont also have strong affirmative action plans in place and like it that way. Personnel executive


David Buchanan of the Ford Motor Company has stated, "Business decided that affirmative action is the right thing to do. It has become a way of life" (U.S. News and World Report, June 17, 1985: 67).

The use of race-and-gender quotas to ward off legal action by government agencies or lawsuits by members of minority groups had a catch until 1987: the counter-threats of reverse-discrimination lawsuits by white employees. However, in 1987, by a 6-3 decision in Johnson v. Transportation Agency of Santa Clara, the Supreme Court upheld preferential treatment for women based upon numerical under-representation. That decision seemingly released employers from the threat of reverse discrimination lawsuits. But the court's most recent decision in Richmond v. Croson (1989), again opened the door to legal redress by whites. In its Richmond decision, the 6-3 majority held that even the use of "benign quotas" likely violates the rights of white males. The court also cast doubt upon the use of any preferences where there was no evidence of intentional, past discrimination against specific minorities.

Nevertheless, as emphasized in the opening chapter of this book, white males very rarely win reverse discrimination lawsuits; the Bakke and Johnson cases have been the exceptions, not the rule. Reports in this study and in press reports suggest that the EEOC takes a skeptical view of reverse discrimination claims. Employers and educational institutions have undoubtedly become aware of this. It will take several more decisions similar to Richmond before the legal odds change. In the meantime, elites and their institutions clearly have the upper hand. And they know it.


Though most of those interviewed in the Verba and Orren survey rejected quotas and equality of results, there was a split in views on whether the system or the individual was to blame for poverty situations. Verba and Orren found that a "system-is-to-blame" belief concerning black and female inequality was shared by majorities of Democrat-inclined elites and amongst smaller factions in other elites.

Verba and Orren's finding of a split in the system-is-to-blame philosophy only partially confirms Charles Murray's more sweeping description of a transformation of elite wisdom in the 1960s. Murray argued that the shift moved elite thinking from blaming the victim to blaming the system (1984b). Verba and Orren's findings indicate that a blaming-the-system outlook gained a sufficient number of vocal, elite adherents who intimidated opponents of race-conscious remedies and inhibited organized opposition: the spiral of silence.

To draw a parallel between the behavior of elites and white males interviewed for this study: it is probable that elites have been ignorant of -- or have misperceived -- the consensus in public opinion opposing the use of


quotas. This is especially likely since, according to Verba and Orren, most of the elites ranked equality for blacks and women as a low-priority issue. Issues of low importance typically do not merit much attention and analysis. Since elites did not devote much attention to equality issues, it is reasonable to posit that, when they did, they assumed the vociferous minority and women's groups -- through a quietly sympathetic mass media -- were articulating popular views. Thus, it is likely that Elisabeth Noelle-Neuman's concept of the "spiral of silence" applies equally well to attitudes amongst elites as to attitudes amongst the public at large.

The twin forces of the spiral of silence and the New McCarthyism would have much greater impact among the elites more than among rank-and-file citizens. A major corporate or government leader obviously has a great deal to lose from being tagged as a "racist." Not only might the stigma become attached to the person, but also to the corporation. (One personnel officer interviewed as a background source for this study admitted that he did not discuss affirmative action outside the corporation because he feared that a slip might get not only him but his corporation in trouble.)

In his discussion of Congress's behavior toward affirmative action, sociologist Nathan Glazer implied a spiral-of-silence situation. "In Congress a point of view that may well reflect the opinions of a minority holds sway. The protection of affirmative action is in the hands of Congressmen who care, reflecting the views of civil rights organizations; most others stay away from the issue" (1988: 107).

In his discussion of Congress's behavior toward affirmative action, sociologist Nathan Glazer implied a spiral-of-silence situation. "In Congress a point of view that may well reflect the opinions of a minority holds sway. The protection of affirmative action is in the hands of Congressmen who care, reflecting the views of civil rights organizations; most others stay away from the issue" (1988: 107).

To be exposed to charges of "racism" is terrible public relations for a corporation or government agency. There is little doubt that corporate and government elites remain wary of such labels and judge that going along with affirmative action, whatever the attendant costs, is less costly in terms of the public's goodwill than being labeled racist or sexist.

Elite accommodation to affirmative action preferences, then, can be explained in terms of good public relations, coupled with a desire for stability and "industrial peace" with government agencies and minority/women's groups. Elites are also aware that the odds for legal action by individual white males are small and that the chances for a white male winning a reverse discrimination lawsuit are even smaller. There has been no organized recognition of white males' grievances by major political groups. Following affirmative action preferences directives, then, has been the line of least resistance.

Omitted from this equation was the damage done to white workers and the continued oversight of serious legal and sociological flaws in affirmative action.



Most affirmative action debate has been centered on the black/white dimension, far less often on the male/female dimension. The increasing list of preferred non-black minorities also has been ignored by the social sciences and the print and television media. A strategic reason for focus upon blacks has been the obvious history of discrimination and deep sense of national guilt. The black/white issue is thus the strongest possible case for affirmative action advocates. Even affirmative action critics wind up fighting proponents on this emotionally charged turf (see, for example, Glazer, 1975; Murray, 1984a, 1984b; Capaldi, 1985; Bloom, 1987). High-visibility debate concerning affirmative action for non-black minorities is next to nonexistent.

Contrary to Nathan Glazer's recent reappraisal of affirmative action (Glazer, 1988), many non-black minorities have been, and continue to be, included in affirmative action programs: the numerous Hispanic groups (Mexicans, Cubans, Puerto Ricans, Central Americans, and so forth), Pacific Islanders, American Indians, Asian/Indians, Filipinos, Vietnamese, Cambodians, and homosexuals.

Why have so few people asked directly: on what basis do these groups ground their claims for redress through quotas, preferential treatment, or proportional representation? On what basis do we include some groups and exclude others? Especially when many, if not most, of these groups have only recently arrived in significant numbers in the United States?

The omission of non-black minorities from discussion of affirmative action becomes even more extraordinary in view of recent immigration history: by including non-black minorities, one obtains a curious immigration-with-preference paradox.

Most of the government-preferred non-black minorities were present in the United States only in relatively small numbers before 1970. Historical measurement of the Hispanic population is plagued with definitional problems in census procedures, especially prior to 1970. But the numbers of


Hispanics in this nation have been relatively small until the recent large-scale and sustained immigration from Mexico and other Hispanic nations (Moore and Pachon, 1985: 24). […]

Aside from those of Chinese and Japanese ancestry, most Asians in this nation are foreign-born: of the 2,539,800 persons living in the United States who were born in Asia, 47 percent immigrated here in the period from 1975 to 1980; 22.4 percent from 1970 to 1974; 12.0 percent from 1965 to 1969, and 6 percent from . 1960 to 1964. The remaining 12.5 percent immigrated here before 1960.

Another example: for U.S. citizens born in India, 43.7 percent migrated here during 1975 to 1980, and another 33.1 percent from 1970 to 1974.

The point of all this is obvious: why should recent immigrants be given preference over native-born American citizens? What does the United States owe these newly arrived immigrants?

This curious immigration-with-preference paradox, in fact, becomes more astonishing when one realizes that the situation can operate -- and probably has operated -- on a same-day basis: a non-citizen, either legally or with forged credentials, can cross the border and immediately become eligible for affirmative action preference. Incredibly, the immigration-with-preference paradox rarely, if ever, surfaced in the public debates surrounding recent immigration reforms.

Employers have recently discovered that the new immigration act has placed them in something of a dual dilemma: they may be fined for hiring illegal aliens but sued for discrimination if they refuse to hire a worker with dark skin. On the other hand, especially when immigrants may work for lower pay than citizens, affirmative action might provide a useful device to


give preference to legal or illegal immigrants and, thereby, fulfill affirmative action quotas at the expense of citizens. Only rarely has this phenomenon been examined. […]

This emperor-has-no-clothes quality regarding group preferences for non-black ethnics leads one to wonder just how social scientists have avoided the issue. […] Most social scientists and policy makers have ignored the multiplication of "protected groups" in the United States.

Thomas Sowell has wryly observed that the number of "protected groups" in the United States has multiplied to the extent that 70 percent of the population is covered by some form of affirmative action (1985: 14).

None of the white males interviewed for this study nor any of the personnel or affirmative action officers with whom I conducted background discussions had even thought of the immigration-with-preference dilemma. No one knew quite what to say when I raised the issue.