Pages

Wednesday, October 5, 2011

Research Paper: Physical Development In Elementary School Age Children

Physical Development In Elementary School Age Children

Strength and coordination are two areas of physical development that seem almost to take care of themselves, which may be why they are often overlooked when curriculum planning is under way in many early childhood settings. After all, children are going to get stronger as they grow older (true); they will also become more coordinated as they grow older (also true). Unfortunately, if left to chance, children may not reach their full potential in both of these critical areas of physical development. Short of weight lifting and coordination drills, what can we do to develop children's abilities? Plenty! Young children need overall strength so that they can participate in a wide variety of activities, derive pleasure from those activities, gain confidence in their abilities to do things, and have the strength to do things--particularly new things. Coordination is another ability that begins developing on its own, as infants begin to explore their bodies and their world. By coordination, we mean a series of movements, organized and timed to occur in a particular way, that bring about a particular result.

When we start thinking about and planning for strength and coordination in young children we have to realize that, like all developmental issues, there are going to be individual differences and, in general, development is going to happen at its own rate. You cannot make development happen; you can only support it by creating the right environment for each child as he reaches a particular point on the developmental continuum. Check out your preschool playground equipment and make sure that there are other sorts of climbing activities besides playing on steps and slides. You can add tires of different sizes, placed in various patterns on the ground for "follow-the-leader" fun. Hang climbing ropes from sturdy tree limbs or swing-set frames to encourage upper-body development through climbing. (Make sure you monitor this activity closely and take the ropes down when playtime is over.) You can also rape a sand pail to each end of a 36-inch wooden dowel and have children carry different amounts of sand, water, or rocks from one place to another.
Ladders and slides aren't challenging enough for children in this age group. To add strength and coordination development opportunities, tie a one-inch natural fiber rope horizontally between two trees, about 54 inches above the ground. Let children monkey-swing on it, or ask them to try to travel hand-over-hand as far as they can along the rope. (Again, monitor this activity closely and take the rope down when playtime is over.) The next time you purchase riding toys for your wheeled-vehicle path, look for those that are hand-powered, not foot-powered. Whatever you do, don't let the children be limited by equipment. Be creative as you look for ways for children to lift things or themselves--you may be surprised at what you'll find!

Reference:

Strickland, Eric. Early Childhood Today, Oct2004, Vol. 19 Issue 2, p6-6, 1p, 1c;

Worobey, John; Pisuk, Joan; Decker, Kathleen. Public Health Nursing, Mar/Apr2004, Vol. 21 Issue 2, p122-127, 6p

Rahman, A.; Hafeez, A.. Acta Psychiatrica Scandinavica, Nov2003, Vol. 108 Issue 5, p392, 2p

Scholastic Parent & Child, Jun-Aug2003, Vol. 10 Issue 6, p40, 1p, 1 chart, 1c;

Tuesday, October 4, 2011

Term Paper: Women’s Right

In 1963, the New York Court of Appeals disrupted an entrenched male privilege in the New York Police Department (NYPD) when it cleared the way for policewoman Felicia Shpritzer and her peers to take the promotional examination for police sergeant. Nevertheless, although Shpritzer's case against the city removed a longstanding barrier to expanded roles for women, it did not argue that women were men's equals. Instead, the plaintiffs argued that supervisory duties were irrelevant to gender. Although their campaign succeeded in putting women in supervisory police roles and indicated a changed playing field favorable to women in the 1960s, the victory imbued greater ambiguity than David's slaying of Goliath. Policewomen advocates in the early 1960s continued to struggle with the liberating and debilitating potential of domestic ideals.
Women claimed that they were the ideal complements to male officers. Rather than embracing the democratic ideal of equal opportunity as Black men did in challenging their exclusion from police jobs, Black and White women pursuing expanded roles in the 1940s, 1950s, and early 1960s, more often than not, defined themselves as the alter ego of the patrolman. If the model male officer was physically imposing, combative, and heroic, they asserted, women officers be would nurturing, motherly, and protective. Similar to women making inroads into other areas of employment in this period, these women justified their new roles as a fulfillment of their feminine duties. World War II era concerns about social hygiene, morality, and female delinquency, as well as postwar concerns about delinquent boys and girls, provided women with a gendered wedge for working in the male space of New York's “mean streets.”
Policewomen advocates incrementally encroached upon the male domain of patrol work by illustrating the relevance of feminine skills such as prevention, sensitivity, communication, and child protection. By the beginning of the 1960s, the increasingly similar nature of policewomen's and policemen's duties, despite their quitedistinctly gendered justifications, led some women to question the validity of the separate job titles, “policewoman” and “patrolman.” The problem was that the femininity campaign for policewomen advocates, which had been so successful in expanding the duties of women's work in law enforcement, further entrenched the rhetorical boundaries between policewomen and patrolmen. Although individual women may have privately embraced radical equality in gender roles, few policewomen advocates before the mid-1960s eschewed the enabling languages of motherhood and domesticity.
Race further complicated the realization of women's equal treatment in the NYPD. Black women were doubly discriminated against, and soon discovered that they could not always enjoy the protection of domestic ideology. Instead they found themselves assigned to dangerous undercover work. Formulating strategies for access to privileged jobs held by men in the police department was not as simple as choosing between arguments about women's likeness to or difference from men. It required a sorting out of racialized conceptions of gender and gendered conceptions of race.

NYPD officials employed race and gender together to structure the way they policed the city. Women were neither able to patrol certain areas of the city nor were they eligible to advance to sergeant, lieutenant, or captain. However, they could make lateral moves to specialized detective units, which often involved more challenging, albeit dangerous, tasks. NYPD officials frequently encouraged Black women to pursue such jobs because they could purportedly infiltrate dangerous neighborhoods without being detected. Also, supervisors seemed to believe that Black women could handle themselves in violent situations, while White women could not. One woman reported, “you might be asked to do something White women wouldn't be asked to do. When a White sergeant was looking at me, he wasn't looking at his mother or his sister. He might send me in a hallway or roof but he would never send a White young lady.” Whether consciously or not, White male supervisors were more protective of White policewomen.
The exclusion of most White women from dangerous detective duties reveals much about what White NYPD officials, New York politicians, and other citizens believed was the proper place of “true ladies” in the 1950s. Women were to partake in preventative police work rather than crime fighting. In exploring the ideology that undergirded this assumption, it is important to lay out the historical duties and idealized roles of policewomen in this period to illustrate how they have shifted over time. Of particular import is how these roles changed during World War II and the immediate postwar era. Despite the confining rhetoric of 1950s domestic ideology, policewomen were able to manipulate that language to their advantage. Leaders of the policewomen's movement learned that they could incrementally expand the sphere of policewomen's duties by invoking a traditionally feminine ideal in which policewomen would be responsible for the domestic welfare of the streets.

“Sober, Respectable, Women!” Police Matrons At The Turn-Of-The-Century
Advocates for women's rights embraced essentialized arguments about women's natures to provide an opening for women in police work. In New York and other cities in the late nineteenth century, women's organizations such as the Women's Prison Association, the National League of Women Voters, the American Female Guardian Society, and the Women's Christian Temperance Union encouraged social reform that would allow women to serve as matrons to care for female prisoners detained in police precincts. Before the 1890s, male officers, their wives, or the maid of the police station searched female prisoners (Brown 3). Of particular concern to women's groups was the potential for sexual abuse of female detainees. To prevent such misuses of power, reformers asserted that women, as bastions of virtue, should have a hand in police work to ensure the public good and welfare (Melchionne iv).

Male prison workers, who feared that women workers would displace them, retaliated by portraying the prison as a place of degeneracy that was unfit for women workers. Agreeing with the premise of women's clubs that women were naturally virtuous and moral, but manipulating that cultural value to contest reform within the police station, the Men's Prison Association argued that “a decent sober woman could not search a female alcoholic because she would be contaminated and demoralized by her contact with such depraved creatures” (qtd in Cohn 43). In the 1880s, Commissioner Vorhis opposed the use of women as police matrons because he thought that the job was too physically demanding for them. Policewomen in the 1900s, hearing similar claims from male patrol officers, would note that such objections were based on a fear that women would displace men in a field that had hitherto been considered a masculine domain (Cohn 43). In response to such claims, women's groups promised to ensure that candidates for the job of matron would be “of good moral character” and that they would secure letters of recommendation from 20 “respectable” women before they were appointed (Melchionne 37). Most men and women agreed that women ought to be models of virtue and morality, but disputed the implications of that outlook for women's employment.
The impingement of police matrons upon the terrain of male police work was neither understood nor promoted by women's activists as a case of women performing men's jobs. Rather, they saw themselves redefining the nature of a small aspect of police work as feminine. From their point of view, moral norms dictated that it was appropriate for women to care for women prisoners. They did not wish to challenge the conventional wisdom that men and women were physically, intellectually, or emotionally different. Indeed, it was this very difference that justified the limited incorporation of women into the station as matrons. Women's roles in police work in the NYPD were relatively static until World War I. This would be the case for women aspiring to police work in the future. Individuals could stretch the boundaries of their duties under creative titles and gain access to more interesting and challenging tasks. One exceptional case was Isabella Goodwin, who worked under the tide of matron at the Mercer Street Station from 1895 to 1912. While serving as a matron, Goodwin made a number of shrewd observations about women prisoners, which led to a supervisor's suggestion that she try her hand at detective work. They quickly realized that a woman could go undetected while investigating criminal activities. So, Goodwin was assigned detective duties by her male supervisors. She gathered evidence against fortunetellers, investigated banking scams and extortion rackets, and exposed fake medical practitioners. When Commissioner Dougherty learned of her work, he appointed her to the position of Detective First Grade, formalizing her position and more than doubling her salary. While Goodwin was exceptional, it was common for police departments to use matrons in capacities other than guarding female prisoners (Segrave 11). It was obvious to police officials that some tasks were more suitable for women. However, like other police matrons and their advocates, Goodwin remained guarded about her femininity, noting that her success in police work was due to her women's intuition,“ and that it took a toll on her work at home, where “a woman's duty is first and foremost.” Reifying her domestic primacy assured Goodwin protection from potential critics of her “public” work as a police detective.
More public roles in police work became possible for increasing numbers of women due to the physical and social dislocations of men and women during World War I. The war created the conditions under which women could make new claims about their relevance to police work. Three significant and interrelated phenomena led to these conditions: the number of men fighting overseas, the vacancy left in traditional men's jobs, and the social stresses on families due to absent parents. The incorporation of women into industry compounded the difficulties of family organization created by the heavy recruitment of men into the armed forces. Of particular concern to some Americans during and immediately after the war was the perceived decline in public morals and waywardness of America's youth. Many citizens feared that the concentration of young men at the new military recruiting centers posed dangers to vulnerable young girls. Since most politicians already viewed women as the guardians of public virtue, it made sense to solicit their help in this morality crusade.

The NYPD defended its use of women as auxiliary police reserves by pointing to women's successful substitution for men in other industries and political and social spheres. It was also a matter of a labor shortage. The city had already organized a police reserve of male officers to replace the soldiers, but by May 1918, it was clear that they were still understaffed. In order to fill the vacancies Mayor John Hylan organized the Committee of Women on National Defense, which established a small unit of women as “protective officers.” Special Deputy Commissioner Rodman Wanamaker, head of the Police Reserves, constructed a tenuous defense of the newly hired women. He said, “New York women have the vote,” and therefore “they should have an active part in enforcing the laws.” Despite such treatments of women as potential equals, Wanamaker justified the department's inclusion of them by focusing on their responsibilities for problems relating to youth and sexuality. Furthermore, he made it clear that their service was to be both “temporary and voluntary.” Wanamaker divided the city into zones that included a number of precincts, and assigned policewomen to each zone to patrol and look after the welfare of young girls who might be found in the company of men in secluded place such as parks and beaches (Melchionne 95). Although Wanamaker granted women new crime-fighting responsibilities, he reserved for men the more critical work of “rough and violent lawbreakers.” As their title suggested, protective officers were assigned to do preventative rather than punitive work. Although vested with the power to do so, the more than 5,000 recruits made no arrests, contenting themselves to report to their superior officers only the most flagrant cases of disorder among soldiers. Fulfilling their duties as moral guardians, women protective officers scouted the streets, parks, camps, armories, recruiting stations, dance halls, motion-picture theaters, and amusement parks. Likewise, they conducted investigative work only in places where young girls might be exploited-furnished rooming houses, places of questionable employment, restaurants, and railroad terminals (Melchionne 58).


Reference:

Crocco, Margaret Smith. Social Studies, Nov2007, Vol. 98 Issue 6, p257-269, 13p, 1 map

Ronan, Marian. Journal of Feminist Studies in Religion (Indiana University Press), Fall2007, Vol. 23 Issue 2, p149-169, 21p;

MacLean, Nancy. OAH Magazine of History, Oct2006, Vol. 20 Issue 5, p19-23, 5p;

Stoever, Jennifer Lynn. Social Identities, Sep2006, Vol. 12 Issue 5, p595-613, 19p

Lamont, Victoria. Canadian Review of American Studies, 2006, Vol. 36 Issue 1, p17-43, 27p, 1bw

Monday, October 3, 2011

Essay: The Child’s Brain

The Child’s Brain

A child's brain is a magnificent engine for learning. A child learns to crawl, then walk, run and explore. A child learns to reason, to pay attention, to remember, but nowhere is learning more dramatic than in the way a child learns language. As children, we acquire language -- the hallmark of being human. In nearly all adults, the language center of the brain resides in the left hemisphere, but in children the brain is less specialized. Scientists have demonstrated that until babies become about a year old, they respond to language with their entire brains, but then, gradually, language shifts to the left hemisphere, driven by the acquisition of language itself. But if the left hemisphere becomes the language center for most adults, what happens if in childhood it is compromised by disease? Brain seizures such as those resulted by epilepsy and Rasmussen's syndrome, have a devastating effect on brain development in some children. Minor developmental delays early in life, like beginning to walk later than average, may forecast alcoholism, according to a new study. The authors suggest that such problems with early childhood brain development may in fact contribute to the disease.
The brain's cerebellum plays a crucial role in motor development and the control of fine, coordinated movements such as walking and playing musical instruments. Some researchers have proposed that the region is also involved in impulse control and that a dysfunctional cerebellum may therefore predispose to addiction. This theory led pharmacologist Ann Manzardo of the University of Kansas Medical Center in Kansas City, Kansas, to ask whether variations in early motor performance might predict alcoholism later in life. To test the theory, Manzardo's team analyzed data from a well-known Danish alcoholism study that followed 330 baby boys — two thirds of whom had alcoholic fathers — through their 40s. Looking at motor development and the frequency of alcoholism in the subjects at age 30, Manzardo and her team discovered that 77% of the alcoholics had not yet been able to walk at 12 months of age, compared to 43% of nonalcoholics. Because the cerebellum is involved in motor development, Manzardo says the region may be an additional marker for alcoholic tendencies. As such, she says, "we need to focus more on early childhood brain development to see if there are contributing factors to the development of alcoholism."

Sunday, October 2, 2011

Research Paper: Effects Of Sleep Deprivation

Literature Review - Effects Of Sleep Deprivation

Normal, healthy individuals need adequate sleep for optimal cognitive functioning (Himashree et al., 2002). Without adequate sleep, humans show reduced alertness (Penetar et al., 1993) and impairments in cognitive performance (Thomas et al., 2000, 2003). Prolonged sleep deprivation is associated with decrements in elementary cognitive abilities such as vigilance and sustained attention (Doran et al., 2001; Wesensten et al., 2004), as well as impairments in complex, higher-order cognitive processes such as verbal fluency, logical thought, decision making, and creativity (Harrison & Horne, 1997, 1999, 2000). In occupational settings such as aviation, air traffic control, and sustained military operations where constant vigilance is a necessity, extended periods of sleep loss have been associated with catastrophic accidents (Mitler et al., 1988) and may have been a factor in some friendly fire incidents (Belenky et al., 1994). Studies of sleep-deprived individuals show that errors attention begin to emerge by 19 h of continuous wakefulness (Russo et al., 2004) and cognitive performance declines at a rate of approximately 25% for each 24-h period of wakefulness (Belenky et al., 1994).

Sleep deprivation produces global decreases in cerebral metabolism and blood flow, with the greatest declines evident in those regions critical for higher order cognitive processes (Thomas et al., 2000). These regions, the heteromodal association cortices, are associated with attention, vigilance, and complex cognitive processing, and reductions in activity within these regions are associated with decrements in these higher-order cognitive process (Mesulam, 1999). As a global blood flow and metabolic activity decline during prolonged periods without sleep, the brain appears to compensate by recruiting cognitive resources from nearby brain regions within the prefrontal and parietal cortices in order to maintain cognitive performance at acceptable levels (Drummond et al., 2001). Some evidence suggests that these compensatory activities may be particularly prominent within the right cerebral hemisphere (Drummond et al., 2001). Consistent with these reports, other studies suggest that cognitive  processes mediated by the right hemisphere are more adversely affected by sleep deprivation than those mediated by the left (Johnsen et al., 2004; Pallesen et al., 2004).
Neuropsychological evidence suggests that the right cerebral hemisphere is dominant for attentional processes (Heilman&Van DenAbell, 1980; Mapstone et al., 2003). Much of the evidence supporting the dominance of the right hemisphere in attention comes from studies of patients with unilateral brain damage (Heilman & Van Den Abell, 1980; Weintraub & Mesulam, 1987). Lesions to the right cerebral hemisphere are more likely to produce contralateral hemispatial neglect than similar lesions to the left hemisphere (Behrmann et al., 2004; Mapstone et al., 2003, Mesulam, 1999). Further evidence supporting the prominent role of the right hemisphere in attentional processing comes from several functional neuroimaging studies that reveal greater right hemisphere activity in response to tasks requiring allocation of spatial attention (Fink et al., 2001; Macaluso et al., 2001). The accumulating evidence suggests that the left cerebral hemisphere allocates its attentional processing predominantly toward the contralateral (i.e., right) hemispace, whereas the right hemisphere appears to distribute attentional processing more equally between both hemispaces, and is therefore considered dominant for attention (Mesulam, 1999). Consequently, the phenomenon of contralesional neglect occurs nearly exclusively following lesions to the right hemisphere.
Given the apparently greater role of the right hemisphere in attentional processing and the preliminary evidence that the cognitive processes mediated by the right hemispheremay be more sensitive to the detrimental effects of sleep deprivation, it was hypothesized that prolonged sleep loss results in greater impairment of right hemisphere visual attention mechanisms oriented toward the contralateral (i.e., left) perceptual hemispace. Participants were assessed several times each day while remaining awake for 40 h. During each 15-min testing session, participants monitored a 150◦ arc of lateral visual space for periodic occurrences of brief flashes of light while simultaneously performing a continuous serial addition task.
Adequate sleep is important for both good mental and physical health. Poor sleep quality is a significant predictor of depressed mood (Mendlowicz, Jean-Louis, von Gizycki, Zizi & Nunes, 1999). Sleep deprivation has been shown to worsen depressive symptoms in some individuals (Benedetti, Zanardi, Colombo & Smeraldi, 1999; Beutler, Cano, Miro & Buela-Casal, 2003) and increase disturbed mood (Crabbe, 2002; Dinges et al., 1997). Sleep deprivation can also result in increased anxiety (Miro, Cano-Lozano, Espinosa & Buela-Casal, 2002), fatigue, confusion, and tension (Dinges et al., 1997). Furthermore, sleep deprivation affects mood to a greater degree than either cognitive or motor performance (Pilcher & Huffcutt, 1996). Regarding physical health, poor sleep quality and sleep loss are associated with decreased immune function (Cruess et al., 2003; Irwin, 2002), the pathophysiology of cardiovascular disease and diabetes (Roost & Nilsson, 2002), and also the development of overweight/obesity (Agras, Hammer, McNicholas & Kraemer, 2004).

Sleep deprivation also influences food consumption in studies of animals, although these studies have shown some conflicting results. For example, studies with rats have shown that sleep deprivation may lead to over eating (Brock et al., 1994; Tsai, Bergmann & Rechtschaffen, 1992). On the other hand, Johansson and Elomaa (1986) found a reduction in the amount of food consumed by rats when deprived of rapid eye movement (REM) sleep. In addition, some studies also demonstrated that sleep deprivation disturbs the light/dark eating pattern in rats rather than simply increasing or decreasing food intake (Elomaa, 1981; Martinez, Bautista, Phillips & Hicks, 1991). Overall, sleep deprivation seems to alter eating patterns among animals. There are relatively few studies on the effects of sleep on food consumption or food choice in humans, but several pieces of indirect evidence exist to suggest a link between sleep and food consumption. Hicks, McTighe and Juarez (1986) found that short-sleeping college students (e.g., 6 h per night) were more likely to eat more small meals or snacks than long-sleepers who averaged 8 h or more of sleep per night. There is also evidence showing that individuals with eating disorders display abnormal sleep patterns. For example, Latzer, Tzischinsky, Epstein, Klein and Peretz (1999) found that women with bulimia nervosa reported more difficulty falling asleep, more early waking, more headaches on awakening, and more daytime sleepiness than women without bulimia.

Additional evidence for an association between sleep and eating comes from studies of the hypothalamic pituitary adrenal (HPA) axis stress hormone cortisol and other studies of psychosocial stress. There is a negative association between amount of REM sleep and cortisol levels (Lauer et al., 1989) and a positive association between cortisol levels and calories consumed (Epel, Lapidus, McEwen & Brownell, 2001). In addition, sleep loss may be thought of as a source of stress for some individuals, which may subsequently influence food choice and food consumption as well. Increases in stress lead to more snacking and a decrease in the consumption of typical meal-type foods (Oliver & Wardle, 1999). In sum, there is some evidence that loss of sleep, as a stressor, may influence eating patterns, but, to date, no study has examined the effects of sleep restriction on food choice and consumption. A study examined the association between self-imposed sleep deprivation and eating among a sample of college students. We hypothesized that individuals would change their pattern of calorie consumption on the day following partial sleep deprivation. Due to the lack of conclusive evidence, as discussed above, we did not make an a priori hypothesis regarding the direction of change in calorie intake. We also predicted that individuals would choose foods differently following partial sleep deprivation; specifically, in concordance with the Oliver and Wardle (1999) study mentioned above, we predicted that they would choose foods based less on health and weight control and based more on mood and convenience.

In that study, the effects of self-induced partial sleep deprivation among an undergraduate sample were examined. The results showed significant differences in food consumption and food choice following partial sleep deprivation as compared to nights of normal sleep. As expected, there was a change in food consumption, as measured by calories consumed, following a night of partial sleep deprivation. We found that consumption of calories decreased after sleep loss as shown in Johansson and Elomaa’s (1986) study with rats. It is noteworthy to point out that the decrease in calories did not become statistically significant until two days after sleep deprivation rather than the day after. It could be argued that this indicates that sleep deprivation was not the cause of this decline in calories, but that some other factor played a role. One possible explanation is that people consume more calories following the weekend and eat less as the weekend approaches. However, it is important to note that the decrease in calories did not begin until after sleep loss. Also, some participants began the diaries on Monday while others began on Tuesday, making it less likely that the finding was due only to the time frame of the study. Other explanations for the observed decrease in calorie consumption could include diary fatigue and increased awareness of intake. Diary fatigue could have resulted in the participants eating the same amount but recording less in the diary or they could have actually consumed less because of an aversion to writing in the diary. Similarly, a heightened awareness of calorie intake could have led to a decrease in food consumption due to health or weight concern reasons. Due to the fact that there was no control group that kept diaries but did not experience sleep loss, the decrease in calories cannot be attributed solely or exclusively to sleep deprivation.

Reference:

Agras, W., Hammer, L., McNicholas, F., & Kraemer, H. (2004). Risk factors for child overweight: A prospective study from birth to 9.5 years. Journal of Pediatrics, 145, 20–25.

Attie, I., & Brooks-Gunn, J. (1989). Development of eating problems in adolescent girls: A longitudinal study. Developmental Psychology, 25, 70–79.

Backhaus, J., Junghanns, K., & Hohagen, F. (2004). Sleep disturbances are correlated with decreased morning awakening salivary cortisol. Psychoneuroendocrinology, 29, 1184–1191.

Benedetti, F., Zanardi, R., Colombo, C., & Smeraldi, E. (1999). Worsening of delusional depression after sleep deprivation: Case reports. Journal of Psychiatric Research, 33, 69–72.

Beutler, L. E., Cano, M. C., Miro, E., & Buela-Casal, G. (2003). The role of activation in the effect of total sleep deprivation on depressed mood. Journal of Clinical Psychology, 59, 369–384.

Brock, J. W., Farooqui, S. M., Ross, K. D., Payne, S., & Prasad, C. (1994). Stress-related behavior and central norepinephrine concentrations in the REM sleep deprived rat. Physiology and Behavior, 55, 997–1003.

Buysse, D. J., Reynolds, C. F., Monk, T. H., Berman, S. R., & Kupfer, D. J. (1989). The Pittsburgh Sleep Quality Index: A new instrument for psychiatric practice and research. Psychiatry Research, 28, 193–213.

Crabbe, J. B. (2002). Effects of cycling exercise on mood and brain electrocortical activity after sleep deprivation. Dissertation Abstracts International: Section B: The Sciences & Engineering, 62, 3967.

Cruess, D. G., Antoni, M. H., Gonzalez, J., Fletcher, M. A., Klimas, N., Duran, R., et al. (2003). Sleep disturbance mediates the association between psychological distress and immune status among HIV-positive men and women on combination antiretroviral therapy. Journal of Psychosomatic Research, 54, 185–189.

Dinges, D. F., Pack, F., Williams, K., Gillen, K. A., Powell, J. W., Ott, G. E., et al. (1997). Cumulative sleepiness, mood disturbance and psychomotor vigilance performance decrements during a week of sleep restricted to 4–5 hours per night. Sleep: Journal of Sleep Research & Sleep Medicine, 20, 267–277.

Elomaa, E. (1981). The light/dark difference in meal size in the laboratory rat on a standard diet is abolished during REM sleep deprivation. Physiology and Behavior, 26, 487–493.

Epel, E., Lapidus, R., McEwen, B., & Brownell, K. (2001). Stress may add bite to appetite in women: A laboratory study of stress-induced cortisol and eating behavior. Psychoneuroendocrinology, 26, 37–49.

Saturday, October 1, 2011

Term Paper: Bloods And Crips and Hells Angels

In Los Angeles and other urban areas in the United States, the formation of street gangs increased at an alarming pace throughout the 1980s and 1990s. The Bloods and the Crips, the most well-known gangs of Los Angeles, are predominately African American [1] and they have steadily increased in number since their beginnings in 1969. In addition, there are approximately 600 Hispanic gangs in Los Angeles County with a growing Asian gang population numbering approximately 20,000 members. Surprisingly, little has been written about the historical background of black gangs in Los Angeles (LA). Literature and firsthand interviews with Los Angeles residents seem to point to three significant periods relevant to the development of the contemporary black gangs. The first period, which followed WWII and significant black migrations from the South, is when the first major black clubs formed. After the Watts rebellion of 1965, the second period gave way to the civil rights period of Los Angeles where blacks, including those who where former club members who became politically active for the remainder of the 1960s. By the early 1970s black street gangs began to reemerge. By 1972, the Crips were firmly established and the Bloods were beginning to organize. This period saw the rise of LA�s newest gangs, which continued to grow during the 1970s, and later formed in several other cities throughout the United States by the 1990s. While black gangs do not make up the largest or most active gang population in Los Angeles today, their influence on street gang culture nationally has been profound.

In order to better understand the rise of these groups, I went into the original neighborhoods to document the history which led to these groups. There are 88 incorporated cities and dozens of other unincorporated places in Los Angeles County (LAC). In the process of conducting this research, I visited all of these places in an attempt to not just identify gangs active in Los Angeles, but to determine their territories. Through several weeks of field work and research conducted in 1996, I identified 274 black gangs in 17 cities and four unincorporated areas in LAC.  The first major period of black gangs in Los Angeles began in the late 1940s and ended in 1965. There were black gangs in Los Angeles prior to this period, but they were small in numbers; little is known about the activity of these groups. Some of the black groups that existed in Los Angeles in the late 1920s and 1930s were the Boozies, Goodlows, Blogettes, Kelleys, and the Driver Brothers. Most of these groups were family oriented, and they referred to themselves as clubs.[2] Max Bond (1936:270) wrote briefly about a black gang of 15-year-old kids from the Central Avenue area that mostly stole automobile accessories and bicycles. It was not until the late 1940s that the first major black clubs surfaced on the East side[3] of Los Angeles near Jefferson High School in the Central Avenue area. This was the original settlement area of blacks in Los Angeles. South of 92nd Street in Watts and in the Jefferson Park/West Adams area on the West side, there were significant black populations. By 1960 several black clubs were operating on the West side[4] of Los Angeles, an area that had previously restricted black residents during the 1940s.

Several of the first black clubs to emerge in the late 1940s and early 1950s formed initially as a defensive reaction to combat much of the white violence that had been plaguing the black community for several years. In the surrounding communities of the original black ghetto of Central Avenue and Watts, and in the cities of Huntington Park and South Gate, white Angelenos were developing a dissatisfaction for the growing black population that was migrating from the South during WWII. During the 1940s, resentment from the white community grew as several blacks challenged the legal housing discrimination laws that prevented them from purchasing property outside the original settlement neighborhoods and integrate into the public schools. Areas outside of the original black settlement of Los Angeles were neighborhoods covered by legally enforced, racially restrictive covenants or deed restrictions. This practice, adapted by white homeowners, was established in 1922 and was designed to maintain social and racial homogeneity of neighborhoods by denying non-whites access to property ownership. By the 1940s, such exclusionary practices made much of Los Angeles off-limits to most minorities (Bond 1936; Davis 1990:161,273; Dymski and Veitch 1996:40). This process contributed to increasing homogeneity of communities in Los Angeles, further exacerbating racial conflict between whites and blacks, as the latter existed in mostly segregated communities. From 1940 to 1944, there was over a 100 percent increase in the black population of Los Angeles, and ethnic and racial paranoia began to develop among white residents. Chronic overcrowding was taking a toll, and housing congestion became a serious problem, as blacks were forced to live in substandard housing (Collins 1980:26). From 1945-1948, black residents continually challenged restrictive covenants in several court cases in an effort to move out of the dense, overcrowded black community. These attempts resulted in violent clashes between whites and blacks (Collins 1980:30). The Ku Klux Klan resurfaced during the 1940s, 20 years after their presence faded during the late 1920s (Adler 1977; Collins 1980), and white youths were forming street clubs to battle integration of the community and schools of black residents.
In 1943, conflicts between blacks and whites occurred at 5th and San Pedro Streets, resulting in a riot on Central Avenue (Bunch 1990:118). white clubs in Inglewood, Gardena, and on the West side engaged in similar acts, but the Spook Hunters were the most violent of all white clubs in Los Angeles. Between 1973 and 1975, several the non-Crip  gangs decided to form a united federation, as many Crip  gangs began indulging in intra-racial fighting with other black non-Crip gangs. Because of the sheer numbers that the  Crips were able to accumulate through heavy recruitment, they were easily able to intimidate and terrorize other non-Crip  gangs, resulting in one of the first Crip  against Blood gang-related homicides. A member of the LA Brims, a West side independent gang, was shot and killed by a Crip  member after a confrontation (Jah & Keyah 1995:123). This incident started the rivalry between the Crips  and the Brims. The Piru Street Boys (non-Crip  gang) in Compton had severed their relations with the Compton Crips after a similar confrontation, and a meeting was called on Piru Street in Compton where the Blood  alliance was created. Throughout the mid-1970s the rivalry between the Bloods  and Crips grew, as did the number of gangs. In 1974 there were 70 gang-related homicides in Los Angeles, and by 1978, there were 60 black gangs in Los Angeles, 45 Crip  gangs, and 15 Blood gangs. By 1979, at the age of 26, the founder of the Crips  was murdered, Crip infighting was well-established, and gang crime became more perilous. The county reported 30,000 active gang members in 1980 (Table 1.1), and gang murders reached a record high 355 (Table 1.2). The Los Angeles District Attorney�s office and the Hard Core Gang Unit began to focus their resources on prosecuting gang-related offenses during this time (Collier & Horowitz 1983: 94). From 1978 to 1982, the number of black gangs grew from 60 to 155 (See chapter 5), and by 1985 gang homicides were reaching epidemic proportions after a brief lull of activity during the Olympics of 1984. The epidemic of gang-related crime and homicides continued to soar throughout the 1980s, peaking in 1992 with 803 gang-related homicides.

In three years, after the first Crip gang was established in 1969, the number of black gangs in Los Angeles had grown to 18. Table 1 reveals that in each year where gang territory data was available, the growth in the number of gang territories was significant. In the six years between 1972 and 1978, 44 new black gangs formed, and only two gangs became inactive. In the 14 years between 1982 and 1996, 150 new gangs formed. However, the most dramatic growth was in the four years between 1978 and 1982 when 101 new gangs formed. In addition to the number of gang territories increasing, the spatial distribution of gang territories changed during these years, penetrating several new places within Los Angeles County. The dramatic increase in the number of gangs from 1978 to 1982, which was most evident in Los Angeles, Compton, and Inglewood, occurred during the same time when unemployment was rising because of plant closures. A major phase of deindustrialization was occurring in Los Angeles that resulted in 70,000 workers being laid off in South Los Angeles between 1978 and 1982, heavily impacting the black community (Soja et al. 1983: 217). Unemployment at the expense of base closures and plant relocations has been linked, among other factors, to persistent juvenile delinquency that has led to gang development (Klein 1995: 103,194). Spergel found that gangs where more prevalent in areas where limited access to social opportunities and social disorganization, or the lack of integration of key social institutions including youth and youth groups, family, school, and employment in a local community, were found (1995:61). Also the type of community was believed to influence the prevalence of gangs, and neighborhoods with large concentrations of poor families, large number of youths, female-headed households, and lower incomes were key factors (Covey et al. 1997:71). In addition, poverty that is associated with unemployment, racism, and segregation is believed to be a foremost cause of gang proliferation (Klein 1995: 194). These conditions are strongly associated with areas plagued by poverty, rather than the suburban regions identified in this study.

By the mid 1990s there were an estimated 650,000 gang members in the United States (U.S. Department of Justice 1997), including 150,000 in Los Angeles County (Figure 1.1). In addition, in 1996 there were over 600 Hispanic gangs in Los Angeles County along with a growing Asian gang force of about 20,000. With gang membership increasing, gang-related homicides in Los Angeles County reached epidemic proportions for black and Hispanic males that represented 93 percent of all gang-related homicide victims from 1979 to 1994 (Hutson, et al. 1995). From 1985 to 1992, gang-related homicides had increased in each of the eight consecutive years (Figure 1.2). However, the year following the Los Angeles Civil Unrest of 1992, there was a ten percent drop in homicides, the first reduction in gang-related homicides in Los Angeles since 1984. This drop in killings was the result of a gang truce implemented by the four largest gangs in Watts, the Bounty Hunters, the Grape Streets, Hacienda Village, and PJ Watts (Perry 1995:24). In 1992, shortly before the urban unrest of April 29, 1992, a cease-fire was already in effect in Watts, and after the unrest, a peace treaty was developed among the largest black gangs in Watts. Early on, the police started to credit the truce for the sharp drop in gang-related homicides (Berger 1992).

Notes:

[1] A majority of the Crips and Bloods in Los Angeles are African American with the exceptions of a Samoan Crip gangs active in Long Beach, a Samoan Blood gang active in Carson, an Inglewood Crip gang with mostly members of Tongan descent, and a mixed Samoan/black gang active in Compton. With the exception of these four gangs, Crips and Blood gangs are predominately African American.

[2] The groups during this time identified themselves as clubs, but the police department often characterized these groups as gangs.

[3] The East side of Los Angeles refers to the areas east of Main Street to Alameda in the City of Los Angeles. This area includes Watts, and the unincorporated area of Florence. It does not include East LA, Boyle Heights, or other eastern portions of the city. Those areas are usually referred to by their specific names.

[4] The West side of Los Angeles refers to the areas west of Main Street, an area that was off limits to blacks in the 1940s. Through time, though, the border between east and west has moved slightly west in the �mental maps� of those who lived in this area. Later Broadway became the infamous border, and later again the Harbor 110 freeway became the border. Some today consider Vermont Avenue the division between the West side & East side. Gangs have always identified geographically to either East side or West side and they have maintained the use of Main Street as their point of division between the two.

[5] Main Street was the street that bounded the Central Avenue community to the west, but over time, this boundary would move further west. Success to move out of the ghetto occurred in a westerly direction, and over time, Broadway became the boundary, then later Vermont.

[6] Personal interview with Raymond Wright.

[7] Organization was a Los Angeles based black political cultural group from the 1960�s that was under the leadership of Ron Karenga (also known as Maulana Karenga).

[8] Interview with Danifu in 1996.










Reference:
Razaq, Rashid. Evening Standard, 6/8/2007, p5, 1/2p

Conor Lally, Crime Correspondent. Irish Times, 06/01/2007;

Hagedorn, John M.. Journal of African American History, Spring2006, Vol. 91 Issue 2, p194-208, 15p;

Kushner, David. World Almanac & Book of Facts, 2006, p9-9, 1p;

Valdez, Avelardo. Journal of Drug Issues, Fall2005, Vol. 35 Issue 4, p843-867, 25p;

Straight, Susan. Nation, 8/15/2005, Vol. 281 Issue 5, p25-29, 5p, 1bw;

Monday, September 5, 2011

Research Paper: Nebraska Cornhuskers Football Team


History
Husker football began play in 1890, with a 10-0 victory over the Omaha YMCA on Thanksgiving Day, November 27. During the early years of the program, the team had a number of nicknames: "Bugeaters", "Tree Planters", "Nebraskans", "The Rattlesnake Boys", "Antelopes" and "Old Gold Knights"; "Cornhuskers" became the sole nickname used around 1900. Nebraska has claimed 46 conference championships and part or all of five national championships: 1970, 1971, 1994, 1995, and 1997. This marked the first time since Notre Dame in 1946-49 when a team won three national championships in four seasons. Nebraska posted a 60-3-0 record between the 1993-97 seasons. No team has come close to breaking this record.[citation needed] Famous former Huskers include Heisman Trophy winners Johnny Rodgers, Mike Rozier, and Eric Crouch. Rodgers was inducted into the College Football Hall of Fame and for the new millennium he was voted the team's "Player of the Century"; his Cornhusker jersey (No. 20) was retired. Rozier was likewise inducted into the hall in 2006. Other Husker players and coaches who are members of the College Football Hall of Fame include: Forrest Behm, Bob Brown, Guy Chamberlin, Sam Francis, Rich Glover, Ken Hunter, Wayne Meylan, Bobby Reynolds, Dave Rimington, George Sauer, Clarence Swanson, Ed Weir, Dave Noble, and coaches Dana X. Bible, Bob Devaney, Biff Jones, Tom Osborne, Eddie "Robbie" Robinson, and Fielding Yost.The most notable rivals of the Cornhuskers are the Oklahoma Sooners and the Colorado Buffaloes. Nebraska and Oklahoma regularly battled for the Big Eight Conference title until 1995 when the conference became the Big 12. Out of the Big Eight's 89 year history, Nebraska or Oklahoma won or shared the conference championship 71 times. The Cornhuskers and Sooners also played several games during the 1970s and 1980s that decided the national championship.
The Husker defense is known by the nickname of the "Blackshirts." Depictions of the Blackshirts often include a skull and crossbones. This nickname originated in the early 1960s and continued as a reference to the black practice jerseys worn by first-string defensive players during practice. This tradition developed when Bob Devaney had Mike Corgan, one of his assistant coaches, find contrastive jerseys to offset the red jerseys worn by the offense in practice. Further credit is given to George Kelly, Devaney's defensive line coach until 1968, who frequently referred to the top defensive unit by the name; eventually the rest of the coaching staff caught on, while the first mention of the Blackshirts in print was not until 1969. Since the 1994 season, Nebraska's home games have always opened with the Tunnel Walk. Before the team enters, the HuskerVision screens light up with a burst of computer animation, and "Sirius" (an instrumental by The Alan Parsons Project) blares from the speakers. Accompanied by cheers from the crowd, the Huskers take the field. When the Cornhuskers play at home in Memorial Stadium, the stadium holds more people than the third-largest city in Nebraska. They currently hold the record for the most consecutive sold out home games, which celebrated its 285th occasion when they competed against the Ball State Cardinals on September 22, 2007. The sellout streak dates back to November 3, 1962 during Bob Devaney's first season at Nebraska. The Huskers lost the first game in the current streak, a Homecoming game, to Missouri 16-7; 36,501 fans were in attendance.
Coaching
The coach who brought about the most wins in Cornhusker history is Tom Osborne, who led the team for 25 seasons, from 1973 to 1997; his final record at Nebraska was 255 wins, 49 losses and 3 ties. During his tenure, the team won three national titles, including one in his final season. Osborne-led teams won at least 9 games every season and 5 times managed to win 12 or more. By the time he was finished the Nebraska coach had compiled a winning percentage of 83.6%, a higher rate than those held by Bobby Bowden, Paul "Bear" Bryant, and Joe Paterno. After retiring from the Cornhuskers, Osborne was elected to the U.S. House of Representatives from Nebraska's Third Congressional District in 2000. Osborne's handpicked successor was Frank Solich, a Nebraska assistant coach and former player. Solich had coached freshmen from 1979-1983 and running backs from 1983-1997. This was following in a tradition because Osborne had been a long-time Cornhusker assistant before Devaney chose him as his successor. Like Osborne, Solich also had big shoes to fill. In his first season, the team got off to a 5-0 start before falling to Texas A&M 21-28. The team went on to a 9-4 record ending up with the most losses since the 1968 season. Over the next three seasons Solich produced better results: 12-1 in 1999 and 10-2 in 2000. The 2001 season looked to be a special one with Heisman candidate Eric Crouch at quarterback.
Winning Tradition
For more than a century, the legacy of Nebraska Football has been growing. From its humble beginning in 1890 – when two games made an entire schedule – to 2007, where an established, nationally prominent program enjoys a rich history of success, Nebraska’s student-athletes have entertained and excelled on all levels. Five national championships and more than 40 conference titles highlight the accomplishments of one of college football’s most storied programs. There have been Heisman winners and Outland Trophy recipients, a nation-leading number of CoSIDA Academic All-Americans, a Big 12-leading graduation rate and many professional football stars. But underlying all of the countless accolades is an organization that does not rely on wins and losses as the final indicator of excellence. Nebraska Football is much more than talented athletes and coaches taking the field to play a game – it is a family. The fans, support staff and student-athletes are all Dedicated to Excellence through Tradition, Teamwork and integrity. It is this commitment on and off the field that makes Nebraska unique and assures that the rich tradition of the Huskers will keep growing for years to come.
Cornhusker Nickname
Before 1900, Nebraska football teams were known by such names as the Old Gold Knights, Antelopes, Rattlesnake Boys and the Bugeaters. In its first two seasons (1890-91), Nebraska competed as the Old Gold Knights, but beginning in 1892, Nebraska adopted Scarlet and Cream as its colors and accepted the Bugeaters as its most popular nickname until the turn of the century. Named after the insect-devouring bull bats that hovered over the plains, the Bugeaters also found their prey in the Midwest, enjoying winning campaigns in every year of the 1890s until a disappointing season in 1899. After its first losing season in a decade, it must have seemed only fitting that Nebraska move in a new direction, and Lincoln sportswriter Charles S. (Cy) Sherman, who was to gain national renown as the sports editor of the Lincoln Star and help originate The Associated Press Poll, provided the nickname that has gained fame for a century. Sherman tired of referring to the Nebraska teams with such an unglamorous term as Bugeaters. Iowa had, from time to time, been called the Cornhuskers, and the name appealed to Sherman.
History of Memorial Stadium
Nebraska’s continuing NCAA record of consecutive home sellouts reached 282 with seven home sellouts in 2006. The Huskers unveiled the newest additions to Memorial Stadium before the 2006 season. More than 6,500 seats were added to the stadium's North end, along with a massive HuskerVision screen and Skyline Suites. The additions sit atop Nebraska's new Osborne Athletic Complex, which houses the Huskers' technologically advanced athletic medicine facility, massive weight room, and sparkling new football locker room, football offices and administrative offices. The first phase of the project also provided the Huskers with a second indoor workout facility, the Hawks Championship Center. The additions pushed Nebraska's capacity above 80,000 for the first time, and in the 2006 season finale, a Memorial Stadium record crowd of 85,800 witnessed the Huskers' triumph over Colorado. Nebraska's average home attendance for 2006 was a school-record 85,044, as nearly 600,000 fans watched the Huskers in Memorial Stadium last year. Overall, nearly 1.1 million fans saw the Huskers in person last year.
Tunnel Walk Tradition
The Tunnel Walk, which began in 1994, has become an integral part of Memorial Stadium's game-day experience. It was created as a way for fans to share in the excitement of the team emerging from the locker room, something only a few could do before HuskerVision's cameras and big screens came to Memorial Stadium. The sounds of the Alan Parsons Project's "Sirius" and the roar of the 85,000 frenzied fans rock the stadium as the Huskers burst through the locker room doors and into the tunnel on their way to the Memorial Stadium FieldTurf. The players emerge through the Tunnel Walk Gates located in the northwest corner of the stadium. The gates are guarded by members of the Nebraska National Guard service units and opened by specially selected service men and women each game. From 1994 through the 2005 season, the Tunnel Walk began in the former varsity locker room in the South Stadium, and the Huskers burst onto the field from the southwest corner. With the completion of the Tom and Nancy Osborne Athletic Complex in the summer of 2006, Nebraska's locker room returned to its original home in the North Stadium, causing the shift in the Huskers' entrance from the southwest corner to the northwest corner of the field. Wall walking down an interior hall in the Osborne Complex toward the field, the Huskers are led by Head Coach Bill Callahan, and all raise their hands to touch the lucky horseshoe that hangs above the door as the Huskers leave the North Stadium. The same horseshoe hung in the South Stadium tunnel, and before that the horseshoe hung above Nebraska's locker room entrance in the original North Stadium.
Greatest Fans in College Football
The Sea Of Red
The very entrance at Memorial Stadium welcomes Husker fans with the following phrase: "Through these gates pass the Greatest Fans in College Football." Nebraska football fans are perhaps the most loyal in college football. Entering the 2006 season, Nebraska owned an NCAA record streak of 275 consecutive sellouts at Memorial Stadium. Although the Huskers added more than 6,000 seats to the stadium before the start of the 2006 season, Nebraska received approximately 15,000 requests for the new seats, ensuring that the sellout streak would continue in the years to come with capacity crowds reaching more than 80,0000 at Memorial Stadium. Nebraska's football fans have been given the distinguished title - "The Sea of Red" - as waves of red-clad Husker fans follow Nebraska at home and on the road. In fact, the entire state follows the Huskers, along with an ever-growing national fan base, packing Memorial Stadium for every game since 1962. Husker fans bring their show on the road as well. More than 60,000 red-clad Huskers trekked to Pasadena for the 2002 Rose Bowl, and in 2001, more than 30,000 Huskers swarmed South Bend for a matchup with Notre Dame.
Spring Game Crowds
Perhaps the most impressive measurement of the support Husker fans give to the Nebraska football team comes from its Spring Game attendance. Over the past three seasons, more than 182,000 fans have flocked to Memorial Stadium to watch NU’s final practice of the spring, including a record 63,416 fans in 2005. The Huskers have averaged nearly 61,000 fans per Spring Game over the past three years to conclude spring practice. By comparison, seven of the Big 12 schools drew fewer than 15,000 fans for their 2006 spring games.
Fan Day
Another Husker tradition is Fan Day which attracts nearly 10,000 fans annually to Memorial Stadium during fall camp. Every Husker player and coach signs autographs and takes pictures with fans in a tradition that dates back more than 30 years.
New Traditions
One of Nebraska’s newest traditions began in 2005, when the Huskers joined the student section in singing "There is No Place Like Nebraska" after each victory.
Economic Impact
Direct Impact
Most athletic department expenditures were associated with economic activity in Lincoln during the 2004-2005 fiscal year. A portion, however, was not. For instance, $6.6 million of expenditures were debt service payments. Debt service expenditures go to finance past projects rather than current economic activity. We also exclude recruiting costs, which are approximately $1.0 million, because a significant portion of this spending will occur in other states. Excluding recruiting costs is a conservative assumption, because a portion of this spending does occur in Lincoln. We do include payments for team travel. While much of the travel expenses by the University of Nebraska teams would go out of state, these would largely be compensated as opposing teams playing in Lincoln spend part of their travel budgets in Lincoln. Excluding debt services and recruiting leaves $49.2 million in current expenditures for the 2004-2005 fiscal year. A portion of this $49.2 million in current expenditure is excluded from the direct impact of athletic department operations. This is done because some department revenue originates from within the Lincoln metropolitan area, including the ticket purchases, donations tied to ticket purchases, and concession spending of Lincoln area fans. This conservative approach is tantamount to assuming that all Lincoln area fans would spend their income elsewhere in Lincoln if not at University of Nebraska home games, presumably on other types of local recreation or entertainment. In reality, of course, Lincoln area fans might have spent some of that money attending sporting events in Kansas City or Omaha or taking vacations.
The Bureau of Business Research estimated
They estimated that 43% of fans attending football games and 59% of fans attending men’s basketball, baseball, and volleyball games live in the Lincoln metropolitan area. Combining these figures, spending by Lincoln metropolitan area fans on tickets and concessions and donations tied to ticket purchases accounted for 28.3% of all athletic department revenue. 71.8% of revenue came from sources outside of Lincoln. The direct impact of the athletic department on the Lincoln metropolitan area economy is 71.8% of $49.2 million or $35.3 million. The direct economic impact of worker earnings is adjusted by the same proportion, as is direct employment. The same approach also can be used to estimate the direct sales tax impact. The University of Nebraska athletic department paid just over $1.7 million in state and local sales taxes in 2004-2005 due to ticket, concession, and other taxable sales. One-fifth of these taxes are paid to local government. The local share of this amount is $343,000. The direct local sales tax impact is 71.8% of $343,000, or $246,000. As indicated earlier, 73% of athletic department expenditures were due to the football team. The direct impact for football is 73% of the direct impact for the athletic department or $25.8 million. The football team accounted for just less than $1.5 million in state and local sales tax payments, yielding local sales tax revenue of 298,000. The direct local sales tax impact is 71.8% of this total or $214,000.
Economic Impact of Additional Home Football Games
Note that these annual impact estimates were based on the 2004-2005 fiscal year, a season when Nebraska hosted only six home football games (plus the Spring Game). In future years, Nebraska will always host seven home games and will possibly host eight home games. As a result, the athletic department’s economic impact will be even higher in these future years. We estimated that additional economic impact based on the expected increase on fan spending from an additional 78,000 fans at each additional home game. We also consider the impact from earnings of event and concession workers at each additional game, but we conservatively assumed no other increases in athletic department expenditures despite increased revenue.
• The economic impact would have been $5.5 million more, or $119.8 million a year, if Nebraska hosted seven home football games.
• The economic impact would have been $11.1 million more, or $125.4 million a year, if Nebraska hosted eight home football games.
Economic Impact on State of Nebraska
The UNL athletic department has a statewide impact on the Nebraska economy. While we are not able to break out the impact for specific communities such as Omaha, this section of the report presents a range of estimates of the statewide economic impact based on alternative assumptions about whether ticket purchases, game day spending, and donations by fans who live in Nebraska create an economic impact on the state. The  finding is that the annual statewide economic impact on Nebraska was between $48.0 million and $155.1 million during the 2004-2005 fiscal year. Note that this estimate only included the spending of fans that attend UNL games and did not include the spending of fans who gather to watch the game in homes or in restaurants throughout Nebraska. We estimate economic impact first under the conservative assumption that ticket spending and donations tied to ticket purchases and off-site spending from fans who live in Nebraska does not contribute an economic impact to the state. This was the same conservative assumption made for fans from the Lincoln area when calculating the economic impact of UNL athletics on the Lincoln metropolitan area.
This approach assumes that all fan spending (ticket purchases, donations tied to ticket purchases, concessions, and off-site spending) for attending UNL home games would have been spent in Nebraska in any case, presumably on other recreation and entertainment. We also provide a second impact estimate under an optimistic assumption. In that scenario, we assume that all spending and donations from fans living in Nebraska contributes to an economic impact for the state. In other words, fans attending UNL home games would have spent all of their money out of state if not spending it at UNL home games. For example, fans that live in Nebraska would have instead spent money attending sporting events in Kansas City or during out-of-state vacations. Reality lies somewhere between the conservative and optimistic assumptions. If attending UNL home games were not an option, fans likely would have directed a portion of their ticket purchases, game-day spending, and donations toward other recreation and entertainment within Nebraska and a portion out of state. The precise portions are uncertain, however. Results therefore will be presented in a range. At the low end of the range is the impact under the conservative scenario, while at the high end of the range is the optimistic scenario.
Marketing Overview
A revenue stream that continues to flow for the Nebraska Athletic Department is from the revenues generated by the Huskers' Athletic Marketing Office. Those funds, coupled with an increase in private donations and licensing royalties, have helped the self-supporting Athletic Department excel. The overall marketing and promotions philosophy of the Athletic Marketing office is to increase attendance and support for all athletic events by providing an entertaining collegiate atmosphere, while generating revenue opportunities and ticket sales. The focus is also geared toward activities and events, which create a traditional collegiate atmosphere and communicate the overall quality of the highly successful NU athletic program.
Corporate Sponsorships
The Athletic Marketing Office has created many long lasting, well-respected relationships through corporate sponsors. In 1992, the Athletic Department launched a sponsorship program which enabled corporations and businesses to support the Huskers while receiving valuable advertising exposure at NU athletic events. This popular program, which involves all sports, has become a revenue fountain for the department. Alltel, First National Bank, Pepsi and adidas are some of the largest contributors to the Nebraska Athletic program. Sponsorship packages can include a combination of premium tickets, use of logos, parking, event signage, company logos on program covers and/or public address announcements during athletic events. Businesses utilize these packages to entertain clients, reward their employees and take advantage of the exposure.
HuskerVision and Marketing
In order for HuskerVision to become a Memorial Stadium reality and to maintain it today, the Athletic Marketing Office was given the charge of selling sponsorships to pay for the replay boards. The first collegiate football team to have instant replay boards at an exclusive college venue, Husker fans have thoroughly enjoyed the two 17 by 24 foot screens placed in the southeast and northwest corners of Memorial Stadium. The cost of the system was paid for entirely by commercial sponsorships sold by the Athletic Marketing Office.
Football Sponsorships
A successful football team provides ample opportunity to generate revenue within its playing venue. In 1995, the Athletic Department brought the sale of scoreboard signage and game day packages in-house, which made the Athletic Marketing Office responsible for the sale of those packages. That move allowed the department to work directly with the sponsors and maintain a clean, uncluttered look in Memorial Stadium, thereby maximizing revenue for the scholarship fund and increasing the revenue for the sponsor.
Husker Nation Pavilion
The 2003 football season marked the beginning of the Husker Nation Pavilion, which has become a new tradition on Husker Football game days. The Pavilion, which takes place on the Ed Weir Track, just northeast of Memorial Stadium, is the largest pre-game tailgate in college football. Husker fans can enjoy many activities before each football game including the live pre-game radio broadcast, a live band, the Kids Zone, Husker Power testing, many food vendors and other new exhibits and events every game day. The Athletic Marketing office is the main operator behind the Husker Nation Pavilion, setting up the Pavilion each week and coordinating the entertainment and events taking place each game. The Pavilion is also the place for corporate sponsors to host their own tailgate party before each game in a reserved tent. Through the Athletic Marketing office and the Athletic Ticket Office, corporate tent packages are sold and allow many groups and organizations to become official sponsors of the Husker Nation Pavilion. Tent packages include a reserved tent spot at the Pavilion, a complete catered meal, tickets to the game and game day programs.
Olympic Sports
While a great deal of time and effort continues to be spent developing comprehensive corporate sponsorship programs for football and men's basketball, the Athletic Marketing Office also strives to develop promotional and marketing plans for all Husker Olympic sport teams. These plans are similar in nature to the football and men's basketball plans and includes program ad sales, event tickets, promotions, etc. The cooperation between the coaches, student-athletes, ticket office, sports information office, business office and development offices make it possible to produce promotions which increases attendance at these sporting events.
Red Zone
One of the most visible and highly successful additions created and implemented in 2000-2001 by the athletic marketing office was the Red Zone. The Red Zone consists of men’s basketball student season ticket holders with seats behind the baskets on the A-level. When students pick up their season tickets they received a Red Zone t-shirt and are required to wear it to every game in order to sit in the A-section. The Red Zone has been a huge success by increasing student season ticket sales significantly and creating a great basketball atmosphere. The upcoming year will bring new additions to the Red Zone launching it into one of the best Men’s Basketball student sections in the country.
Interns
Several part-time undergraduate students serve as interns in the marketing office. Each is assigned a specific sport to create and implement promotional plans for that particular sport. These students are a very important part of the marketing office's workforce and make a great impact during their four to six semester terms. The experience they gain has proven valuable as many have gone on to work in the professional ranks including positions with the New Orleans Saints, Kansas City Chiefs, Colorado Rockies, Sioux City Explorers and Sioux Falls Canaries.
Event Marketing
The marketing office is not only responsible for planning events and finding corporate sponsors, but is also actively involved as the actual events take place. Marketing staff is present at almost every sports event held at home. At home events, corporate sponsors are hosted in the Hospitality rooms, which are available for sponsors before and during every Men’s and Women’s basketball games. Sponsors are treated to a catered meal, game day programs and refreshments to make their game day experience one of a kind.
Cheerleaders and Yell Squad
Starting in 1997-98, the Athletic marketing office took over all responsibilities for the Spirit Squad in order to coordinate game day atmosphere in a more efficient way. The Spirit Squad can be broken down into these Varsity cheer squads: has two Varsity cheer squads, the All-Girl and Coed Yell Squad and Varsity dance team (the Scarlets), and two mascots (Herbie and Lil’ Red). The Spirit Squad promotes the athletic, fun atmosphere and provides support for athletic events. During an academic year, the spirit squad members make more than 200 appearances at athletic, philanthropic, community and state events. Members of the Spirit Squad condition and practice 10 to 20 hours per week. A major focus of the Spirit Squad is education. Collectively the current members hold a 3.2 grade point average. The yell and dance squads both earned top 10 honors at their collegiate national competition in 2000. The yell squad finished eighth, while the Scarlets Dance Team finished second in Division I-A. Nebraska's Lil' Red mascot finished fourth at the 2000 NCAA national championship.



















Reference:

Hayes, Matt. Sporting News, 10/1/2007, Vol. 231 Issue 40, p58-59, 2p;

Sherman, Mitch. Omaha World-Herald (NE), 03/04/2007

Sanchez, Adrian. Columbus Telegram (NE), 10/30/2006

Rasbach, David. Bellingham Herald, The (Bellingham, WA), 10/20/2006

Anderson, Lars. Sports Illustrated, 8/21/2006, Vol. 105 Issue 7, p108-110, 2p, 1c

Beech, Mark. Sports Illustrated, 11/21/2007, Vol. 103 Issue 20, p130-130, 1p, 1c;

King, Kelley; Sabino, David. Sports Illustrated, 8/16/2004, Vol. 101 Issue 6, p115-115, 1p, 1c;

Sporting News, 12/16/2002, Vol. 226 Issue 50, p16, 1/4p;

Scheft, Bill; Kennedy, Kostya; Deitsch, Richard. Sports Illustrated, 10/14/2002, Vol. 97 Issue 15, p34, 2/3p, 1c;

Maisel, Ivan. Sports Illustrated, 10/7/2006, Vol. 97 Issue 14, p90, 2p, 2c;

Gigaatino, Artie. Sporting News, 9/30/2005, Vol. 226 Issue 39, p36, 1/4p, 2c;

Shatel, Tom. Sporting News, 12/24/2005, Vol. 225 Issue 52, p70, 2p, 1c;

Thursday, September 1, 2011

Sample Essay: Styles Of Management


Some people are better negotiators than others. How do the best negotiators behave differently from average negotiators? Researchers have been searching for factors that determine effective negotiation behaviors ever since the early research efforts to investigate negotiation began in the late 1950s (for detailed reviews see Bazerman et al., 2001; Thompson, 1990). Many have argued that negotiation behaviors are predicated upon conflict management styles (Kirkbride et al., 1991; Ma, 2006a; Volkema and Bergmann, 1995), however, surprisingly few studies have attempted to examine this relationship (Volkema and Bergmann, 1995), and even fewer have done so within a cross-cultural context. With the increasingly globalized world economy, cross-cultural studies of negotiation and conflict management styles have received more and more attention from both academics and practitioners (Gelfand and Dyer, 2000; Graham and Mintu-Wimsat, 1997; Tinsley, 1998). Sensitivity to cultural differences in negotiations becomes an important success factor for today's business. As a result, it is necessary to study whether conflict management styles predict actual behaviors during negotiation and, if so, whether such relationship exists within a cross-cultural context in order to understand the dynamics of international negotiations.
Conflict Managing Styles
Because problems and conflict occur widely in team-oriented organizations the way in which conflict is managed may determine the success or failure of team outcomes. Organizations are constantly relying on teams to increase competitiveness and solve conflict and so team members must be able to manage intragroup conflict effectively and constructively (Cohen & Ledford, 1994; Ilgen, 1999; Lovelace, Shapiro, & Weingart, 2001).
At a basic level, a conflict exists when confronting interests or incompatible activities exist between the parties involved in social situations (Deutsch, 1973). Thomas (1992) emphasized three basic themes underlying common definitions of conflict. First, a conflict exists only if it is perceived as conflict by the actors involved. Second, there is a level of interdependence between the actors such that they have the ability to influence each other. Finally, in any conflict, scarcity of resources (such as money, power, and prestige) may generate tensions among the actors.
Different theoretical models have been proposed to analyze the way in which individuals approach and handle conflict. Taxonomies and meta taxonomies have been anticipated using a unidimensional approach of cooperation and competition styles (Deutsch, 1949; Tjosvold, 1998), a bidimensional approach involving four styles of conflict management behavior (Pruitt, 1983), a bidimensional approach involving five styles (Blake & Mouton, 1964; Rahim & Bonoma, 1979), and even a tridimensional model of moving away, moving toward and moving against (Horney, 1945).
The most extended model is that of Blake and Mouton (1964) who proposed a bidimensional grid for classifying the modes in which individuals handle interpersonal conflict. These two dimensions relate to the extent that individuals show high or low concern "for production" and "for people." Later, Thomas and Kilmann (1974) and Rahim (1983), using this theoretical approach, redefined the dimensions as "concern for self and "concern for others." The "concern for self" dimension reflects the degree in which an individual tries to satisfy his/her personal concerns or needs. The "concern for others" dimension has the same meaning but is centred on others' needs or concerns. Combining these two dimensions, five different styles of managing interpersonal conflict are obtained.
The Dominating style involves high concern for self and low concern for others reflecting win-lose behavior involving efforts to obtain favourable solutions for oneself regardless of others. The Integrating style involves high concern for self and high concern for others, reflecting a collaborating style between the parties in conflict where individuals seek to exchange information, examine differences, understand the problem, and show openness to each other. An integrative solution that is acceptable for both parties is sought in this style which echoes the problem solving strategy proposed by Van de Vliert and Euwema (1994) as well as the approach to integration in group dilemmas proposed by Trompenaars (2004). The Avoiding style is related to low concern for self and low concern for others. This style is related with withdrawal behavior, hiding disagreement, and sidestepping confrontations with the other party involved in the conflict. The Obliging style reflects low concern for self and high concern for the other party in the conflict. This style is related to behavior that tries to satisfy the needs of others and make concessions during the course of the conflict. Both Obliging and Avoiding styles seek to reduce discrepancies between parties but in a very different manner. While Obliging shows a high concern for others and attitudes to accommodate and accept their wishes, Avoiding does not judge the other party as deserving any concern and thus it may hide higher levels of aggressiveness. The Avoiding style may also be used when there is a lack of awareness of interdependency and it may hide a lack of interest. Finally, Compromising depicts a moderate concern for self and for others. It takes a middle ground in solving conflict where both parties should "give something" in order to "take something" (Rahim & Magner, 1995, p. 123). This bidimensional approach of five styles has been widely supported (Chanin & Schneer, 1984; Goodwin, 2002; King & Miles, 1990; Lee, 1990; Rahim, Antonioni, & Psenicka, 2001; Van de Vliert & Kabanoff, 1990).
Common Backgrounds
If high performing teams are to be built, the way in which conflict is handled in teams is of fundamental importance. Highly interdependent contexts are defined by constant controversy. Controversy may be constructive or destructive depending on the cooperative or competitive goal structure of the team (Tjosvold, 1998). However, if other factors influencing behavior are considered, the way in which individuals manage conflict in a team may be determined by their personal preferences (Drenth, Thierry, Willems, & Wolf, 1984).
From this point of view, previous studies have related team role preferences to the exercise of control in interpersonal relations. Fisher, et al. (2001) found that some team roles showed a higher propensity to exert control than others. Shapers and Resource Investigators, for example, displayed behavior related to attempts to control more so than accepting control.
Similarly, team role preferences have been related to the cognitive styles that individuals possess while making decisions and solving problems (Aritzeta et al., 2005; Fisher et al., 1998). These studies, reported that team roles like Resource Investigator, Shaper, and Plant showed a positive relationship with an innovative cognitive style. While solving problems, individuals high in innovative cognitive style tend to manipulate problems and challenge rules and do not need consensus to maintain confidence in the face of opposition. High innovators are defined as abrasive, creating dissonance, unsound, and who are prepared to shock their opposites (Kirton, 1989). On the other hand, team roles like Team Worker, Completer Finisher, and Implementer show a positive correlation with an adaptive cognitive style. This style is described as being methodical, prudent, disciplined, conforming, and dependable. Generally, a high adaptor is a person concerned with reducing problems and seeking solutions in tried and understood ways. They are vulnerable to social pressure and authority and have a greater need for clarity.
Studies on control and cognitive styles show that different team roles can be differentially related to ways in which team members seek power in groups and approach problem solving. If a team role is related to exerting control behavior it is likely to be related to dominating conflict management behavior. Similarly, if control is accepted then avoiding conflict managing behavior will be more likely. The same can be said for different cognitive styles. As innovative cognitive style is defined by abrasive and shocking behavior, dominating rather than obliging behavior should be expected. In the same way, as adaptive cognitive style is defined by being conforming and dependable, avoiding rather than dominating styles can be predicted. Therefore, as team roles have shown to be differentially related to control behavior and cognitive styles, it can be expected that different team role preferences will also show different correlations with conflict management style.
The theoretical background developed above shows that both team role preferences and conflict management styles share common ground regarding the ways in which individuals relate to one another in a work team context. As conflict will occur in any team and as individuals have preferences regarding the way in which they approach work and interpersonal relations, it should be possible to predict how team role preferences relate to conflict managing styles.
Conflict management style and negotiation
Scholars have been studying the best way to manage conflict, resulting in impressive literature on conflict management styles (cf. Thompson, 1990; Van de Vliert, 1997; Wall and Blum, 1991). The dominant conflict management model in this literature is the dual-concern model. Originated from the work of Blake and Mouton (1964) and further developed by many other theorists (e.g. Deutsch, 1994; Rahim, 1983; Thomas, 1976; Thomas and Kilmann, 1974), the dual-concern model has several variations, all of which assume that individuals choose different modes, strategies, or styles for handling conflict based on some variations of two primary concerns/interests — "concern for self" and "concern for other". These two dimensions define five conflict management styles: competing or dominating (high concern for self, low concern for other); collaborating or integrating (high concern for self and for other); compromising (moderate concern for self and for other); accommodating or obliging (low concern for self and high concern for other); and avoiding (low concern for self and low concern for other). The defined five styles reflect an individual's behavioral intentions when facing conflict situations (Womack, 1988). Subsequent studies suggest that the interrelationships among the constructs are consistent with those depicted in the model (Van de Vliert and Euwema, 1994; Van de Vliert and Kabanoff, 1990) and that the two dimensions provide the basis for choice of conflict mode (Sorenson et al., 1999).
Among the instruments developed for assessing conflict management styles, Hall's (1969) conflict management survey, Rahim's (1983) organizational conflict inventory, and Thomas and Kilmann's (1974) conflict MODE instrument have been used extensively in academic research, training seminars, and organizational intervention and development, yet few studies have linked the conflict management styles measured by these instruments with actual behaviors (Ma, 2006a; Volkema and Bergmann, 1995), which makes it difficult to assess the usefulness of these instruments in predicting actual conflict-resolving behaviors such as in business negotiation. This study will explore whether the conflict management styles measured by the Thomas-Kilmann conflict MODE instrument are valid indicators of behavioral pattern in business negotiation and further whether such relationship between conflict management styles and actual behaviors hold across cultures.
Conflict management has been defined as a culturally bound event (Hocker and Wilmot, 1991) and consequently, the relationship between conflict management styles and actual behaviors are affected by cultural values. The most relevant cultural dimension that is likely to affect this relationship is contextualism, also the dimension that has been widely examined in cross-cultural negotiation literature. Contextualism reflects the degree of sensitivity to communication context (Hall, 1976; Kirkbride et al., 1991). People from low context cultures, such as Canadian culture, use explicit and direct language, whereas those from high context cultures such as Chinese culture use implicit and indirect language in which words and phrases derive their meanings from contextual clues.
Negotiation behaviors and outcomes
Negotiation behaviors involves the dynamic interaction between negotiators by which the two parties exchange goods or services and attempt to agree upon an exchange rate by resolving incompatible goals (Carnevale and Pruitt, 1992; Wall, 1985; Wall and Blum, 1991). Among other behaviors, first offer, assertiveness, and distributiveness have been found to play important roles during negotiations and thus have often been examined in numerous negotiation and conflict management studies (Barry and Friedman, 1998; Lewicki and Litterer, 1985; Greenhalgh et al., 1985). Their relationship with conflict management styles and with the negotiation outcomes will be examined in this study.
As one of the central dimensions of negotiation behavior, the level of assertiveness during negotiation has been examined and proved to be an important factor that affects negotiation outcomes and therefore its role in negotiation process cannot be overrated (Greenhalgh et al., 1985; Jaeger et al., 1999; Mnookin et al., 1996). Similarly, the level of distributiveness or "win-lose" intent of negotiation behavior has also been closely related to negotiation outcomes (Lewicki and Litterer, 1985; Lewicki et al., 1994). In this study, their relationship with conflict management styles will be examined.
The effect of first offer will also be investigated in this study. In any negotiation, the decision to put the first offer on the table is a double-edged sword (Barry and Friedman, 1998). To the offerer's potential disadvantage, an initial offer conveys information about aspirations and utilities (Rubin and Brown, 1975). Depending on the underlying structure of reservation prices, this information may reduce the range of potential agreements, to the disadvantage of the offerer. On the other hand, an opening offer may lead the opponent to perceive that settlements will favor the party making the first offer. This is more likely to happen when the first offer is an extreme one (Siegel and Fouraker, 1960). For example, a seller who initially demands a high price may induce the buyer to believe that the range of potential agreements is closer to the seller's reservation price than originally thought. Moreover, extreme initial offers can signal that the party making the offer is hard bargainer who will not be induced to retreat (Lewicki et al., 1994). When this occurs, the recipient of such an offer may moderate his or her negotiation objectives and be more inclined to offer concessions. Therefore, bargainers who make the first move may be better off starting with a relatively extreme offer, though there are limits to the effectiveness of extreme offers (e.g. offers so extreme that they discredit the bargainer who made the offer or reduce hope on the other side to the point of withdrawal) (Barry and Friedman, 1998).
Inclusion of the level of assertiveness, the level of distributiveness, and the level of first offer in this study shows an internal relationship among these variables. The level of assertiveness reflects the extent to which individuals are not afraid to express their needs/desires and willing to defend their own interests, while the level of distributiveness assesses the extent to which individuals believe the current situation to be a win-lose situation versus a win-win situation. The interaction of these two leads to choice of different approaches of conflict resolving or negotiating. The level of first offer sets a tone for the whole negotiation process, which is the manifestation of such choice.
Implications for management
The results of this study will have important implications for management practices. The first important implications is that negotiation researchers and practitioners cannot rely on self-reported conflict management styles to predict actual behaviors and therefore the training seminars and practice-oriented workshops should be adjusted accordingly since most of these interventions and organizational development are based on self-administered paper-and-pencil tests. Due to the difference in cultural contextualism the usefulness of the self-reported conflict management styles in predicting negotiation behaviors largely depends on the sensitivity to context clues. In low context cultures such as that of Canada, negotiators' preferred conflict management styles predict their behaviors very well, which indicates that practitioners can count on these styles to plan and prepare for business negotiation. In high context cultures such as Chinese culture, contexts play a much more important role in determining negotiators' behaviors, and therefore, negotiators should closely examine the features of negotiation tasks in order to make accurate predictions on negotiation behaviors.
The second implication is about the usefulness of high level of first offer. Evidence about the important role that first offer plays in business negotiation emerges from this study. The level of first offer is found to be the key process factor that predicts individual profits both in Canada and in China. This might be an important message to negotiation practitioners. As discussed previously, high first offer is a double-edged sword. Relatively extreme first offer can be favorable to the offerer as it sends a message that the party making the offer is a hard bargainer and thus the recipient of such an offer will be more likely to offer concessions, but too extreme offer will discredit the offerer to the point of breaking the negotiation. The current result supports an extreme first offer for obtaining the best individual results, which is encouraging news for the use extreme first offer in actual business negotiations.

Reference:

Baron, R.M. and Kenny, D.A. (1986), "The moderator-mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations", Journal of Personality and Social Psychology, Vol. 51, pp. 1173-82.

Barry, B. and Friedman, R.A. (1998), "Bargainer characteristics in distributive and integrative negotiation", Journal of Personality and Social Psychology, Vol. 74, pp. 345-59.

Bazerman, M.H., Curhan, J.R. and Moore, D.A. (2001), "The death and rebirth of the social psychology of negotiation", in Clark, M. and Fletcher, G. (Eds), Blackwell Handbook of Social Psychology, Blackwell, Cambridge, MA, pp. 196-228.

Blake, R.R. and Mouton, J.S. (1964), The Managerial Grid, Gulf Publishing, Houston, TX.

Calhoun, P.S. and Smith, W.P. (1999), "Integrative bargaining: does gender make a difference?", International Journal of Conflict Management, Vol. 10 No. 3, pp. 203-24.

Carnevale, P.J. and Lawer, E.J. (1986), "Time pressure and the development of integrative agreements in bilateral negotiations", Journal of Conflict Resolution, Vol. 30, pp. 636-56.

Carnevale, P.J. and Pruitt, D.G. (1992), "Negotiation and mediation", Annual Review of Psychology, Vol. 43, pp. 531-82.

 Deutsch, M. (1994), "Constructive conflict resolution: principles, training, and research", Journal of Social Issues, Vol. 50 No. 1, pp. 13-32.

Fiske, A.P. (2002), "Using individualism and collectivism to compare culture — a critique of the validity and measurement of the constructs: comment on Oyserman et al. (2002)", Psychological Bulletin, Vol. 128 No. 1, pp. 78-88.

Gelfand, M. and Dyer, N. (2000), "A cultural perspective on negotiation: progress, pitfalls, and prospects", Annual Psychology: An International Review, Vol. 49 No. 1, pp. 62-99.

Graham, J. and Mintu-Wimsat, A. (1997), "Culture's influence on business negotiations in four countries", Group Decision and Negotiation, Vol. 6, pp. 483-502.

Greenhalgh, L., Nelsin, S.A. and Gilkey, R.W. (1985), "The effects of negotiator preferences, situational power, and negotiator personality on outcomes of business negotiations", Academy of Management Journal, Vol. 28, pp. 9-33.

Hall, E.T. (1976), Beyond Culture, Anchor, Garden City, NY.

Hall, J. (1969), Conflict Management Survey: A Survey of One's Characteristic Reaction to and Handling of Conflict between Himself and Others, Teleometrics International, Conroe, TX.

Hirokawa, R. and Miyahara, A. (1986), "A comparison of influence strategies utilized by managers in American and Japanese organizations", Communication Quarterly, Vol. 34, pp. 250-65.

 Kirkbride, P.S., Tang, F.Y. and Westwood, R.I. (1991), "Chinese conflict preferences and negotiating behavior: cultural and psychological influence", Organization Studies, Vol. 12 No. 3, pp. 365-89.

Leung, K. (1988), "Some determinants of conflict avoidance", Journal of Cross-Cultural Psychology, Vol. 19 No. 1, pp. 125-36.

Leung, K. and Tjosvold, D. (1998), Conflict Management in the Asian Pacific: Assumptions and Approaches in Diverse Cultures, John Wiley & Sons, New York, NY.

Lewicki, R.J. and Litterer, J.A. (1985), Negotiation, Irwin, Homewood, IL.

Lewicki, R.J., Litterer, J.A., Minton, J.W. and Saunders, D.M. (1994), Negotiation, 2nd ed., Irwin, Burr Ridge, IL.

Ma, Z. (2004), "West meets Muslim: comparing Canadian and Pakistani conflict styles in business negotiations", paper presented at the annual meeting of the Academy of Management, New Orleans, LA.

Ma, Z. (2006a), "Conflict styles as indicators of behavioral pattern in business negotiations", paper presented at the annual meeting of the Academy of Management, Atlanta, GA.

eu, J., Graham, J.L. and Gilly, M.C. (1988), "The influence of gender on behaviors and outcomes in a retail buyer-seller negotiation simulation", Journal of Retailing, Vol. 64, pp. 427-51.

Ohbuchi, K. and Tedeschi, J.T. (1994), "Cultural styles of conflict", Journal of Applied Social Psychology, Vol. 24, pp. 1345-66.

Oyserman, D., Coon, H.M. and Kemmelmeier, M. (2002), "Rethinking individualism and collectivism: evaluation of theoretical assumption and meta-analyses", Psychological Bulletin, Vol. 128 No. 1, pp. 3-72.

Rahim, M.A. (1983), "A measure of styles of handling interpersonal conflict", Academy of Management Journal, Vol. 26 No. 2, pp. 368-76.

 Tinsley, C. (1998), "Models of conflict resolution in Japanese, German, and American cultures", Journal of Applied Psychology, Vol. 83, pp. 316-23.

Triandis, H.C. (1995), Individualism and Collectivism, Westview, Boulder, CO.

Trubisky, P., Ting-Toomey, S. and Lin, S.L. (1991), "The influence of individualism-collectivism and self-monitoring on conflict styles", International Journal of Intercultural Relations, Vol. 15, pp. 65-84.

Van de Vijver, F. and Leung, K. (1997), "Methods and data analysis of comparative research", in Berry, J.W., Poortinga, Y.H. and Pandey, J. (Eds), Handbook of Cross-Cultural Psychology, Vol. 1, Allyn & Bacon, Needham Heights, MA, pp. 257-300.

Van de Vliert, E. (1997), Complex Interpersonal Behavior: Theoretical Frontiers, Psychology Press, Hove.

Van de Vliert, E. and Euwema, M.C. (1994), "Agreeableness and activeness as components of conflict behaviors", Journal of Personality and Social Psychology, Vol. 66, pp. 674-87.

 Weldon, E. and Jehn, K.A. (1995), "Examining cross-cultural differences in conflict management behavior: a strategy for future research", International Journal Conflict Management, Vol. 6, pp. 387-403.

Womack, D.F. (1988), "Assessing the Thomas-Kilmann conflict model survey", Management Communication Quarterly, Vol. 1 No. 3, pp. 321-49.