ScienceDaily (Sep. 30, 2009) — In a first-of its-kind study, epidemiologists at the University of Pennsylvania School of Medicine found that, on average, guns did not protect those who possessed them from being shot in an assault. The study estimated that people with a gun were 4.5 times more likely to be shot in an assault than those not possessing a gun. The study was released online this month in the American Journal of Public Health, in advance of print publication in November 2009.
“This study helps resolve the long-standing debate about whether guns are protective or perilous,” notes study author Charles C. Branas, PhD, Associate Professor of Epidemiology. “Will possessing a firearm always safeguard against harm or will it promote a false sense of security?”
What Penn researchers found was alarming – almost five Philadelphians were shot every day over the course of the study and about 1 of these 5 people died. The research team concluded that, although successful defensive gun uses are possible and do occur each year, the chances of success are low. People should rethink their possession of guns or, at least, understand that regular possession necessitates careful safety countermeasures, write the authors. Suggestions to the contrary, especially for urban residents who may see gun possession as a defense against a dangerous environment should be discussed and thoughtfully reconsidered.
A 2005 National Academy of Science report concluded that we continue to know very little about the impact of gun possession on homicide or the utility of guns for self-defense. Past studies had explored the relationship between homicides and having a gun in the home, purchasing a gun, or owning a gun. These studies, unlike the Penn study, did not address the risk or protection that having a gun might create for a person at the time of a shooting.
Penn researchers investigated the link between being shot in an assault and a person’s possession of a gun at the time of the shooting. As identified by police and medical examiners, they randomly selected 677 cases of Philadelphia residents who were shot in an assault from 2003 to 2006. Six percent of these cases were in possession of a gun (such as in a holster, pocket, waistband, or vehicle) when they were shot.
These shooting cases were matched to Philadelphia residents who acted as the study’s controls. To identify the controls, trained phone canvassers called random Philadelphians soon after a reported shooting and asked about their possession of a gun at the time of the shooting. These random Philadelphians had not been shot and had nothing to do with the shooting. This is the same approach that epidemiologists have historically used to establish links between such things as smoking and lung cancer or drinking and car crashes.
“The US has at least one gun for every adult,” notes Branas. “Learning how to live healthy lives alongside guns will require more studies such as this one. This study should be the beginning of a better investment in gun injury research through various government and private agencies such as the Centers for Disease Control, which in the past have not been legally permitted to fund research ‘designed to affect the passage of specific Federal, State, or local legislation intended to restrict or control the purchase or use of firearms.’”
This study was funded by the National Institutes of Health. The authors are also indebted to numerous dedicated individuals at the Philadelphia Police, Public Health, Fire, and Revenue Departments as well as DataStat Inc, who collaborated on the study.
Therese S. Richmond, PhD, CRNP, School of Nursing; Dennis P. Culhane, PhD, School of Social Policy; Thomas R. Ten Have, PhD, MPH, and Douglas J. Wiebe, PhD, both from the School of Medicine, are co-authors.
Journal reference:
Charles C. Branas, Therese S. Richmond, Dennis P. Culhane, Thomas R. Ten Have, and Douglas J. Wiebe. Investigating the Link Between Gun Possession and Gun Assault. American Journal of Public Health, 2009; DOI: 10.2105/AJPH.2008.143099
Adapted from materials provided by University of Pennsylvania School of Medicine.
Thursday, October 1, 2009
Protection Or Peril? Gun Possession Of Questionable Value In An Assault, Study Finds.
Sunday, September 20, 2009
Ego City: Cities Are Organized Like Human Brains.
SOURCE
ScienceDaily (Sep. 19, 2009) — Cities are organized like brains, and the evolution of cities mirrors the evolution of human and animal brains, according to a new study by researchers at Rensselaer Polytechnic Institute.
Just as advanced mammalian brains require a robust neural network to achieve richer and more complex thought, large cities require advanced highways and transportation systems to allow larger and more productive populations. The new study unearthed a striking similarity in how larger brains and cities deal with the difficult problem of maintaining sufficient interconnectedness.
“Natural selection has passively guided the evolution of mammalian brains throughout time, just as politicians and entrepreneurs have indirectly shaped the organization of cities large and small,” said Mark Changizi, a neurobiology expert and assistant professor in the Department of Cognitive Science at Rensselaer, who led the study. “It seems both of these invisible hands have arrived at a similar conclusion: brains and cities, as they grow larger, have to be similarly densely interconnected to function optimally.”
As brains grow more complex from one species to the next, they change in structure and organization in order to achieve the right level of interconnectedness. One couldn’t simply grow a double-sized dog brain, for example, and expect it to have the same capabilities as a human brain. This is because, among other things, a human brain doesn’t merely have more “dog neurons,” but, instead, has neurons with a greater number of synapses than that of a dog – something crucial in helping to keep the human brain well connected.
As with brains, interconnectedness is also a critical component of the overall function of cities, Changizi said. One couldn’t put together three copies of Seattle (surface area of 83.9 sq. miles) and expect the result to have the same interconnectedness and efficiency as Chicago (surface area of 227.1 sq. miles). There would be too many highways with too few exits and lanes that are too narrow.
In exploring this topic, Changizi discovered evidence linking the size of a city or a brain to the number and size of its supporting infrastructure. He investigated and documented how the infrastructures scale up as the surface area of brains and cities increase.
As cities and the neocortex grow in surface area, the number of connectors – highways in cities and pyramidal neurons in brains – increases more slowly, as surface area to the 3/4 power, Changizi found. This means the number of connectors increases in both brains and cities as S3/4, where S = surface area. Similarly, as cities and brains grow, the total number of highway exits and synapses — which share a similar function as terminal points along highways and neurons — increases with an exponent of about 9/8. The number of exits per highway and synapses per neuron were also closely aligned, with an exponent of approximately 3/8.
These and other findings are detailed in the paper “Common Scaling Laws for City Highway Systems and the Mammalian Neocortex,” published this week in the journal Complexity. The complete paper may be viewed online at the Complexity Web site.
“When scaling up in size and function, both cities and brains seem to follow similar empirical laws,” Changizi said. “They have to efficiently maintain a fixed level of connectedness, independent of the physical size of the brain or city, in order to work properly.”
Marc Destefano, clinical assistant professor in the Department of Cognitive Science at Rensselaer, co-authored the paper.
Adapted from materials provided by Rensselaer Polytechnic Institute.
“Natural selection has passively guided the evolution of mammalian brains throughout time, just as politicians and entrepreneurs have indirectly shaped the organization of cities large and small,” said Mark Changizi, a neurobiology expert and assistant professor in the Department of Cognitive Science at Rensselaer, who led the study. “It seems both of these invisible hands have arrived at a similar conclusion: brains and cities, as they grow larger, have to be similarly densely interconnected to function optimally.”
As brains grow more complex from one species to the next, they change in structure and organization in order to achieve the right level of interconnectedness. One couldn’t simply grow a double-sized dog brain, for example, and expect it to have the same capabilities as a human brain. This is because, among other things, a human brain doesn’t merely have more “dog neurons,” but, instead, has neurons with a greater number of synapses than that of a dog – something crucial in helping to keep the human brain well connected.
As with brains, interconnectedness is also a critical component of the overall function of cities, Changizi said. One couldn’t put together three copies of Seattle (surface area of 83.9 sq. miles) and expect the result to have the same interconnectedness and efficiency as Chicago (surface area of 227.1 sq. miles). There would be too many highways with too few exits and lanes that are too narrow.
In exploring this topic, Changizi discovered evidence linking the size of a city or a brain to the number and size of its supporting infrastructure. He investigated and documented how the infrastructures scale up as the surface area of brains and cities increase.
As cities and the neocortex grow in surface area, the number of connectors – highways in cities and pyramidal neurons in brains – increases more slowly, as surface area to the 3/4 power, Changizi found. This means the number of connectors increases in both brains and cities as S3/4, where S = surface area. Similarly, as cities and brains grow, the total number of highway exits and synapses — which share a similar function as terminal points along highways and neurons — increases with an exponent of about 9/8. The number of exits per highway and synapses per neuron were also closely aligned, with an exponent of approximately 3/8.
These and other findings are detailed in the paper “Common Scaling Laws for City Highway Systems and the Mammalian Neocortex,” published this week in the journal Complexity. The complete paper may be viewed online at the Complexity Web site.
“When scaling up in size and function, both cities and brains seem to follow similar empirical laws,” Changizi said. “They have to efficiently maintain a fixed level of connectedness, independent of the physical size of the brain or city, in order to work properly.”
Marc Destefano, clinical assistant professor in the Department of Cognitive Science at Rensselaer, co-authored the paper.
Adapted from materials provided by Rensselaer Polytechnic Institute.
Monday, July 13, 2009
Swearing Can Actually Increase Pain Tolerance
ScienceDaily (July 13, 2009) — Researchers from Keele University’s School of Psychology have determined that swearing can have a ‘pain-lessening effect’, according to new study published in the journal NeuroReport.
While swearing is often a common response to pain, Dr Richard Stephens and his colleagues, John Atkins and Andrew Kingston, were surprised to discover that no links had been established between swearing and the actual experience of physical pain. Since swearing often has a ‘catastrophising’ or exaggerating effect, serving to embellish or overstate the severity of pain, Stephens and his team hypothesised that swearing would actually decrease the individual’s tolerance of pain.
The Ice Water Test
Enlisting the help of 64 undergraduate volunteers, the team set out to test their theory. Each individual was asked to submerge their hand in a tub of ice water for as long as possible while repeating a swear word of their choice; they were then asked to repeat the experiment, this time using a more commonplace word that they would use to describe a table. Despite their initial expectations, the researchers found that the volunteers were able to keep their hands submerged in the ice water for a longer period of time when repeating the swear word, establishing a link between swearing and an increase in pain tolerance.
Fight-Or-Flight Response
While it isn’t clear how or why this link exists, the team believes that the pain-lessening effect occurs because swearing triggers our natural ‘fight-or-flight’ response. They suggest that the accelerated heart rates of the volunteers repeating the swear word may indicate an increase in aggression, in a classic fight-or-flight response of ‘downplaying feebleness in favour of a more pain-tolerant machismo.’ What is clear is that swearing triggers not only an emotional response, but a physical one too, which may explain why the centuries-old practice of cursing developed and still persists today.
Dr Richard Stephens said: “Swearing has been around for centuries and is an almost universal human linguistic phenomenon. It taps into emotional brain centres and appears to arise in the right brain, whereas most language production occurs in the left cerebral hemisphere of the brain. Our research shows one potential reason why swearing developed and why it persists.”
Adapted from materials provided by Keele University.
The Ice Water Test
Enlisting the help of 64 undergraduate volunteers, the team set out to test their theory. Each individual was asked to submerge their hand in a tub of ice water for as long as possible while repeating a swear word of their choice; they were then asked to repeat the experiment, this time using a more commonplace word that they would use to describe a table. Despite their initial expectations, the researchers found that the volunteers were able to keep their hands submerged in the ice water for a longer period of time when repeating the swear word, establishing a link between swearing and an increase in pain tolerance.
Fight-Or-Flight Response
While it isn’t clear how or why this link exists, the team believes that the pain-lessening effect occurs because swearing triggers our natural ‘fight-or-flight’ response. They suggest that the accelerated heart rates of the volunteers repeating the swear word may indicate an increase in aggression, in a classic fight-or-flight response of ‘downplaying feebleness in favour of a more pain-tolerant machismo.’ What is clear is that swearing triggers not only an emotional response, but a physical one too, which may explain why the centuries-old practice of cursing developed and still persists today.
Dr Richard Stephens said: “Swearing has been around for centuries and is an almost universal human linguistic phenomenon. It taps into emotional brain centres and appears to arise in the right brain, whereas most language production occurs in the left cerebral hemisphere of the brain. Our research shows one potential reason why swearing developed and why it persists.”
Adapted from materials provided by Keele University.
Wednesday, July 8, 2009
Evolution Guides Cooperative Turn-taking, Game Theory-based Computer Simulations Show
ScienceDaily (July 8, 2009) — It’s not just good manners to wait your turn – it’s actually down to evolution, according to new research by University of Leicester psychologists.
A study in the University’s School of Psychology sought to explain how turn-taking has evolved across a range of species. The conclusion is that there is an evolution-based “invisible hand” that guides our actions in this respect. What's more, the researchers have shown that this behavior can be simulated using a simple computer algorithm and basic genetic laws.
Professor Andrew Colman and Dr Lindsay Browning carried out the study due to appear in the September issue of the journal Evolutionary Ecology Research. The study has helped to explain the evolution of cooperative turn-taking.
Professor Colman said: “In human groups, turn-taking is usually planned and coordinated with the help of language. For example, people living together often agree to take turns washing up the dishes after meals or taking their children to school. But turn-taking has also evolved in many other species without language or the capacity to reach negotiated agreements. These include apes, monkeys, birds, and antelopes that take turns grooming each other, and mating pairs of Antarctic penguins that take turns foraging at sea while their partners incubate eggs or tend to chicks.
“It is far from obvious how turn-taking evolved without language or insight in animals shaped by natural selection to pursue their individual self-interests.”
The researchers say that playing “tit for tat” – copying in each time period whatever the other individual did in the previous period – can explain synchronized cooperation, but cannot fully explain turn-taking. “For example, many predatory animals hunt in pairs or larger groups, and this involves synchronized cooperation. ‘Tit for tat’ has been shown to work very well in initiating and sustaining this type of cooperation.”
“But where cooperation involves turn-taking, a ‘tit for tat’ instinct could sustain the pattern once it was established but could not initiate it in the first place. For example, in a mating pair of penguins who both went foraging or both incubated the eggs at the same time, ‘tit for tat’ would not be enough to evolve the habit of taking turns.”
Using evolutionary game theory and computer simulations, Professor Colman and Dr Browning discovered a simple variation of “tit for tat” that explains how turn-taking can evolve in organisms that pursue their individual self-interests robotically.
The researchers state: “Turn-taking is initiated only after a species has evolved at least two genetically different types that behave differently in initial, uncoordinated interactions with others. Then as soon as a pair coordinates by chance, they instinctively begin to play ‘tit for tat’. This locks them into mutually beneficial coordinated turn-taking indefinitely. Without genetic diversity, turn-taking cannot evolve in this simple way.”
Professor Colman added: “In our simulations, the individuals were computer programs that were not only dumb and robotic but also purely selfish. Nevertheless, they ended up taking turns in perfect coordination. We published indirect evidence for this in 2004; we have now shown it directly and found a simple explanation for it. Our findings confirm that cooperation does not always require benevolence or deliberate planning. This form of cooperation, at least, is guided by an ‘invisible hand’, as happens so often in Darwin’s theory of natural selection.”
Andrew Colman is a Professor of Psychology and Lindsay Browning is a former student and Honorary Visiting Fellow of the University of Leicester. The research, which used a specially developed genetic algorithm, was funded through an Auber Bequest Award from Scotland’s National Academy, The Royal Society of Edinburgh.
Journal reference:
Andrew M. Colman & Lindsay Browning. Evolution of cooperative turn-taking. Evolutionary Ecology Research, 2009; (forthcoming)
Adapted from materials provided by University of Leicester, via AlphaGalileo.
Professor Andrew Colman and Dr Lindsay Browning carried out the study due to appear in the September issue of the journal Evolutionary Ecology Research. The study has helped to explain the evolution of cooperative turn-taking.
Professor Colman said: “In human groups, turn-taking is usually planned and coordinated with the help of language. For example, people living together often agree to take turns washing up the dishes after meals or taking their children to school. But turn-taking has also evolved in many other species without language or the capacity to reach negotiated agreements. These include apes, monkeys, birds, and antelopes that take turns grooming each other, and mating pairs of Antarctic penguins that take turns foraging at sea while their partners incubate eggs or tend to chicks.
“It is far from obvious how turn-taking evolved without language or insight in animals shaped by natural selection to pursue their individual self-interests.”
The researchers say that playing “tit for tat” – copying in each time period whatever the other individual did in the previous period – can explain synchronized cooperation, but cannot fully explain turn-taking. “For example, many predatory animals hunt in pairs or larger groups, and this involves synchronized cooperation. ‘Tit for tat’ has been shown to work very well in initiating and sustaining this type of cooperation.”
“But where cooperation involves turn-taking, a ‘tit for tat’ instinct could sustain the pattern once it was established but could not initiate it in the first place. For example, in a mating pair of penguins who both went foraging or both incubated the eggs at the same time, ‘tit for tat’ would not be enough to evolve the habit of taking turns.”
Using evolutionary game theory and computer simulations, Professor Colman and Dr Browning discovered a simple variation of “tit for tat” that explains how turn-taking can evolve in organisms that pursue their individual self-interests robotically.
The researchers state: “Turn-taking is initiated only after a species has evolved at least two genetically different types that behave differently in initial, uncoordinated interactions with others. Then as soon as a pair coordinates by chance, they instinctively begin to play ‘tit for tat’. This locks them into mutually beneficial coordinated turn-taking indefinitely. Without genetic diversity, turn-taking cannot evolve in this simple way.”
Professor Colman added: “In our simulations, the individuals were computer programs that were not only dumb and robotic but also purely selfish. Nevertheless, they ended up taking turns in perfect coordination. We published indirect evidence for this in 2004; we have now shown it directly and found a simple explanation for it. Our findings confirm that cooperation does not always require benevolence or deliberate planning. This form of cooperation, at least, is guided by an ‘invisible hand’, as happens so often in Darwin’s theory of natural selection.”
Andrew Colman is a Professor of Psychology and Lindsay Browning is a former student and Honorary Visiting Fellow of the University of Leicester. The research, which used a specially developed genetic algorithm, was funded through an Auber Bequest Award from Scotland’s National Academy, The Royal Society of Edinburgh.
Journal reference:
Andrew M. Colman & Lindsay Browning. Evolution of cooperative turn-taking. Evolutionary Ecology Research, 2009; (forthcoming)
Adapted from materials provided by University of Leicester, via AlphaGalileo.
Saturday, July 4, 2009
Why most "pedophiles" aren't really pedophiles, technically speaking?
Source: Scientific American
Between the age limits of nine and fourteen there occur maidens who, to certain bewitched travelers, twice or many times older than they, reveal their true nature which is not human, but nymphic (that is, demoniac); and these chosen creatures I propose to designate as “nymphets.”
By Jesse Bering
Michael Jackson probably wasn’t a pedophile—at least, not in the strict, biological sense of the word. It’s a morally loaded term, pedophile, that has become synonymous with the very basest of evils. (In fact it’s hard to even say it aloud without cringing, isn’t it?) But according to sex researchers, it’s also a grossly misused term. If Jackson did fall outside the norm in his “erotic age orientation”—and we may never know if he did—he was almost certainly what’s called a hebephile, a newly proposed diagnostic classification in which people display a sexual preference for children at the cusp of puberty, between the ages of, roughly, 11 to 14 years of age. Pedophiles, in contrast, show a sexual preference for clearly prepubescent children. There are also ephebophiles (from ephebos, meaning “one arrived at puberty” in Greek), who are mostly attracted to 15- to 16-year-olds; teleiophiles (from teleios, meaning, “full grown” in Greek), who prefer those 17 years of age or older); and even the very rare gerontophile (from gerontos, meaning “old man” in Greek), someone whose sexual preference is for the elderly. So although child sex offenders are often lumped into the single classification of pedophilia, biologically speaking it’s a rather complicated affair. Some have even proposed an additional subcategory of pedophilia, “infantophilia,” to distinguish those individuals most intensely attracted to children below six years of age.Based on this classification scheme of erotic age orientations, even the world’s best-known fictitious “pedophile,” Humbert Humbert from Nabokov’s masterpiece, Lolita, would more properly be considered a hebephile. (Likewise the protagonist from Thomas Mann’s Death in Venice, a work that I’ve always viewed as something of the “gay Lolita”). Consider Humbert’s telltale description of a “nymphet.” After a brief introduction to those “pale pubescent girls with matted eyelashes,” Humbert explains:
Between the age limits of nine and fourteen there occur maidens who, to certain bewitched travelers, twice or many times older than they, reveal their true nature which is not human, but nymphic (that is, demoniac); and these chosen creatures I propose to designate as “nymphets.”
Although Michael Jackson might have suffered more disgrace from his hebephilic orientation than most, and his name will probably forever be entangled darkly with the sinister phrase “little boys,” he wasn’t the first celebrity or famous figure that could be seen as falling into this hebephilic category. In fact, ironically, Michael Jackson’s first wife, Lisa Marie Presley, is the
product of a hebephilic attraction. After all, let’s not forget that Priscilla caught Elvis’s very grownup eye when she was just fourteen, only a year or two older than the boys that Michael Jackson was accused of sexually molesting. Then there’s of course also the scandalous Jerry Lee Lewis incident in which the 23-year-old “Great Balls of Fire” singer married his 13-year-old first cousin.
In the psychiatric community, there’s recently been a hubbub of commotion concerning whether hebephelia should be designated as a medical disorder or, instead, seen simply as a normal variant of sexual orientation and not indicative of brain pathology. There are important policy implications of adding hebephilia to the checklist of mental illnesses, since doing so might allow people who sexually abuse pubescent children to invoke a mental illness defense.
In the psychiatric community, there’s recently been a hubbub of commotion concerning whether hebephelia should be designated as a medical disorder or, instead, seen simply as a normal variant of sexual orientation and not indicative of brain pathology. There are important policy implications of adding hebephilia to the checklist of mental illnesses, since doing so might allow people who sexually abuse pubescent children to invoke a mental illness defense.
One researcher who is arguing vociferously for the inclusion of hebephilia in the American Psychiatric Association's revised diagnostic manual (the DSM-V) is University of Toronto psychologist Ray Blanchard. In last month’s issue of Archives of Sexual Behavior, Blanchard and his colleagues provide new evidence that many people diagnosed under the traditional label of pedophilia are in fact not as interested in prepubescent children as they are early adolescents.
To tease apart these erotic age orientation differences, Blanchard and his colleagues studied 881 men (straight and gay) in his laboratory using phallometric testing (also known as penile plethysmography) while showing them visual images of differently aged nude models. Because this technique measures penile blood volume changes, it’s seen as being a fairly objective index of sexual arousal to what’s being shown on the screen—which, for those attracted to children and young adolescents, the participant might verbally deny being attracted to. In other words, the penis isn’t a very good liar. So, for example, in Blanchard’s study, the image of a naked 12-year-old girl (nothing prurient, but rather resembling a subject in a medical textbook) was accompanied by the following audiotaped narrative:
“You are watching a late movie on TV with your neighbors’ 12-year-old daughter. You have your arm around her shoulders, and your fingers brush against her chest. You realize that her breasts have begun to develop…”
Blanchard and his coauthors found that the men in their sample fell into somewhat discrete categories of erotic age orientation—some had the strongest penile response to the prepubescent children (the pedophiles), others to the pubescent children (the hebephiles), and the remainder to the adults shown on screen (the teleiophiles). These categories weren’t mutually exclusive. For example, some teleiophiles showed some arousal to pubescent children, some hebephiles showed some attraction to prepubescent children, and so on. But the authors did find that it’s possible to distinguish empirically between a “true pedophile” and a hebephile using this technique, in terms of the age ranges for which men exhibited their strongest arousal. They also conclude that, based on the findings from this study, hebephilia “is relatively common compared with other forms of erotic interest in children.”
In the second half of their article, Blanchard and his colleagues argue that hebephilia should be added to the newly revised DSM-V as a genuine paraphilic mental disorder—differentiating it from pedophilia. But many of his colleagues working in this area are strongly opposed to doing this.
Men who find themselves primarily attracted to young or middle-aged adolescents are clearly disadvantaged in today’s society, but historically (and evolutionarily) this almost certainly wasn’t the case. In fact, hebephiles—or at least ephebephiles—would have had a leg up over their competition. Evolutionary psychologists have found repeatedly that markers of youth correlate highly with perceptions of beauty and attractiveness. For straight men, this makes sense, since a woman’s reproductive value declines steadily after the age of about twenty. Obviously having sex with a prepubescent child would be fruitless—literally. But, whether we like it or not, this isn’t so for a teenage girl who has just come of age, who is reproductively viable and whose brand-new state of fertility can more or less ensure paternity for the male. These evolved motives were portrayed in the film Pretty Baby, in which a young Brooke Shields plays the role of twelve-old-old Violet Neil, a prostitute’s daughter in 1917’s New Orleans whose coveted virginity goes up for auction to the highest bidder.
To tease apart these erotic age orientation differences, Blanchard and his colleagues studied 881 men (straight and gay) in his laboratory using phallometric testing (also known as penile plethysmography) while showing them visual images of differently aged nude models. Because this technique measures penile blood volume changes, it’s seen as being a fairly objective index of sexual arousal to what’s being shown on the screen—which, for those attracted to children and young adolescents, the participant might verbally deny being attracted to. In other words, the penis isn’t a very good liar. So, for example, in Blanchard’s study, the image of a naked 12-year-old girl (nothing prurient, but rather resembling a subject in a medical textbook) was accompanied by the following audiotaped narrative:
“You are watching a late movie on TV with your neighbors’ 12-year-old daughter. You have your arm around her shoulders, and your fingers brush against her chest. You realize that her breasts have begun to develop…”
Blanchard and his coauthors found that the men in their sample fell into somewhat discrete categories of erotic age orientation—some had the strongest penile response to the prepubescent children (the pedophiles), others to the pubescent children (the hebephiles), and the remainder to the adults shown on screen (the teleiophiles). These categories weren’t mutually exclusive. For example, some teleiophiles showed some arousal to pubescent children, some hebephiles showed some attraction to prepubescent children, and so on. But the authors did find that it’s possible to distinguish empirically between a “true pedophile” and a hebephile using this technique, in terms of the age ranges for which men exhibited their strongest arousal. They also conclude that, based on the findings from this study, hebephilia “is relatively common compared with other forms of erotic interest in children.”
In the second half of their article, Blanchard and his colleagues argue that hebephilia should be added to the newly revised DSM-V as a genuine paraphilic mental disorder—differentiating it from pedophilia. But many of his colleagues working in this area are strongly opposed to doing this.
Men who find themselves primarily attracted to young or middle-aged adolescents are clearly disadvantaged in today’s society, but historically (and evolutionarily) this almost certainly wasn’t the case. In fact, hebephiles—or at least ephebephiles—would have had a leg up over their competition. Evolutionary psychologists have found repeatedly that markers of youth correlate highly with perceptions of beauty and attractiveness. For straight men, this makes sense, since a woman’s reproductive value declines steadily after the age of about twenty. Obviously having sex with a prepubescent child would be fruitless—literally. But, whether we like it or not, this isn’t so for a teenage girl who has just come of age, who is reproductively viable and whose brand-new state of fertility can more or less ensure paternity for the male. These evolved motives were portrayed in the film Pretty Baby, in which a young Brooke Shields plays the role of twelve-old-old Violet Neil, a prostitute’s daughter in 1917’s New Orleans whose coveted virginity goes up for auction to the highest bidder.
Understanding adult gay men’s attraction to young males is more of a puzzle. Evolutionary psychologist Frank Muscarella’s “alliance formation theory” is the only one that I’m aware of that attempts to do this. This theory holds that homoerotic behavior between older, high status men and teenage boys serves as a way for the latter to move up in ranks, a sort of power-for-sex bargaining chip. The most obvious example of this type of homosexual dynamic was found in ancient Greece, but male relationships in a handful of New Guinea tribes display these homoerotic patterns as well. There are also, ahem, plenty of present-day examples of this in Congress. Oscar Wilde probably would have signed on to this theoretical perspective. After all, his famous “love that dare not speak its name” wasn’t homosexuality, per se, but rather a “great affection of an elder for a younger man”:
...as there was between David and Jonathan, such as Plato made the very basis of his philosophy, and such as you find in the sonnets of Michelangelo and Shakespeare. It is that deep, spiritual affection that is as pure as it is perfect. It dictates and pervades great works of art like those of Shakespeare and Michelangelo… It is beautiful, it is fine, it is the noblest form of affection. There is nothing unnatural about it. It is intellectual, and it repeatedly exists between an elder and a younger man, when the elder man has intellect, and the younger man has all the joy, hope and glamour of life before him. That it should be so, the world does not understand. The world mocks at it and sometimes puts one in the pillory for it.
But, generally speaking, Muscarella’s theory doesn’t seem to pull a lot of weight. Not many teenage boys in any culture seem terribly interested in taking this particular route to success. Rather—and I may be wrong about this—but I think most teenage boys would prefer to scrub toilets for the rest of their lives or sell soft bagels at the mall than become the sexual plaything of an “older gentlemen.”
In any event, given the biological (even adaptive) verities of being attracted to adolescents, most experts in this area find it completely illogical for Blanchard to recommend adding hebephilia to the revised DSM-V. (Especially since other more clearly maladaptive paraphilias—such as gerontophilia, in which men are attracted primarily to elderly, post-menopausal women—are not presently included in the diagnostic manual.) The push to pathologize hebephilia, argues forensic psychologist Karen Franklin, appears to be motivated more by “a booming cottage industry” in forensic psychology, not coincidentally linked with a “punitive era of moral panic." Because “civil incapacitation” (basically, the government’s ability to strip a person of his or her civil rights in the interests of public safety) requires that the person be suffering from a diagnosable mental disorder or abnormality, Franklin calls Blanchard’s proposal “a textbook example of subjective values masquerading as science.” Another critic, forensic psychologist Gregory DeClue, suggests that such medical classifications are being based on arbitrary distinctions dictated by cultural standards:
Pedophilia is a mental disorder. Homosexuality is not. Should hebephilia of ephebophilia or gerontophilia be considered mental disorders? How about sexual preference for people with different (or with the same) ethnic characteristics as oneself?
And Marquette University psychologist Thomas Zander, points out that since chronological age doesn’t always perfectly match physical age, including these subtle shades of erotic age preferences would be problematic from a diagnostic perspective:
Imagine how much more impractical it would be to require forensic evaluators to determine the existence of pedophilia based on the stage of adolescence of the examinee’s victim. Such determinations could literally devolve into a splitting of pubic hairs.
...as there was between David and Jonathan, such as Plato made the very basis of his philosophy, and such as you find in the sonnets of Michelangelo and Shakespeare. It is that deep, spiritual affection that is as pure as it is perfect. It dictates and pervades great works of art like those of Shakespeare and Michelangelo… It is beautiful, it is fine, it is the noblest form of affection. There is nothing unnatural about it. It is intellectual, and it repeatedly exists between an elder and a younger man, when the elder man has intellect, and the younger man has all the joy, hope and glamour of life before him. That it should be so, the world does not understand. The world mocks at it and sometimes puts one in the pillory for it.
But, generally speaking, Muscarella’s theory doesn’t seem to pull a lot of weight. Not many teenage boys in any culture seem terribly interested in taking this particular route to success. Rather—and I may be wrong about this—but I think most teenage boys would prefer to scrub toilets for the rest of their lives or sell soft bagels at the mall than become the sexual plaything of an “older gentlemen.”
In any event, given the biological (even adaptive) verities of being attracted to adolescents, most experts in this area find it completely illogical for Blanchard to recommend adding hebephilia to the revised DSM-V. (Especially since other more clearly maladaptive paraphilias—such as gerontophilia, in which men are attracted primarily to elderly, post-menopausal women—are not presently included in the diagnostic manual.) The push to pathologize hebephilia, argues forensic psychologist Karen Franklin, appears to be motivated more by “a booming cottage industry” in forensic psychology, not coincidentally linked with a “punitive era of moral panic." Because “civil incapacitation” (basically, the government’s ability to strip a person of his or her civil rights in the interests of public safety) requires that the person be suffering from a diagnosable mental disorder or abnormality, Franklin calls Blanchard’s proposal “a textbook example of subjective values masquerading as science.” Another critic, forensic psychologist Gregory DeClue, suggests that such medical classifications are being based on arbitrary distinctions dictated by cultural standards:
Pedophilia is a mental disorder. Homosexuality is not. Should hebephilia of ephebophilia or gerontophilia be considered mental disorders? How about sexual preference for people with different (or with the same) ethnic characteristics as oneself?
And Marquette University psychologist Thomas Zander, points out that since chronological age doesn’t always perfectly match physical age, including these subtle shades of erotic age preferences would be problematic from a diagnostic perspective:
Imagine how much more impractical it would be to require forensic evaluators to determine the existence of pedophilia based on the stage of adolescence of the examinee’s victim. Such determinations could literally devolve into a splitting of pubic hairs.
One unexplored question, and one inseparable from the case of Michael Jackson, is whether we tend to be more forgiving of a person’s sexual peccadilloes when that individual has some invaluable or culturally irreplaceable abilities. For example, consider the following true story:
There once was a man who fancied young boys. Being that laws were more lax in other nations, this man decided to travel to a foreign country, leaving his wife and young daughter behind, where he met up with another Westerner who shared in his predilections for pederasty, and there the two of them spent their happy vacation scouring the seedy underground of this country searching for pimps and renting out boys for sex.
Now if you’re like most people, you’re probably experiencing a shiver of disgust and a spark of rage. You likely feel these men should have their testicles drawn and quartered by wild mares, be thrown to a burly group of rapists, castrated with garden sheers or, if you’re the pragmatic sort, treated as any other sick animal in the herd would be treated, with a humane bullet to the temple or perhaps a swift and sure current of potassium chloride injected into the arm.
But notice the subtle change in your perceptions when I tell you that these events are from the autobiography of André Gide, who in 1947—long after he’d publicized these very details—won the Nobel prize in literature. Gide is in fact bowdlerizing his time in Algiers with none other than Oscar Wilde.
Wilde took a key out of his pocket and showed me into a tiny apartment of two rooms… The youths followed him, each of them wrapped in a burnous that hid his face. Then the guide left us and Wilde sent me into the further room with little Mohammed and shut himself up in the other with the [other boy]. Every time since then that I have sought after pleasure, it is the memory of that night I have pursued.
It’s not that we think it’s perfectly fine for Gide and Wilde to have sex with minors or even that they shouldn’t have been punished for such behaviors. (In fact Wilde was sentenced in London to two years hard labor for related offenses not long after this Maghreb excursion with Gide and died in penniless ignominy.) But somehow, as with our commingled feelings for Michael Jackson, “the greatest entertainer of all time,” the fact that these men were national treasures somehow dilutes our moralistic anger, as though we’re more willing to suffer their vices given the remarkable literary gifts they bestowed.
Would you really have wanted Oscar Wilde euthanized as though he were a sick animal? Should André Gide, whom the New York Times hailed in their obituary as a man “judged the greatest French writer of this century by the literary cognoscenti,” have been deprived of his pen, torn to pieces by illiterate thugs? It’s complicated. And although in principle we know that all men are equal in the eyes of the law, just as we did for Michael Jackson during his child molestation trials, I have a hunch that many people tend to feel (and uncomfortably so) a little sympathy for the Devil under such circumstances.
In this column presented by Scientific American Mind magazine, research psychologist Jesse Bering of Queen's University Belfast ponders some of the more obscure aspects of everyday human behavior. Ever wonder why yawning is contagious, why we point with our index fingers instead of our thumbs or whether being breastfed as an infant influences your sexual preferences as an adult? Get a closer look at the latest data as “Bering in Mind” tackles these and other quirky questions about human nature. Sign up for the RSS feed or friend Dr. Bering on Facebook and never miss an installment again.
There once was a man who fancied young boys. Being that laws were more lax in other nations, this man decided to travel to a foreign country, leaving his wife and young daughter behind, where he met up with another Westerner who shared in his predilections for pederasty, and there the two of them spent their happy vacation scouring the seedy underground of this country searching for pimps and renting out boys for sex.
Now if you’re like most people, you’re probably experiencing a shiver of disgust and a spark of rage. You likely feel these men should have their testicles drawn and quartered by wild mares, be thrown to a burly group of rapists, castrated with garden sheers or, if you’re the pragmatic sort, treated as any other sick animal in the herd would be treated, with a humane bullet to the temple or perhaps a swift and sure current of potassium chloride injected into the arm.
But notice the subtle change in your perceptions when I tell you that these events are from the autobiography of André Gide, who in 1947—long after he’d publicized these very details—won the Nobel prize in literature. Gide is in fact bowdlerizing his time in Algiers with none other than Oscar Wilde.
Wilde took a key out of his pocket and showed me into a tiny apartment of two rooms… The youths followed him, each of them wrapped in a burnous that hid his face. Then the guide left us and Wilde sent me into the further room with little Mohammed and shut himself up in the other with the [other boy]. Every time since then that I have sought after pleasure, it is the memory of that night I have pursued.
It’s not that we think it’s perfectly fine for Gide and Wilde to have sex with minors or even that they shouldn’t have been punished for such behaviors. (In fact Wilde was sentenced in London to two years hard labor for related offenses not long after this Maghreb excursion with Gide and died in penniless ignominy.) But somehow, as with our commingled feelings for Michael Jackson, “the greatest entertainer of all time,” the fact that these men were national treasures somehow dilutes our moralistic anger, as though we’re more willing to suffer their vices given the remarkable literary gifts they bestowed.
Would you really have wanted Oscar Wilde euthanized as though he were a sick animal? Should André Gide, whom the New York Times hailed in their obituary as a man “judged the greatest French writer of this century by the literary cognoscenti,” have been deprived of his pen, torn to pieces by illiterate thugs? It’s complicated. And although in principle we know that all men are equal in the eyes of the law, just as we did for Michael Jackson during his child molestation trials, I have a hunch that many people tend to feel (and uncomfortably so) a little sympathy for the Devil under such circumstances.
In this column presented by Scientific American Mind magazine, research psychologist Jesse Bering of Queen's University Belfast ponders some of the more obscure aspects of everyday human behavior. Ever wonder why yawning is contagious, why we point with our index fingers instead of our thumbs or whether being breastfed as an infant influences your sexual preferences as an adult? Get a closer look at the latest data as “Bering in Mind” tackles these and other quirky questions about human nature. Sign up for the RSS feed or friend Dr. Bering on Facebook and never miss an installment again.
Correction (posted 7/2/09): When this story was originally posted, we incorrectly stated that the DSM-IV is published by the American Psychological Association, rather than the American Psychiatric Association. Scientific American regrets the error.
ABOUT THE AUTHOR(S)Jesse Bering is director of the Institute of Cognition and Culture at Queen's University Belfast in Northern Ireland, where he studies how the evolved human mind plays a part in various aspects of social behavior. His new book, Under God's Skin, is forthcoming from W. W. Norton in spring 2010.
ABOUT THE AUTHOR(S)Jesse Bering is director of the Institute of Cognition and Culture at Queen's University Belfast in Northern Ireland, where he studies how the evolved human mind plays a part in various aspects of social behavior. His new book, Under God's Skin, is forthcoming from W. W. Norton in spring 2010.
Thursday, July 2, 2009
Second Life Data Offers Window Into How Trends Spread
ScienceDaily (July 3, 2009) — Do friends wear the same style of shoe or see the same movies because they have similar tastes, which is why they became friends in the first place? Or once a friendship is established, do individuals influence each other to adopt like behaviors?
Social scientists don't know for sure. They're still trying to understand the role social influence plays in the spreading of trends because the real world doesn't keep track of how people acquire new items or preferences.
But the virtual world Second Life does. Researchers from the University of Michigan have taken advantage of this unique information to study how "gestures" make their way through this online community. Gestures are code snippets that Second Life avatars must acquire in order to make motions such as dancing, waving or chanting.
Roughly half of the gestures the researchers studied made their way through the virtual world friend by friend.
"We could have found that most everyone goes to the store to buy gestures, but it turns out about 50 percent of gesture transfers are between people who have declared themselves friends. The social networks played a major role in the distribution of these assets," said Lada Adamic, an assistant professor in the School of Information and the Department of Electrical Engineering and Computer Science.
Adamic is an author of a paper on the research that graduate student Eytan Bakshy will present on July 7 at the Association for Computer Machinery's Conference on Electronic Conference in Stanford, Calif. Bakshy is a doctoral student in the School of Information.
"There's been a high correspondence between the real world and virtual worlds," Adamic said. "We're not saying this is exactly how people share in the real world, but we believe it does have some relevance."
This study is one of the first to model social influence in a virtual world because of the rarity of having access to information about how information, assets or ideas propagate. In Second Life, the previous owner of a gesture is listed.
The researchers also found that the gestures that spread from friend to friend were not distributed as broadly as ones that were distributed outside of the social network, such as those acquired in stores or as give-aways.
And they discovered that the early adopters of gestures who are among the first 5-10 percent to acquire new assets are not the same as the influencers, who tend to distribute them most broadly. This aligns with what social scientists have found.
"In our study, we sought to develop a more rigorous understanding of social processes that underlies many cultural and economic phenomena," Bakshy said. "While some of our findings may seem quite intuitive, what I find most exciting is that we were actually able to test some rather controversial and competing hypotheses about the role of social networks in influence."
The researchers examined 130 days worth of gesture transfers in late 2008 and early 2009. They looked at 100,229 users and 106,499 gestures. They obtained the data from Linden Lab, the maker of Second Life. Personally-identifying information had been removed.
The research is funded by the National Science Foundation. The paper, "Social Influence and the Diffusion of User-Created Content," is authored by Eytan Bakshy, Brian Karrer and Lada A. Adamic.
Adapted from materials provided by University of Michigan.
But the virtual world Second Life does. Researchers from the University of Michigan have taken advantage of this unique information to study how "gestures" make their way through this online community. Gestures are code snippets that Second Life avatars must acquire in order to make motions such as dancing, waving or chanting.
Roughly half of the gestures the researchers studied made their way through the virtual world friend by friend.
"We could have found that most everyone goes to the store to buy gestures, but it turns out about 50 percent of gesture transfers are between people who have declared themselves friends. The social networks played a major role in the distribution of these assets," said Lada Adamic, an assistant professor in the School of Information and the Department of Electrical Engineering and Computer Science.
Adamic is an author of a paper on the research that graduate student Eytan Bakshy will present on July 7 at the Association for Computer Machinery's Conference on Electronic Conference in Stanford, Calif. Bakshy is a doctoral student in the School of Information.
"There's been a high correspondence between the real world and virtual worlds," Adamic said. "We're not saying this is exactly how people share in the real world, but we believe it does have some relevance."
This study is one of the first to model social influence in a virtual world because of the rarity of having access to information about how information, assets or ideas propagate. In Second Life, the previous owner of a gesture is listed.
The researchers also found that the gestures that spread from friend to friend were not distributed as broadly as ones that were distributed outside of the social network, such as those acquired in stores or as give-aways.
And they discovered that the early adopters of gestures who are among the first 5-10 percent to acquire new assets are not the same as the influencers, who tend to distribute them most broadly. This aligns with what social scientists have found.
"In our study, we sought to develop a more rigorous understanding of social processes that underlies many cultural and economic phenomena," Bakshy said. "While some of our findings may seem quite intuitive, what I find most exciting is that we were actually able to test some rather controversial and competing hypotheses about the role of social networks in influence."
The researchers examined 130 days worth of gesture transfers in late 2008 and early 2009. They looked at 100,229 users and 106,499 gestures. They obtained the data from Linden Lab, the maker of Second Life. Personally-identifying information had been removed.
The research is funded by the National Science Foundation. The paper, "Social Influence and the Diffusion of User-Created Content," is authored by Eytan Bakshy, Brian Karrer and Lada A. Adamic.
Adapted from materials provided by University of Michigan.
Wednesday, July 1, 2009
Less Empathy Toward Outsiders: Brain Differences Reinforce Preferences For Those In Same Social Group
ScienceDaily (July 1, 2009) — An observer feels more empathy for someone in pain when that person is in the same social group, according to new research in the July 1 issue of The Journal of Neuroscience.
The study shows that perceiving others in pain activates a part of the brain associated with empathy and emotion more if the observer and the observed are the same race. The findings may show that unconscious prejudices against outside groups exist at a basic level.
The study confirms an in-group bias in empathic feelings, something that has long been known but never before confirmed by neuroimaging technology. Researchers have explored group bias since the 1950s. In some studies, even people with similar backgrounds arbitrarily assigned to different groups preferred members of their own group to those of others. This new study shows those feelings of bias are also reflected in brain activity.
"Our findings have significant implications for understanding real-life social behaviors and social interactions," said Shihui Han, PhD, at Peking University in China, one of the study authors.
Other recent brain imaging studies show that feeling empathy for others in pain stimulates a brain area called the anterior cingulate cortex. Building on these results, the study authors tested the theory that these empathic feelings increase for members of the same social group. In this case, the researchers chose race as the social group, although the same effect may occur with other groups.
The researchers scanned brains areas in one Caucasian group and one Chinese group. The authors monitored participants as they viewed video clips that simulated either a painful needle prick or a non-painful cotton swab touch to a Caucasian or Chinese face. When painful simulations were applied to individuals of the same race as the observers, the empathic neural responses increased; however, responses increased to a lesser extent when participants viewed the faces of the other group.
Martha Farah, PhD, at the University of Pennsylvania, a cognitive neuroscientist and neuroethicist who was not affiliated with the study, says learning how empathic responses influence our behavior in many different situations is interesting both practically and theoretically. "This is a fascinating study of a phenomenon with important social implications for everything from medical care to charitable giving," she said.
But the finding raises as many questions as it answers, Farah said. "For example, is it racial identity per se that determines the brain's empathic response, or some more general measure of similarity between self and other?" she said. "What personal characteristics or life experiences influence the disparity in empathic response toward in-group and out-group members?"
The research was supported by the National Natural Science Foundation of China.
Adapted from materials provided by Society for Neuroscience, via EurekAlert!, a service of AAAS.
The study confirms an in-group bias in empathic feelings, something that has long been known but never before confirmed by neuroimaging technology. Researchers have explored group bias since the 1950s. In some studies, even people with similar backgrounds arbitrarily assigned to different groups preferred members of their own group to those of others. This new study shows those feelings of bias are also reflected in brain activity.
"Our findings have significant implications for understanding real-life social behaviors and social interactions," said Shihui Han, PhD, at Peking University in China, one of the study authors.
Other recent brain imaging studies show that feeling empathy for others in pain stimulates a brain area called the anterior cingulate cortex. Building on these results, the study authors tested the theory that these empathic feelings increase for members of the same social group. In this case, the researchers chose race as the social group, although the same effect may occur with other groups.
The researchers scanned brains areas in one Caucasian group and one Chinese group. The authors monitored participants as they viewed video clips that simulated either a painful needle prick or a non-painful cotton swab touch to a Caucasian or Chinese face. When painful simulations were applied to individuals of the same race as the observers, the empathic neural responses increased; however, responses increased to a lesser extent when participants viewed the faces of the other group.
Martha Farah, PhD, at the University of Pennsylvania, a cognitive neuroscientist and neuroethicist who was not affiliated with the study, says learning how empathic responses influence our behavior in many different situations is interesting both practically and theoretically. "This is a fascinating study of a phenomenon with important social implications for everything from medical care to charitable giving," she said.
But the finding raises as many questions as it answers, Farah said. "For example, is it racial identity per se that determines the brain's empathic response, or some more general measure of similarity between self and other?" she said. "What personal characteristics or life experiences influence the disparity in empathic response toward in-group and out-group members?"
The research was supported by the National Natural Science Foundation of China.
Adapted from materials provided by Society for Neuroscience, via EurekAlert!, a service of AAAS.
Saturday, June 27, 2009
Rating Attractiveness: Consensus Among Men, Not Women, Study Finds
SOURCE
ScienceDaily (June 27, 2009) — Hot or not? Men agree on the answer. Women don't.
There is much more consensus among men about whom they find attractive than there is among women, according to a new study by Wake Forest University psychologist Dustin Wood.
The study, co-authored by Claudia Brumbaugh of Queens College, appears in the June issue of the Journal of Personality and Social Psychology.
"Men agree a lot more about who they find attractive and unattractive than women agree about who they find attractive and unattractive," says Wood, assistant professor of psychology. "This study shows we can quantify the extent to which men agree about which women are attractive and vice versa."
More than 4,000 participants in the study rated photographs of men and women (ages 18-25) for attractiveness on a 10-point scale ranging from "not at all" to "very." In exchange for their participation, raters were told what characteristics they found attractive compared with the average person. The raters ranged in age from 18 to more than 70.
Before the participants judged the photographs for attractiveness, the members of the research team rated the images for how seductive, confident, thin, sensitive, stylish, curvaceous (women), muscular (men), traditional, masculine/feminine, classy, well-groomed, or upbeat the people looked.
Breaking out these factors helped the researchers figure out what common characteristics appealed most to women and men.
Men's judgments of women's attractiveness were based primarily around physical features and they rated highly those who looked thin and seductive. Most of the men in the study also rated photographs of women who looked confident as more attractive.
As a group, the women rating men showed some preference for thin, muscular subjects, but disagreed on how attractive many men in the study were. Some women gave high attractiveness ratings to the men other women said were not attractive at all.
"As far as we know, this is the first study to investigate whether there are differences in the level of consensus male and female raters have in their attractiveness judgments," Wood says. "These differences have implications for the different experiences and strategies that could be expected for men and women in the dating marketplace."
For example, women may encounter less competition from other women for the men they find attractive, he says. Men may need to invest more time and energy in attracting and then guarding their mates from other potential suitors, given that the mates they judge attractive are likely to be found attractive by many other men.
Wood says the study results have implications for eating disorders and how expectations regarding attractiveness affect behavior.
"The study helps explain why women experience stronger norms than men to obtain or maintain certain physical characteristics," he says. "Women who are trying to impress men are likely to be found much more attractive if they meet certain physical standards, and much less if they don't. Although men are rated as more attractive by women when they meet these physical appearance standards too, their overall judged attractiveness isn't as tightly linked to their physical features."
The age of the participants also played a role in attractiveness ratings. Older participants were more likely to find people attractive if they were smiling.
Adapted from materials provided by Wake Forest University.
The study, co-authored by Claudia Brumbaugh of Queens College, appears in the June issue of the Journal of Personality and Social Psychology.
"Men agree a lot more about who they find attractive and unattractive than women agree about who they find attractive and unattractive," says Wood, assistant professor of psychology. "This study shows we can quantify the extent to which men agree about which women are attractive and vice versa."
More than 4,000 participants in the study rated photographs of men and women (ages 18-25) for attractiveness on a 10-point scale ranging from "not at all" to "very." In exchange for their participation, raters were told what characteristics they found attractive compared with the average person. The raters ranged in age from 18 to more than 70.
Before the participants judged the photographs for attractiveness, the members of the research team rated the images for how seductive, confident, thin, sensitive, stylish, curvaceous (women), muscular (men), traditional, masculine/feminine, classy, well-groomed, or upbeat the people looked.
Breaking out these factors helped the researchers figure out what common characteristics appealed most to women and men.
Men's judgments of women's attractiveness were based primarily around physical features and they rated highly those who looked thin and seductive. Most of the men in the study also rated photographs of women who looked confident as more attractive.
As a group, the women rating men showed some preference for thin, muscular subjects, but disagreed on how attractive many men in the study were. Some women gave high attractiveness ratings to the men other women said were not attractive at all.
"As far as we know, this is the first study to investigate whether there are differences in the level of consensus male and female raters have in their attractiveness judgments," Wood says. "These differences have implications for the different experiences and strategies that could be expected for men and women in the dating marketplace."
For example, women may encounter less competition from other women for the men they find attractive, he says. Men may need to invest more time and energy in attracting and then guarding their mates from other potential suitors, given that the mates they judge attractive are likely to be found attractive by many other men.
Wood says the study results have implications for eating disorders and how expectations regarding attractiveness affect behavior.
"The study helps explain why women experience stronger norms than men to obtain or maintain certain physical characteristics," he says. "Women who are trying to impress men are likely to be found much more attractive if they meet certain physical standards, and much less if they don't. Although men are rated as more attractive by women when they meet these physical appearance standards too, their overall judged attractiveness isn't as tightly linked to their physical features."
The age of the participants also played a role in attractiveness ratings. Older participants were more likely to find people attractive if they were smiling.
Adapted from materials provided by Wake Forest University.
Friday, June 19, 2009
Some Video Games Can Make Children Kinder And More Likely To Help
ScienceDaily (June 18, 2009) — Some video games can make children kinder and more likely to help—not hurt—other people.
That's the conclusion of new research published in the June 2009 issue of Personality and Social Psychology Bulletin.
The article presents the findings of three separate studies, conducted in different countries with different age groups, and using different scientific approaches. All the studies find that playing games with prosocial content causes players to be more helpful to others after the game is over.
The report is co-authored by a consortium of researchers from the United States, Japan, Singapore and Malaysia.
"Dozens of studies have documented a relationship between violent video games and aggressive behaviors," said lead author Douglas Gentile, an Iowa State University psychologist. "But this is one of the first that has documented the positive effects of playing prosocial games."
Prosocial video games involve characters who help and support each other in nonviolent ways.
"These studies show the same kind of impact on three different age groups from three very different cultures," said Brad Bushman, a University of Michigan co-author of the report. "In addition, the studies use different analytic approaches—correlational, longitudinal and experimental. The resulting triangulation of evidence provides the strongest possible proof that the findings are both valid and generalizable."
"These studies document that children and adolescents learn from practicing behaviors in games," said Rowell Huesmann, a U-M co-author of the report.
One study examined the link between video game habits and prosocial behavior among 727 secondary students in Singapore, with a mean age of 13. Students listed their favorite games and rated how often game characters helped, hurt or killed other characters. They also answered questions about how likely they were to spend time and money helping people in need, to cooperate with others and share their belongings, and to react aggressively in various situations.
As in numerous other studies, the researchers found a strong correlation between playing violent video games and hurting others. But the study also found a strong correlation between playing prosocial games and helping others.
The second study analyzed the long-term connection between video game habits and prosocial behavior in nearly 2,000 Japanese children ages 10 to 16. Participants completed a survey about their exposure to prosocial video games, and rated how often they had helped other people in the last month. Three to four months later, they were surveyed again, and researchers found a significant connection between exposure to prosocial games and helpful behavior months later.
"This suggests there is an upward spiral of prosocial gaming and helpful behavior, in contrast to the downward spiral that occurs with violent video gaming and aggressive behavior," said Bushman, a professor of communications and psychology and a research professor at the U-M Institute for Social Research (ISR).
For the third study, the researchers carried out an experiment with 161 U.S. college students, with a mean age of 19. After playing either a prosocial, violent, or neutral game, participants were asked to assign puzzles to a randomly selected partner. They could choose from puzzles that were easy, medium or hard to complete. Their partner could win $10 if they solved all the puzzles. Those who played a prosocial game were considerably more helpful than others, assigning more easy puzzles to their partners. And those who had played violent games were significantly more likely to assign the hardest puzzles.
"Taken together, these findings make it clear that playing video games is not in itself good or bad for children," Bushman said."The type of content in the game has a bigger impact than the overall amount of time spent playing."
Adapted from materials provided by University of Michigan.
The article presents the findings of three separate studies, conducted in different countries with different age groups, and using different scientific approaches. All the studies find that playing games with prosocial content causes players to be more helpful to others after the game is over.
The report is co-authored by a consortium of researchers from the United States, Japan, Singapore and Malaysia.
"Dozens of studies have documented a relationship between violent video games and aggressive behaviors," said lead author Douglas Gentile, an Iowa State University psychologist. "But this is one of the first that has documented the positive effects of playing prosocial games."
Prosocial video games involve characters who help and support each other in nonviolent ways.
"These studies show the same kind of impact on three different age groups from three very different cultures," said Brad Bushman, a University of Michigan co-author of the report. "In addition, the studies use different analytic approaches—correlational, longitudinal and experimental. The resulting triangulation of evidence provides the strongest possible proof that the findings are both valid and generalizable."
"These studies document that children and adolescents learn from practicing behaviors in games," said Rowell Huesmann, a U-M co-author of the report.
One study examined the link between video game habits and prosocial behavior among 727 secondary students in Singapore, with a mean age of 13. Students listed their favorite games and rated how often game characters helped, hurt or killed other characters. They also answered questions about how likely they were to spend time and money helping people in need, to cooperate with others and share their belongings, and to react aggressively in various situations.
As in numerous other studies, the researchers found a strong correlation between playing violent video games and hurting others. But the study also found a strong correlation between playing prosocial games and helping others.
The second study analyzed the long-term connection between video game habits and prosocial behavior in nearly 2,000 Japanese children ages 10 to 16. Participants completed a survey about their exposure to prosocial video games, and rated how often they had helped other people in the last month. Three to four months later, they were surveyed again, and researchers found a significant connection between exposure to prosocial games and helpful behavior months later.
"This suggests there is an upward spiral of prosocial gaming and helpful behavior, in contrast to the downward spiral that occurs with violent video gaming and aggressive behavior," said Bushman, a professor of communications and psychology and a research professor at the U-M Institute for Social Research (ISR).
For the third study, the researchers carried out an experiment with 161 U.S. college students, with a mean age of 19. After playing either a prosocial, violent, or neutral game, participants were asked to assign puzzles to a randomly selected partner. They could choose from puzzles that were easy, medium or hard to complete. Their partner could win $10 if they solved all the puzzles. Those who played a prosocial game were considerably more helpful than others, assigning more easy puzzles to their partners. And those who had played violent games were significantly more likely to assign the hardest puzzles.
"Taken together, these findings make it clear that playing video games is not in itself good or bad for children," Bushman said."The type of content in the game has a bigger impact than the overall amount of time spent playing."
Adapted from materials provided by University of Michigan.
Etichette:
Behavior,
Child Development,
Child Psychology,
Children's Health,
Disorders and Syndromes,
Educational Psychology,
Mental Health,
Neuroscience,
Perception,
Psychology,
Relationships,
Social Psychology
Friday, June 5, 2009
Be Your Best Friend If You'll Be Mine: Alliance Hypothesis For Human Friendship
SOURCE
ScienceDaily (June 5, 2009) — University of Pennsylvania psychologists studying the cognitive mechanisms behind human friendship have determined that how you rank your best friends is closely related to how you think your friends rank you. The results are consistent with a new theory called the Alliance Hypothesis for Human Friendship, distinct from traditional explanations for human friendship that focused on wealth, popularity or similarity.
The study, performed by Penn cognitive psychologists Peter DeScioli and Robert Kurzban, has demonstrated that human friendship is caused, in part, by cognitive mechanisms aimed at creating a ready-made support group for potential conflicts. People call on friends for help in a variety of disputes, ranging from trivial arguments to violent fights. This study suggests that people have specialized decision processes that prioritize those individuals who tend to be most helpful in conflicts, those with fewer stronger commitments to others.
Researchers performed question-and-answer studies in which participants ranked their closest friends in a number of ways, including, for example, the benefits they receive from the friendship, the number of secrets shared and how long the friendship has been ongoing. Each time, whether participants were an online community, random passersby on a metropolitan street or undergraduate students in a laboratory, friendship rankings were most strongly correlated with individuals' own perceived rank among their partners' other friends.
"Historically, the main theory has been that humans build friendships in order to trade in goods and services," DeScioli, lead author, said. "The problem we focused on was that friendship involves more than exchange. People want friends who care about them and do not give just to get something back in return. We thought that theories about alliances might help explain why friends are primarily concerned with each others' needs rather than the benefits they can get in return for helping."
Traditional evolutionary approaches to explain human friendship apply the Theory of Reciprocal Altruism: Friends function as exchange partners; however, a wealth of empirical evidence from social psychology is inconsistent with the theory. For example, in prior studies it was shown that people do not keep regular tabs on the benefits given and received in close relationships. Also, people seem to help friends even when they are unlikely to be capable of repayment. For cognitive psychologists, it is unclear what humans and their complex brains are up to in creating these relationships.
The new Penn theory has origins in models of alliance building between nations, which prepare for conflict in advance but may not expect anything in return immediately.
"Friendships are about alliances," Kurzban, an associate professor, said. "We live in a world where conflict can arise and allies must be in position beforehand. This new hypothesis takes into account how we value those alliances. In a way, one of the main predictors of friendship is the value of the alliance. The value of an ally, or friend, drops with every additional alliance they must make, so the best alliance is one in which your ally ranks you above everyone else as well."
In short, the hypothesis is much more optimistic about the reasons for friendship than existing theories which point toward popularity, wealth and proximity as reasons for friendship.
"In this hypothesis," Kurzban said, "it's not what you can do for me, it's how much you like me. In this manner even the weakest nations, for example, or the least popular kid at the party with nary an alliance in the room is set up to be paired with someone looking for a friend."
More darkly, the new model also serves as an explanation for some petty human behaviors not explained by traditional friendship theories. For example, the Alliance Hypothesis explains why people are extremely concerned with comparisons to others in their social circle. It also explains how jealousies and aggression can erupt among groups of friends as alliances are shifted and maintained.
If the Alliance Hypothesis for Human Friendship is correct, then theories about alliances from game theory and international relations might help us better understand friendship. These theories suggest that people in conflict would benefit strategically from ranking their friends, hiding their friend-rankings and ranking friends according to their own position in partners' rankings. To employ these tactics in their friendships, people need to gather and store information about their friends' other friendships. That is, they have to readily understand the social world not only from their own perspective but also from the perspectives of their friends.
Although friendship is a core element of human social life, its evolved functions have been difficult to understand. Human friendship occurs among individuals who are neither relatives nor mates, so the function of this cooperative behavior is not as clear as when reproduction or genetic relatives are involved. Similar relationships have been observed in non-human species -- hyenas use partners to gain access to carcasses and male dolphins employ "wingmen" to attain females for mating — and considerable progress has been made in understanding these non-human relationships. But the functions of human friendship have been more elusive.
The study, appearing in the current issue of the online journal Public Library of Science One, was conducted by DeScioli and Kurzban of the Department of Psychology in the School of Arts and Sciences at Penn.
It was supported by a fellowship from the International Foundation for Research in Experimental Economics.
Adapted from materials provided by University of Pennsylvania, via EurekAlert!, a service of AAAS.
Researchers performed question-and-answer studies in which participants ranked their closest friends in a number of ways, including, for example, the benefits they receive from the friendship, the number of secrets shared and how long the friendship has been ongoing. Each time, whether participants were an online community, random passersby on a metropolitan street or undergraduate students in a laboratory, friendship rankings were most strongly correlated with individuals' own perceived rank among their partners' other friends.
"Historically, the main theory has been that humans build friendships in order to trade in goods and services," DeScioli, lead author, said. "The problem we focused on was that friendship involves more than exchange. People want friends who care about them and do not give just to get something back in return. We thought that theories about alliances might help explain why friends are primarily concerned with each others' needs rather than the benefits they can get in return for helping."
Traditional evolutionary approaches to explain human friendship apply the Theory of Reciprocal Altruism: Friends function as exchange partners; however, a wealth of empirical evidence from social psychology is inconsistent with the theory. For example, in prior studies it was shown that people do not keep regular tabs on the benefits given and received in close relationships. Also, people seem to help friends even when they are unlikely to be capable of repayment. For cognitive psychologists, it is unclear what humans and their complex brains are up to in creating these relationships.
The new Penn theory has origins in models of alliance building between nations, which prepare for conflict in advance but may not expect anything in return immediately.
"Friendships are about alliances," Kurzban, an associate professor, said. "We live in a world where conflict can arise and allies must be in position beforehand. This new hypothesis takes into account how we value those alliances. In a way, one of the main predictors of friendship is the value of the alliance. The value of an ally, or friend, drops with every additional alliance they must make, so the best alliance is one in which your ally ranks you above everyone else as well."
In short, the hypothesis is much more optimistic about the reasons for friendship than existing theories which point toward popularity, wealth and proximity as reasons for friendship.
"In this hypothesis," Kurzban said, "it's not what you can do for me, it's how much you like me. In this manner even the weakest nations, for example, or the least popular kid at the party with nary an alliance in the room is set up to be paired with someone looking for a friend."
More darkly, the new model also serves as an explanation for some petty human behaviors not explained by traditional friendship theories. For example, the Alliance Hypothesis explains why people are extremely concerned with comparisons to others in their social circle. It also explains how jealousies and aggression can erupt among groups of friends as alliances are shifted and maintained.
If the Alliance Hypothesis for Human Friendship is correct, then theories about alliances from game theory and international relations might help us better understand friendship. These theories suggest that people in conflict would benefit strategically from ranking their friends, hiding their friend-rankings and ranking friends according to their own position in partners' rankings. To employ these tactics in their friendships, people need to gather and store information about their friends' other friendships. That is, they have to readily understand the social world not only from their own perspective but also from the perspectives of their friends.
Although friendship is a core element of human social life, its evolved functions have been difficult to understand. Human friendship occurs among individuals who are neither relatives nor mates, so the function of this cooperative behavior is not as clear as when reproduction or genetic relatives are involved. Similar relationships have been observed in non-human species -- hyenas use partners to gain access to carcasses and male dolphins employ "wingmen" to attain females for mating — and considerable progress has been made in understanding these non-human relationships. But the functions of human friendship have been more elusive.
The study, appearing in the current issue of the online journal Public Library of Science One, was conducted by DeScioli and Kurzban of the Department of Psychology in the School of Arts and Sciences at Penn.
It was supported by a fellowship from the International Foundation for Research in Experimental Economics.
Adapted from materials provided by University of Pennsylvania, via EurekAlert!, a service of AAAS.
Etichette:
Behavior,
Consumer Behavior,
Disorders and Syndromes,
Educational Psychology,
Mental Health,
Neuroscience,
Perception,
Psychiatry,
Psychology,
Psychology Research,
Relationships,
Workplace Health
Basket Weaving May Have Taught Humans To Count
SOURCE
ScienceDaily (June 3, 2009) — Did animals teach us one of the oldest forms of human technology? Did this technology contribute to our ability to count? These are just two of the themes due to be explored at a conference on basketry at the University of East Anglia.
The event, which takes place today and tomorrow (June 5-6), is part of Beyond the Basket, a major new research project led by the university exploring the development and use of basketry in human culture over 10,000 years.
Basketry has been practised for millennia and ranges from mats for sitting on, containers and traps for hunting, to fencing and barriers for animals or land, partitions and walls - all of which have been central to culture.
Beyond the Basket is a two-and-a-half year project funded by the Arts and Humanities Research Council as part of its Beyond Text programme. The research will explore the role of basketry in human culture and focus on various parts of the world, both in the past and present, from Europe to Amazonia, central Africa and Papua New Guinea.
The aim is to identify the mechanical traditions of making and the ways in which basketry is implicated in wider patterns of understanding, for example the order of society or the design of the universe. It will also show the impact of woven forms on other media, such as pottery, painting, and stone sculpture and architecture, and look at the future of basketry and the solutions it could offer to current issues, whether technical or social.
Project leader Sandy Heslop, of the School of World Art and Museology at UEA, said: “Basketry is a worldwide technology and is the interaction between human ingenuity and the environment. It tends to make use of, and therefore has to be adapted to, local conditions in terms of resources and environment.
“Without basketry there would be no civilisations. You can’t bring thousands of people together unless you can supply them, you can’t bring in supplies to feed populations without containers. In the early days of civilisations these containers were basketry.
“We may think of baskets as humble, but other people and cultures don’t. They have been used for storage, for important religious and ceremonial processes, even for bodies in the form of coffins.”
It is about 10,000 years ago that evidence for basketry starts to appear in North America, Asia, Europe and the Middle East. Today its uses and influences are still seen, from the bamboo scaffolding often used in Asia, to contemporary architecture, for example the ‘Boiler Suit’ - the name given to the ‘woven’ steel tiles encasing the boiler room at Guy’s Hospital in London.
Mr Heslop said: “Beyond its practical uses, basketry has arguably been even more influential on our lives, since it relies on the relationship of number, pattern and structure. It therefore provides a model for disciplines such as mathematics and engineering and for the organisation of social and political life.
“Given the range of uses of basketry the associations of the technology are very varied. Some are aggressive, others protective, some help create social hierarchies others are recreational.”
The conference, Beyond the Basket: Construction, Order and Understanding, will look at various themes including: design and production, environmental issues, commercial and historical perspectives, weaving in architecture, and the mathematics of basketry, as well as more anthropological and archaeological topics. Among the speakers will be experts from North and South America, as well as the UK.
Beyond the Basket will culminate in an exhibition and accompanying book in 2011. The exhibition will include ancient material recovered by excavation as well as more recent examples of basketry from around the world and will enable people to experience basketry directly.
For further information about Beyond the Basket and to view images visit http://projects.beyondtext.ac.uk/beyondthebasket
Adapted from materials provided by University of East Anglia, via AlphaGalileo.
Basketry has been practised for millennia and ranges from mats for sitting on, containers and traps for hunting, to fencing and barriers for animals or land, partitions and walls - all of which have been central to culture.
Beyond the Basket is a two-and-a-half year project funded by the Arts and Humanities Research Council as part of its Beyond Text programme. The research will explore the role of basketry in human culture and focus on various parts of the world, both in the past and present, from Europe to Amazonia, central Africa and Papua New Guinea.
The aim is to identify the mechanical traditions of making and the ways in which basketry is implicated in wider patterns of understanding, for example the order of society or the design of the universe. It will also show the impact of woven forms on other media, such as pottery, painting, and stone sculpture and architecture, and look at the future of basketry and the solutions it could offer to current issues, whether technical or social.
Project leader Sandy Heslop, of the School of World Art and Museology at UEA, said: “Basketry is a worldwide technology and is the interaction between human ingenuity and the environment. It tends to make use of, and therefore has to be adapted to, local conditions in terms of resources and environment.
“Without basketry there would be no civilisations. You can’t bring thousands of people together unless you can supply them, you can’t bring in supplies to feed populations without containers. In the early days of civilisations these containers were basketry.
“We may think of baskets as humble, but other people and cultures don’t. They have been used for storage, for important religious and ceremonial processes, even for bodies in the form of coffins.”
It is about 10,000 years ago that evidence for basketry starts to appear in North America, Asia, Europe and the Middle East. Today its uses and influences are still seen, from the bamboo scaffolding often used in Asia, to contemporary architecture, for example the ‘Boiler Suit’ - the name given to the ‘woven’ steel tiles encasing the boiler room at Guy’s Hospital in London.
Mr Heslop said: “Beyond its practical uses, basketry has arguably been even more influential on our lives, since it relies on the relationship of number, pattern and structure. It therefore provides a model for disciplines such as mathematics and engineering and for the organisation of social and political life.
“Given the range of uses of basketry the associations of the technology are very varied. Some are aggressive, others protective, some help create social hierarchies others are recreational.”
The conference, Beyond the Basket: Construction, Order and Understanding, will look at various themes including: design and production, environmental issues, commercial and historical perspectives, weaving in architecture, and the mathematics of basketry, as well as more anthropological and archaeological topics. Among the speakers will be experts from North and South America, as well as the UK.
Beyond the Basket will culminate in an exhibition and accompanying book in 2011. The exhibition will include ancient material recovered by excavation as well as more recent examples of basketry from around the world and will enable people to experience basketry directly.
For further information about Beyond the Basket and to view images visit http://projects.beyondtext.ac.uk/beyondthebasket
Adapted from materials provided by University of East Anglia, via AlphaGalileo.
High Population Density Triggers Cultural Explosions
SOURCE
ScienceDaily (June 5, 2009) — Increasing population density, rather than boosts in human brain power, appears to have catalysed the emergence of modern human behaviour, according to a new study by UCL (University College London) scientists published in the journal Science.
High population density leads to greater exchange of ideas and skills and prevents the loss of new innovations. It is this skill maintenance, combined with a greater probability of useful innovations, that led to modern human behaviour appearing at different times in different parts of the world.
In the study, the UCL team found that complex skills learnt across generations can only be maintained when there is a critical level of interaction between people. Using computer simulations of social learning, they showed that high and low-skilled groups could coexist over long periods of time and that the degree of skill they maintained depended on local population density or the degree of migration between them. Using genetic estimates of population size in the past, the team went on to show that density was similar in sub-Saharan Africa, Europe and the Middle-East when modern behaviour first appeared in each of these regions. The paper also points to evidence that population density would have dropped for climatic reasons at the time when modern human behaviour temporarily disappeared in sub-Saharan Africa.
Adam Powell, AHRC Centre for the Evolution of Cultural Diversity, says: "Our paper proposes a new model for why modern human behaviour started at different times in different regions of the world, why it disappeared in some places before coming back, and why in all cases it occurred more than 100,000 years after modern humans first appeared.
"By modern human behaviour, we mean a radical jump in technological and cultural complexity, which makes our species unique. This includes symbolic behavior, such as abstract and realistic art, and body decoration using threaded shell beads, ochre or tattoo kits; musical instruments; bone, antler and ivory artefacts; stone blades; and more sophisticated hunting and trapping technology, like bows, boomerangs and nets.
Professor Stephen Shennan, UCL Institute of Archaeology, says: "Modern humans have been around for at least 160,000 to 200,000 years but there is no archaeological evidence of any technology beyond basic stone tools until around 90,000 years ago. In Europe and western Asia this advanced technology and behaviour explodes around 45,000 years ago when humans arrive there, but doesn't appear in eastern and southern Asia and Australia until much later, despite a human presence. In sub-Saharan Africa the situation is more complex. Many of the features of modern human behaviour – including the first abstract art – are found some 90,000 years ago but then seem to disappear around 65,000 years ago, before re-emerging some 40,000 years ago.
"Scientists have offered many suggestions as to why these cultural explosions occurred where and when they did, including new mutations leading to better brains, advances in language, and expansions into new environments that required new technologies to survive. The problem is that none of these explanations can fully account for the appearance of modern human behaviour at different times in different places, or its temporary disappearance in sub-Saharan Africa."
Dr Mark Thomas, UCL Genetics, Evolution and Environment, says: "When we think of how we came to be the sophisticated creatures we are, we often imagine some sudden critical change, a bit like when the black monolith appears in the film 2001: A Space Odyssey. In reality, there is no evidence of a big change in our biological makeup when we started behaving in an intelligent way. Our model can explain this even if our mental capacities are the same today as they were when we first originated as a species some 200,000 years ago.
"Ironically, our finding that successful innovation depends less on how smart you are than how connected you are seems as relevant today as it was 90,000 years ago."
Journal reference:
Adam Powell, Stephen Shennan, and Mark G. Thomas. Late Pleistocene Demography and the Appearance of Modern Human Behavior. Science, 2009; 324 (5932): 1298 DOI: 10.1126/science.1170165
Adapted from materials provided by University College London, via EurekAlert!, a service of AAAS.
In the study, the UCL team found that complex skills learnt across generations can only be maintained when there is a critical level of interaction between people. Using computer simulations of social learning, they showed that high and low-skilled groups could coexist over long periods of time and that the degree of skill they maintained depended on local population density or the degree of migration between them. Using genetic estimates of population size in the past, the team went on to show that density was similar in sub-Saharan Africa, Europe and the Middle-East when modern behaviour first appeared in each of these regions. The paper also points to evidence that population density would have dropped for climatic reasons at the time when modern human behaviour temporarily disappeared in sub-Saharan Africa.
Adam Powell, AHRC Centre for the Evolution of Cultural Diversity, says: "Our paper proposes a new model for why modern human behaviour started at different times in different regions of the world, why it disappeared in some places before coming back, and why in all cases it occurred more than 100,000 years after modern humans first appeared.
"By modern human behaviour, we mean a radical jump in technological and cultural complexity, which makes our species unique. This includes symbolic behavior, such as abstract and realistic art, and body decoration using threaded shell beads, ochre or tattoo kits; musical instruments; bone, antler and ivory artefacts; stone blades; and more sophisticated hunting and trapping technology, like bows, boomerangs and nets.
Professor Stephen Shennan, UCL Institute of Archaeology, says: "Modern humans have been around for at least 160,000 to 200,000 years but there is no archaeological evidence of any technology beyond basic stone tools until around 90,000 years ago. In Europe and western Asia this advanced technology and behaviour explodes around 45,000 years ago when humans arrive there, but doesn't appear in eastern and southern Asia and Australia until much later, despite a human presence. In sub-Saharan Africa the situation is more complex. Many of the features of modern human behaviour – including the first abstract art – are found some 90,000 years ago but then seem to disappear around 65,000 years ago, before re-emerging some 40,000 years ago.
"Scientists have offered many suggestions as to why these cultural explosions occurred where and when they did, including new mutations leading to better brains, advances in language, and expansions into new environments that required new technologies to survive. The problem is that none of these explanations can fully account for the appearance of modern human behaviour at different times in different places, or its temporary disappearance in sub-Saharan Africa."
Dr Mark Thomas, UCL Genetics, Evolution and Environment, says: "When we think of how we came to be the sophisticated creatures we are, we often imagine some sudden critical change, a bit like when the black monolith appears in the film 2001: A Space Odyssey. In reality, there is no evidence of a big change in our biological makeup when we started behaving in an intelligent way. Our model can explain this even if our mental capacities are the same today as they were when we first originated as a species some 200,000 years ago.
"Ironically, our finding that successful innovation depends less on how smart you are than how connected you are seems as relevant today as it was 90,000 years ago."
Journal reference:
Adam Powell, Stephen Shennan, and Mark G. Thomas. Late Pleistocene Demography and the Appearance of Modern Human Behavior. Science, 2009; 324 (5932): 1298 DOI: 10.1126/science.1170165
Adapted from materials provided by University College London, via EurekAlert!, a service of AAAS.
Sunday, May 10, 2009
Greenland's Constant Summer Sunlight Linked To Summer Suicide Spike
ScienceDaily (May 10, 2009) — Suicide rates in Greenland increase during the summer, peaking in June. Researchers speculate that insomnia caused by incessant daylight may be to blame.
Karin Sparring Björkstén from the Karolinska Institutet, Sweden, led a team of researchers who studied the seasonal variation of suicides in all of Greenland from 1968-2002. They found that there was a concentration of suicides in the summer months, and that this seasonal effect was especially pronounced in the North of the country – an area where the sun doesn't set between the end of April and the end of August.
Björkstén said, "In terms of seasonal light variation, Greenland is the most extreme human habitat. Greenland also has one of the highest suicide rates in the world. We found that suicides were almost exclusively violent and increased during periods of constant day. In the north of the country, 82% of the suicides occurred during the daylight months (including astronomical twilight)".
The researchers found that most suicides occurred in young men and that violent methods, such as shooting, hanging and jumping, accounted for 95% of all suicides. No seasonal variation in alcohol consumption was found.
The authors speculate that light-generated imbalances in turnover of the neurotransmitter serotonin may lead to increased impulsiveness that, in combination with lack of sleep, may explain the increased suicide rates in the summer. They said, "People living at high latitudes need extreme flexibility in light adaptation. During the long periods of constant light, it is crucial to keep some circadian rhythm to get enough sleep and sustain mental health. A weak serotonin system may cause difficulties in adaptation".
Björkstén concludes, "Light is just one of many factors in the complex tragedy of suicide, but this study shows that there is a possible relationship between the two."
Journal reference:
Karin S Björkstén, Daniel F Kripke and Peter Bjerregaard. Accentuation of suicides but not homicides with rising latitudes of Greenland in the sunny months. BMC Psychiatry, 2009; 9 (1): 20 DOI: 10.1186/1471-244X-9-20
Adapted from materials provided by BMC Psychiatry, via EurekAlert!, a service of AAAS.
Björkstén said, "In terms of seasonal light variation, Greenland is the most extreme human habitat. Greenland also has one of the highest suicide rates in the world. We found that suicides were almost exclusively violent and increased during periods of constant day. In the north of the country, 82% of the suicides occurred during the daylight months (including astronomical twilight)".
The researchers found that most suicides occurred in young men and that violent methods, such as shooting, hanging and jumping, accounted for 95% of all suicides. No seasonal variation in alcohol consumption was found.
The authors speculate that light-generated imbalances in turnover of the neurotransmitter serotonin may lead to increased impulsiveness that, in combination with lack of sleep, may explain the increased suicide rates in the summer. They said, "People living at high latitudes need extreme flexibility in light adaptation. During the long periods of constant light, it is crucial to keep some circadian rhythm to get enough sleep and sustain mental health. A weak serotonin system may cause difficulties in adaptation".
Björkstén concludes, "Light is just one of many factors in the complex tragedy of suicide, but this study shows that there is a possible relationship between the two."
Journal reference:
Karin S Björkstén, Daniel F Kripke and Peter Bjerregaard. Accentuation of suicides but not homicides with rising latitudes of Greenland in the sunny months. BMC Psychiatry, 2009; 9 (1): 20 DOI: 10.1186/1471-244X-9-20
Adapted from materials provided by BMC Psychiatry, via EurekAlert!, a service of AAAS.
Friday, May 8, 2009
Babies Brainier Than Many Imagine
SOURCE
ScienceDaily (May 7, 2009) — A new study from Northwestern University shows what many mothers already know: their babies are a lot smarter than others may realize.
Though only five months old, the study's cuties indicated through their curious stares that they could differentiate water in a glass from solid blue material that looked very much like water in a similar glass.
The finding that infants can distinguish between solids and liquids at such an early age builds upon a growing body of research that strongly suggests that babies are not blank slates who primarily depend on others for acquiring knowledge. That's a common assumption of researchers in the not too distant past.
"Rather, our research shows that babies are amazing little experimenters with innate knowledge," Susan Hespos said. "They're collecting data all the time."
Hespos, an assistant professor of psychology at Northwestern, is lead author of the study, which will appear in the May 2009 issue of Psychological Science.
In a test with one group of infants in the study, a researcher tilted a glass filled with blue water back and forth to emphasize the physical characteristics of the substance inside. Another group of babies looked at a glass filled with a blue solid resembling water, which also was moved back and forth to demonstrate its physical properties.
Next all the infants were presented with test trials that alternated between the liquid or solid being transferred between two glasses.
According to the well-established looking-time test, babies, like adults, look significantly longer at something that is new, unexpected or unpredictable.
The infants who in their first trials observed the blue water in the glass looked significantly longer at the blue solid, compared to the liquid test trials. The longer stares indicated the babies were having an "Aha!" moment, noticing the solid substance's difference from the liquid. The infants who in their first trials observed the blue solid in the glass showed the opposite pattern. They looked longer at the liquid, compared to the solid test trials.
"As capricious as it may sound, how long a baby looks at something is a strong indicator of what they know," Hespos said. "They are looking longer because they detect a change and want to know what is going on."
The five-month-old infants were able to discriminate a solid from a similar-looking liquid based on movement cues, or on how the substances poured or tumbled out of upended glasses.
In a second experiment, the babies also first saw either liquid or a similar-looking solid in a glass that was tipped back and forth. This time, both groups of infants next witnessed test trials in which a cylindrical pipe was lowered into either the liquid-filled glass or the solid-containing glass.
The outcomes were similar to those of the previous experiment. Infants who first observed the glass with the liquid looked longer in the subsequent test when the pipe was lowered onto the solid. Likewise, the infants who looked at the solid in their first trials stared longer when later the pipe was lowered into the liquid.
The motion cues led to distinct expectations about whether an object would pass through or remain on top of the liquid or solid, the Northwestern researchers noted.
"Together these experiments provide the earliest evidence that infants have expectations about the physical properties of liquids," the researchers concluded in the Psychological Science study.
Hespos primarily is interested in how the brain works, and, to that end, her research on babies' brand new, relatively uncomplicated brains provides invaluable insights. She also is doing optical imaging of babies' brains, in which the biological measures confirm behavioral findings.
"Our research on babies strongly suggests that right from the beginning babies are active learners," Hespos said. "It shows that we perceive the world in pretty much the same way from infancy throughout life, making fine adjustments along the way."
In addition to Hespos, the co-investigators of the Psychological Science study are Alissa Ferry, a graduate student, and Lance Rips, professor of psychology, at Northwestern.
Journal reference:
Hespos et al. Five-Month-Old Infants Have Different Expectations for Solids and Liquids. Psychological Science, 2009; 20 (5): 603 DOI: 10.1111/j.1467-9280.2009.02331.x
Adapted from materials provided by Northwestern University.
The finding that infants can distinguish between solids and liquids at such an early age builds upon a growing body of research that strongly suggests that babies are not blank slates who primarily depend on others for acquiring knowledge. That's a common assumption of researchers in the not too distant past.
"Rather, our research shows that babies are amazing little experimenters with innate knowledge," Susan Hespos said. "They're collecting data all the time."
Hespos, an assistant professor of psychology at Northwestern, is lead author of the study, which will appear in the May 2009 issue of Psychological Science.
In a test with one group of infants in the study, a researcher tilted a glass filled with blue water back and forth to emphasize the physical characteristics of the substance inside. Another group of babies looked at a glass filled with a blue solid resembling water, which also was moved back and forth to demonstrate its physical properties.
Next all the infants were presented with test trials that alternated between the liquid or solid being transferred between two glasses.
According to the well-established looking-time test, babies, like adults, look significantly longer at something that is new, unexpected or unpredictable.
The infants who in their first trials observed the blue water in the glass looked significantly longer at the blue solid, compared to the liquid test trials. The longer stares indicated the babies were having an "Aha!" moment, noticing the solid substance's difference from the liquid. The infants who in their first trials observed the blue solid in the glass showed the opposite pattern. They looked longer at the liquid, compared to the solid test trials.
"As capricious as it may sound, how long a baby looks at something is a strong indicator of what they know," Hespos said. "They are looking longer because they detect a change and want to know what is going on."
The five-month-old infants were able to discriminate a solid from a similar-looking liquid based on movement cues, or on how the substances poured or tumbled out of upended glasses.
In a second experiment, the babies also first saw either liquid or a similar-looking solid in a glass that was tipped back and forth. This time, both groups of infants next witnessed test trials in which a cylindrical pipe was lowered into either the liquid-filled glass or the solid-containing glass.
The outcomes were similar to those of the previous experiment. Infants who first observed the glass with the liquid looked longer in the subsequent test when the pipe was lowered onto the solid. Likewise, the infants who looked at the solid in their first trials stared longer when later the pipe was lowered into the liquid.
The motion cues led to distinct expectations about whether an object would pass through or remain on top of the liquid or solid, the Northwestern researchers noted.
"Together these experiments provide the earliest evidence that infants have expectations about the physical properties of liquids," the researchers concluded in the Psychological Science study.
Hespos primarily is interested in how the brain works, and, to that end, her research on babies' brand new, relatively uncomplicated brains provides invaluable insights. She also is doing optical imaging of babies' brains, in which the biological measures confirm behavioral findings.
"Our research on babies strongly suggests that right from the beginning babies are active learners," Hespos said. "It shows that we perceive the world in pretty much the same way from infancy throughout life, making fine adjustments along the way."
In addition to Hespos, the co-investigators of the Psychological Science study are Alissa Ferry, a graduate student, and Lance Rips, professor of psychology, at Northwestern.
Journal reference:
Hespos et al. Five-Month-Old Infants Have Different Expectations for Solids and Liquids. Psychological Science, 2009; 20 (5): 603 DOI: 10.1111/j.1467-9280.2009.02331.x
Adapted from materials provided by Northwestern University.
Wednesday, May 6, 2009
For Your Health, Pick A Mate Who Is Conscientious And, Perhaps, Also Neurotic
ScienceDaily (May 6, 2009) — Conscientiousness is a good thing in a mate, researchers report, not just because it's easier to live with someone who washes the dishes without being asked, but also because having a conscientious partner may actually be good for one's health. Their study, of adults over age 50, also found that women, but not men, get an added health benefit when paired with someone who is conscientious and neurotic.
This is the first large-scale analysis of what the authors call the "compensatory conscientiousness effect," the boost in health reported by those with conscientious spouses or romantic partners. The study appears this month in Psychological Science."Highly conscientious people are more organized and responsible and tend to follow through with their obligations, to be more impulse controlled and to follow rules," said University of Illinois psychology professor Brent Roberts, who led the study. Highly neurotic people tend to be more moody and anxious, and to worry, he said.
Researchers have known since the early 1990s that people who are more conscientious tend to live longer than those who are less so. They are more likely to exercise, eat nutritious foods and adhere to vitamin or drug regimens, and are less likely to smoke, abuse drugs or take unwarranted risks, all of which may explain their better health. They also tend to have more stable relationships than people with low conscientiousness.
Most studies have found a very different outcome for people who are highly neurotic. They tend to report poorer health and less satisfying relationships.
Many studies focus on how specific personality traits may affect one's own health, Roberts said, but few have considered how one's personality can influence the health of another.
"There's been kind of an individualistic bias in personality research," he said. "But human beings are not islands. We are an incredibly interdependent species."
Roberts and his colleagues at the University of Illinois and the University of Michigan looked at the association of personality and self-reported health among more than 2,000 couples taking part in the Health and Retirement Study, a representative study of the U.S. population over age 50. The study asked participants to rate their own levels of neuroticism and conscientiousness and to answer questions about the quality of their health. Participants also filled out a questionnaire that asked them whether or not a health problem limited their ability to engage in a range of activities such as jogging one block, climbing a flight of stairs, shopping, dressing or bathing.
As other studies have found, the researchers found that those who described themselves as highly conscientious also reported better health and said they were more able to engage in a variety of physical activities than those who reported low conscientiousness.
For the first time, however, the researchers also found a significant, self-reported health benefit that accompanied marriage to a conscientious person, even among those who described themselves as highly conscientious.
"It appears that even if you are really highly conscientious, you can still benefit from a spouse's conscientiousness," Roberts said. "It makes sense that regardless of what your attributes are, if you have people in your social network that have resources, such as conscientiousness, that can always help."
A more unusual finding involved an added health benefit reported by women who were paired with highly conscientious men who were also highly neurotic, Roberts said. The same benefit was not seen in men with highly conscientious and neurotic female partners. While both men and women benefit from being paired with a conscientious mate, Roberts said, only the women saw a modest boost in their health from being with a man who was also neurotic.
"The effect here is not much larger than the effect of aspirin on cardiovascular health, which is a well-known small effect," he said.
Asked whether women looking for long-term mates should choose a man who is conscientious and neurotic over one who is simply conscientious, Roberts said, "I wouldn't recommend it."
Adapted from materials provided by University of Illinois at Urbana-Champaign, via EurekAlert!, a service of AAAS.
Why People Are Better At Lying Online Than Telling A Lie Face-to-face
SOURCE
ScienceDaily (May 5, 2009) — In the digital world, it’s easier to tell a lie and get away with it. That’s good news for liars, but not so good for anyone being deceived.
Michael Woodworth, a forensic psychologist at UBC Okanagan studying deception in computer-mediated environments, says offering up a fib in person might make you provide certain signals that you’re trying to deceive, but lying online avoids the physical cues that can give you away.
“When people are interacting face to face, there is something called the ‘motivational impairment effect,’ where your body will give off some cues as you become more nervous and there’s more at stake with your lie,” says Woodworth. “In a computer-mediated environment, the exact opposite occurs.”
The motivational enhancement effect – a term coined by Woodworth and colleague Jeff Hancock from Cornell University – describes how people motivated to lie in a computer-mediated environment are not only less likely to be detected, they are also actually better at being deceptive than people who are less motivated.
When telling a lie face-to-face, the higher the stakes of your deception, the more cues you may give out that you’re lying. So, what isn’t in a text message may have advantages for a would-be deceiver: text doesn’t transmit non-verbal cues such as vocal properties, physical gestures, and facial expressions.
Woodworth’s research is very timely as technology and deceptive practices converge.
“Deception is one of the most significant and pervasive social phenomena of our age,” says Woodworth. “On average, people tell one to two lies a day, and these lies range from the trivial to the more serious. Deception lies in communication between friends, family, colleagues and in power and politics.”
Woodworth began his exploration by looking at how to detect deception in face-to-face environments. But he soon recognized the invasion of information and communication technologies into nearly all aspects of our lives was an opportunity to study how technology affects “digital deception” – defined as any type of technologically mediated message transmitted to create a false belief in the receiver of the message.
“Given the prevalence of both deception and communication technology in our personal and professional lives, an important set of concerns have emerged about how technology affects digital deception,” says Woodworth. He points out a growing number of individuals are falling prey to deceptive practices and information received through computer mediated contexts such as the Internet
“By learning more about how various factors affect detecting deceit in online communication, our research will certainly have important implications in organizational contexts, both legal and illegal, in the political domain, and in family life as more and more children go online.”
Common threads detected in psychopath texts
Michael Woodworth’s research at UBC Okanagan goes beyond deception. He also studies the personality disorder of psychopathy, looking at what secrets can be gleaned from the language used by psychopaths who have killed.
After interviewing dozens of psychopaths and non-psychopaths convicted of murder, Woodworth and colleagues used electronic linguistics analysis to automatically process the interview transcripts, paying attention to the appearance of certain words, parts of speech (verbs, adjectives, nouns), and semantics – for example, looking at how often certain topics came up.
The results were revealing.
“In the transcripts of psychopathic offenders, we found twice as many terms related to eating, and 58 per cent more references to money,” says Woodworth. “And the psychopaths were significantly more likely to discuss both clothing and drinking while discussing their homicide, compared to non-psychopathic offenders.”
Woodworth has now teamed with noted forensic psychologist and deception researcher Stephen Porter, who joined UBC Okanagan from Dalhousie University last summer, and fellow forensic psychologist Jan Cioe to build a multi-disciplinary forensic science graduate program and research centre at UBC Okanagan.
Bringing together prominent forensic psychologists will benefit both the academic and wider communities, says Woodworth.
“In the back of my mind I’m always thinking ‘how is this going to potentially have some applied value?’ whether it be the community in general, or specifically for law enforcement, or by furthering our knowledge within a certain area,” he says. “All of these applications ultimately assist with both assessment and treatment.”
This research is supported by a grant of $87,055 from the Social Sciences and Humanities Research Council in Canada.
Adapted from materials provided by University of British Columbia. Original article written by Raina Ducklow and Bud Mortenson.
“When people are interacting face to face, there is something called the ‘motivational impairment effect,’ where your body will give off some cues as you become more nervous and there’s more at stake with your lie,” says Woodworth. “In a computer-mediated environment, the exact opposite occurs.”
The motivational enhancement effect – a term coined by Woodworth and colleague Jeff Hancock from Cornell University – describes how people motivated to lie in a computer-mediated environment are not only less likely to be detected, they are also actually better at being deceptive than people who are less motivated.
When telling a lie face-to-face, the higher the stakes of your deception, the more cues you may give out that you’re lying. So, what isn’t in a text message may have advantages for a would-be deceiver: text doesn’t transmit non-verbal cues such as vocal properties, physical gestures, and facial expressions.
Woodworth’s research is very timely as technology and deceptive practices converge.
“Deception is one of the most significant and pervasive social phenomena of our age,” says Woodworth. “On average, people tell one to two lies a day, and these lies range from the trivial to the more serious. Deception lies in communication between friends, family, colleagues and in power and politics.”
Woodworth began his exploration by looking at how to detect deception in face-to-face environments. But he soon recognized the invasion of information and communication technologies into nearly all aspects of our lives was an opportunity to study how technology affects “digital deception” – defined as any type of technologically mediated message transmitted to create a false belief in the receiver of the message.
“Given the prevalence of both deception and communication technology in our personal and professional lives, an important set of concerns have emerged about how technology affects digital deception,” says Woodworth. He points out a growing number of individuals are falling prey to deceptive practices and information received through computer mediated contexts such as the Internet
“By learning more about how various factors affect detecting deceit in online communication, our research will certainly have important implications in organizational contexts, both legal and illegal, in the political domain, and in family life as more and more children go online.”
Common threads detected in psychopath texts
Michael Woodworth’s research at UBC Okanagan goes beyond deception. He also studies the personality disorder of psychopathy, looking at what secrets can be gleaned from the language used by psychopaths who have killed.
After interviewing dozens of psychopaths and non-psychopaths convicted of murder, Woodworth and colleagues used electronic linguistics analysis to automatically process the interview transcripts, paying attention to the appearance of certain words, parts of speech (verbs, adjectives, nouns), and semantics – for example, looking at how often certain topics came up.
The results were revealing.
“In the transcripts of psychopathic offenders, we found twice as many terms related to eating, and 58 per cent more references to money,” says Woodworth. “And the psychopaths were significantly more likely to discuss both clothing and drinking while discussing their homicide, compared to non-psychopathic offenders.”
Woodworth has now teamed with noted forensic psychologist and deception researcher Stephen Porter, who joined UBC Okanagan from Dalhousie University last summer, and fellow forensic psychologist Jan Cioe to build a multi-disciplinary forensic science graduate program and research centre at UBC Okanagan.
Bringing together prominent forensic psychologists will benefit both the academic and wider communities, says Woodworth.
“In the back of my mind I’m always thinking ‘how is this going to potentially have some applied value?’ whether it be the community in general, or specifically for law enforcement, or by furthering our knowledge within a certain area,” he says. “All of these applications ultimately assist with both assessment and treatment.”
This research is supported by a grant of $87,055 from the Social Sciences and Humanities Research Council in Canada.
Adapted from materials provided by University of British Columbia. Original article written by Raina Ducklow and Bud Mortenson.
Children Bullied At School At High Risk Of Developing Psychotic Symptoms
ScienceDaily (May 5, 2009) — Children who are bullied at school over several years are up to four times more likely to develop psychotic-like symptoms by the time they reach early adolescence.
Researchers at the University of Warwick found children who suffered physical or emotional bullying were twice as likely to develop psychotic symptoms by early adolescence, compared to children who are not bullied. However, if they experienced sustained bullying over a number of years that risk increases up to four times.
The research team, led by Professor Dieter Wolke, Professor of Developmental Psychology, followed 6,437 children from birth to 13 years.
The children took part in annual face-to-face interviews, psychological and physical tests. Parents were also asked to complete questionnaires about their child’s development. When they reached 13 years of age they were interviewed about experiences of psychotic symptoms in the previous six months.
Psychotic symptoms include hallucinations, delusions such as being spied on or bizarre thoughts such as one’s thoughts are being broadcast.
Professor Wolke said: “Our research shows that being victimised can have serious effects on altering perception of the world, such as hallucinations, delusions or bizarre thoughts where the person’s insight into why this is happening is reduced.”
“This indicates that adverse social relationships with peers is a potent risk factor for developing psychotic symptoms in adolescence and may increase the risk of developing psychosis in adulthood.”
The researchers used data from the Avon Longtitudinal Study of Parents And Children (ALSPAC). Parents have completed regular postal questionnaires about all aspects of their child’s health and development since birth (Apr 1991- Dec 1992).
Since the children were 7 and a half they have attended annual assessment clinics where they took part in a range of face-to-face interviews, psychological and physical tests.
Chronic peer victimisation, where bullying had continued over a number of years, was found in 13.7% of children when interviewed at ages 8 and 10. Severe victimisation, where children are both physically and emotionally bullied, was reported by 5.2% of children at age 10.
Professor Wolke added: “All children have conflicts occasionally and teasing and play fighting occurs. Children learn from these conflicts of how to deal with this. When we talk about bullying victimisation it is repeated, systematic and an abuse of power with the intent to hurt. Children who become targets have less coping skills, show a clear reaction and have few friends who can help them.”
Journal reference:
Schreier et al. Prospective Study of Peer Victimization in Childhood and Psychotic Symptoms in a Nonclinical Population at Age 12 Years. Archives of General Psychiatry, 2009; 66 (5): 527 DOI: 10.1001/archgenpsychiatry.2009.23
Adapted from materials provided by University of Warwick.
The research team, led by Professor Dieter Wolke, Professor of Developmental Psychology, followed 6,437 children from birth to 13 years.
The children took part in annual face-to-face interviews, psychological and physical tests. Parents were also asked to complete questionnaires about their child’s development. When they reached 13 years of age they were interviewed about experiences of psychotic symptoms in the previous six months.
Psychotic symptoms include hallucinations, delusions such as being spied on or bizarre thoughts such as one’s thoughts are being broadcast.
Professor Wolke said: “Our research shows that being victimised can have serious effects on altering perception of the world, such as hallucinations, delusions or bizarre thoughts where the person’s insight into why this is happening is reduced.”
“This indicates that adverse social relationships with peers is a potent risk factor for developing psychotic symptoms in adolescence and may increase the risk of developing psychosis in adulthood.”
The researchers used data from the Avon Longtitudinal Study of Parents And Children (ALSPAC). Parents have completed regular postal questionnaires about all aspects of their child’s health and development since birth (Apr 1991- Dec 1992).
Since the children were 7 and a half they have attended annual assessment clinics where they took part in a range of face-to-face interviews, psychological and physical tests.
Chronic peer victimisation, where bullying had continued over a number of years, was found in 13.7% of children when interviewed at ages 8 and 10. Severe victimisation, where children are both physically and emotionally bullied, was reported by 5.2% of children at age 10.
Professor Wolke added: “All children have conflicts occasionally and teasing and play fighting occurs. Children learn from these conflicts of how to deal with this. When we talk about bullying victimisation it is repeated, systematic and an abuse of power with the intent to hurt. Children who become targets have less coping skills, show a clear reaction and have few friends who can help them.”
Journal reference:
Schreier et al. Prospective Study of Peer Victimization in Childhood and Psychotic Symptoms in a Nonclinical Population at Age 12 Years. Archives of General Psychiatry, 2009; 66 (5): 527 DOI: 10.1001/archgenpsychiatry.2009.23
Adapted from materials provided by University of Warwick.
Early Word Recognition Is Key To Lifelong Reading Skills Says New Study
SOURCE
ScienceDaily (May 6, 2009) — Children’s early reading experience is critical to the development of their lifelong reading skills a new study from the University of Leicester has discovered. It found that the age at which we learn words is key to understanding how people read later in life.
The study addresses a 20-year riddle: When researchers investigate reading behaviour in children they find different patterns. Some researchers have found children’s reading mimics that of adults, but others have seen a different pattern of reading behaviour. Psychologists have struggled for twenty years to offer a convincing explanation for why different studies looking at the same topic have found such different results.
Now research by Dr Tessa Webb in the School of Psychology at the University of Leicester sheds new light on the subject by taking into account the age at which words are learnt.
She said: “Children read differently from adults, but as they grow older, they develop the same reading patterns. When adults read words they learned when they were younger, they recognise them faster and more accurately than those they learned later in life.”
In her research children from three different school years read aloud common and rarely used words, with half of the words following spelling to sound rules and the other half not obeying them. Unlike previous studies, Dr Webb made sure her research considered word learning age as well.
She found that children in their first few years at school read the words differently from adults. However, by age 10, they were mimicking the reading pattern of adults. This suggests that the different pattern of results found in children compared to adults may be due to the fact that word learning age was not considered.
This led her to conclude that word learning age is a key aspect of reading that should not be left out of research, lest the results are unsound.
The results of this research could have implications in tackling reading-related disabilities, such as dyslexia, said Dr Webb.
Adapted from materials provided by University of Leicester.
The study addresses a 20-year riddle: When researchers investigate reading behaviour in children they find different patterns. Some researchers have found children’s reading mimics that of adults, but others have seen a different pattern of reading behaviour. Psychologists have struggled for twenty years to offer a convincing explanation for why different studies looking at the same topic have found such different results.
Now research by Dr Tessa Webb in the School of Psychology at the University of Leicester sheds new light on the subject by taking into account the age at which words are learnt.
She said: “Children read differently from adults, but as they grow older, they develop the same reading patterns. When adults read words they learned when they were younger, they recognise them faster and more accurately than those they learned later in life.”
In her research children from three different school years read aloud common and rarely used words, with half of the words following spelling to sound rules and the other half not obeying them. Unlike previous studies, Dr Webb made sure her research considered word learning age as well.
She found that children in their first few years at school read the words differently from adults. However, by age 10, they were mimicking the reading pattern of adults. This suggests that the different pattern of results found in children compared to adults may be due to the fact that word learning age was not considered.
This led her to conclude that word learning age is a key aspect of reading that should not be left out of research, lest the results are unsound.
The results of this research could have implications in tackling reading-related disabilities, such as dyslexia, said Dr Webb.
Adapted from materials provided by University of Leicester.
Sunday, April 12, 2009
Odor Matching: The Scent Of Internet Dating
ScienceDaily (Apr. 12, 2009) — Dating websites will soon be able to compare partners in terms of whether the personal body odour of the other party will be pleasant to them. This has a very serious biological background.
If the start-up company Basisnote get their way, we will soon not only be able to match looks and interests in the profile of a potential partner with our own preferences. Now even the individual smell of the other party can be recorded in the profile and then checked to see if it will be pleasant for us. Even before going on the first date.
“If everything fits, you have the same interests, lots to talk about, but you can’t stand their smell, then a love affair doesn’t stand a chance,” explains biologist August Hämmerli. He makes the online smell profile possible with his company Basisnote. The start-up from Bern has worked together with ETH to develop a fast test to determine your own body odour and enter it as a code in a database. If the flirt partner has also entered their smell profile, you can find out within seconds whether you would like their smell.
All of this works by taking a saliva test, which can be carried out easily at home. It works with a chromatographic process, similar to a pregnancy test. The result: a simple digital code, which can be entered into an online profile. All of this takes no longer than twenty minutes. Hämmerli continues: “Obviously, smell is by no means the only factor in choosing a partner. However, our test makes it a measurable component.” The company is developing the test together with Mathias Wegner, head assistant at the Paul Schmidt-Hempel chair at the Institute for Integrative Biology. The test will appear on the market this year in cooperation with an online dating provider.
Immunity check through the nose
This all sounds like another gag for online dating platforms. Far from it. According to an explanatory model by evolutionary biologists, there is a valid explanation for why our nose is so important when it comes to choosing our partner. It is not without reason that we have to literally be able to “stand the smell” of our partner, if we are to find them likeable or even more. Our nose has sensitive receptors. They probe whether the other party has as few similar genes to us as possible. The more varied the gene pools are, the higher the chance for healthy, strong offspring.
It has been a well-known fact for a long time that mice check their potential mating partners by smelling them. The fact that humans do the same on a subconscious level was first proven in the nineties by biologist Claus Wedekind at University of Bern. He let female students smell T-shirts that had been worn by male test persons. The women had to indicate the smell that they found to be the most pleasant. It was shown that they consistently chose the men whose immune system was most different from their own.
How does this work? Basisnote founder August Hämmerli explains: “The genes of the MHC, the Major Histocompatibility Complex, carry the instructions for important building blocks of the immune system, the MHC proteins.” These bind fragments of foreign proteins, for example following an infection, and pass them on to the body’s own defence cells, which initiate a defence reaction. The more different MHC molecules someone has, the more different pathogens his body can defend against. In humans, there are more than one hundred variations of each of the nine most important MHC genes. The more varied the MHC, the better the immune systems of the offspring will be armed. Hämmerli: “The specific body odour is marked by the MHC combination. It is transmitted in the bodily fluids and transformed into the body’s very own smell on the skin.” The stronger the difference in immune system between the potential partner and yourself, the more pleasant you will find their smell.
Test instead of a T-shirt
According to Hämmerli, Basisnote is really just applying Wedekind’s T-shirt study to a standardised test system. August Hämmerli is so convinced of the success of his idea that he gave up his position as scientist at ETH to found the company. The Bern-born man coordinates the interface between the interested firms and the research work at the ETH laboratories. Co-founder Dominic Senn is an economist and political scientist. He also worked as a scientist at ETH up to the founding of the company, and is now responsible for the development of the business as CEO. Physicist Manuel Kaegi, who is just finishing his dissertation at the laboratory for safety analysis at ETH, looks after the IT implementation at Basisnote and interfaces with existing online dating platforms.
For two-and-a-half years, the three men have collected development funds and worked intensively on the details of the product. Now all technical issues have been resolved and it only remains to define the most user-friendly application. They are also preparing the first scientific publications on the subject.
The negotiations with online dating platforms are in their final phase. Hämmerli is happy to say that there has been great interest. He is reluctant to reveal which partner search site will soon be featuring smell as a dating component. This will have to wait until the autumn.
Setting up their own partner search site is out of the question. Their plans for the future are along other lines: “There are so many interesting areas. Once all of this is up and running, we want to have a look at the perfume sector,” Hämmerli reveals.
Adapted from materials provided by ETH Zurich.
“If everything fits, you have the same interests, lots to talk about, but you can’t stand their smell, then a love affair doesn’t stand a chance,” explains biologist August Hämmerli. He makes the online smell profile possible with his company Basisnote. The start-up from Bern has worked together with ETH to develop a fast test to determine your own body odour and enter it as a code in a database. If the flirt partner has also entered their smell profile, you can find out within seconds whether you would like their smell.
All of this works by taking a saliva test, which can be carried out easily at home. It works with a chromatographic process, similar to a pregnancy test. The result: a simple digital code, which can be entered into an online profile. All of this takes no longer than twenty minutes. Hämmerli continues: “Obviously, smell is by no means the only factor in choosing a partner. However, our test makes it a measurable component.” The company is developing the test together with Mathias Wegner, head assistant at the Paul Schmidt-Hempel chair at the Institute for Integrative Biology. The test will appear on the market this year in cooperation with an online dating provider.
Immunity check through the nose
This all sounds like another gag for online dating platforms. Far from it. According to an explanatory model by evolutionary biologists, there is a valid explanation for why our nose is so important when it comes to choosing our partner. It is not without reason that we have to literally be able to “stand the smell” of our partner, if we are to find them likeable or even more. Our nose has sensitive receptors. They probe whether the other party has as few similar genes to us as possible. The more varied the gene pools are, the higher the chance for healthy, strong offspring.
It has been a well-known fact for a long time that mice check their potential mating partners by smelling them. The fact that humans do the same on a subconscious level was first proven in the nineties by biologist Claus Wedekind at University of Bern. He let female students smell T-shirts that had been worn by male test persons. The women had to indicate the smell that they found to be the most pleasant. It was shown that they consistently chose the men whose immune system was most different from their own.
How does this work? Basisnote founder August Hämmerli explains: “The genes of the MHC, the Major Histocompatibility Complex, carry the instructions for important building blocks of the immune system, the MHC proteins.” These bind fragments of foreign proteins, for example following an infection, and pass them on to the body’s own defence cells, which initiate a defence reaction. The more different MHC molecules someone has, the more different pathogens his body can defend against. In humans, there are more than one hundred variations of each of the nine most important MHC genes. The more varied the MHC, the better the immune systems of the offspring will be armed. Hämmerli: “The specific body odour is marked by the MHC combination. It is transmitted in the bodily fluids and transformed into the body’s very own smell on the skin.” The stronger the difference in immune system between the potential partner and yourself, the more pleasant you will find their smell.
Test instead of a T-shirt
According to Hämmerli, Basisnote is really just applying Wedekind’s T-shirt study to a standardised test system. August Hämmerli is so convinced of the success of his idea that he gave up his position as scientist at ETH to found the company. The Bern-born man coordinates the interface between the interested firms and the research work at the ETH laboratories. Co-founder Dominic Senn is an economist and political scientist. He also worked as a scientist at ETH up to the founding of the company, and is now responsible for the development of the business as CEO. Physicist Manuel Kaegi, who is just finishing his dissertation at the laboratory for safety analysis at ETH, looks after the IT implementation at Basisnote and interfaces with existing online dating platforms.
For two-and-a-half years, the three men have collected development funds and worked intensively on the details of the product. Now all technical issues have been resolved and it only remains to define the most user-friendly application. They are also preparing the first scientific publications on the subject.
The negotiations with online dating platforms are in their final phase. Hämmerli is happy to say that there has been great interest. He is reluctant to reveal which partner search site will soon be featuring smell as a dating component. This will have to wait until the autumn.
Setting up their own partner search site is out of the question. Their plans for the future are along other lines: “There are so many interesting areas. Once all of this is up and running, we want to have a look at the perfume sector,” Hämmerli reveals.
Adapted from materials provided by ETH Zurich.
Friday, April 10, 2009
Dogs And 2-year-olds Share A Limited Ability To Understand Adult Pointing Gestures
SOURCE
ScienceDaily (Apr. 10, 2009) — Dogs and small children who share similar social environments appear to understand human gestures in comparable ways, according to Gabriella Lakatos from Eötvös University in Budapest, Hungary, and her team. Looking at how dogs and young children respond to adult pointing actions, Lakatos shows that 3-year-olds rely on the direction of the index finger to locate a hidden object, whereas 2-year-olds and dogs respond instead to the protruding body part, even if the index finger is pointing in the opposite direction.
It is widely accepted that in the course of domestication, dogs became predisposed to read human communication signals, including pointing, head turning and gazing. Furthermore, the social environment of human infants is often shared by pet dogs in the family, and therefore there are likely to be similarities in the social stimulation of both young children and dogs.
The authors carried out two studies in which they compared the performance of adult dogs and 2- and 3-year-old children - the period of human development during which children and dogs respond in similar ways. They investigated whether dogs and human children are able to generalize from familiar pointing gestures to unfamiliar ones and whether they understand the unfamiliar pointing actions as directional signals.
A total of fifteen dogs and thirteen 2-year-old and eleven 3-year-old children took part in the two studies. In the first study, the researchers used a combination of finger and elbow pointing gestures to help dogs locate hidden food and children a favorite toy. They found that dogs choose a direction for the reward on the basis of a body part that protrudes from the experimenter's silhouette, even when the index finger is pointing in a different direction. Like dogs, 2-year-olds did not understand the significance of the pointing index finger when it did not protrude from the silhouette. (In these cases, the elbow protruded in the opposite direction.) However, 3-year-olds responded successfully to all gestures.
In the second study, the researchers used unfamiliar pointing gestures with a combination of finger, leg and knee pointing. All children and the dogs understood the leg-pointing gestures but only 3-year-olds successfully responded to pointing with the knee.
The authors conclude that "protruding body parts provide the main cue for deducing directionality for 2-year-old children and dogs. The similar performance of these groups can be explained by parallels in their evolutionary history and their socialization in a human environment."
Journal reference:
Lakatos et al. A comparative approach to dogs’ (Canis familiaris) and human infants’ comprehension of various forms of pointing gestures. Animal Cognition, 2009; DOI: 10.1007/s10071-009-0221-4
Adapted from materials provided by Springer Science+Business Media, via AlphaGalileo.
The authors carried out two studies in which they compared the performance of adult dogs and 2- and 3-year-old children - the period of human development during which children and dogs respond in similar ways. They investigated whether dogs and human children are able to generalize from familiar pointing gestures to unfamiliar ones and whether they understand the unfamiliar pointing actions as directional signals.
A total of fifteen dogs and thirteen 2-year-old and eleven 3-year-old children took part in the two studies. In the first study, the researchers used a combination of finger and elbow pointing gestures to help dogs locate hidden food and children a favorite toy. They found that dogs choose a direction for the reward on the basis of a body part that protrudes from the experimenter's silhouette, even when the index finger is pointing in a different direction. Like dogs, 2-year-olds did not understand the significance of the pointing index finger when it did not protrude from the silhouette. (In these cases, the elbow protruded in the opposite direction.) However, 3-year-olds responded successfully to all gestures.
In the second study, the researchers used unfamiliar pointing gestures with a combination of finger, leg and knee pointing. All children and the dogs understood the leg-pointing gestures but only 3-year-olds successfully responded to pointing with the knee.
The authors conclude that "protruding body parts provide the main cue for deducing directionality for 2-year-old children and dogs. The similar performance of these groups can be explained by parallels in their evolutionary history and their socialization in a human environment."
Journal reference:
Lakatos et al. A comparative approach to dogs’ (Canis familiaris) and human infants’ comprehension of various forms of pointing gestures. Animal Cognition, 2009; DOI: 10.1007/s10071-009-0221-4
Adapted from materials provided by Springer Science+Business Media, via AlphaGalileo.
Subscribe to:
Posts (Atom)