In The News
A new study finds that teens and young adults with Down syndrome between the ages of 12 and 21 were significantly more likely to be on psychotropic medications than children five to 11 years old.
Among children less than 12, the odds of being on a psychotropic medication increased with age for all classes of medications studied.
For 12- to 18-year-olds, the odds of being on a stimulant significantly decreased with age, while the odds of being on a medication from other classes of drugs remained stable over time.
“Variations in medication use over time in children and teens with Down syndrome suggest that the type and severity of neurobehavioral problems likely change over time, too,” says Julia Anixt, M.D., a developmental pediatrician at Cincinnati Children’s Hospital and a co-author of the study.
The study is published online in the Journal of Developmental & Behavioral Pediatrics.
In the younger age group, the odds of being on a stimulant increased 1.37 times for each additional year of age from five to 11. This means that a nine year old would be 3.5 times as likely to be on a stimulant medication as a five year old.
These drugs are used as first-line therapy for symptoms of attention-deficit hyperactivity disorder (ADHD).
This increase in use “may reflect increasing impairment in functioning due to ADHD symptoms as children approach 11 years. After that age, the use of stimulants declined with each increasing year.”
The researchers found the use of selective serotonin reuptake inhibitors (SSRIs), a medication class commonly used to treat symptoms of anxiety and depression, increases as children and teens age.
Declining behavioral problems (outwardly disruptive behaviors) and increasing emotional problems, such as depression and anxiety, with age are also common in typically developing children and those with intellectual disability.
Use of atypical antipsychotics (AAP) peaked in the age range of 11 to 14, which is an age range that previous studies have identified as consistent with a peak in challenging behaviors in children with Down syndrome.
AAPs are approved for the treatment of irritability and aggression in children with autism spectrum disorders but are often prescribed “off-label” to target problem behaviors in children with disruptive behavior disorder and developmental disabilities.
The study found that the rates of AAP use in boys was higher than in girls for all ages.
Researchers reviewed data on 832 children taken between 2010 and 2013. All were patients at Cincinnati Children’s. The division of Developmental and Behavioral Pediatrics at Cincinnati Children’s is home to The Thomas Center, a specialized clinical program for the care of children with Down syndrome.
Researchers believe the review was insightful in showing that medical and pharmaceutical management for this special cohort can be improved.
“Providers must be more systematic in the screening, diagnosis, and management of mental health conditions in children and teens with Down syndrome,” said Anixt.
“Eventually, the American Academy of Pediatrics health guidelines for children with Down syndrome could be expanded beyond physical health conditions to include treating behavioral and mental health conditions, thus improving the long-term outcomes and quality of life of individuals with Down syndrome.”
A new study suggests that rather than complaining of missing skill sets among younger workers, adult workers should take the lead modeling and training beginning workers on appropriate behaviors.
Emerging adults aged 18 to 25 are often criticized for their poor interpersonal skills, sense of entitlement, and casual work ethic. A University of Illinois study suggests that fault-finding adult co-workers could make a big difference in young workers’ leadership development by developing relationships with them.
Researchers suggest co-workers model the behaviors they wish to see, and provide leadership growth opportunities.
“Young adults in our study had learned a lot from mentors who modeled initiative, drive, and persistence; demonstrated how to communicate with confidence and engage in active listening; and displayed reliability, tolerance, respect, and a positive attitude,” said Jill Bowers, Ph.D., a researcher in the University of Illinois’s Department of Human and Community Development.
According to Bowers, most of the literature on leadership development is written from an adult point of view. But in this small qualitative study, young adults describe their leadership growth as students and on the job as they moved from adolescence into young adulthood.
The study shows that role models were profoundly influential during the transition to adulthood, and the article describes a role model-driven framework for leadership development, she said.
“Adults who are complaining about the new generation of ‘slackers’ should build relationships with students and young colleagues and actively model a professional work ethic for them,” Bowers said.
Investigators discovered that when a mentoring relationship was established and role models demonstrated the behaviors they wished to see in young participants, the new workers were receptive to the process.
In fact, the mentees described a process in which they listened to the knowledge their mentors shared, engaged in opportunities to grow as leaders, and learned to believe in their own potential, Bowers said.
Leadership development during adulthood is often influenced by positional leaders; say Gandhi, Hillary Clinton, or Bill, and Melinda Gates.
These cultural leaders, unknown personally to study participants, inspired the young adults’ vision of future activism, but that didn’t happen until a teacher, family member, coach, or co-worker had laid a foundation for thinking about character and leadership, she said.
“For that reason, we’d like to encourage businesses and organizations to offer leadership training, explicitly teaching employees and youth leaders to be good role models and teaching youth and young adults how to develop and maintain relationships with mentors,” she said.
“It’s natural for some people to engage in positive communication and active listening or demonstrate initiative and perseverance. For others, those qualities aren’t as instinctual, and establishing relationships and mentoring young adults is something they could learn if businesses made teaching those traits a priority,” said Bowers.
Mentoring is a task that can also help the mature worker self-reflect on their own work ethic and professional skills. This often provides an opportunity for a seasoned employee to modify their behavior and personal qualities that they wouldn’t want to see replicated in the adolescents or young adults.
“You have to demonstrate the skills you’re trying to teach, not just preach about their importance. If you tell a student or a young co-worker to use good email etiquette, and then don’t follow your own advice in communicating with them, you lose credibility,” she said.
Source: University of Illinois
A new study suggests sleep problems often plague midlife adults who are only marginally satisfied with life.
In a review of nearly 4,000 adults, investigators discovered individuals with higher life satisfaction reported the ability to go to sleep faster than people who were less satisfied with life.
Prior studies have suggested that sleep onset delay among those with low life satisfaction could be the result of worry and anxiety.
In the study cohort, fifty-five percent were female and ranged in age from 17 to 74. A six-item life satisfaction survey was used to code participants as having low, medium, and high levels of satisfaction, and a subjective measure of minutes it takes to fall asleep was used to measure sleep onset latency (SOL) or the time it takes to fall asleep.
These findings support the idea that life satisfaction is interlinked with many measures of sleep and sleep quality, suggesting that improving one of these variables might result in improving the other.
“These findings support the idea that life satisfaction is interlinked with many measures of sleep and sleep quality, suggesting that improving one of these variables might result in improvement in the other,” said lead author Hayley O’Hara, a recent graduate of Ohio Northern University.
The research abstract was published recently in an online supplement of the journal Sleep and was presented at SLEEP 2015, the 29th annual meeting of the Associated Professional Sleep Societies LLC.
Babies who are able to “resettle” themselves after waking up are more likely to sleep for longer periods of time, according to a new study published in the Journal of Developmental & Behavioral Pediatrics.
“Infants are capable of resettling themselves back to sleep by three months of age,” said Ian St. James-Roberts and colleagues at the University of London. “Both autonomous resettling and prolonged sleeping are involved in ‘sleeping through the night’ at an early age.”
For the study, the researchers used infrared video cameras to record 101 infants overnight. Videos made at five weeks and three months of age were analyzed to determine changes in sleep and waking during this age span, a time when parents hope their baby will start sleeping more at night, while crying less.
The researchers looked for moments when the infants woke up but were able to “resettle autonomously,” or go back to sleep without parental involvement. The video footage was then compared with parental questionnaires on their infants’ sleep behaviors.
The clearest developmental progression between video recordings was an increase in length of sleep from a little over two hours at five weeks to 3.5 hours at three months. Only about 10 percent of infants slept continuously for five hours or more at five weeks, compared to 45 percent at three months.
At both ages, about one-fourth of the infants woke up and resettled themselves without parental help at least once during the night. These babies were able to fall back asleep with little or no crying/fussing.
“Self-resettling at five weeks predicted prolonged sleeping at three months,” the researchers write. Sixty-seven percent of infants who resettled in the first recording slept continuously for at least five hours in the second recording, compared to 38 percent who could not resettle.
Infants were more likely to suck their fingers or hands at three months compared to five weeks. Infants who slept through the night at three months spent more time sucking their fingers or hands, a self-regulatory strategy that may help them initiate or maintain sleep.
Prior research has shown that breast-fed infants wake up at night because they need to eat frequently. However, the new results showed no difference in resettling or sleep times for infants fed breast milk versus formula.
Previous video studies have shown that what parents call “sleeping through the night” is a misnomer; older babies who sleep through the night not only sleep for longer periods, but also are able to resettle themselves after waking.
The new study confirms that some babies develop this resettling ability in the first three months of age.
“Findings indicate the need for studies of how arousal, waking, and resettling develop into sustained sleeping, and of how environmental factors support these endogenous and behavioral processes,” said St. James-Roberts.
“If they fulfill their promise, the findings may eventually help to resolve the puzzle of why so many healthy infants should be diagnosed with sleep problems, as well as helping the families involved.”
Source: Wolters Kluwer Health
A new study quantifies the harm that a cerebrovascular accident (or stroke) has on the brain as researchers discovered having a stroke reduces brain capabilities by almost eight years.
Brain damage includes memory loss and thinking speed as measured on cognitive tests.
University of Michigan investigators discovered that among both black and white patients, having had a stroke meant that their score on a 27-item test of memory and thinking speed had dropped as much as it would have if they had aged 7.9 years overnight.
Investigators reviewed data from more than 4,900 black and white seniors over the age of 65. The results will be published in a forthcoming issue of the journal Stroke.
Researchers used two sources of information for their analysis: detailed surveys and tests of memory and thinking speed over multiple years from participants in a large, national study of older Americans, and Medicare data from the same individuals.
Investigators focused on the 7.5 percent of black study participants, and the 6.7 percent of white participants, who had no recent history of stroke, dementia, or other cognitive issues, but who suffered a documented stroke within 12 years of their first survey and cognitive test in 1998.
By measuring participants’ changes in cognitive test scores over time from 1998 to 2012, the researchers could see that both blacks and whites did significantly worse on the test after their stroke than they had before.
Although the size of the effect was the same among blacks and whites, past research has shown that the rates of cognitive problems in older blacks are generally twice that of non-Hispanic whites.
So the new results mean that stroke doesn’t account for the mysterious differences in memory and cognition that grow along racial lines as people age. The researchers say the findings underscore the importance of stroke prevention.
“As we search for the key drivers of the known disparities in cognitive decline between blacks and whites, we focus here on the role of ‘health shocks’ such as stroke,” says lead author and University of Michigan Medical School assistant professor Deborah Levine, M.D., M.P.H.
“Although we found that stroke does not explain the difference, these results show the amount of cognitive aging that stroke brings on, and therefore the importance of stroke prevention to reduce the risk of cognitive decline.”
Other research on disparities in cognitive decline has focused on racial differences in socioeconomic status, education, and vascular risk factors such as diabetes, high blood pressure, and smoking that can all contribute to stroke risk.
These factors may explain some but not all of the racial differences in cognitive decline.
Levine and her colleagues note that certain factors — such as how many years a person has vascular risk factors, and the quality of his or her education, as well as genetic and biological factors — might play a role in racial differences in long-term cognitive performance.
Despite the uncertainty in reasons for cognitive decline, one thing is clear: Strokes have serious consequences for brain function.
The study show that on average, they rob the brain of eight years of cognitive health. Therefore, people of all racial and ethnic backgrounds can benefit from taking steps to reduce their risk of a stroke.
This can be accomplished by controlling blood pressure and cholesterol, stopping or avoiding smoking, controlling blood sugar in diabetes, and being active even in older age.
Source: University of Michigan
A new study suggest getting a poor night’s sleep does more than make you feel grumpy and eat more the next day.
Researchers discovered poor sleep habits among people with knee osteoarthritis (OA) appears to increase their sensitivity to pain resulting in an amplification of discomfort.
The study has been published in the journal Arthritis Care & Research.
OA is a degenerative joint disease that causes pain and swelling of joints in the hand, hips, or knee — affects nearly 27 million Americans 25 years of age and older.
Experts say that roughly one third of older adults have knee OA, a leading cause of pain and disability worldwide. Researchers believe that central sensitization, which is a hypersensitivity to pain, may contribute to the clinical pain amplification in OA.
“Our study is the largest and most comprehensive examination of the relationship between sleep disturbance, catastrophizing and central sensitization in knee OA,” said lead author Claudia Campbell, Ph.D.
The current case-controlled study included 208 participants who were categorized into four groups: OA patients with insomnia, OA patients with normal sleep habits, healthy controls with insomnia, and healthy controls without a pain syndrome and normal sleep.
Seventy-two percent of the participants were female. Participants completed sleep assessments, psychological and pain evaluations, and sensory testing.
Researchers found that subjects with knee OA and insomnia had the greatest degree of central sensitization compared to the controls.
The team found patients with poor sleep and high catastrophizing scores reported increased levels of central sensitization. In turn, central sensitization was significantly associated with increased clinical pain.
Said Campbell, “While no causal processes may be determined from this study, our data suggest that those with low sleep efficiency and higher catastrophizing have the greatest central sensitization.
Understanding the intricate relationship between sleep, central sensitization, and catastrophizing has important clinical implications for treating those with chronic pain conditions such as knee OA.”
A new study provides some timely suggestions on improving self-control for ethical decision-making.
Researchers discovered being aware of the temptation before it happens and thinking about the long-term consequences of misbehaving could help more people do the right thing.
The study by University of Chicago Professor Ayelet Fishbach, Ph.D., and Rutgers Professor Oliver J. Sheldon, Ph.D., is the first to test how the two separate factors of identifying an ethical conflict and preemptively exercising self-control interact in shaping ethical decision-making.
The article was recently published in the Personality and Social Psychology Bulletin.
In a series of experiments that included common ethical dilemmas, such as calling in sick to work and negotiating a home sale, the researchers found that two factors together promoted ethical behavior.
They found that participants who identified a potential ethical dilemma as connected to other similar incidents and who also anticipated the temptation to act unethically were more likely to behave honestly than participants who did not.
“Unethical behavior is rampant across various domains ranging from business and politics to education and sports,” said Fishbach.
“Organizations seeking to improve ethical behavior can do so by helping people recognize the cumulative impact of unethical acts and by providing warning cues for upcoming temptation.”
In one experiment, business school students were divided into pairs as brokers for the buyer and seller of a historic New York brownstone.
The dilemma: The seller wanted to preserve the property while the buyer wanted to demolish it and build a hotel. The brokers for the seller were told to only sell to a buyer who would save the brownstone, while the brokers for the buyer were told to conceal the buyer’s plan to develop a hotel.
Before the negotiations began, half of the students were asked to recall a time when they cheated or bent the rules to get ahead. Only 45 percent of those students thinking about their ethics ahead of time behaved unethically in the negotiations, while more than two-thirds, or 67 percent, of the students who weren’t reminded of an ethical temptation in advance, lied in the negotiations in order to close the deal.
In another experiment involving workplace scenarios, participants were less likely to say it is OK to steal office supplies, call into work sick when they aren’t really ill, or intentionally work slowly to avoid additional tasks, if they anticipated an ethical dilemma through a writing exercise in advance and if they considered a series of six ethical dilemmas all at once.
In other words, people are more likely to engage in unethical behavior if they believe the act is an isolated incident and if they don’t think about it ahead of time.
The results of the experiments have the potential to help policymakers, educators and employers devise strategies to encourage people to behave ethically.
For example, a manager could control costs by emailing employees before a work trip to warn them against the temptation to inflate expenses.
The notice could be even more effective if the manager reminded employees that the urge to exaggerate expenses is a temptation they will encounter repeatedly in the future.
African-American women are more likely to suffer through infertility issues alone, according to a new University of Michigan study. The findings also show that black women more often feel that infertility undermines their sense of self and gender identity.
The study may be among the first to focus only on black women and infertility, as most research has been conducted on affluent white couples seeking advanced medical help.
“Infertile African-American women are indeed hidden from public view,” said lead author Dr. Rosario Ceballo, a U-M professor of psychology and women’s studies.
For the study, Ceballo and colleagues Erin Graham and Jamie Hart interviewed 50 African-American women, ages 21 to 52, of different socioeconomic backgrounds about infertility and relationships with friends, relatives and doctors. Most of the women were married, and many had college degrees and worked full-time.
All of the participants had met the medical diagnosis for infertility, a condition in which a woman is unable to become pregnant after 12 or more months of regular, unprotected sex. The women spent from one to 19 years trying to conceive.
During the interviews, 32 percent of the women discussed stereotyped beliefs that equated being a woman with motherhood. Some responses included: “Emotionally, I felt that I was not complete, because I had not had a child. I didn’t feel like I was a complete woman,” and “It (having no biological children) would label you as a failure.”
For some women, infertility was infused with religious significance. They believed God intended women to produce children, which further heightened their sense of shame.
Nearly all of the women coped with infertility in silence and isolation, even when a friend or relative knew about it. Participants also believed that infertility was not as emotionally painful for their husbands and partners, who were not interviewed for the study.
Researchers noted that some women, especially those who could not conceive again after having a child, stayed silent because discussing it did not elicit sympathy or empathy.
“Women may also reason that other people can neither change their infertility status nor understand what they were experiencing,” Ceballo said.
Other reasons for silence about infertility may have to do with cultural expectations about strong, self-reliant black women who can deal with problems on their own and with notions about remaining private in African-American communities, she said.
In the interviews, for example, respondents said, “You don’t want people in your business” and “I never said anything to anyone else because in our culture…it was not something that you shared.”
Regarding their interactions with doctors and medical professionals, about 26 percent believed that encounters may have been influenced by gender, race and/or class discrimination. These women spoke about doctors who made assumptions about sexual promiscuity and inability to pay for services or support a child.
One surprising finding was that highly educated women who were well-off financially were just as likely as low-income African-American women to report discrimination in medical settings. Furthermore, the cost of fertility treatment was prohibitively high for most participants.
Overall, when black women were unable to become pregnant, it negatively affected their self-esteem. They saw themselves as abnormal, in part, because they did not see other people like themselves — African-American, female and infertile — in social images, Ceballo said.
Source: University of Michigan
Children with autism spectrum disorder (ASD) are often picky eaters, which can lead parents to worry that they aren’t getting the right amounts of vitamins and minerals. This sometimes leads parents to try nutritional supplements and dietary regimens, such as gluten-free and casein-free (GFCF) diets without professional supervision.
But a new study published in the Journal of the Academy of Nutrition and Dietetics found that this often results in both insufficient nutrients and excessive nutrients.
For example, researchers found that despite supplementation, children with ASD were deficient in calcium, while some were consuming excessive amounts of vitamin A and other nutrients.
“Many families try a GFCF diet in an attempt to improve symptoms of ASD,” said lead investigator Patricia A. Stewart, Ph.D., R.D., an assistant professor of pediatrics at the University of Rochester Medical Center in New York. “While 19 percent of all Autism Speaks Autism Treatment Network (AS ATN) participants were reported to be on a GFCF diet, 12 percent of the children in the subgroup participating in this study were given a GFCF diet and were significantly more likely to use nutritional supplements — 78 percent vs 53 percent — however, the micronutrient intake of children on or off the diet was remarkably similar.”
The researchers recruited 368 children between the ages of 2 and 11 from five AS ATN sites at Cincinnati Children’s Hospital, University of Arkansas, University of Colorado, University of Pittsburgh and University of Rochester. All had been diagnosed with autistic disorder, Asperger disorder, or pervasive developmental disorder.
Three-day food records were completed for the children by their caregivers. A registered dietitian nutritionist trained the caregivers to record the amount of all foods, beverages and nutritional supplements consumed, including brand names and recipes used for food preparation.
In the case of nutritional supplements, photographs of the labels were taken to ensure that ingredients were accurately recorded, the researchers reported. Registered dietitian nutritionists verified the records and called the parents if clarification was needed.
Examining these detailed eating records, investigators found that the children were consuming similar amounts of micronutrients as children without ASD. They also had the same deficits in vitamins D, E, calcium, potassium, and choline as the general population.
Although autistic children are given supplements more often — 56 percent vs. 31-37 percent of the general population — even after supplementation, 40 percent to 55 percent were lacking in calcium and 30 percent to 40 percent were lacking in vitamin D, according to the study’s findings.
Children on the GFCF diet consumed more magnesium and vitamin E, the researchers reported, noting this may be due to the substitution of soy and nut-based products. Children on this diet were more adequately supplemented with vitamin D. Calcium supplementation was equally inadequate in those on and off the diet, the researchers added.
Despite different eating behaviors, autistic children received much of their needed micronutrients from their food. This might be due to the high levels of fortification in the modern food supply, where vitamins and minerals are often added, the researchers theorized.
This fortification may also be responsible for the overconsumption of certain nutrients by children with ASD, researchers noted. For the supplement users in this study, many exceeded the Tolerable Upper Limit for safe intake levels of vitamin A, folic acid, and zinc, according to the study’s findings.
“In clinical practice, each patient needs to be individually assessed for potential nutritional deficiencies or excess,” Stewart said. “Few children with ASD need most of the micronutrients they are commonly given as multivitamins, which often leads to excess intake that may place children at risk for adverse effects. When supplements are used, careful attention should be given to adequacy of vitamin D and calcium intake.”
Source: Elsevier Health Sciences
Women with obesity are at greater risk for health problems during pregnancy including depression, gestational diabetes and high blood pressure compared with healthy weight women, according to a new analysis led by Trinity College Dublin.
The paper, published in the journal Obesity Reviews, recommends that women with obesity should lose weight before they conceive and highlights the current lack of support available to these women.
Maternal obesity is associated with a range of health problems for both mothers and babies during pregnancy, delivery and the postnatal period. Problems can include gestational diabetes, high blood-pressure, pre-eclampsia, depression, higher levels of instrumental and caesarean birth, and surgical site infection.
Maternal obesity is also linked to greater risk of preterm birth, large-for-gestational-age babies, fetal defects, congenital anomalies, and perinatal death. Furthermore, breastfeeding initiation rates are lower and there is greater risk of early breastfeeding cessation in women with obesity compared with healthy weight women.
The findings also showed that maternal obesity is the most significant factor leading to obesity in their children and, coupled with excessive weight gain in pregnancy, also results in long-term obesity for women.
“Up to 1 in 5 pregnant women in Ireland suffer from obesity, a serious health problem that is not currently being adequately addressed and that can have significant implications for both them and their babies,” said Dr. Cecily Begley, author of the study and Chair of Nursing and Midwifery in the School of Nursing and Midwifery, Trinity.
“However, it is important not to stigmatize women because of their weight. We need to provide pre-conceptual health education, through national subsidized programs, to support and encourage women with a high BMI to lose weight before they conceive. The benefits for them and their babies can be significant.”
For the study, the researchers produced a systematic overview of 22 systematic reviews, which looked at a total of 573 research studies comparing outcomes between pregnant women with obesity and those of healthy weight. This has resulted in an exhaustive and extensive review of the true risks associated with maternal obesity in terms of physical and mental health problems in both the mother and baby.
“The potential complications of obesity in pregnancy can lead to longer duration of hospital stay and greater costs. Given the high proportion of pregnant women with obesity, it is crucial to invest in weight loss support for these women, to reduce the risks for mothers and babies,” said Professor Michael Turner, clinical lead for the National Clinical Programme in Obstetrics and Gynaecology in Ireland.
Source: Trinity College Dublin
After a concussion, a person can be left with disturbed sleep, memory deficits and other cognitive problems for years, but a new study shows that sleep can still help them overcome memory deficits.
According to researcher Rebecca Spencer, Ph.D., at the University of Massachusetts Amherst, the benefit is equivalent to that seen in individuals without a history of mild traumatic brain injury (TBI), also known as concussion.
Spencer, with graduate student Janna Mantua and undergraduates Keenan Mahan and Owen Henry, found that individuals who had sustained a mild TBI more than a year earlier had greater recall in a word memorization task after they had slept.
“It is interesting to note that despite having atypical or disturbed sleep architecture, people in our study had intact sleep-dependent memory consolidation,” she said. “Supporting opportunities to sleep following a concussion may be an important factor in recovery from cognitive impairments. The changes in sleep architecture we observed are in an optimal direction, that is, more rich, slow wave sleep and less light or Stage 1 sleep, (which) is a shift in the positive direction.”
The researchers did notice differences in sleep in the participants who had a concussion. They spent a significantly greater part of the night in deep, slow-wave sleep, a sleep stage where memories are replayed and consolidated to long-term storage. However, their memory and recall ability was not significantly different from the participants who had not suffered a concussion, the researchers noted.
“Overall, sleep composition is altered following TBI, but such deficits do not yield insufficiencies in sleep-dependent memory consolidation,” the researchers wrote in the study.
For the study, researchers recruited 26 young adults 18 to 22 years old with a history of diagnosed TBI an average three to four years earlier, and 30 people with no history of brain injury. All slept more than six hours a night, took few naps, drank moderate amounts of coffee and alcohol, and had no neurological disorders other than participants who had a TBI, the researchers reported.
Participants learned a list of word pairs and their memory for them was assessed 12 hours later. Half in each group learned the word pairs in the morning and their memory was tested in the evening, while half were tested in the evening and their memory was tested in the morning after sleep.
Sleep stages were identified by polysomnography, attaching a set of electrodes to the head for physiological recordings during sleep.
While slow wave sleep was greater in those with a TBI, they also had less non-REM stage 1 sleep, a form of very light sleep seen during the wake-to-sleep transition, according to the study’s findings. This suggests that those with a concussion history can reach deep sleep sooner and get more of it, the researchers said.
For both those with a history of concussion and those without, recall was better following sleep than being awake in the daytime, according to the study’s findings.
“We know this is not just a matter of the time of day we tested them at as they were able to learn equally regardless of whether we taught them the task in the morning or the evening,” Spencer said.
How you react to eye contact with another person is largely connected to your personality traits, according to a new study by researchers at the University of Tartu in Estonia and the University of Tampere in Finland.
“Our findings indicate that people do not only feel different when they are the center of attention but that their brain reactions also differ. For some, eye contact tunes the brain into a mode that increases the likelihood of initiating an interaction with other people. For others, the effect of eye contact may decrease this likelihood,” said Jari Hietanen, Ph.D., of University of Tampere.
Eye contact plays a crucial role in communication and is a powerful social signal. Looking someone in the eye automatically sends a signal to the other person that your attention is focused on him or her. If the other person happens to look back, you engage in eye contact, and a channel for interaction is opened.
Prior research suggests that eye contact triggers patterns of brain activity connected to “approach” motivation, while seeing another person with his or her gaze averted triggers brain activity associated with “avoidance” motivation. This suggests that another person’s attention is something important and desirable. And yet many people find that being the focus of someone’s gaze is uncomfortable, and some may even experience high levels of anxiety.
For the new study, the researchers set out to determine what lies underneath these individual psychological differences. Does personality modulate how a person reacts to eye contact? Can this difference be measured by brain activity?
“In order to test this hypothesis, we conducted an experiment where the participants’ electrical brain activity was recorded while they were looking at another person who was either making eye contact or had her gaze averted to the side. We had assessed the participants’ personality with a personality test in advance,” said researcher Helen Uusberg.
The findings revealed that personality does indeed help determine how one’s brain will react to attention from another person. In participants who scored low in neuroticism, situations of eye contact triggered brain activity linked to ‘approach’ motivation. Neuroticism is the personality dimension related to anxiety and self-consciousness.
However, if the participant scored high on neuroticism, the eye contact triggered more ‘avoidance’ brain activity patterns. The neurotic participants also wanted to look at the other person with a direct gaze for shorter periods of time and experienced more pleasant feelings when they saw a person with an averted gaze.
Source: Academy of Finland
A new study shows that a low-cost, non-profit weight loss program offers the kind of long-term results that often elude dieters.
“We know that people lose weight and then gain it back,” said study author Nia S. Mitchell, M.D., M.P.H., a researcher with the Division of General Internal Medicine at the Anschutz Health and Wellness Center at the University of Colorado. “In this case, we found that people who renewed their annual membership in the program lost a clinically significant amount of weight and kept it off.”
“Clinically significant” weight loss is defined as losing 5 percent or more of one’s body weight, because weight-related medical conditions, such as diabetes, can improve with that level of weight loss, she explained.
Mitchell’s study focused on Take Off Pounds Sensibly (TOPS), a weight loss program led by volunteers and costing $92 a year (a $32 annual fee plus local chapter dues averaging about $5 a month.)
The study, which included about 75,000 participants, focused on those who renewed their annual memberships consecutively for up to seven years. Mitchell found that 50 percent of them had clinically significant weight loss in their first year in TOPS, and 62 percent of those who stayed with the program maintained that after seven years.
Unlike many commercial and academic programs, there is a minimal difference between the weight-loss and weight-maintenance phases of the TOPS program, reinforcing weight management behaviors, according to Mitchell.
“Despite decades of obesity research, two issues remain elusive in weight management: Significant, long-term weight-loss maintenance and widely accessible programs,” she said. “To reverse this epidemic we need to find programs that are effective at weight loss and maintenance, low-cost, and easy to implement and disseminate widely.”
According to Mitchell, TOPS appears to provide effective weight loss, weight loss maintenance and affordability, which can be especially important to low-income, minority and rural populations that may not have access to a structured weight loss program.
“As long-term weight loss is difficult to achieve in any clinical circumstance, TOPS may be a viable option to treat those who are overweight or obese,” she said.
The study was published in the American Journal of Preventative Medicine.
Source: University of Colorado Denver
A new study has found that one-third of patients admitted to an intensive care unit (ICU) will develop delirium, which lengthens hospital stays and substantially increases the risk of dying in the hospital.
“Every patient who develops delirium will on average remain in the hospital at least one day longer,” said Robert Stevens, M.D., a specialist in critical care and an associate professor at the Johns Hopkins University School of Medicine.
Worse, he added, is “if you’re admitted to the intensive care unit and you develop brain dysfunction, your risk of not surviving your hospital stay is doubled.”
Delirium is a type of brain dysfunction characterized by a sudden onset, fluctuating symptoms, inattention and confusion.
For the new study, Stevens led an interdisciplinary team of researchers who sifted through 10,000 published reports before selecting 42 studies that met their specific criteria. For instance, they eliminated any studies that included patients with head injuries, strokes or other neurological disorders to obtain a more precise estimate of delirium in ICU patients.
That left the researchers with 16,595 patients, of which 5,280 — or 32 percent — had confirmed cases of delirium. The researchers then conducted a meta-analysis, which found that delirium was associated with a twofold increase in the risk of dying in the hospital, even after adjusting for severity of illness.
One of the best known causes of delirium is medications such as sedatives, according to the researchers. For instance, benzodiazepine, which is commonly administered to patients to help them calm down and sleep, may lead to disorientation and confusion.
The goal moving forward should be to reduce or eliminate the use of such potentially harmful medications, particularly among higher risk populations, such as the elderly and individuals with dementia, according to Stevens.
Nighttime interruptions should also be kept to a minimum to ensure that patients get a good night’s rest without sedatives, he said.
Other causes of delirium, however, might be harder to address, he noted.
According to the inflammatory hypothesis, illnesses occurring outside the brain, such as severe pneumonia, can lead to inflammation in the brain. Another theory is that delirium is linked to changes in the flow of blood to the brain, sometimes resulting in strokes that are not recognized, the researchers said.
The new study also found that among patients who develop delirium, the risk of long-term cognitive decline increases by 20 to 30 percent.
“We’re seeing that even though you may have a very severe illness or injury and you’re lucky enough to survive, you’re still not quite out of the woods,” Stevens concluded. “We need to think about the measures we can put into place to decrease these long-term burdens.”
The study was published in the British Medical Journal.
Source: Johns Hopkins Medicine
Powerful people are quick to respond to unfair treatment when they are the ones being victimized, but they are less likely to notice injustice when others are victimized or when they benefit from the situation, according to a new study published by the Society for Personality and Social Psychology.
“Powerful people are only faster to notice unfair situations when they’re the victims,” said lead researcher Takuya Sawaoka, a doctoral student in psychology at Stanford University.
“Our findings also suggest that powerful people are slower to notice unfair situations that victimize other people, and this converges with other research demonstrating that the powerful are less empathetic to the plight of others.”
In four experiments, participants who were primed to think of powerful situations recognized unfair treatment more quickly when it affected them and were more likely to take action to avoid disadvantageous situations than powerless people. The findings were similar for both men and women. Most of the participants were white so the results weren’t based on race.
In another experiment with 227 participants, the high-power group wrote about a situation in which they had power over someone else while the low-power group wrote about an experience when someone had power over them.
Each participant then played a computer game where their reaction times were measured in deciding the fairness of the distribution of coins between the participant and two computer-generated players. The high-power group responded more quickly than the low-power group when they were the victims of unfairness but not when they benefitted from an unfair distribution of the imaginary wealth.
In another task, 100 participants played a game in which they were either beneficiaries or victims of an unfair distribution of wages by an employer. When participants were treated unfairly, the high-power group switched more quickly to another employer, while the low-power group stayed with the same employer longer even though they had received less pay.
Since the writing exercises designed to make participants feel powerful or powerless only had short-term effects, the differences between powerful and powerless people are probably greater in the real world, where powerless people often are overwhelmed by unfair treatment on a daily basis, Sawaoka said.
The findings help explain the ongoing problem of income inequality and “white privilege” in American society, said Sawaoka.
“Since whites tend to occupy powerful or advantaged positions in society, this fosters a sense of entitlement, and powerful people come to believe that they deserve better outcomes than others,” he said. “Thus, whites may be very quick to notice and respond to perceived injustices, but this entitlement also could make them less likely to notice injustices that victimize minorities.”
“People who are repeatedly victimized by unfairness are going to end up with fewer resources and opportunities,” he said. “Effectively responding to unfair situations (e.g., by seeking out more equitable outcomes) could enable the powerful to maintain their higher social standing.
“In contrast, because powerless people are slower to perceive and respond to unfairness, they may become more vulnerable to exploitation. These processes could end up perpetuating gaps between the powerful and powerless.”
Being bullied in the teen years is strongly linked to depression in young adulthood, according to a new study published in The BMJ.
There is a rapid increase in depression from childhood to adulthood and one contributing factor may be bullying by peers. But the link between bullying at school and adult depression has remained unclear due to research limitations.
This led a team of scientists to conduct one of the largest studies on the association between bullying by peers in teenage years and depression in early adulthood.
The researchers, headed by Lucy Bowes, Ph.D., at the University of Oxford, examined the relationship between bullying at 13 years and depression at 18 years. They did this by analyzing bullying and depression data on 3,898 participants in the Avon Longitudinal Study of Parents and Children (ALSPAC), a UK community based birth cohort.
At 13 years old, the participants completed a self-report questionnaire about bullying. Then at 18 years, they completed an assessment that identified individuals who met internationally agreed criteria for depression.
Of the 683 teenagers who had been bullied frequently (more than once a week) at 13 years, 14.8 percent were experiencing depression at 18 years. And of the 1,446 teens who had experienced some bullying of one to three times over six months at 13 years, 7.1 percent were depressed at 18 years. Only 5.5 percent of teenagers who did not experience bullying were depressed at 18 years.
Around 10.1 percent of frequently bullied teenagers suffered from depression for more than two years, compared with 4.1 percent from the non-bullied group.
Overall, 2,668 participants had reported on bullying and depression as well as other factors that may have caused their depression, such as previous bullying in childhood, mental and behavioral problems, family situations, and stressful life events.
When these factors were taken into account, frequently bullied teenagers still had about double the chances of depression compared with those who did not experience bullying. This connection was the same for both males and females.
The most common type of bullying was name calling; 36 percent experienced this, while 23 percent had belongings taken from them.
If this were a causal relationship then up to 30 percent of depression in young adults could be attributed to bullying in their teen years, explain the authors, adding that bullying could make a significant contribution to the overall burden of depression.
While no definitive conclusions can be drawn about cause and effect, the researchers say that interventions to reduce bullying in schools could reduce depression in later life.
Although insufficient sleep is often associated with increased caloric intake, new research suggests eating less late at night may help mitigate concentration and alertness deficits that accompany sleep deprivation.
Researchers at the Perelman School of Medicine at the University of Pennsylvania presented their study at SLEEP 2015, the 29th annual meeting of the Associated Professional Sleep Societies LLC.
“Adults consume approximately 500 additional calories during late-night hours when they are sleep restricted,” said the study’s senior author David F. Dinges, Ph.D., director of the Unit for Experimental Psychiatry and chief of the division of Sleep and Chronobiology.
“Our research found that refraining from late-night calories helps prevent some of the decline those is individuals may otherwise experience in neurobehavioral performance during sleep restriction.”
In the study, researchers gave 44 subjects, ages 21 to 50, unlimited access to food and drink during the day, followed by only four hours of sleep each night for three nights. On the fourth night, 20 participants received continued access to food and drinks, while the 24 others were allowed only to consume water from 10:00 p.m. until they went to sleep at 4:00 a.m.
At 2:00 a.m. each night, all subjects completed a variety of tests to measure their working memory, cognitive skills, sleepiness, stress level, and mood.
During the fourth night, subjects who fasted performed better on reaction time and attention lapses than subjects who had eaten during those late-night hours.
Researchers also discovered that subjects who ate showed significantly slower reaction times and more attention lapses on the fourth night of sleep restriction compared to the first three nights. In contrast, study subjects who had fasted did not show this performance decline.
While countless studies associate numerous physical and mental health benefits with a healthy night’s sleep, the Centers for Disease Control Prevention reports that “insufficient sleep is a public health epidemic” in the United States, including the estimated 50 to 70 million U.S. adults suffering from sleep and wakefulness disorders.
The new study compliments research on the links between eating and sleep deprivation. A prior study from the same Penn team found that individuals with late bedtimes and chronic sleep restriction may be more susceptible to weight gain due to the increased consumption of calories during late-night hours.
In a related study, the same team of Goel, Spaeth and Dinges, found that adults who are chronically sleep restricted may need to compensate for decreased morning resting metabolic rate by reducing caloric intake or increasing physical activity to prevent weight gain.
“Short sleep duration is a significant risk factor for weight gain and obesity, particularly in African Americans and men,” says senior author Namni Goel, Ph.D.
“This research suggests that reducing the number of calories consumed can help prevent that weight gain and some of the health issues associated with obesity in Caucasians and particularly in African Americans.”
The NIH reports that 69 percent of U.S. adults are overweight or obese. Being overweight or obese increases your risk of coronary heart disease, high blood pressure, stroke, type II diabetes, cancer, sleep apnea, and other health problems.
Source: University of Pennsylvania
Although the practice of new mothers eating the placenta is trendy as celebrities such as Kourtney Kardashian blogged and raved about the benefits of their personal placenta ‘vitamins,’ a new medical review fails to uncover data to support anecdotal reports of curative benefits.
Researchers from Northwestern Medicine reviewed 10 current published research studies on placentophagy and failed to discover data to support common claims that eating the placenta — either raw, cooked, or encapsulated offers protection against a variety of ills and issues.
Claims that placenta ingestion provides postpartum depression relief, reduces of post-delivery pain, provides energy, aids lactation, promotes skin elasticity, enhances maternal bonding, or replenishes iron in the body were not found in the literature.
Moreover, scientists were concerned with the absence of studies examining the risk of ingesting the placenta — called placentophagy. The placenta acts as a filter to absorb and protect the developing fetus from toxins and pollutants.
The study has been published in journal Archives of Women’s Mental Health.
“There are a lot of subjective reports from women who perceived benefits, but there hasn’t been any systematic research investigating the benefits or the risk of placenta ingestion,” said corresponding study author Dr. Crystal Clark.
“The studies on mice aren’t translatable into human benefits.”
Clark is assistant professor of psychiatry and behavioral sciences at Northwestern University Feinberg School of Medicine and a psychiatrist specializing in reproduction-related mood disorders at Northwestern’s Asher Center for the Study and Treatment of Depressive Disorders.
Placentophagy is an unknown risk for the women who eat it and for their infants, if they are breastfeeding.
“Our sense is that women choosing placentophagy, who may otherwise be very careful about what they are putting into their bodies during pregnancy and nursing, are willing to ingest something without evidence of its benefits and, more importantly, of its potential risks to themselves and their nursing infants,” said lead author Cynthia Coyle, a Feinberg faculty member and a psychologist.
“There are no regulations as to how the placenta is stored and prepared, and the dosing is inconsistent,” Coyle said. “Women really don’t know what they are ingesting.”
Research is needed to provide the answers, Coyle said. She also hopes the study sparks conversations between women and their physicians about their post-birth plans, so doctors can inform their patients about the science or lack thereof and support patients in their decision-making process.
Clark became interested in placentophagy after some of her pregnant patients asked if eating their placentas would interfere with their antidepressant medications. She was unfamiliar with the practice and began to ask her other patients about it.
“I was surprised that it was more widespread than I anticipated,” Clark said.
Although almost all non-human placental mammals ingest their placenta after giving birth, the first documented accounts of postpartum women practicing placentophagy were in North America in the 1970s, the study reports. In recent years, advocates and the media have popularized health benefits of the practice, and more women are considering it as an option for postpartum recovery.
“The popularity has spiked in the last few years,” Clark said. “Our sense is that people aren’t making this decision based on science or talking with physicians. Some women are making this based on media reports, blogs, and websites.”
The authors of this paper are currently gathering data on the perceptions, beliefs, and placental practices of health care providers internationally and nationally, as well as patients locally, and whether providers are recommending placentophagy to patients.
A new study finds that while activities such as walking, aerobics/calisthenics, biking, gardening, golfing, running, weight-lifting, and yoga/Pilates are associated with better sleep habits, some activities actually may harm sleep quality.
Researchers at the Perelman School of Medicine at the University of Pennsylvania (Penn) discovered activities such as household and childcare work are associated with increased cases of poor sleep habits.
The study will be presented at SLEEP 2015, the annual meeting of the Associated Professional Sleep Societies LLC.
The new study breaks down physical activity — normally associated with healthy sleep — and provides detail into activities that significantly help sleep and those that may cause people to lose sleep.
The new study, led by Michael Grandner, Ph.D., looked at data on sleep and physical activities of 429,110 adults from the 2013 Behavioral Risk Factor Surveillance System. From this data set, the Penn researchers measured whether each of 10 types of activities was associated with a typical amount of sleep, relative to both no activity and to walking.
Survey respondents were asked what type of physical activity they spent the most time doing in the past month, and also asked how much sleep they got in a typical 24-hour period. Since previous studies showed that people who get less than seven hours are at greater risk for poor health and functioning, the study evaluated whether people who reported specific activities were more likely to also report sufficient sleep.
Compared to those who reported that they did not get physical activity in the past month, all types of activity except for household/childcare were associated with a lower likelihood of insufficient sleep.
To assess whether these effects are just a result of any activity, results were compared to those who reported walking as their main source of activity.
Compared to just walking, aerobics/calisthenics, biking, gardening, golf, running, weight-lifting, and yoga/Pilates were each associated with fewer cases of insufficient sleep, and household/childcare activity was associated with higher cases of insufficient sleep.
These results were adjusted for age, sex, education level, and body mass index.
“Although previous research has shown that lack of exercise is associated with poor sleep, the results of this study were surprising,” said Grandner.
“Not only does this study show that those who get exercise simply by walking are more likely to have better sleep habits, but these effects are even stronger for more purposeful activities, such as running and yoga, and even gardening and golf.
It was also interesting that people who receive most of their activity from housework and childcare were more likely to experience insufficient sleep — we know that home and work demands are some of the main reasons people lose sleep.”
“These results are consistent with the growing scientific literature on the role of sleep in human performance,” said Grandner.
“Lab studies show that lack of sleep is associated with poor physical and mental performance, and this study shows us that this is consistent with real-world data as well.
“Since these results are correlational, more studies are needed to help us understand whether certain kinds of physical activity can actually improve or worsen sleep, and how sleep habits help or hurt a person’s ability to engage in specific types of activity.”
Source: University of Pennsylvania
An emerging hot topic in the field of neurology is the use of brain imaging to help experts treat and care for cognitive decline in patients.
A new review suggests imaging studies can be used as a tool to help neurologists, psychiatrists, and other clinicians to measure and manage cognitive declines in patients.
Experts believe the imaging findings can motivate patients to make beneficial lifestyle changes to reduce risk for Alzheimer’s disease.
The concept that cognitive decline can be identified early and prevented by applying quantitative brain imaging techniques is the focus of “Hot Topics in Research: Preventive Neuroradiology in Brain Aging and Cognitive Decline,” a review published online in American Journal of Neuroradiology (AJNR).
In the review, an international team suggest a framework in which neuroradiologists work as part of a team of clinical neuroscientists (neurologists, psychiatrists, neuropsychologists, etc.) can be an effective strategy to prevention of cognitive decline in populations at high risk for dementia.
Researchers believe the application of quantitative neuroradiology will particularly aid individuals with lifestyle, genetic, and other associated risk factors.
“I believe neuroradiology, and especially quantitative MRI technology, will have a huge impact in the future of diagnosis and treatment of Alzheimer’s disease, since there is compelling evidence for the baseline size of hippocampus as a key determinant of risk for future cognitive decline, and since many lifestyle factors can cause atrophy or expansion in the volume of this critical brain structure,” says neurologist Majid Fotuhi, M.D., Ph.D., of Johns Hopkins University.
Such work is already happening at University of California, Los Angeles and other institutions that meld these approaches into novel ways to improve patient care.
“We are working closely with neuroradiologists to redefine how we can reduce risk for Alzheimer’s with quantitative neuroimaging that helps us pinpoint symptom-relevant volume loss in the brain and subsequent targets for tracking our lifestyle-based interventions,” says Dr. David Merrill, a geriatric psychiatrist at University of California, Los Angeles Medical Center.
“Recent advances have improved the ability to characterize imaging markers along the trajectory of Alzheimers disease, starting in the pre-clinical phase. These markers, including structural, functional, and molecular imaging are being used in the AD diagnostic criteria” says Howard Aizenstein, M.D., Ph.D., a psychiatrist at University of Pittsburgh.
Fotuhi sees imaging findings as a unique motivator for patients to make positive lifestyle changes.
“Patients seem to enjoy reviewing results of their imaging studies, more so than reading the results of their blood tests or other clinical evaluations. For example, they can see with their own eyes whether there are any strokes or atrophy in their brain. This can have a powerful impact on them and on their determination to make changes in their lifestyle in order to improve their brain health,” he adds.
Experts explain that that as many as three million cases of Alzheimer’s dementia worldwide can be prevented with as little as a 10 percent reduction in the burden of preventable lifestyle.
Lifestyle risk factors that can be altered to potentially prevent cognitive declines are obesity, diet, sleep, hypertension, diabetes, depression, supplementation, smoking, and physical activity.