Tri-County Services

In The News

Syndicate content
Psychology, psychiatry and mental health news and research findings, every weekday.
Updated: 2 hours 45 min ago

Physical Restraint, Acute Care Drugs Still Part of Treatment in Mental Health Units

Tue, 07/07/2015 - 6:15am

A new Canadian study finds that psychiatric hospital providers continue to use intervention techniques such as physical restraint and confinement.

Although the use of physical restraint has declined in nursing homes, the practice is used in hospitals as clinicians use physical restraints to prevent patient falls, to forestall disruption of therapy, or to control disruptive behavior.

New research suggests the risks of such a practice outweigh the benefits as the intervention can cause harm to both patients and care facilities, according to University of Waterloo researchers.

The study, which appears in a special mental health issue of Healthcare Management Forum, found that almost one in four psychiatric patients in Ontario hospitals are restrained using control interventions. These include chairs that prevent rising, wrist restraints, seclusion rooms or acute control medications.

“The latest findings show that the use of restraints and medications as control interventions is still an everyday practice in inpatient mental health units,” said Dr. John Hirdes, of the Faculty of Applied Health Sciences at Waterloo and senior author on the paper.

The research reveals that Ontario health-care providers administer acute control medication to almost 20 per cent of psychiatric patients in order to manage dangerous situations.

“Control interventions are not ideal because they counter a patient-centered approach to care and can damage therapeutic relationships while further stigmatizing patients,” said Dr. Tina Mah, lead author and vice president of planning, performance management and research at Grand River Hospital.

“There are also organizational implications of control interventions use including increased costs of care, reputational harm and exposure to potential litigation.”

The study suggests that earlier detection of illness or deterioration would help avoid patient crisis and minimize the use of physical restraint, sedation or seclusion. In addition, health-care providers should not use control interventions when a psychiatric emergency is not present.

“Health-care leaders need to pursue more patient-centered approaches to the provision of mental health services. The Mental Health and Addictions Quality Initiative is a positive example of collaboration by hospitals to improve the quality of mental-health services, including in the area of control intervention use,” said Mah.

Source: University of Waterloo

Why Women Live Longer than Men

Tue, 07/07/2015 - 5:30am

A new study suggests that women, across the globe, live longer than men due to differences in heart disease.

According to researchers, the disparity in longevity is a relatively new phenomenon, emerging within the last 150 years.

In the review, researchers discovered significant differences in life expectancies between the sexes first emerged as infectious disease prevention, improved diets, and other positive health behaviors were adopted by people born during the 1800s and early 1900s.

While death rates fell for both genders, women began to reap the benefits of longevity at a much greater rate.

In the wake of this massive but uneven decrease in mortality, a review of global data points to heart disease as the culprit behind most of the excess deaths documented in adult men, said gerontologist Dr. Eileen Crimmins of the University of Southern California (USC).

“We were surprised at how the divergence in mortality between men and women, which originated as early as 1870, was concentrated in the 50 to 70 age range and faded out sharply after age 80,” Crimmins said.

Research was conducted with Caleb Finch, Ph.D., a USC professor in the neurobiology of aging, and Hiram Beltrán-Sánchez, Ph.D., of the Center for Demography of Health and Aging at the University of Wisconsin-Madison. It examined the lifespans of people born between 1800 and 1935 in 13 developed nations.

When researched looked at mortality in adults over the age of 40, the team found that in individuals born after 1880, female death rates decreased 70 percent faster than those of males.

Even when the researchers controlled for smoking-related illnesses, cardiovascular disease appeared to still be the cause of the vast majority of excess deaths in adult men over 40 for the same time period. Surprisingly, smoking accounted for only 30 percent of the difference in mortality between the sexes after 1890, Crimmins said.

The uneven impact of cardiovascular illness-related deaths on men, especially during middle and early older age, raises the question of whether men and women face different heart disease risks due to inherent biological risks and/or protective factors at different points in their lives, Finch said.

“Further study could include analysis of diet and exercise activity differences between countries, deeper examination of genetics, and biological vulnerability between sexes at the cell level, and the relationship of these findings to brain health at later ages,” he said.

The study appears in the Proceedings of the National Academy of Sciences and was supported by the National Institute on Aging.

Source: USC/EurekAlert

REM Sleep Can Be Critical to Memory Formation in Young Brains

Mon, 07/06/2015 - 8:30am

Rapid eye movement or REM sleep actively converts waking experiences into lasting memories and abilities in young brains, according to new research.

Washington State University researchers say the finding expands the understanding of children’s sleep needs and calls into question the increasing use of REM-disrupting medications such as stimulants and antidepressants.

The National Institutes of Health-funded study appears in the journal Science Advances.

Professor of medical sciences Dr. Marcos Frank of Washington State University said scientists have known that infant animals spend much of their early life in REM sleep, but little was understood about the actual nuts and bolts of REM’s ability to change or recombine memories.

Providing new insights, Frank and his colleagues documented the effects of sleep on vision development in young animals. The researchers found that brain circuits change in the visual cortex as animals explore the world around them, but that REM sleep is required to make those changes “stick.”

The scientists showed that the changes are locked in by ERK, an enzyme that is activated only during REM sleep.

“REM sleep acts like the chemical developer in old-fashioned photography to make traces of experience more permanent and focused in the brain,” said Frank.

“Experience is fragile,” he said. “These traces tend to vanish without REM sleep and the brain basically forgets what it saw.”

Frank said young brains, including those of human children, go through critical periods of plasticity, or remodeling, when vision, speech, language, motor skills, social skills, and other higher cognitive functions are developed.

The study suggests that during these periods, REM sleep helps growing brains adjust the strength or number of their neuronal connections to match the input they receive from their environment, he said.

The new revelations do have historical antecedents.

In the 1960s, Frank said surgeons noticed that delayed removal of congenital cataracts in children resulted in severe problems such as double vision and the inability to align the eyes.

“The visual cortex is very sensitive to information it is receiving and there are critical periods for its development,” he said. “If vision is blocked at these stages, then problems result.”

Researchers used a model based on that finding to determine the specific effects of REM sleep on vision development. Animals had a patch placed over one eye and their brain activity was monitored both while awake and during sleep.

While in REM sleep, the animals were awakened intermittently by gentle tapping on their enclosures. Controls were awakened during non-REM sleep.

Analyses showed that normal vision did not develop in animals experiencing a REM sleep deficit.

“Without REM sleep, permanent plastic changes to the visual cortex did not occur and the ERK enzyme did not activate,” said Frank.

Previously, the researchers had determined that ERK works by turning neuronal genes into proteins, which solidify the brain changes.

Frank was surprised to also discover brain activity patterns occurring in REM sleep that were similar to those seen when the animals were awake.

“It’s as if the neurons were dreaming of their waking experience,” he said.

“This is the first time these similar events have been reported to occur in the developing brain during REM sleep,” said Frank. “Up till now, there has not been strong evidence to show that waking experience reappears during REM sleep.”

He said REM sleep may be important for the development of other parts of the brain beyond the visual cortex and its effects may continue throughout a lifetime.

The study “has big implications for our understanding of sleep in children,” said Frank.

“There is a lot of data accumulating that says the amount of sleep a child gets impacts his/her ability to do well in school,” he said. “This study helps explain why this might be, and why we should be cautious about restricting sleep in our children.

“We know there are different times in a child’s development when sleep needs increase — they are very high in babies but also in adolescents when their brains are changing rapidly,” he said.

“Also, it is becoming more common for pediatricians to give compounds that affect brain activity earlier in life, not just Ritalin for attention deficit disorder, but also antidepressants and other drugs,” said Frank.

“The fact is, we have very little pre-clinical research data to tell us what these drugs are doing to developing brains in both the short and long term,” he said.

“Almost all of these compounds can potentially suppress sleep and REM sleep in particular. REM sleep is very fragile — it can be inhibited by drugs very easily,” he said.

Source: Washington State University

Recurrent Depression Linked to Smaller Hippocampus

Mon, 07/06/2015 - 7:45am

A new international study has found that people with recurrent depression have a significantly smaller hippocampus — the part of the brain responsible for forming new memories — than those with a first depressive episode or no depression.

“This large study confirms the need to treat first episodes of depression effectively, particularly in teenagers and young adults, to prevent the brain changes that accompany recurrent depression,” said Dr. Ian Hickie, co-director of the Brain and Mind Research Institute (BMRI).

The research, conducted by University of Sydney scholars at BMRI, is the largest international study to compare brain volumes in people with and without major depression. It highlights the need to identify and treat depression effectively when it first occurs, particularly among teenagers and young adults.

“This is another reason that we need to ensure that young people receive effective treatments for depression, a key goal of our Centre of Research Excellence in Optimising Early Interventions for Young People with Emerging Mood Disorder,” said Hickie.

Using magnetic resonance image (MRI) brain scans, and clinical data from 1,728 people with major depression and 7,199 healthy individuals, the study combined 15 datasets from Europe, the U.S. and Australia.

The findings show that people with an early age of onset of major depression (occurring before the age of 21 years) have a smaller hippocampus than healthy individuals, consistent with the notion that many of these young people go on to have recurrent disorders. Of all the study participants with major depression, 65 percent had recurrent episodes.

However, people who had a first episode of major depression (34 percent of study subjects with major depression) did not have a smaller hippocampus than healthy individuals, indicating that the changes are due to the adverse effects of depressive illness on the brain.

“These findings shed new light on brain structures and possible mechanisms responsible for depression,” said Associate Professor Dr. Jim Lagopoulos of the BMRI.

“Despite intensive research aimed at identifying brain structures linked to depression in recent decades, our understanding of what causes depression is still rudimentary. One reason for this has been the lack of sufficiently large studies, variability in the disease and treatments provided, and the complex interactions between clinical characteristics and brain structure.”

“Clearly, there’s a need for longitudinal studies that can track changes in hippocampal volume among people with depression over time, to better clarify whether hippocampal abnormalities result from prolonged duration of chronic stress, or represent a vulnerability factor for depression, or both,” Lagapoulos said.

The findings are published in the journal Molecular Psychiatry.

Source: University of Sydney

 

Statins May Up Aggression in Women, Lower It in Men

Mon, 07/06/2015 - 7:00am

For more than two decades, the drugs known as statins have been used to manage blood cholesterol levels and reduce the risk of heart disease.

However, while the medications have successfully lowered cholesterol, studies have questioned if statins cause adverse behavioral changes such as irritability or violence. Until now, the findings with statins have been inconsistent.

In the first randomized trial to look at statin effects on behavior, researchers at the University of California, San Diego School of Medicine report that aggressive behavior typically declined among men placed on statins (compared to placebo), but typically increased among women placed on statins.

The findings appear in the online issue of PLOS ONE.

“Many studies have linked low cholesterol to increased risk of violent actions and death from violence, defined as death from suicide, accident and homicide,” said lead author Beatrice A. Golomb, M.D., Ph.D., professor of medicine.

“There have been reports of some individuals reproducibly developing irritability or aggression when placed on statins. Yet in contrast to pre-statin lipid-lowering approaches, clinical trials and meta-analyses of statin use (in which most study participants were male) had not shown an overall tendency toward increased violent death. We wanted to better understand whether and how statins might affect aggression.”

In the study, researchers randomly assigned more than 1,000 adult men and postmenopausal women to either a statin (simvastatin or pravastatin) or a placebo for six months. Neither researchers nor trial participants knew who was receiving the drug or the placebo.

Behavioral aggression was measured using a weighted tally of actual aggressive acts against others, self or objects in the prior week.

Other measurements taken included testosterone levels and reported sleep problems. Simvastatin is known to affect both measures, said Golomb, and both testosterone and sleep can affect aggression.

The male and female study cohorts were separately randomized, and analyzed separately since it was shown that statin use affected the genders differently.

Researchers discovered that for postmenopausal women, the typical effect was increased aggression. The effect was significant for postmenopausal women older than age 45. The increase in aggression (compared to placebo) appeared stronger in women who began with lower aggression at baseline.

For men, the picture was more complex. Three male participants who took statins (and no one on placebo) displayed very large increases in aggression. When these were included in analysis, there was no average effect.

When these “outliers” were removed from the analysis, a decline in aggressive behavior for male statin users was significant. It was stronger among younger men who tend to be more aggressive. “But actually the effect was most evident in less aggressive men,” said Golomb.

Researchers discovered the way statins effect testosterone and sleep contributed to bidirectional effects. “Changes in testosterone and in sleep problems on simvastatin each significantly predicted changes in aggression. A larger drop in testosterone on simvastatin was linked, on average, to a greater drop in aggression.

A greater rise in sleep problems on simvastatin was significantly linked to a greater rise in aggression. The sleep finding also helped account for the outliers: The two men with the biggest aggression increases were both on simvastatin, and both had developed ‘much worse’ sleep problems on the statin.”

Researchers admit that the full set of biological explanations linking statins to behavior remains a work-in-progress. One early hypothesis was that lower levels of cholesterol may reduce brain serotonin. (The connection between low brain serotonin activity and violence has been viewed as one of the most consistent findings in biological psychiatry.)

Whole blood serotonin, which can relate inversely to brain serotonin, was not a predictor in this study. However, testosterone and sleep were, for those on simvastatin.

Golomb postulates that other factors, such as oxidative stress and cell energy, may play a role. She noted that the findings help clarify seeming inconsistencies in the scientific literature.

“The data reprise the finding that statins don’t affect all people equally — effects differ in men versus women and younger versus older. Female sex and older age have predicted less favorable effects of statins on a number of other outcomes as well, including survival.”

Bottom line, said Golomb: “Either men or women can experience increased aggression on statins, but in men the typical effect is reduction.”

Source: University of California, San Diego/EurekAlert

Foodies Seem to Eat Less than Others

Mon, 07/06/2015 - 6:15am

New research finds that adventurous eaters weigh less and may be healthier than their less-adventurous counterparts.

Researchers from the Cornell Food and Brand Lab performed a national survey of 502 women and discovered that those who had eaten the widest variety of uncommon foods rated themselves as healthier eaters, more physically active, and more concerned with the healthfulness of their food when compared with non-adventurous eaters.

The adventurist eaters reported consummation of a host of exotic items including seitan, beef tongue, kimchi, rabbit, and polenta.

“They also reported being much more likely to have friends over for dinner,” said lead author Lara Latimer, Ph.D., formerly at the Cornell Food and Brand Lab and now at the University of Texas at Austin.

“These findings are important to dieters because they show that promoting adventurous eating may provide a way for people, especially women, to lose or maintain weight without feeling restricted by a strict diet,” said coauthor Brian Wansink, Ph.D., director of the Cornell lab.

He advised, “Instead of sticking with the same boring salad, start by adding something new. It could kick start a more novel, fun and healthy life of food adventure.”

The article appears in the journal Obesity.

Source: Cornell Food and Brand Lab/EurekAlert

Mass Shootings May Be Contagious

Mon, 07/06/2015 - 5:30am

Emerging research suggests mass killings and school shootings in the U.S. appear to be contagious.

In the study, a team of scientists from Arizona State University (ASU) and Northeastern Illinois University examined databases on past high-profile mass killings and school shootings in the U.S.

Investigators then created a contagion model to match the data to determine if these tragedies inspired similar events in the near future.

Study author Dr. Sherry Towers, research professor from ASU Simon A. Levin Mathematical, Computational and Modeling Sciences Center, explained, “The hallmark of contagion is observing patterns of many events that are bunched in time, rather than occurring randomly in time.”

The research team determined that mass killings — events with four or more deaths — and school shootings create a period of contagion that lasts an average of 13 days. Roughly 20 to 30 percent of such tragedies appear to arise from contagion.

Their paper appears in the journal PLOS ONE.

The analysis was inspired by actual events in Towers’ life.

“In January of 2014 I was due to have a meeting with a group of researchers at Purdue University,” she said. “That morning there was a tragic campus shooting and stabbing incident that left one student dead.

“I realized that there had been three other school shootings in the news in the week prior, and I wondered if it was just a statistical fluke, or if somehow through news media those events were sometimes planting unconscious ideation in vulnerable people for a short time after each event.”

The researchers noted that previous studies have shown that suicide in youths can be contagious, where one suicide in a school appears to spark the idea in other vulnerable youths to do the same.

“It occurred to us that mass killings and school shootings that attract attention in the national news media can potentially do the same thing, but at a larger scale,” Towers said. “While we can never determine which particular shootings were inspired by unconscious ideation, this analysis helps us understand aspects of the complex dynamics that can underlie these events.”

On average, mass killings involving firearms occur approximately every two weeks in the U.S., and school shootings occur on average monthly. The team found that the incidence of these tragedies is significantly higher in states with a high prevalence of firearm ownership.

Source: Arizona State University/EurekAlert

1 in 4 Patients Given Painkillers Go On to Longer-Term Prescriptions

Sun, 07/05/2015 - 8:00am

Painkiller addiction and accidental overdoses have become common in the U.S. In an effort to identify who is at most risk, researchers from the Mayo Clinic studied how many patients prescribed an opioid painkiller for the first time progressed to long-term prescriptions.

Their study found that the answer is one in four.

It also found that people with a history of tobacco use and substance abuse were likeliest to use painkillers long-term.

Discovering who is likeliest to end up using the drugs long-term is important due to the widespread problems associated with their misuse, according to lead author W. Michael Hooten, M.D., an anesthesiologist at Mayo Clinic in Rochester.

“Many people will suggest it’s actually a national epidemic,” he said. “More people now are experiencing fatal overdoses related to opioid use than compared to heroin and cocaine combined.”

Researchers used the National Institutes of Health-funded Rochester Epidemiology Project to get a random sample of 293 patients who received a new prescription in 2009 for an opioid painkiller such as oxycodone, morphine, hydromorphone, oxymorphone, hydrocodone, fentanyl, meperidine, codeine, or methadone.

They found that 21 percent, or 61 people, progressed from short-term use to prescriptions lasting three to four months, and six percent, or 19, ended up with more than a four-month supply of the drugs.

The identification of nicotine use and substance abuse as top risk factors for long-term use of opioids suggests that doctors should be particularly careful about prescribing painkillers to patients with such histories, according to Hooten.

What’s behind the connection? The neurobiology related to chronic pain, chronic opioid use, and addiction is similar, he explained. For example, nicotine activates a group of receptors, or brain structures, in a way very similar to how opioids and chronic pain may activate them.

While the study identified past or present nicotine use and substance abuse as top risk factors for long-term use of opioids, all patients should proceed with caution when offered opioid painkiller prescriptions, Hooten advised.

“From a patient perspective, it is important to recognize the potential risks associated with these medications,” he said. “I encourage use of alternative methods to manage pain, including non-opioid analgesics or other nonmedication approaches. That reduces or even eliminates the risk of these medications transitioning to another problem that was never intended.”

He added that long-term opioid use may actually make people more sensitive to pain — a condition called opioid-induced hyperalgesia.

If opioids must be used, as is usually the case with surgery or traumatic injuries, reducing the dose and limiting the duration is important, Hooten said.

“The next step in this research is to drill down and find more detailed information about the potential role of dose and quantity of medication prescribed,” he says. “It is possible that higher dose or greater quantities of the drug with each prescription are important predictors of longer-term use.”

The study was published in the Mayo Clinic Proceedings.

Source: Mayo Clinic

Blue Eyes Linked to Greater Risk for Alcoholism

Sun, 07/05/2015 - 7:15am

People with blue eyes may be at greater risk for becoming alcoholics, according to a novel study by genetic researchers at the University of Vermont.

The study is the first to identify a direct link between a person’s eye color and alcohol dependency. The researchers hope to get closer to finding the roots of not only alcoholism but other psychiatric illnesses as well.

“This suggests an intriguing possibility: that eye color can be useful in the clinic for alcohol dependence diagnosis,” says Arvis Sulovari, a doctoral student in cellular, molecular and biological sciences.

Researchers Sulovari and Dawei Li, Ph.D., assistant professor of microbiology and molecular genetics, discovered that primarily European-Americans with light-colored eyes — including those with green, grey, and brown in the center — had a higher incidence of alcohol dependency than people with dark brown eyes. The strongest tendency for alcoholism was found among blue-eyed individuals.

The study outlines the genetic components that determine eye color and shows that they line up along the same chromosome as the genes related to excessive alcohol use.

But, Li says, “we still don’t know the reason” and more research is needed.

Li has studied psychiatric genetics for a decade. During that time, he has collaborated with other researchers to build a clinical and genetic database of more than 10,000 individuals.

Most of them have been African-Americans and European-Americans, diagnosed with at least one psychiatric illness. Many have multiple diagnoses of diseases, including depression, schizophrenia and bipolar disorder, as well as addiction and alcohol or drug dependence.

“These are complex disorders,” he said. “There are many genes, and there are many environmental triggers.”

From that extensive database, the researchers filtered out the alcohol-dependent patients with European ancestry, a total of 1,263 samples. After Sulovari noticed the eye-color connection, they retested their analysis three times, arranging and rearranging the groups to compare age, gender and different ethnic or geographic backgrounds, such as southern and northern parts of the continent.

Li wants to delve deeper into the relationship between cultural background and genetic makeup, continuing his quest to find the underpinnings of mental illness. His greatest challenge, he says, is that all the genes identified in the past 20 years “can only explain a small percentage of the genetics part that has been suggested. A large number is still missing, is still unknown.”

“What has fascinated me the most about this work has been investigating the interface between statistics, informatics, and biology,” said Sulovari. “It’s an incredible opportunity to study genomics in the context of complex human diseases.”

Their findings are published in the American Journal of Medical Genetics: Neuropsychiatric Genetics.

Source: University of Vermont

 

Brain Profiles May Suggest Risk for Problem Drinking, Sexual Behavior

Sun, 07/05/2015 - 6:30am

Duke University researchers believe they have discovered two distinct brain profiles that appear to be associated with risky sexual activity and problem drinking among young adults.

Researchers say the scans show an imbalance in functions of typically complementary brain regions. They believe the findings may allow clinicians to one day predict how likely young adults are to develop problem drinking or engage in risky sexual behavior in response to stress.

The new research is part of the ongoing Duke Neurogenetics Study (DNS), which began in 2010 to better understand how interactions between the brain, genome and environment shape risky behaviors that can predict mental illnesses including depression, anxiety, and addiction.

“By knowing the biology that predicts risk, we hope to eventually change the biology or at least meet that biology with other forces to stem the risk,” said the senior author of both studies, Ahmad Hariri, Ph.D., professor of psychology and neuroscience at Duke University.

In both studies, the team used non-invasive functional MRI imaging to measure the activity of two brain areas that help shape opposing behaviors crucial for survival: the reward-seeking ventral striatum and the threat-assessing amygdala.

As part of the project, in 2012 researchers evaluated 200 participants and discovered that having both an overactive ventral striatum and an underactive amygdala was associated with problem drinking in response to stress.

The researchers also discovered that the inverse brain pattern — low ventral striatum and high amygdala activity — predicted problem drinking in response to stress both at the time of the scan and three months after.

These results appear in the journal Molecular Psychiatry.

“We now have these two distinct profiles of risk that, in general, reflect imbalance in the function of typically complementary brain areas,” Hariri said.

“If you have high activity in both areas, no problem. If you have low activity in both areas, no problem. It’s when they’re out of whack that individuals may have problems with drinking.”

Interestingly, people with the two different risk profiles may drink for different reasons.

Hariri speculates that those with high ventral striatum activity may be motivated to drink because they are impulsive; combined with a lower danger signal coming from the amygdala, they may be less inclined to reign in their behavior.

In contrast, the participants with low ventral striatum activity usually have lower mood, and an overactive amygdala may make them more sensitive to stress, so they might drink as a coping mechanism.

Balance in the activity of the ventral striatum and the amygdala also predicts sexual behavior, according to the second study, which appears in the Journal of Neuroscience.

In that study, a team led by graduate student Elizabeth Victor asked a subset of DNS participants (70 heterosexual men and women) how many new sexual partners they acquired over an 11-month period.

For men, the same pattern of brain activity linked to problem drinking, high ventral striatum and low amygdala activity, was associated with a greater number of sexual partners compared to those men with more balanced activity of the two brain areas.

But the pattern for more sexually active women was different: They had higher-than-normal activity in both the ventral striatum and the amygdala, indicating both high reward and high threat.

“It’s not really clear why that is,” Hariri said. “One possibility is that this amygdala signal is representing different things in men and women.”

In women, amygdala activity might be driving general awareness, arousal, and responsiveness which, when combined with strong reward-related activity in the ventral striatum, leads to a greater number of partners. In contrast, in men, the amygdala signal could be more focused on detecting danger, Hariri said.

Measuring brain-based predictors of sexual behavior is largely uncharted territory, Victor said. Although a previous study tied higher ventral striatum activity to more sexual partners, no prior studies have accounted for amygdala activity.

Source: Duke University/EurekAlert

New Research Shows How Brain Reconstructs Past Events

Sat, 07/04/2015 - 9:30am

A new study shows that when remembering something from our past, the entire event can be reactivated in the brain, including incidental information, such as what music may have been playing in the background.

“When we recall a previous life event, we have the ability to re-immerse ourselves in the experience,” said lead author Dr. Aidan Horner of the University College London Institute of Cognitive Neuroscience.

“We remember the room we were in, the music that was playing, the person we were talking to, and what they were saying. When we first experience the event, all these distinct aspects are represented in different regions of the brain, yet we are still able to remember them all later on. It is the hippocampus that is critical to this process, associating all these different aspects so that the entire event can be retrieved.”

The researchers showed that associations formed between the different aspects of an event allow one aspect to retrieve all the other aspects, a process known as pattern completion. For example, when remembering who we saw, we often remember other details, such as what they were holding and where they were. This means that the entire event can be re-experienced in full, the researchers say.

Using fMRI, the researchers discovered that different aspects of an imagined event are reflected in activity in different regions of the brain. When asked about one aspect of an event, activity in the hippocampus correlates with reactivation in these regions, including those incidental to the task, and that this reactivation corresponds to the full event coming to mind.

“This work supports a long-standing computational model of how memory might work, in which the hippocampus enables different types of information to be bound together so that they can be imagined as a coherent event when we want to remember what happened,” added senior author Professor Neil Burgess.

“It provides a fundamental insight into our ability to recollect what has happened, and may help to understand how this process can go wrong in conditions such as Alzheimer’s disease or post-traumatic stress disorder.”

The study’s experiment involved 26 volunteers, who were asked to imagine and memorize a series of events involving different locations, famous people, and objects. They were then asked to remember the details of the event based on a single cue.

For example, one trial event involved President Barack Obama in a kitchen with a hammer. Volunteers were then asked to remember details based on a single cue, such as “where was Obama?”, “who was in the kitchen?” or “what object did Obama have?”.

During the questioning, volunteers underwent fMRI scans to measure their brain activity.

The results showed that different parts of the brain showed increased activity when encoding different aspects of each event, and that the hippocampus provides the critical links between them to form a complete memory.

For example, activity increased in one part of the brain when volunteers thought of Obama, another when they thought of the kitchen, and another when they thought of the hammer.

The study showed that when asked ‘where was Obama?’ activity increased in the regions corresponding to Obama and Kitchen. Critically, activity also increased in the region corresponding to the hammer, despite no requirement to retrieve this item. This reactivation correlated with hippocampal activity, suggesting the hippocampus is involved in retrieving the entire event, the researchers explained.

Source: University College London

Treating Sleep Problems May Improve Work Satisfaction

Sat, 07/04/2015 - 8:45am

A new study has found evidence pointing to a two-way connection between job strain and disturbed sleep, suggesting that interventions to treat sleep problems may also improve work satisfaction.

“The results are important because they show that work demands influence stress negatively, and this link has rarely been investigated in longitudinal studies,” said lead author and principal investigator Torbjörn Akerstedt, a professor in the department of clinical neuroscience at the Karolinska Institute in Stockholm, Sweden.

“Sleep problems are abundant in the industrialized world, and we need to know where mitigation may be most effective.”

The findings show that people with higher work demands exhibited later sleep disturbances at the two-year follow-up. Similarly, those with sleep disturbances later showed a higher perception of stress, higher work demands, a lower degree of control, and less social support at work two years later. However, no link was found between disturbed sleep and physical work environment, shift work schedules, or working hours.

The research team, led by Akerstedt and lead author Johanna Garefelt, analyzed data from the 2008 and 2010 waves of the Swedish Longitudinal Occupational Survey of Health.

The study group included 4,827 participants with a mean age of 48 years, including 2,655 females and 2,171 males. Information regarding sex, age, and socioeconomic position were obtained from national register data.

The researchers used the Karolinska Sleep Questionnaire (KSQ) to identify disturbed sleep, which was defined as having difficulties falling asleep, restless sleep, repeated awakenings, or premature awakening. Work demands, control at work, and social support at work were measured using the Swedish version of the Demand-Control-Support Questionnaire.

The researchers believe that their findings align with prior studies showing that disturbed sleep increases stress response and emotional reactivity. The results imply that promoting better sleep may improve working life by reducing perceived job stress and minimizing negative attitudes toward work.

“The effect of sleep problems on stress emphasizes the importance of good sleep for functioning in everyday life,” said Akerstedt.

According to the American Academy of Sleep Medicine, about 30 percent of adults have symptoms of insomnia, and about 10 percent have severe insomnia that leads to problems in the daytime. This may include fatigue, moodiness, anxiety, memory difficulties, headaches, or upset stomach.

The study results are published in the July issue of the journal Sleep.

Source: American Academy of Sleep Medicine

 

The Affect of Hormones on Financial Markets

Sat, 07/04/2015 - 8:00am

New research shows that the hormones testosterone and cortisol may destabilize financial markets by making traders take more risks.

For their study, researchers simulated the trading floor in the lab by having volunteers buy and sell assets among themselves. They measured the volunteers’ natural hormone levels in one experiment and artificially raised them in another.

When given doses of either hormone, the volunteers invested more in risky assets, according to the study’s findings.

According to the researchers, the stressful and competitive environment of financial markets may promote high levels of cortisol and testosterone in traders.

Cortisol is elevated in response to physical or psychological stress, increasing blood sugar and preparing the body for a fight-or-flight response.

Previous studies have shown that men with higher testosterone levels are more likely to be confident and successful in competitive situations.

The researchers of the new study, published in Scientific Reports, suggest their findings should be considered by policymakers looking to develop more stable financial institutions.

“Our view is that hormonal changes can help us understand traders’ behavior, particularly during periods of financial instability,” said Dr. Carlos Cueva from the Department of Economics at the University of Alicante and one of the lead authors of the study.

“Our aim is to understand more about what these hormones do,” added Dr. Ed Roberts of the Department of Medicine at Imperial College London and another of the lead authors of the study.

“Then we can look at the environment in which traders work, and think about whether it’s too stressful or too competitive. These factors could be affecting traders’ hormones and having an impact on their decision-making.”

For their study, the researchers first measured the levels of the two hormones in saliva samples of 142 volunteers, male and female, playing an asset trading game in a group of about 10 people. They found that the volunteers who had higher levels of cortisol were more likely to take risks, and high levels in the group were associated with instability in prices.

In a follow-up experiment, 75 young men were given either cortisol or testosterone before playing the game, once with the hormone and once on a placebo. The study found that both hormones shifted investment towards riskier assets.

Cortisol appeared to directly affect volunteers’ preference for riskier assets, while testosterone seemed to increase optimism about how prices would change in the future, the researchers explained.

“The results suggest that cortisol and testosterone promote risky investment behavior in the short run,” said Roberts. “We only looked at the acute effects of the hormones in the lab. It would be interesting to measure traders’ hormone levels in the real world, and also to see what the longer term effects might be.”

Source: Imperial College London 

Alcohol Sensitizes Brain to Food Aroma

Sat, 07/04/2015 - 7:15am

Alcohol exposure appears to sensitize the brain’s response to food aromas, thereby increasing one’s consumption of food, according to a new study that measured the brain’s role in regulating caloric intake following alcohol consumption among women.

The findings are published in the journal Obesity published by the Obesity Society.

The research adds to the current body of knowledge that alcohol increases food intake, also known as the “aperitif effect,” but shows this increased intake does not rely entirely on the oral ingestion of alcohol and its absorption through the gut.

“The brain, absent contributions from the gut, can play a vital role in regulating food intake. Our study found that alcohol exposure can both increase the brain’s sensitivity to external food cues, like aromas, and result in greater food consumption,” said Dr. William J. A. Eiler II, Ph.D., of the Indiana University School of Medicine’s Departments of Medicine and Neurology.

“Many alcoholic beverages already include empty calories, and when you combine those calories with the aperitif effect, it can lead to energy imbalance and possibly weight gain.”

The study involved 35 female participants who were non-vegetarian, non-smoking, and at a healthy weight. To test the direct effects of alcohol on the brain, the researchers skipped the digestive system by intravenously administering alcohol to each participant at one study visit and then a placebo (saline) on another study visit, prior to eating.

Participants were observed and brain responses to food and non-food aromas were measured using blood oxygenation level dependent (BOLD) response via fMRI scans. After imaging, participants were offered a lunch choice between pasta with Italian meat sauce and beef and noodles.

When participants were given intravenous alcohol, they ate more food at lunch, on average, compared to when they were given the placebo. There were individual differences, however, with one-third of participants eating less after alcohol exposure when compared to the placebo exposure.

Also, the area of the brain responsible for certain metabolic processes, the hypothalamus, responded more to food odors, compared to non-food odors, after alcohol infusion vs. saline.

The findings suggest that the hypothalamus may therefore play a role in mediating the impact of alcohol exposure on our sensitivity to food cues, contributing to the aperitif phenomenon.

“This research helps us to further understand the neural pathways involved in the relationship between food consumption and alcohol,” said Martin Binks, Ph.D., FTOS, TOS Secretary Treasurer and Associate Professor of Nutrition Sciences at Texas Tech University.

“Often, the relationship between alcohol on eating is oversimplified; this study unveils a potentially more complex process in need of further study.”

“Today, nearly two-thirds of adults in the U.S. consume alcohol, with wine consumption rising, which reinforces the need to better understand how alcohol can contribute to overeating,” said Binks.

Source: Obesity Society

 

Computer Game Can Help Reduce Unwanted Memories

Sat, 07/04/2015 - 6:30am

Intrusive or unwanted memories are often difficult to erase. Disturbing memories are a core feature of stress-and trauma-related clinical disorders such as post-traumatic stress disorder (PTSD). Moreover, unsolicited recollections can also be associated with situations that have occurred in everyday life.

A new study, published in the journal Psychological Science, finds that playing a visually demanding computer game may reduce the occurrence of the obtrusive memories over time.

“This work is the first to our knowledge to show that a ‘simple cognitive blockade’ could reduce intrusive memories of experimental trauma via memory reconsolidation processes,” said senior study author Emily Holmes of the Medical Research Council Cognition and Brain Sciences Unit in the UK.

“This is particularly interesting because intrusive memories are the hallmark symptom of PTSD.”

“Currently, there are recommended treatments for PTSD once it has become established, that is, at least one month after the traumatic event, but we lack preventative treatments that can be given earlier,” says Holmes.

“If this experimental work continues to show promise, it could inform new clinical interventions for consolidated memories that could be given a day or so after trauma to prevent or lessen the intrusive memories over time.”

Most people who have experienced a traumatic event don’t end up developing PTSD, but they often experience repeated intrusive visual memories of certain moments from the event in vivid detail. Someone who has been involved in a road traffic accident, for example, might continue to re-experience the moment of impact, seeing vividly in their mind’s eye the moment a red car crashed into them.

Previous research has shown that people who played the computer game Tetris shortly after viewing film of traumatic events experienced fewer intrusive memories over the following week, when they played within four hours of viewing the footage. However, the intervention is not practical as it is unlikely that many people would be able to receive such immediate treatment following a traumatic event in the real world.

Therefore, Holmes and colleagues wanted to see whether they might be able to use a similar cognitive procedure to change older, already established memories a day later.

The investigators developed an approach that used emerging research on memory. They built upon the theory of reconsolidation as a way of making established memories malleable and vulnerable to disruption, following the reactivation of that memory.

They hypothesized that playing Tetris — an engaging visuospatial task — after memory reactivation would create a “cognitive blockade” that would interfere with the subsequent reconsolidation of visual intrusive memories. As a result, the frequency of intrusive memories would be reduced over time.

In two experiments, the researchers had participants view films that contained scenes of traumatic content (for example, footage highlighting the dangers of drunk driving) as a way of experimentally inducing intrusive memories.

Participants then returned to the lab 24 hours after watching the film. Using film footage as a form of experimental trauma is a well-established technique for studying reactions, such as intrusive memories, in a controlled setting.

In the first experiment, half of the participants had their memories of the film reactivated by viewing selected stills from the film footage, followed by a 10-minute filler task, and then 12 minutes of playing Tetris; the other participants completed only the filler task and then sat quietly for 12 minutes.

The results showed that the participants who had their memories reactivated and played Tetris experienced significantly fewer intrusive memories in a diary over the next week than the participants who came to the lab and simply sat quietly for the equivalent period of time.

A second experiment with four groups replicated the findings from first experiment. Importantly, it revealed that neither reactivation nor Tetris was enough to produce these effects on their own; only participants who experienced both components showed fewer intrusive memories over time.

“Our findings suggest that, although people may wish to forget traumatic memories, they may benefit from bringing them back to mind, at least under certain conditions — those which render them less intrusive,” said study co-author Ella James of the University of Oxford.

“We hope to develop this approach further as a potential intervention to reduce intrusive memories experienced after real trauma, but we are keen emphasize that the research is still in the early stages and careful development is needed,” says Holmes.

“Better treatments are much needed in mental health. We believe the time is ripe to use basic science about mechanisms — such as research on memory reconsolidation — to inform the development of improved and innovative psychological treatment techniques.”

Source: Association for Psychological Science

Simple Changes to Classroom Procedures Helps ADHD Kids

Fri, 07/03/2015 - 8:30am

Proactive new UK research has discovered that modifications to the classroom can improve academic outcomes for children with attention deficit hyperactivity disorder (ADHD) potentially reducing the need for medications.

Medications are often used for children with ADHD as they are typically restless, act without thinking, and struggle to concentrate — the actions causes particular problems for them and for others in school.

A systematic review was led by the University of Exeter Medical School with experts concluding that non-drug interventions in schools may be effective in improving academic outcomes measured by performance in standardized tests for children with ADHD.

The team found 54 studies (39 randomized controlled trials and 15 non randomized studies) that tested many different ways of supporting these children.

Researchers found several strategies can be used to help support an ADHD child. For one, the use of daily report cards — completed by teachers and parents — help provide the child consistent and regular feedback. Another method is to provide study and organizational skills training which can help children achieve better attainment levels, reduce hyperactive behavior, and increase attention.

Remarkably the research, published in the journal Health Technology Assessment, found so many different types of strategies, and so many different combination of approaches, that it was impossible to clearly identify what works best.

As a result, the researchers have called for more standardized assessment to make future research outcomes more meaningful.

The systematic review, which involved collaborators at Kings College London and the Hong Kong Institute of Education, looked at all available and relevant research published between 1980 and 2013.

They examined the following different areas that are important to supporting children with ADHD in schools:

  1. The effectiveness and cost-effectiveness of school-based interventions for children with or at risk of ADHD;
  2. The attitudes and experiences of children, teachers, parents and others using ADHD interventions in school settings;
  3. The experience or culture of dealing with ADHD in school among pupils, their parents and teachers.

From the review, researchers did not discover studies of cost-effectiveness — an area that needs to be addressed in the future. They did find studies of attitudes and experience that suggest differences in beliefs about ADHD can create tensions in relationships between teachers, pupils and parents that may be significant barriers to its effective treatment.

In conclusion, the review suggests that education of school staff as well as the public around ADHD would help to break down preconceptions and stigma, and that classroom / school culture as well as individualized support for children with ADHD may make the support offered more or less effective.

Professor Tamsin Ford, from the University of Exeter Medical School, led the study, which involved collaborators from Kings College London and the Hong Kong Institute for Education.

She said: “There is strong evidence for the effectiveness of drugs for children with ADHD, but not all children can tolerate them or want to take them. ADHD can be disruptive to affected children as well as the classroom overall, but our study shows that effective psychological and behavioral management may make a significant improvement to children’s ability to cope with school.

“While this is encouraging, it’s not possible to give definitive guidance on what works because of variations between the strategies tested, and the design and analysis of the studies that we found. We now need more rigorous evaluation, with a focus on what works, for whom and in which contexts.

“Gaps in current research present opportunities to develop and test standardized interventions and research tools, and agree on gold standard outcome measure to provide answers to both schools and families.

Source: University of Exeter

Children With Autism Don’t Adjust Sniffing Time for Bad Smells

Fri, 07/03/2015 - 7:45am

When most people come across a pleasant scent, such as a nice perfume or freshly baked cookies, they typically take a good long sniff. While walking next to a dumpster, however, a person would most likely shorten his incoming breaths, minimizing the intake of the unpleasant odor.

Now, researchers have discovered that people with autism spectrum disorder (ASD) don’t make this natural adjustment like other people do. In fact, children with autism continue right on sniffing in the same way, no matter how pleasant or awful the scent.

The findings suggest that tests related to smell might serve as useful early indicators of ASD, say the researchers.

“The difference in sniffing pattern between the typically developing children and children with autism was simply overwhelming,” says Noam Sobel of the Weizmann Institute of Science in Israel.

Earlier studies have indicated that people with autism have impairments in “internal action models,” the brain templates we depend on to seamlessly coordinate our five senses with our actions. It wasn’t clear if this deficit would show up in a test of the sniff response, however.

To find out, Sobel, along with Liron Rozenkrantz and their colleagues, presented 18 children with ASD and 18 typically developing children (17 boys and one girl in each group) with pleasant and unpleasant odors and measured their sniff responses. The average age of the participants was seven years old.

While typical children adjusted their sniffing within 305 milliseconds of smelling an odor, the researchers report, children on the autism spectrum showed no such response.

That difference in sniff response time between the two groups of kids was enough to correctly classify them as children with or without a diagnosis of ASD 81 percent of the time. Furthermore, the researchers report that increasingly abnormal sniffing was linked to increasingly more severe autism symptoms, based on social but not motor impairments.

The study results suggest that a sniff test could be quite useful in the clinic, although the researchers emphasize that their test is in no way ready for that yet.

“We can identify autism and its severity with meaningful accuracy within less than 10 minutes using a test that is completely non-verbal and entails no task to follow,” Sobel says.

“This raises the hope that these findings could form the base for development of a diagnostic tool that can be applied very early on, such as in toddlers only a few months old. Such early diagnosis would allow for more effective intervention.”

The researchers plan on testing whether the sniff-response pattern they’ve observed is specific to autism or if it also shows up people with other neurodevelopmental disorders. They also want to investigate how early in life such a test could be used. But the most immediate question for Sobel is “whether an olfactory impairment is at the heart of the social impairment in autism.”

The findings are published in the Cell Press journal Current Biology.

Source: Current Biology

 

Extracurricular Sports Help Kids Develop Discipline in the Classroom

Fri, 07/03/2015 - 7:00am

A new Canadian study suggests regular, structured extramural sports help kids develop the discipline they need in order to engage effectively in the classroom.

Researchers from the University of Montreal and its affiliated CHU Sainte-Justine children’s hospital lead the study.

“We worked with information provided by parents and teachers to compare kindergarteners’ activities with their classroom engagement as they grew up,” said Linda Pagani, Ph.D.

“By time they reached the fourth grade, kids who played structured sports were identifiably better at following instructions and remaining focused in the classroom. There is something specific to the sporting environment — perhaps the unique sense of belonging to a team to a special group with a common goal — that appears to help kids understand the importance of respecting the rules and honoring responsibilities.”

Professor Pagani and her colleagues Geneviève Piché and Caroline Fitzpatrick came to their conclusions after reviewing the data on 2,694 children who were born in Quebec between 1997 and 1998. The information was retrieved from the Quebec Longitudinal Study on Child Development, a public data set coordinated by the province’s statistical institute.

“Our goal was to answer two questions: firstly, does participation in extracurricular activities in kindergarten predict fourth grade self-discipline, and secondly, do kindergarten self-discipline characteristics predict fourth-grade participation in sports?” Pagani explained.

Researchers say the predictive characteristics encompass things such as classroom engagement, physical aggression, impulsivity, and emotional distress.

At kindergarten, when most children in the study were six, teachers filled in questionnaires about their student behavior and parents were interviewed by phone or in person about their home life.

The exercise was repeated four years later. The researchers then analyzed the data by eliminating pre-existing influences such as child’s physical fitness and cognitive abilities, mother’s education, and how well the family unit functioned (asking families to rate, for example, how well they communicate) which could have influenced the results.

“Children who were involved in sports at kindergarten, or in fact who were involved in any kind of structured activity, were likely to be involved in teams sports by age ten. However, involvement in unstructured activities at kindergarten had no bearing on the child’s future.

Across the board, we found that children who had better behavior in the kindergarten class were more likely to be involved in sport by age ten,” Pagani said.

“Nonetheless, we found that those children who were specifically involved in team sports at kindergarten scored higher in self-regulation by time they reached fourth-grade.”

The researchers believe that sporting activities and attention skills are closely associated and can be addressed simultaneously in school planning. Their findings could help schools and public health authorities better reach children at risk of insufficient exercise as a way of addressing both the obesity and school drop-out crises at the same time.

“Programs to help parents develop their child’s self-regulation skills and the availability of extracurricular sports programs as early as kindergarten could help decrease the risk of kids being left behind,” Pagani said.

“We also hope policy makers consider our findings in order to improve access to parks and playgrounds, where children and their families can engage in sporting activities, to improve access to K12 enrichment programs that target self-regulation skills, and to improve the promotion of active schools and communities generally-speaking.”

Source: University of Montreal/EurekAlert

Docs Prescribe More Anti-psychotics for Boys

Fri, 07/03/2015 - 6:15am

A new NIH study looks, for the first time, at antipsychotic prescriptions patterns in America.

Researchers discovered boys are more likely than girls to receive a prescription for antipsychotic medication regardless of age. Approximately 1.5 percent of boys ages 10-18 received an antipsychotic prescription in 2010, although the percentage falls by nearly half after age 19.

Antipsychotic were prescribed most often for attention deficit hyperactivity disorder (ADHD) among youth ages one to 18. Depression was the most common diagnosis among young adults ages 19-24 for receiving antipsychotics.

Despite concerns over the rising use of antipsychotic drugs to treat young people, little has been known about trends and usage patterns in the United States before this latest research.

Mark Olfson, M.D., M.P.H., and colleagues Marissa King, Ph.D., and Michael Schoenbaum, Ph.D., report their findings in JAMA Psychiatry.

“No prior study has had the data to look at age patterns in antipsychotic use among children the way we do here,” said co-author Michael Schoenbaum, Ph.D., senior advisor for mental health services, epidemiology and economics at NIMH.

“What’s especially important is the finding that around 1.5 percent of boys aged 10-18 are on antipsychotics, and then this rate abruptly falls by half, as adolescents become young adults.”

“Antipsychotics should be prescribed with care,” says Schoenbaum. “They can adversely affect both physical and neurological function and some of their adverse effects can persist even after the medication is stopped.”

The U.S. Food and Drug Administration (FDA) has approved antipsychotics for children with certain disorders, particularly bipolar disorder, psychosis/schizophrenia, and autism.

However, the research team found that the medication use patterns do not match the illness patterns. The mismatch means that many antipsychotic prescriptions for young people may be for off-label purposes, that is, for uses not approved by FDA.

For example, maladaptive aggression is common in ADHD, and clinical trial data suggest that at least one antipsychotic, risperidone, when used with stimulants, can help reduce aggression in ADHD.

To date, FDA has not approved the use of any antipsychotic for ADHD, making its use for this diagnosis off-label.

In the current study, the combination of peak use of antipsychotics in adolescent boys and the diagnoses associated with prescriptions (often ADHD) suggest that these medications are being used to treat developmentally limited impulsivity and aggression rather than psychosis.

Mark Olfson and colleagues worked with the IMS LifeLink LRx database, which includes 63 percent of outpatient prescriptions filled in the U.S. The team looked at prescription data for 2006-2010 and found antipsychotic use increased with age in both boys and girls.

They found antipsychotic use beginning at 0.11 percent in 2010 for ages one to six years, increasing to 0.80 percent for ages seven to 12 years and increasing again to 1.19 percent for youth ages 13-18 years before dropping substantially to 0.84 percent for ages 19-24.

In children ages one to six, boys were more than twice as likely as girls to receive an antipsychotic prescription (0.16 vs. 0.06 percent in 2010). This pattern held true for boys and girls ages seven to 12 (1.20 vs. 0.44 percent in 2010) before narrowing for the 13-18 age group (1.42 vs. 0.95 percent) and finally becoming more comparable for young men and women ages 19 to 24 (0.88 to 0.81 percent in 2010).

Among young people treated with antipsychotics in 2010, the youngest children, ages one to six, were the least likely to receive the prescription from a psychiatrist (57.9 vs. 71.9, 77.9, and 70.4 percent for the other three age groups). This is a source of concern, as practice guidelines caution practitioners on the use of antipsychotic medications for young children in particular.

Among young people receiving antipsychotic prescriptions, fewer than half had any medical visit that included a mental disorder diagnosis. That may be in part due to stigma about mental illness, or because primary care providers are concerned about reimbursement for treatment related to such diagnoses.

“In addition to having a new look at antipsychotic use among youth, one positive finding coming from this study is that around 75 percent of these kids have at least some contact with a psychiatrist,” said NIMH Director Thomas Insel, M.D.

Source: NIMH/NIH

Poor Sleep Negatively Influences Self-Control

Fri, 07/03/2015 - 5:30am

A new study suggests poor sleep habits can have an undesirable effect on self-control. Challenges to self-control can lead to risks in an individuals’ personal and professional live.

Learning how sleep deficits can affect a person’s life is important as today’s 24-hour-a-day global economy often causes people to sleep less or at irregular times. This in turn results in poor sleep and chronic sleep loss.

In the study, titled “Interactions between Sleep Habits and Self-Control,” Clemson psychologists concluded a sleep-deprived individual is at increased risk for succumbing to impulsive desires, inattentiveness, and questionable decision-making.

“Self-control is part of daily decision-making. When presented with conflicting desires and opportunities, self-control allows one to maintain control,” said June Pilcher, Clemson Alumni Distinguished Professor of psychology, one of four authors of the study.

“Our study explored how sleep habits and self-control are interwoven and how sleep habits and self-control may work together to affect a person’s daily functioning.”

“Exercising self-control allows one to make better choices when presented with conflicting desires and opportunities. That has far-reaching implications to a person’s career and personal life,” Pilcher said.

Poor sleep habits, which include inconsistent sleep times and not enough hours of sleep, can also lead to health problems, including weight gain, hypertension, and illness, according to prior research. Studies have also found that sleep deprivation decreases self-control but increases hostility in people, which can create problems in the workplace and at home,” Pilcher said.

Moreover, better sleep habits can contribute to a more stable level of daily energy reserves. Availability of energy can refuel a person’s ability to make more difficult choices rather than opting for the easier choice or the easier task.

“Many aspects of our daily lives can be affected by better-managed sleep and self-control capacity,” Pilcher said.

“Improved health and worker performance are two potential benefits, but societal issues such as addictions, excessive gambling, and over spending could also be more controllable when sleep deficiencies aren’t interfering with one’s decision making.”

Source: Clemson University/EurekAlert