NHS Choices

Common bacteria could help prevent food allergies

NHS Choices - Behind the Headlines - Tue, 26/08/2014 - 13:00

"Bacteria which naturally live inside our digestive system can help prevent allergies and may become a source of treatment," BBC News reports after new research found evidence that Clostridia bacteria helps prevent peanut allergies in mice.

The study in question showed that mice lacking normal gut bacteria showed increased allergic responses when they were given peanut extracts.

The researchers then tested the effects of recolonising the mice's guts with specific groups of bacteria. They found that giving Clostridia bacteria (a group of bacteria that includes the "superbug" Clostridium difficile) reduced the allergic response.

The researchers hope the findings could one day support the development of new approaches to prevent or treat food allergies using probiotic treatments.

These are promising findings, but they are in the very early stages. Only mice have so far been studied, with a specific focus on peanut allergy and Clostridia bacteria. Further study developments from this animal research are awaited.

 

Where did the story come from?

This study was conducted by researchers from the University of Chicago, Northwestern University, the California Institute of Technology and Argonne National Laboratory in the US, and the University of Bern in Switzerland.

Funding was provided by Food Allergy Research and Education (FARE), US National Institutes of Health Grants, the University of Chicago Digestive Diseases Research Core Center, and a donation from the Bunning family.

It was published in the peer-reviewed journal PNAS.

BBC News gave a balanced account of this research.

 

What kind of research was this?

This was an animal study that aimed to see how alterations in gut bacteria are associated with food allergies.

As the researchers say, life-threatening anaphylactic reactions to food allergens (any substance that generates an allergic response) are an important concern, and the prevalence of food allergies appears to have been rising over a short space of time.

This has caused speculation about whether alterations in our environment could be driving allergic sensitivity to foods. One such theory is the "hygiene hypothesis" (discussed above).

This is the theory that reducing our exposure to infectious microbes during our early years – through overzealous sanitisation, for example – deprives people's immune systems of the "stimulation" of exposure, which could then lead to allergic disease. 

An extension of this theory is that environmental factors – including sanitation, but also increased use of antibiotics and vaccination – have altered the composition of natural gut bacteria, which play a role in regulating our sensitivity to allergens. It has been suggested that infants who have altered natural gut bacteria could be more sensitive to allergens.

This mouse study aimed to examine the role of gut bacteria in sensitivity to food allergens, with a focus on peanut allergy.

 

What did the research involve?

The researchers investigated the role gut bacteria plays in sensitivity to food allergens in different groups of mice. The research team studied mice born and raised in a completely sterile, bacteria-free environment so they were germ free.

Another group of mice were treated with a mixture of strong antibiotics from two weeks of age to severely reduce the variety and number of bacteria in their gut.

These groups of mice were then given purified extracts of roasted unsalted peanuts to assess their allergic response.

After looking at the allergic reactions in the germ-free and antibiotic-treated mice, specific groups of bacteria were reintroduced into their gut to see what, if any, effect it had on their allergic response.

The researchers focused on reintroducing Bacteroides and Clostridia groups of bacteria, which are normally present in mice in the wild.

 

What were the basic results?

Faecal samples taken from the antibiotic mice were found to have a significantly reduced number and variety of gut bacteria. These mice also had increased sensitivity to peanut allergens, demonstrating an increased immune system response that produced antibodies specific to these allergens, as well as showing symptoms of allergy.  

When the germ-free mice were exposed to peanut allergens, they demonstrated a greater immune response than normal mice and also demonstrated features of an anaphylactic reaction.

The researchers found that adding Bacteroides to the gut of the germ-free mice had no effect on the allergic reaction. However, adding Clostridia bacteria reduced sensitivity to the peanut allergen, making their allergic response similar to normal mice.

This suggests that Clostridia plays a role in protecting against sensitisation to food allergens.

This was further confirmed when Clostridia was used to recolonise the guts of the antibiotics mice and was found to reduce their allergic response.

The researchers then carried out further laboratory experiments looking at the process by which Clostridia could be offering protection. They found the bacteria increases the immune defenses of the cells lining the gut.

One specific effect seen was how Clostridia increased the activity of a particular antibody, which reduced the amount of peanut allergen entering the bloodstream by making the gut lining less permeable (so substances are less likely to pass through it).

 

How did the researchers interpret the results?

The researchers concluded that they have identified a "bacterial community" that protects against sensitisation to allergens and have demonstrated the mechanisms by which these bacteria regulate the permeability of the gut lining to food allergens.

They suggest their findings support the development of new approaches for the prevention and treatment of food allergy by using probiotic therapies to modulate the composition of the gut bacteria, and so help induce tolerance to dietary allergens.

 

Conclusion

This research examined how normal populations of gut bacteria influence mouse susceptibility to peanut allergens. The findings suggest the Clostridia group of bacteria may have a particular role in altering the immune defenses of the gut lining and preventing some of the food allergen entering the bloodstream.

The findings inform the theory that our increasingly sterile environments and increased use of antibiotics could lead to a reduction in our normal gut bacteria, which could possibly lead to people developing a sensitivity to allergens.

But these findings are in the very early stages. So far, only mice have been studied, and only their reactions to peanuts. We don't know whether similar results would be seen with other tree nuts or other foods that can cause an allergic response.

Also, although this research provides a theory, we do not know whether this theory is correct. We don't know, for example, whether people with a peanut allergy do (or did) have reduced levels of certain gut bacteria populations and whether this contributed to the development of their allergy. We also do not know whether treatments that reintroduce these bacteria could help reduce the allergy.

As the researchers say, the study does open an avenue of further study into the possible development of probiotic treatments, but there is a long way to go. 

Professor Colin Hill, a microbiologist at University College Cork, was quoted by the BBC as saying: "It is a very exciting paper and puts this theory on a much sounder scientific basis."

But he does offer due caution, saying: "We have to be careful not to extrapolate too far from a single study, and we also have to bear in mind that germ-free mice are a long way from humans."

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To Science

Gut bugs 'help prevent allergies'. BBC News, August 26 2014

Probiotics may help prevent peanut allergies, animal study shows. Fox News, August 26 2014

Categories: NHS Choices

Breakfast 'not the most important meal of the day'

NHS Choices - Behind the Headlines - Tue, 26/08/2014 - 01:00

"Breakfast might not be the most important meal of the day after all,” the Mail Online reports.

The concept that breakfast is the most important meal of the day is up there in the pantheon of received wisdom with “never swim after eating” or “getting wet will give you a cold”. But is there any hard evidence to back the claim?

A new study in 38 people found that six weeks of regularly eating breakfast had no significant effect on metabolism or eating patterns for the rest of the day compared to total fasting before midday.

It also found no difference between the groups at the end of the study in body mass, fat mass, or indicators of cardiovascular health (such as cholesterol or inflammatory markers).

There are various important limitations to this trial though, including the small sample size and short follow-up time. For example, people who fasted had much more variable blood sugar levels in the afternoon and evening, and we do not know what the longer-term effects of this could be.

Overall, based on this study alone, we would not recommend completely starving your body of all nutrition before 12pm each day, not least because not eating something in the morning may not make you feel very happy or energetic, if nothing else. 

 

Where did the story come from?

The study was carried out by researchers from the University of Bath and published in the peer-reviewed American Journal of Clinical Nutrition. The study has been published on an open-access basis, so is available for free online. The work was funded by a grant from the Biotechnology and Biological Sciences Research Council. The authors declare no conflicts of interest.

In concluding that breakfast is not the most important meal of the day, the Mail does not consider the various limitations of this very small study.

 

What kind of research was this?

This was a randomised controlled trial looking at how breakfast habits were associated with energy balance in the rest of the day in people living their normal daily life.

As the researchers say, it is the popular belief that “breakfast is the most important meal of the day”. But this assumption is only grounded in cross-sectional studies observing that eating breakfast is associated with reduced risk of weight gain and certain chronic diseases (such as diabetes and cardiovascular disease). However, this does not prove cause and effect. The researchers also note that such observational studies do not take into account the fact that people who eat breakfast also tend to be more physically active, eat less fat, be non-smokers and moderate drinkers, opening up the possibility of confounding factors.

So it could be the case that rather than regularly eating breakfast making you healthy, healthy people are more likely to eat breakfast.

The researchers say that though breakfast is said to influence metabolism, studies have lacked measurement tools capable of accurately measuring this during normal daily activities. This study aimed to get a better indication of this by measuring all aspects of energy balance, including the heat generated during physical activity, and in-depth laboratory tests (including blood tests and DEXA scan of bone mineral density).

Ultimately, they wanted to find out whether eating breakfast was a cause of good health or whether it was simply a sign of an already healthy lifestyle.

 

What did the research involve?

The research was given the title the “Bath Breakfast Project”. It included a six-week trial where 38 people (of 137 invited to participate) were randomised to eat a daily breakfast (18) or to extended morning fasting (20).

Adults between the ages of 21 and 60 were eligible for the trial if they were either normal weight (20 to 25kg/m2) or overweight (25 to 30kg/m2). Each of the two randomised groups were intended to include an even balance of normal and overweight participants, and of people who frequently and infrequently ate breakfast. This was done to allow a stratified (representative) analysis based on these two factors. 

Before the trial, participants came to the laboratory to have baseline measurements taken. This included blood tests to look at hormones, metabolites and blood fats, assessments of metabolic rate, and body mass and fat mass measurements. A small tissue sample was also taken to look at key genes related to appetite and physical activity. 

The breakfast group were told to eat 3,000kJ (around 720 calories – or around two bacon sandwiches) of energy prior to 11am, with half of this provided within two hours of waking. The breakfasts were self-selected by the participants, though they were said to be provided with detailed examples of the foods that would give the appropriate energy intake. The extended morning fasting group could drink only water before 12pm each day.

During the first and last weeks of the six-week trial, participants kept detailed records of their food and fluid intakes for later analysis of daily energy and macronutrient intake. During these two weeks, they were also fitted with a combined heart rate/accelerometer to accurately record energy expenditure/physical activity habits for the entire duration of each of these seven-day periods. A glucose monitor was also fitted under the skin.

They were told when these devices were fitted: “Your lifestyle choices during this free-living monitoring period are central to this study. We are interested in any natural changes in your diet and/or physical activity habits, which you may or may not make in response to the intervention. This monitoring period has been carefully scheduled to avoid any pre-planned changes in these habits, such as a holiday or diet/exercise plan. You should inform us immediately if unforeseen factors external to the study may influence your lifestyle.”

After the six weeks of the trial, the participants returned to the laboratory for repeat body measurements.

 

What were the basic results?

The study reports data for the 33 people who completed the trial, 16 in the breakfast group and 17 in the fasting group. These people were of average age 36, and 79% of them regularly ate breakfast. These were described to be a “lean” group of people – 21 women with a DEXA fat mass index of 11 kg/m2 or less, and 12 men with a fat mass index of 7.5 kg/m2 or less (DEXA fat mass index is assessed using X-rays to give a very precise measurement of body fat).

The researchers found that compared to the fasting group those in the breakfast group generated significantly more heat energy during physical activity before 12pm, and also engaged in more physical activity, in particular more “light” physical activity. Resting metabolic rate was stable between the groups, and there was no subsequent suppression of appetite in the breakfast group (energy intake remained 539 kcal/d greater than the fasting group throughout the day).

There was no difference in waking or sleeping times, and at the end of the study there were no differences between groups in body mass or fat mass, body hormones, cholesterol or inflammatory markers. There was no difference between groups in fasting blood sugar or insulin at six weeks, but during continuous sugar monitoring in the last week of the trial the fasting group demonstrated more variability in their afternoon and evening sugar measures.

 

How did the researchers interpret the results?

The researchers conclude that “Daily breakfast is causally linked to higher physical activity thermogenesis [heat generation] in lean adults, with greater overall dietary energy intake, but no change in resting metabolism. Cardiovascular health indexes were unaffected by either of the treatments, but breakfast maintained more stable afternoon and evening glycemia [glucose control] than did fasting”.

 

Conclusion

This trial aimed to measure the direct effect that eating breakfast or fasting before 12am has on energy balance and indicators of cardiovascular health in people living their normal daily lives. The trail has been carefully designed study and has taken extensive body measurements to try and measure the direct effects of breakfast or fasting upon the body. However, there are limitations to bear in mind:

  • This was a small sample size. The target sample size for the trial for it to have statistical power to reliably detect differences between the group had been 70 people – 30 in each group – with the aim that more than 60 people would be included in their final analyses. The study actually only had half the intended amount of participants – recruiting 38, with just 33 completing the trial. This means we can’t be sure whether the lack of difference between the groups is real, or was caused by the study being underpowered to detect any such differences.
  • The intervention was intended to apply “under free-living conditions” where all lifestyle choices were allowed to vary naturally. However, it is difficult to gauge how accurately people did comply with their allocated interventions. Compliance was said to be confirmed via self-report and verified via continuous glucose monitoring; however, this only apparently happened during the first and sixth weeks of the trial. It is unclear whether compliance would have been accurately measured during the intervening weeks.
  • The study only measures the effect of a very specific intervention of eating 3,000KJ for breakfast, or eating absolutely nothing at all, except for water before 12pm. This total fasting example is quite extreme, and its effects have only been measured over six weeks. We don’t know what the longer-term effects upon health would be. For example, the study did find that people who fasted had much more variable blood glucose control in the afternoon, and we don’t know what the longer-term effects of this pattern would be.
  • The study has also not measured the wider effects upon general feelings of wellbeing, emotions, concentration, lethargy, etc, that fasting may have. Participants in the fasting group were observed to do less physical activity in the morning, and this may have been an indicator of them feeling that they had less energy.
  • Study of different timings of breakfast, or different compositions (e.g. of carbohydrate, protein or fat) or different total calories, may be more beneficial for future study than the comparison of this 3,000KJ breakfast or total fast before 12pm studied here.

Overall, this study does not settle the debate on whether breakfast is the most important meal of the day, because it may have been underpowered to detect true differences and was quite narrow in its scope. Dr Betts, a senior lecturer in nutrition, metabolism and statistics, told the Mail Online that “It is certainly true that people who regularly eat breakfast tend to be slimmer and healthier, but these individuals also typically follow most other recommendations for a healthy lifestyle, so have more balanced diets and take more physical exercise." 

In normal life situations, breakfast does therefore seem to be linked to health in some way, though direct cause and effect is difficult to apply, due to the influence of other health and lifestyle factors in relationship. However, this study does not provide many more answers of whether we should eat breakfast, or what type of breakfast we should eat.

However, based on this study alone we would not recommend missing breakfast, not least because it may have a negative impact on your mood; you could spend all morning feeling “hangry”.

If you have slipped into the habit of skipping breakfast, then it is never too late to break the habit.

Read about five breakfast recipes specifically designed for people who hate eating breakfast.

Analysis by Bazian. Edited by NHS Choices.
Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Breakfast might NOT be the most important meal of the day after all: Scientists find it doesn't speed up the metabolism or aid weight loss. Mail Online, August 25 2014

Links To Science

Betts JA, Richardson JD, Chowdhury EA, et al. The causal role of breakfast in energy balance and health: a randomized controlled trial in lean adults. The American Journal of Clinical Nutrition. Published online June 4 2014

Categories: NHS Choices

Autistic brain 'overloaded with connections'

NHS Choices - Behind the Headlines - Fri, 22/08/2014 - 12:30

"Scientists discover people with autism have too many brain 'connections'," the Mail Online reports. US research suggests that people with an autistic spectrum disorder have an excessive amount of neural connections inside their brain.

The headline is based on the results of a study that found that at post-mortem, brains of people with autism spectrum disorder (ASD) have more nerve cell structures called “dendritic spines” – which receive signals from other nerve cells – than the brains of people without ASD.

Brain development after birth involves both the formation of new connections and the elimination or "pruning" of other connections. The researchers concluded that people with ASD have a developmental defect in the pruning/elimination of dendritic spines.

Further examination of the brains of people with ASD found that more of the signalling protein mTOR was found to be in its activated state than in brains of people without ASD.

A process called autophagy, where older structures and proteins within cells are removed and broken down, was also impaired.

The researchers performed further experiments to show the mTOR signalling inhibits autophagy, and without autophagy pruning of dendritic spines does not occur.

Mice genetically engineered to have increased levels of activated mTOR signalling were found to display autistic-like symptoms. All of these could be reversed with treatment with an inhibitor of mTOR called rapamycin.

Rapamycin is a type of antibiotic, and is currently used in medicine as an immunosuppressant to prevent organ rejection after kidney transplant. However, it has been associated with a range of adverse effects so would be unsuitable for most people with ASD.

It is too soon to say whether this research could lead to any treatment for ASD, and even if it does it is likely to be a long way off.

 

Where did the story come from?

The study was carried out by researchers from Columbia Medical School, the Icahn School of Medicine at Mount Sinai and the University of Rochester. It was funded by the Simons Foundation.

The study was published in the peer-reviewed journal Neuron.

The results of the study were extremely well-reported by the Mail Online.

 

What kind of research was this?

This was a laboratory and animal study that aimed to determine whether a process called autophagy (a process of removing and degrading cell structures and proteins) is involved in the remodelling of synapses (nerve connections). And whether this involves signalling through a protein called mTOR.

They also wanted to see whether this process was defective in autism spectrum disorder (ASD).

Laboratory and animal-based research is ideal for answering these sorts of questions. However, it means that any application to human health is probably a long way off.

 

What did the research involve?

The researchers initially examined at post-mortem the brains of people with ASD and people without ASD. They were particularly interested in nerve cell structures called “dendritic spines”, which receive signals from other nerve cells.

The researchers performed experiments with mice genetically engineered to have symptoms of ASD. In these mice models the signalling protein mTOR is dysregulated.

The researchers also performed further experiments to study the effects of mTOR dysregulation and blockage of autophagy.

 

What were the basic results?

From examining the brains of people with ASD and comparing them with the brains of people without ASD the researchers found that the density of dendritic spines was significantly higher in ASD.

Brain development after birth involves both the formation of new nerve connections and the pruning/elimination of others. The formation of new nerve connections exceeds pruning during childhood, but then synapses are eliminated during adolescence as synapses are selected and matured.

When the researchers compared the brains of children (aged between two and nine) and adolescents (aged between 13 and 20) they found that spine density was slightly higher in children with ASD compared to controls, but was markedly higher in adolescents with ASD compared to controls.

From childhood through adolescence, dendritic spines decreased by approximately 45% in control subjects, but by only approximately 16% in those with ASD. The researchers concluded that people with ASD have a developmental defect in spine pruning/elimination.

The researchers found there were higher levels of the activated version of the signalling protein mTOR in adolescent ASD brains than brains without ASD. They also found ASD brains were not performing as much autophagy as brains without ASD.

The researchers then performed experiments using mice models of ASD that had dysregulated mTOR. They found the mice had spine pruning defects. These pruning defects could be improved by treating the mice with a chemical called rapamycin which inhibits mTOR. The nerve cells of the mice models of ASD also performed less autophagy, and this was also corrected by treating the mice with rapamycin. Rapamycin also improved social behaviour of the mice on behavioural tests.

 

How did the researchers interpret the results?

The researchers conclude that their “findings suggest mTOR-regulated autophagy is required for developmental spine pruning, and activation of neuronal autophagy corrects synaptic pathology and social behaviour deficits in ASD models with hyperactivated mTOR".

 

Conclusion

This study has found that brains of people with ASD have more nerve cell structures called “dendritic spines”, which receive signals from other nerve cells, than the brains of people without ASD. More of the signalling protein mTOR was found to be in its activated state and a process called autophagy, which the cell uses to remove and degrade cell structures and proteins, was impaired in brains from people with ASD.

Genetically engineered mice with hyperactivated mTOR display autistic-like symptoms, have more dendritic spine pruning defects and impaired autophagy. All of these could be reversed with treatment with an inhibitor of mTOR called rapamycin.

Rapamycin is a type of antibiotic, and is currently used in medicine as an immunosuppressant to prevent organ rejection after kidney transplantation.

However, it has been associated with a range of adverse effects. As the Mail points out, this research is in its very early stages. It mainly helps our understanding of the brain changes that may be involved in this condition.

It is too soon to say whether it could lead to any treatment for autism spectrum disorders, and even if it does it is likely to be a long way off.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Scientists discover people with autism have too many brain 'connections'. Mail Online, August 22 2014

Links To Science

Tang G, Gudsnuk K, Kuo S, et al. Loss of mTOR-Dependent Macroautophagy Causes Autistic-like Synaptic Pruning Deficits. Neuron. Published online August 21 2014

Categories: NHS Choices

Dual vaccine approach could help eradicate polio

NHS Choices - Behind the Headlines - Fri, 22/08/2014 - 12:00

Double vaccines "could hasten the end of polio", BBC News reports. Researchers in India found that using a combination of the oral and injected vaccines provided enhanced protection against the disease.

Polio is a viral infection that can cause paralysis and death. Thanks to initiatives such as the NHS Childhood Vaccination Schedule, it is now largely a disease of the past, found in only three countries: Afghanistan, India and Nigeria. It is hoped that the disease could be entirely eradicated in the same way as smallpox.

There are two types of polio vaccine: the oral polio vaccine, which contains weakened strains of polio, and a vaccine known as the Salk inactivated poliovirus vaccine (IPV), which contains chemically inactivated poliovirus and is given by injection.

A new study, performed in India, found that giving a booster injection with the Salk IPV to children who had already been given the oral vaccine boosted gut immunity. This was demonstrated by the fact that fewer children had virus in the faeces after they received a challenge dose (an additional dose) of oral vaccine.

On the basis of this study’s results, the World Health Organization (WHO) is recommending that at least one dose of Salk inactivated poliovirus vaccine is added to routine vaccination schedules, instead of the all-oral vaccination schedule that many countries have.

Hopefully, the ambition of eradicating polio will be achieved in the coming years.

 

Where did the story come from?

The study was carried out by researchers from the WHO, the US Centers for Disease Control and Prevention, Imperial College London, the Enterovirus Research Centre in India and Panacea Biotech Ltd. Funding was provided by the Rotary International Polio Plus Program.

The study was published in the peer-reviewed journal Science. This article is open-access, so is free to download and read.

The results of the study were well reported by BBC News. Additional insight into the challenges of vaccinating children in conflict-ridden areas, such as Taliban-dominated areas of Afghanistan, was also provided.

 

What kind of research was this?

This was a randomised controlled trial. The researchers wanted to see if giving children a booster injection with the Salk inactivated poliovirus vaccine (IPV) could boost “mucosal” immunity, which includes immunity in the gut. This is because poliovirus can replicate in the guts of people who have been vaccinated but who don’t have strong mucosal immunity, and can therefore continue to be spread in faeces.

 

What did the research involve?

To do this, they randomised 954 children in India (in three age groups: infants aged 6 to 11 months, children aged 5 and children aged 10) who had already been vaccinated with the oral polio vaccine to booster injections with:

  • the Salk IPV
  • another dose of the oral polio vaccine
  • no vaccine

Four weeks later, children received a challenge dose of the oral polio vaccine, and the researchers measured the amount of poliovirus that was in their faeces after 3, 7 and 14 days. The researchers were interested in two types of poliovirus: poliovirus type 1 and poliovirus type 3. They wanted to see if the booster injection with the Salk IPV reduced the number of children with either of these two polioviruses in the faeces.

 

What were the basic results? Infants aged between 6 and 11 months
  • Booster injections with the Salk IPV significantly reduced the proportion of infants with type 3 poliovirus in their faeces compared to no vaccine, but did not significantly alter the proportion of infants with type 1 poliovirus in their faeces.
  • Another dose of the oral polio vaccine did not significantly alter the proportion of infants excreting poliovirus, compared to no vaccine.
Children aged 5
  • Booster injections with the Salk IPV significantly reduced the proportion of children aged 5 with type 1 or type 3 poliovirus in their faeces, compared to no vaccine.
  • Another dose of the oral polio vaccine did not significantly alter the proportion of children excreting poliovirus, compared to no vaccine.
Children aged 10
  • Booster injections with the Salk IPV significantly reduced the proportion of children aged 10 with type 1 or type 3 poliovirus in their faeces, compared to no vaccine.
  • Another dose of the oral polio vaccine also significantly reduced the number of children aged 10 with type 1 or type 3 poliovirus in their faeces, compared to no vaccine.
Overall

When all the age groups were considered together, booster injections with the Salk IPV significantly reduced the proportion of children with type 1 or type 3 poliovirus in their faeces compared to no vaccine, while another dose of the oral polio vaccine had no significant effect.

 

How did the researchers interpret the results?

The researchers conclude that their study "provides strong evidence that IPV boosts intestinal immunity among children with a history of multiple [oral poliovirus vaccine] doses more effectively than an additional [oral poliovirus vaccine] dose".

They go onto report that “as a result, the WHO is no longer recommending an all-[oral poliovirus vaccine] schedule; rather, it recommends that all-[oral poliovirus vaccine] using countries introduce [at least] one dose of the IPV into routine vaccination schedules".

 

Conclusion

This randomised control trial has found that a booster vaccination with the Salk inactivated poliovirus vaccine (IPV) can boost gut immunity against polioviruses in infants and children who have already received multiple doses of the oral vaccine.

It appears that receiving both vaccines is key, as the researchers report that the ability of the Salk IPV to induce gut immunity is limited. They say that studies in countries that do not use oral vaccine show that more than 90% of children given the IPV excrete challenge poliovirus. However, the researchers also say the oral vaccine has been reported to give incomplete intestinal immunity that does deteriorate.

Polio is transmitted by the faecal-oral route, either by exposure to faecally contaminated food or water, or by person-to-person contact. These findings are important, as in many of the parts of the world where polio is a problem, the standards of sanitation are poor. This means the potential for children to contract the disease by coming into contact with infected faeces passed by someone with weakened intestinal immunity is high.

The researchers also note one limitation to their study: it was performed in one district of India, and therefore extrapolation or generalisation of these findings must be done with caution. Despite this, on the basis of the results of this study the WHO is recommending that at least one dose of Salk IPV is added to routine vaccination schedules instead of the all-oral vaccination schedule that many countries have.

The UK vaccination schedule will remain unchanged, as all children should be given the IPV vaccinations as part of the routine vaccination schedule. 

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Double vaccines 'could hasten the end of polio'. BBC News, August 22 2014

Polio double vaccine gives better protection, study finds. The Guardian, August 22 2014

Links To Science

Jafari H, Deshpande JM, Sutter RW, et al. Efficacy of inactivated poliovirus vaccine in India. Science. Published online August 22 2014

Categories: NHS Choices

Botox may be useful in treating stomach cancers

NHS Choices - Behind the Headlines - Thu, 21/08/2014 - 13:30

"Botox may have cancer fighting role," BBC News reports after research involving mice found using Botox to block nerve signals to the stomach may help slow the growth of stomach cancers. Botox, short for botulinum toxin, is a powerful neurotoxin that can block nerve signals.

The researchers studied genetically modified mice designed to develop stomach cancer as they grew older.

They found that mice treated with Botox injections had improved survival rates, because the cancer spread at a reduced rate or was prevented from developing in the first place.

Cutting the nerve supply to the stomach during an operation called a vagotomy had a similar effect.

In mice that had already developed stomach cancer, Botox injections reduced cancer growth and improved survival rates when combined with chemotherapy.

Further studies of human stomach cancer samples confirmed the finding that nerves play a role in tumour growth.

An early-phase human trial is now underway in Norway to determine the safety of such a procedure and to work out how many people would need to be treated in trials, to see whether the treatment is effective.

 

Where did the story come from?

The study was carried out by researchers from the Norwegian University of Science and Technology in Trondheim, Columbia University College of Physicians and Surgeons in New York, and universities and institutes of technology in Boston, Germany and Japan.

It was funded by the Research Council of Norway, the Norwegian University of Science and Technology, St Olav's University Hospital, the Central Norway Regional Health Authority, the US National Institutes of Health, the Clyde Wu Family Foundation, the Mitsukoshi Health and Welfare Foundation, the Japan Society for the Promotion of Science Postdoctoral Fellowships for Research Abroad, the Uehara Memorial Foundation, the European Union Seventh Framework Programme, the Max Eder Program of the Deutsche Krebshilfe and the German Research Foundation.

The study was published in the peer-reviewed medical journal Science Translational Medicine.

The study was reported accurately by the UK media, making it clear that this potential treatment is not yet available and will take years to assess its potential.

 

What kind of research was this?

This research was a collection of experiments on mice and studies of human tissue samples. Previous research had shown that cutting the main nerve to the stomach (vagus) in a procedure called a vagotomy reduces the thickness of the stomach wall and decreases cell division.

Another research study found people who had a vagotomy had a 50% reduced risk of developing stomach cancer 10 to 20 years later. The researchers wanted to see if targeting the nerve would reduce stomach cancer growth.

 

What did the research involve?

Genetically modified mice designed to develop stomach cancer by 12 months of age were studied to see if there was a link between the density of nerves and stomach cancer.

One of four different types of operation was then performed on the vagus nerve of 107 genetically modified mice at the age of six months to see if this made a difference in the development of stomach cancer. This was either:

  • a sham operation
  • pyloroplasty (PP) – surgery to widen the valve at the bottom of the stomach so the stomach can empty food more easily
  • bilateral vagotomy with pyloroplasty (VTPP) – cutting both sections of the vagus nerve and widening the valve
  • anterior unilateral vagotomy (UVT) – cutting just the front section of the vagus nerve

The researchers then performed a Botox procedure on another set of mice by injecting the anterior vagus nerve (front section) when they were six months old to see if this reduced the development of stomach cancer.

To see if cutting or injecting the nerve had any effect after stomach cancer had developed, the researchers performed UVT on mice aged 8, 10 or 12 months and compared their survival rate with mice who had not had the intervention.

They then injected Botox into the stomach cancer of mice aged 12 months and looked at the subsequent cancer growth. They also compared survival rates for chemotherapy with saline injection, chemotherapy with Botox and chemotherapy with UVT.

The researchers then examined human stomach samples from 137 people who had undergone an operation for stomach cancer, to look at how active the nerves were in the sections of cancer compared with normal tissue.

They also compared tissue samples of 37 people who had already had an operation for stomach cancer, but then developed stomach cancer in the base portion of the stomach. The vagus nerve had been cut in 13 of these people.

 

What were the basic results?

The genetically modified mice mostly developed stomach cancer in the section of the stomach that had the highest density of nerves.

Cutting the vagus nerve supply reduced the incidence of tumours developing. The percentage of mice that had tumours six months after the operation was:

  • 78% after the sham surgery
  • 86% after PP
  • 17% after VTPP
  • 14% in the front section of the stomach (where the nerve had been cut) and 76% in the back section (where the vagus nerve was still intact) after UVT

Six months after the Botox injection into the anterior vagus nerve, the mice still developed stomach cancer. However, the size of the tumour and number of dividing cancer cells in the front section of the stomach was less than half that of the back section.

In mice that had already developed stomach cancer, the normal survival rate was 53% by 18 months, but this was increased by the UVT to:

  • 71% if the UVT was performed at 8 months
  • 64% if the UVT was performed at 10 months
  • 67% if the UVT was performed at 12 months

Botox injection into the stomach tumours of mice reduced the growth by roughly half. Botox and chemotherapy improved mouse survival compared with chemotherapy on its own, as did UVT and chemotherapy.

In the human samples, there was evidence of increased nerve activity in the cancer sections of tissue compared with the normal tissues. This was higher in more advanced tumours.

All 24 people who had not had the vagus nerve cut developed stomach cancer in the base, as well as the remaining front and back sections of the stomach. Only one of the 13 people who had had the vagus nerve cut developed cancer in the front or back section of the stomach, suggesting that the nerve needed to be intact for cancer to develop.

 

How did the researchers interpret the results?

The researchers say that their "finding that nerves play an important role in cancer initiation and progression highlights a component of the tumour microenvironment contributing to the cancer stem cell niche.

"The data strongly supports the notion that denervation and cholinergic antagonism, in combination with other therapies, could represent a viable approach for the treatment of gastric cancer and possibly other solid malignancies."

 

Conclusion

These laboratory experiments show that nerves have a role in the development and advancement of stomach cancer. The early experiments in mice found that stopping the nervous supply by either cutting the vagus nerve or injecting it with Botox improved survival rates and reduced cancer growth.

The Botox injections were not performed on any humans in this study. However, an early-phase clinical trial in humans with inoperable stomach cancer began in Norway in January 2013, with the results expected in 2016.

This will determine the safety of such a procedure and work out the number of people who would need to be treated in a larger controlled trial to see whether the treatment is effective.

You can reduce your risk of stomach cancer by quitting smoking if you smoke and moderating your consumption of salt and smoked meats, such as pastrami.

Stomach cancer has also been linked to a chronic infection by H. pylori bacteria, a common cause of stomach ulcers.

If you find yourself having persistent bouts of indigestion or stomach pain, you should contact your GP for advice. The symptoms could be caused by a H. pylori infection, which is relatively straightforward to treat.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Botox may have cancer fighting role. BBC News, August 21 2014

Botox could be used as new treatment for stomach cancer as scientists discover anti-wrinkle treatment slows tumour growth. Daily Mail, August 21 2014

Botox 'could be used to treat stomach cancer'. Daily Mirror, August 21 2014

Botox could halt stomach cancer. The Daily Telegraph, August 21 2014

Links To Science

Zhao C, Hayakawa Y, Kodama Y, et al. Denervation suppresses gastric tumorigenesis. Science Translational Medicine. Published online August 20 2014

Categories: NHS Choices

'Fat and 30' link to dementia is inconclusive

NHS Choices - Behind the Headlines - Thu, 21/08/2014 - 12:00

“People as young as 30 who are obese may be at greater risk [of dementia],” The Independent reports.

This UK study examined a set 14-year period (1998 to 2011) and looked at whether NHS hospital records documenting obesity in adults above the age of 30 were associated with subsequent hospital or mortality records documenting dementia in the remaining years of the study.

Overall there was actually no significant association between obesity and dementia in later life.

When the researchers broke down the data into 10-year age bands (30s, 40s, 50s and 60s) they found that people in these age groups had increased risk of dementia. However, it must be remembered that the researchers were not looking at lifetime dementia diagnoses, but only looking at diagnoses in the remaining years of the study. Very few people in the younger age groups would have developed dementia over the following few years.

For example, the study found more than a trebled risk of dementia for people with obesity in their 30s, but this was based on only 19 people who developed dementia during the remaining years of the study. Calculations based on small numbers may be less reliable and should be given less "weight".

As expected the greatest number of subsequent dementia diagnoses occurred in people who were 70 or above when obesity was assessed, and obesity did not increase dementia risk in these people.

Aside from any dementia link or not, overweight and obesity are well established to be associated with a variety of chronic diseases and a healthy weight should be the aim.

 

Where did the story come from?

The study was carried out by two researchers from the University of Oxford and was funded by the English National Institute for Health Research.

The study was published in the peer-reviewed Postgraduate Medical Journal.

The UK media failed to report the various limitations of this research. This includes the lack of a significant association with dementia overall for the total cohort.

And while significant associations for people between the ages of 30 and 60 were found, these are based on only very small numbers who developed dementia during the study so may be less reliable.

As said, the links between vascular dementia specifically and obesity do seem to be more apparent, but this is to be expected.

It is also not clear in the study where the 50% increased risk for people in middle age comes from.

 

What kind of research was this?

This was a retrospective cohort study that aimed to examine how obesity in middle age may be associated with the risk of subsequent dementia.

The researchers say the worldwide prevalence of dementia in 2010 was around 35.6 million cases, which was estimated to double to 65.7 million by 2030.

Meanwhile we are in the midst of an obesity epidemic, with the World Health Organization reporting that in 2008 just over a third of all adults were overweight (BMI over 25kg/m²) while 10% of men and 14% of women were obese (BMI over 30kg/m²).

As the researchers say, with the rapidly increasing burden of dementia, it is important to identify which modifiable risk factors are associated. The researchers say there is growing evidence that mid-life obesity is associated with “dementia” overall.

Dementia is just the general term for problems with memory and thinking, which has different causes. Alzheimer’s disease is the most common cause of dementia, which is associated with characteristic symptoms and changes in the brain (the formation of protein plaques and tangles). The causes of Alzheimer’s are not fully understood, with increasing age and genetic factors being the most well established. Overweight and obesity are not currently established as risk factors for Alzheimer’s disease.

Meanwhile, vascular dementia – the second most common cause – has the same risk factors as cardiovascular disease, so there would be a plausible link between obesity and this type of dementia.

This study simply examined a set 14 year period (1998 to 2011) and looked at whether hospital re-ords documenting obesity in adults of different ages, was associated with subsequent documentation of dementia in the remaining years of the study.

 

What did the research involve?

This study used Hospital Episode Statistics (HES) data, which includes data for all hospital admissions including day cases in NHS hospitals in England between April 1998 and December 2011. They also linked with the Office for National Statistics (ONS) to identify deaths up to December 2011.

The researchers identified a cohort of people with obesity by looking for the first admission or day care visit where obesity was recorded as a diagnosis (according to the International Classification of Diseases [ICD] codes). They identified a comparison control cohort without obesity who had received day care or hospital admission for various medical, surgical conditions or injuries. They only included adults in the obesity and comparison groups who were aged 30 or older and did not have an admission for dementia at the same time as, or before, the date of admission when obesity was recorded.

For the obesity and comparison groups they searched the HES and ONS databases for all subsequent hospital care or deaths from dementia (according to ICD codes). The researchers say they subdivided admissions into those specifically documented to be due to Alzheimer’s disease or vascular dementia, and separately examined men and women.

They grouped obesity and comparison groups into 10-year age bands at the time obesity was first recorded, then compared their risk of dementia in the subsequent years. Adjustment was made for sex, time period of the study, region of residence and deprivation score.  

 

What were the basic results?

There were 451,232 adults in the obesity cohort, 43% of whom were male (number in the comparison cohort not specifically reported).

Overall compared to controls, for the total cohort of all adults aged 30 or above, there was no statistically significant association between a hospital record of obesity and subsequent record of dementia in the remaining years of the study (relative risk [RR]0.98, 95% confidence interval [CI] 0.95 to 1.01).

However, when they were then split into 10-year age brackets, there was increased risk of subsequent dementia for people with obesity recorded in the age brackets:

  • 30 to 39 (RR 3.48, 95% CI 2.05 to 5.61)
  • 40 to 49 (RR 1.74, 95% CI 1.33 to 2.24)
  • 50 to 59 (RR 1.48, 95% CI 1.28 to 1.69)
  • 60 to 69 (RR 1.39, 95% CI 1.31 to 1.48)

There was no significant association between obesity and dementia for people with obesity between the ages 70 and 79, and an apparent decrease in risk of dementia for people above the age of 80 with obesity. 

When they looked by specific type of dementia, there was no clear link between obesity and Alzheimer’s disease. For the full cohort of adults aged 30 or over, obesity actually seemed to decrease the risk of subsequently developing Alzheimer’s disease (RR 0.63, 95% CI 0.59 to 0.67). Then by age group there was an apparent increased risk for those with obesity in the ages 30 to 39 (RR 5.37, 95% CI 1.65 to 13.7); no association for those between the ages 40 and 59; then decreased risk of Alzheimer’s for those with obesity above the age of 60.   

Obesity seemed to have a clearer link with risk of vascular dementia. The full cohort of adults aged 30 or over recorded to have obesity had a 14% increased risk of vascular dementia in the subsequent years of the study (RR 1.14, 95% CI 1.08 to 1.19). There were also significantly increased risks for all age groups up to the age of 69. For the 70 to 79 year age group there was no association, and for obese adults over the age of 80, obesity again seemed to decrease the risk.

 

How did the researchers interpret the results?

The researchers conclude that: “Obesity is associated with a risk of dementia in a way that appears to vary with age. Investigation of the mechanisms mediating this association might give insights into the biology of both conditions.”

 

Conclusion

As the researchers say: “The dataset spans 14 years and is therefore just a snapshot of people's lifetime experience of obesity.” The study is just looking at a set 14-year period (1998 to 2011) and looking at whether hospital records documenting obesity in adults of different ages, were associated with subsequent documentation of dementia in the remaining years of the study.

Therefore not only is the study looking at a snapshot of obesity in a 14-year period, is also looking at just a snapshot of time in which people could develop dementia in the remaining years of the study. For those in the cohort who were in their 70s or 80s when their obesity was recorded, you may expect that the study could have a better chance of capturing whether those people were ever going to develop dementia in their lifetime. However, for most of the people in the cohort who were between the ages of 30 and 60, their likelihood of developing dementia in the remaining few years of the study is low.

Therefore, this study cannot reliably show whether or not obesity in mid-life is associated with developing dementia, as the follow-up timeframe will not have been long enough for most people. 

The main result of this study was that for all adults in the cohort there was no association between a hospital record of obesity and risk of any type of dementia in the subsequent years of the study.

Though the research did then find increased risks for 10-year age bands in the 30s, 40s, 50s and 60s, many of these analyses are based on only small numbers of people who developed dementia in the remaining years of the study.

For example, the highest more than trebled risk of dementia for people with obesity in their 30s was based on only 19 people who developed dementia during the remaining years of the study. An analysis based on such a small number of people has a much higher chance of error. 

The 39% increased risk for people with obesity in their 60s was more reliable as this included 1,037 people from this age band who subsequently developed dementia.

But then the pattern is less clear, as for people with obesity in their 70s, of whom the largest number developed dementia (2,215), there was no association between obesity and dementia.

Meanwhile people who were obese in their 80s seemed to have decreased risk of then developing dementia.

Overall this makes a confusing picture from which to obtain any clear understanding of how obesity is associated with dementia. And it seems possible that various confounding hereditary, health and lifestyle factors may be having an influence.

Looking at Alzheimer’s specifically there was no clear link between adult obesity and Alzheimer’s. Therefore the study doesn’t provide evidence of obesity as a modifiable risk factor for the most common type of dementia. The only increased risk was for people with obesity in their 30s, but considering only five people developed Alzheimer’s in the remaining study years, this makes this risk association far from reliable. In fact for people over the age of 60, obesity apparently seems to be protective against Alzheimer’s for some reason. Though again it is highly possible this could be due to confounding from other factors.

As said, vascular dementia – the second most common type – has the same risk factors as cardiovascular disease, so there would be a plausible link between obesity and this type of dementia. And this study does support this, finding for the overall cohort of all adults above the age of 30, obesity was associated with a 14% increased risk of vascular dementia. Therefore, the study generally supports the link between obesity and this vascular condition.

Another point to bear in mind for this study is that, though it benefits from using a large reliable dataset of HES and ONS data which has recorded obesity and dementia based on valid diagnostic codes, it is of course only looking at hospital presentations of both obesity and dementia.

It is therefore unable to capture the large number of people with both of these conditions who may not have accessed hospital care.

Overall, this study contributes to the literature examining how the obesity epidemic may be associated with the growing prevalence of dementia worldwide, however it provides little in the way of conclusive answers.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Further evidence that obesity in middle age increases dementia risk. The Independent, August 21 2014

How being fat in your 30s could triple the risk of dementia: Person's age at which they are classified as obese found to be key in chance of developing condition. Daily Mail, August 21 2014

Slim to reduce the risk of dementia, middle-aged told. The Times, August 21 2014

Dementia risk TRIPLES if you get fat in your thirties. Daily Express, August 21 2014

Links To Science

Wotton CJ, Goldacre MJ. Age at obesity and association with subsequent dementia: record linkage study. Postgraduate Medical Journal. Published online August 20 2014

Categories: NHS Choices

Could failure to breastfeed cause depression?

NHS Choices - Behind the Headlines - Wed, 20/08/2014 - 12:30

Mothers who plan, but are unable, to breastfeed their babies are more likely to suffer from postnatal depression, report BBC News and The Independent.

A study of 14,000 women in England found that those who planned to breastfeed but had not managed to were two-and-a-half times more likely to develop postnatal depression, compared to women who had no intention of breastfeeding.

Around 1 in 10 women develop postnatal depression, which is not the same as the “baby blues”, but a serious illness that can affect a mother’s ability to bond with her baby. It can also affect the baby’s longer-term development.

It can develop within the first six weeks of giving birth, but is often not apparent until around six months. It’s important to get professional help if you think you may be suffering from this illness.

The study had several limitations. For example, both antenatal and postnatal depression were self-reported rather than clinically diagnosed, which may make the results less reliable.

Due to the nature of the study’s design, it cannot prove that not breastfeeding raises the risk of postnatal depression. However, it highlights the need to support new mothers who want to breastfeed but are unable to do so.

 

Where did the story come from?

The study was carried out by researchers from the University of Seville, University of Cambridge, University of Essex and University of London. It was funded by the UK’s Economic and Social Research Council. The study was published in the peer-reviewed Journal of Maternal and Child Health.

The Mail Online’s claim that “choosing not to” breastfeed doubles the risk of postnatal depression was misleading and oversimplified the study’s results.

The media did not point out that the majority of results were compared to women who did not want to breastfeed (and, subsequently, didn’t). For example, the doubled risk of postnatal depression for women who wanted to breastfeed but couldn’t was compared to women who did not want to breastfeed and didn’t. Most of the associations reported by the media were only significant at eight weeks after birth, and not significant beyond that.

As the authors point out, their results on the association between maternal depression and breastfeeding were very mixed. The link between not breastfeeding and postnatal depression seems to depend on whether or not a woman planned to breastfeed in the first place, as well as her mental health during pregnancy.

 

What kind of research was this?

Researchers used data from a longitudinal survey of about 14,000 children born in the early 1990s, conducted by the University of Bristol, which looked at child health and development.

The authors point out that about 3% of women experience postpartum depression (PPD) within 14 weeks of giving birth. Overall, as many as 19% of women have a depressive episode during pregnancy or the three months after birth. However, they say the effects of breastfeeding on the risk of PPD is not well understood.

The researchers aimed to examine how breastfeeding affects a mother’s mental health and, in particular, if the relationship between breastfeeding and maternal mental health is mediated by whether or not the mother intended to breastfeed.

The relationship between breastfeeding and the risk of PPD, they say, may be driven by biological factors, such as the difference in hormone levels between breast- and formula-feeding mothers. However, it may also be affected by feelings of success or failure over breastfeeding.

As this was a cohort study, it can only show an association, it cannot prove that not breastfeeding causes PPD.

 

What did the research involve?

The researchers used a sample of just over 14,000 women, who were recruited into the survey by doctors, when they first reported their pregnancy. Data for the study was collected by questionnaires administered to both parents at four points during pregnancy, and at several stages following birth.

Researchers used a validated measure of depression called the Edinburgh Postnatal Depression Scale (EPDS), which is designed to screen for PPD. This was conducted when women were 18 and 32 weeks pregnant. They conducted it again at 8 weeks, and 8, 18 and 33 months after the birth. 

The EPDS consists of 10 questions, each with four possible answers, to describe the severity of depressive symptoms. Total scores range from 0 to 30. Following guidelines, the researchers used a score of more than 14 to indicate depression during the antenatal period and more than 12 to indicate depression after birth.

Mothers were asked during pregnancy how they intended to feed their babies for the first four weeks. Following their child’s birth, they were asked at several points how they were actually feeding, and the ages at which infant formula and solid foods were introduced.

Researchers included in their analysis how long mothers had breastfed for and how long they had breastfed exclusively.

They identified four groups of women: 

  • mothers who had not planned to breastfeed, and who did not breastfeed (reference group)
  • mothers who had not planned to breastfeed, but who did actually breastfeed
  • mothers who had planned to breastfeed, but who did not actually breastfeed
  • mothers who had planned to breastfeed, and who did actually breastfeed

Using statistical methods, they presented several models of the relationship between breastfeeding and depression, controlling for different factors such as the child’s sex, parents' education and information on the pregnancy and birth. The most reliable model takes account of as many factors as possible, including the mother’s physical and mental health, whether she was depressed in pregnancy, the quality of her personal relationships and the experience of stressful life events. 

After conducting this analysis for the whole sample, they split the sample into mothers who were and who were not depressed during pregnancy; for each group, they examined the differences in outcomes between women who had planned to breastfeed, and women who had not.

 

What were the basic results?

Researchers found that 7% of women suffered depression at 18 weeks of pregnancy and 8% at 32 weeks. 9-12% of new mothers suffered from PPD. 

Breastfeeding was initiated by 80% of mothers and 74% breastfed for one week or more. By four weeks, 56% of mothers were breastfeeding at all and 43% were breastfeeding exclusively.

Researchers found that for the sample as a whole, there was little evidence of a relationship between breastfeeding and the risk of PPD. After adjusting for all of the factors, it was found that women who exclusively breastfed for 4 weeks or more were 19% less likely to have PPD 8 weeks after giving birth (odds ratio [OR] 0.81, 95% confidence interval [CI] 0.68 to 0.97). This was not significant at 8, 18 or 33 months.

However, they then calculated the results according to whether mothers had been depressed during pregnancy, and whether they had planned to breastfeed their babies. 

In mothers without any depressive symptoms during pregnancy, they found that the lowest risk of PPD by 8 weeks was among women who had planned to breastfeed and did so. For example, compared to women who did not plan to breastfeed and didn’t, women who exclusively breastfed for 2 weeks or more were 42% less likely to develop PPD by 8 weeks (OR 0.58, 95% CI 0.35 to 0.96).

The highest risk was found among women who had planned to breastfeed, but had not initiated breastfeeding. They were two-and-a-half times more likely to develop PPD by 8 weeks compared to women who did not plan to breastfeed and didn’t (OR 2.55, 95% CI 1.34 to 4.84).

For women who had shown signs of depression during pregnancy, there was no difference in risk of PPD for women who had planned to breastfeed but couldn’t. The only statistically significant result was for those women who had not planned to breastfeed, but did exclusively for four weeks. Their risk of PPD was reduced by 58% compared to women who had not planned to breastfeed and didn’t (OR 0.42, 95% CI 0.20 to 0.90).

There was no significant difference in risk of PPD between any of the planned or not planned breastfeeding groups at 8, 21 or 33 months.

 

How did the researchers interpret the results?

The authors say that the effects of breastfeeding on the risk of maternal depression is dependent on breastfeeding intentions during pregnancy and by mothers’ mental health.

“Our results underline the importance of providing expert breastfeeding support to women who want to breastfeed, but also of providing compassionate support for women who had intended to breastfeed, but who find themselves unable to,” they argue.

 

Conclusion

This is a useful study but, as the authors point out, it does have some limitations. Both antenatal and postnatal depression were self-reported rather than clinically diagnosed, which may make the results less reliable.

Also, the fact that the study consisted of parents who had voluntarily entered the study may also lead to bias. It’s worth noting that 95% of the women were white, so the results may not be generalisable to mothers from ethnic minorities.

Finally, although the researchers controlled for many possible confounders, there is the possibility that some unmeasured factor may have influenced results, such as a mother’s personality or IQ.

Many mothers who wish to breastfeed may find it difficult to do so for a range of reasons, but professional support can help. Postnatal depression is serious, but treatment is available.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Failing to breastfeed may double risk of depression in mothers: study. The Daily Telegraph, August 19 2014

Breastfeeding 'cuts depression risk', according to study. BBC News, August 20 2014

Mothers who breastfeed are 50% less likely to suffer postnatal depression. The Independent, August 20 2014

Links To Science

Borra C, Iacovou M, Sevilla A. New Evidence on Breastfeeding and Postpartum Depression: The Importance of Understanding Women’s Intentions. Maternal and Child Health Journal. Published online August 2014

Categories: NHS Choices

Common antibiotic linked to 'tiny' rise in heart deaths

NHS Choices - Behind the Headlines - Wed, 20/08/2014 - 11:15

An antibiotic given to millions of people in the UK to treat chest infections has been linked to an increased risk of heart death, report The Daily Telegraph and The Independent.

A Danish study of three antibiotics found the risk of death from any heart condition while taking the antibiotic clarithromycin is slightly higher than with penicillin V.

Clarithromycin is used for respiratory infections, and 2.2 million doses were prescribed in England in 2013. However, it is not recommended for people with abnormal heart rhythms.

Researchers compared the number of people who had a heart-related death after being put on a course of either clarithromycin, roxithromycin (not used in the UK) or penicillin.

The study, published online in the British Medical Journal, found there were an extra 37 cardiac deaths per 1 million courses of clarithromycin compared with penicillin.

But the risk was still very low. As this was a cohort study, it cannot prove that any of these deaths were as a result of taking clarithromycin, as it did not account for all of the other factors that could have influenced the results.

In particular, major risk factors for heart conditions such as smoking and obesity were not included in the analyses. When all factors the researchers did record were accounted for, there was no longer any statistically significant difference between clarithromycin and penicillin.

This study should not cause unnecessary concern – although there appears to be an increase in risk, this is tiny, at 0.01%.

 

Where did the story come from?

The study was carried out by researchers from the Statens Serum Institut in Copenhagen. They report there was no funding.

It was published in the peer-reviewed British Medical Journal (BMJ). It is available to read on the BMJ website.

The media reported the story reasonably accurately, but on the whole failed to point out quite how low the risk of cardiac death is on these antibiotics.

There were good quotes from UK experts about the fact that all drugs have some side effects and should therefore only be taken if they are really needed – this is particularly important for antibiotics given the increase in antibiotic resistance.

 

What kind of research was this?

This was a cohort study. It aimed to see if there was an increased risk of cardiac death while taking clarithromycin or roxithromycin compared with penicillin V.

Penicillin V is an antibiotic used for treating bacterial infections of the ear, throat, chest, skin and soft tissues.

Clarithromycin is an antibiotic used to treat bacterial chest infections, throat or sinus infections, skin and soft tissue infections, and Helicobacter pylori associated with peptic ulcers. It is not recommended for people with abnormal heart rhythms.

Roxithromycin is a similar type of antibiotic, but it is not used in the UK. All three are also used as prophylactic medication to prevent infections for people who are immunocompromised.

As this was a cohort study, it cannot prove that clarithromycin caused any cardiac deaths. This is because it does not take into account confounding factors that may have influenced the results. A randomised controlled trial would be required to prove causation.

 

What did the research involve?

The researchers compared the number of people who had a cardiac death during or in the 30 days after an outpatient course of either clarithromycin or roxithromycin, compared with penicillin V.

The nationwide Danish National Prescription Registry was used to identify all adults aged 40 to 74 who collected prescriptions for each antibiotic between 1997 and 2011.

Each time a person had a prescription of one of the drugs this was included in the analysis as long as they were not in hospital or had been prescribed an antibiotic in the previous 30 days. This means some people would have been included who had more than one antibiotic prescription.

The researchers collected data on cardiac deaths from the Danish Register of Causes of Death and looked at whether there was an association between taking either clarithromycin or roxithromycin compared with penicillin V, and having a cardiac death.

They looked at whether people had a cardiac death during the following two periods:

  • the seven days of likely antibiotic use from the start date of the prescription
  • eight to 37 days after the start date of the prescription

The researchers excluded people with serious disease (including cancer, neurological diseases or liver disease) and those deemed to be at high risk of death from non-cardiac causes.

They adjusted their analyses for a number of confounders, including sex, age, place of birth, time period, season, medical history, prescription drug use in the previous year, and use of healthcare in the previous six months.

 

What were the basic results?

There were 285 cardiac deaths during the first seven days after antibiotic prescription from a total of more than 5 million antibiotic prescriptions that met the study inclusion criteria. Of these, there were:

  • 18 deaths during 160,297 courses of clarithromycin (0.01%), incidence rate of cardiac death 5.3 per 1,000 person years
  • 235 deaths during 4,355,309 courses of penicillin V (0.005%), incidence rate of cardiac death 2.5 per 1,000 person years
  • 32 deaths during 588,988 courses of roxithromycin (0.005%), incidence rate of cardiac death 2.5 per 1,000 person years

After taking into account sex, age, cardiac risk score and the use of other drugs that are metabolised in a similar way, clarithromycin was associated with a 76% higher risk of cardiac death than penicillin V (adjusted rate ratio 1.76, 95% confidence interval [CI] 1.08 to 2.85).

The researchers say this would be equivalent to 37 extra cardiac deaths per 1 million treatment courses associated with clarithromycin compared with penicillin V (95% CI, 4 to 90). Roxithromycin was not associated with an increased risk.

The risk was higher in women on clarithromycin, (adjusted rate ratio 2.83 [95% CI 1.50 to 5.36]) compared with men (adjusted rate ratio 1.09 [95% CI 0.51 to 2.35]), although the difference was not statistically significant.

When the researchers performed additional analysis, where they matched people who had taken clarithromycin with people who had taken penicillin, according to sex, age, place of birth, time period, season, medical history, prescription drug use in the previous year and use of healthcare in the previous six months, they found the increase in risk of cardiac death with clarithromycin compared with penicillin was no longer statistically significant (rate ratio 1.63, 95% CI 0.87 to 3.03).

Between 8 and 37 days after antibiotic prescription, when it was assumed that people had finished taking antibiotics, there were 364 cardiac deaths. Of these, there were:

  • 14 deaths after clarithromycin, incidence rate 1.3 per 1,000 patient years
  • 308 deaths after penicillin V, incidence rate 1.0 per 1,000 patient years
  • 42 deaths after roxithromycin, incidence rate 1.0 per 1,000 patient years

Neither clarithromycin nor roxithromycin had an increased risk of cardiac death compared with penicillin after the presumed seven-day course.

 

How did the researchers interpret the results?

The researchers concluded this study "found a significantly increased risk of cardiac death associated with current use of clarithromycin, but not roxithromycin".

However, they also acknowledged that, "Before these results are used to guide clinical decision making, confirmation in independent populations is an urgent priority given the widespread use of macrolide antibiotics".

Clarithromycin and roxithromycin both belong to the macrolide class of antibiotics.

 

Conclusion

The conclusion that the risk of cardiac death during the use of clarithromycin is 76% higher than that for penicillin V was based on a small number of cardiac deaths. In fact, it occurred during 0.01% of prescriptions of clarithromycin, compared with 0.005% during prescriptions for penicillin V.

A death rate just a bit higher than a very small death rate is still very small. This means that from an individual point of view, the risk of cardiac death from taking either antibiotic is minimal.

This study does not prove clarithromycin caused any cardiac deaths. It only showed a very small increased risk of cardiac death in the seven days after the prescription was collected in a select group of people. This did not include:

  • antibiotic use in hospitals
  • people with serious illnesses
  • long-term prophylactic use (to prevent infections), such as for those who are immunocompromised
  • people who did not improve and required an alternative antibiotic

The study also has several other limitations, including:

  • major risk factors for cardiac death, such as smoking and obesity, were not taken into account
  • the reason for taking each antibiotic was not known – clarithromycin is used for more types of infections than penicillin V, which may have influenced the results
  • clarithromycin is commonly used for people who are allergic to penicillin, but this factor was not assessed in the study
  • it was assumed that people who collected their prescriptions took the medication as prescribed for seven days

Also, when the researchers performed additional analysis, where they matched people who had taken clarithromycin with people who had taken penicillin according to sex, age, place of birth, time period, season, medical history, prescription drug use in the previous year and use of healthcare in the previous six months, they found the increase in risk of cardiac death with clarithromycin was no longer statistically significant.

Although it is already known clarithromycin can have an effect on the rhythm of the heart and is not recommended for people who have irregular heart rhythms, the study did not specifically look at cardiac death caused by an abnormal rhythm, but instead grouped all causes of death related to heart problems. This further limits the ability to establish a link between how clarithromycin might be increasing the very small risk.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Antibiotic 'linked to heart deaths'. Daily Mail, August 20 2014

Common antibiotic linked to sudden heart deaths. The Daily Telegraph, August 20 2014

Common antibiotic linked to increased risk of heart disease. The Independent, August 20 2014

Links To Science

Svanström H, Pasternak B, Hviid A. Use of clarithromycin and roxithromycin and risk of cardiac death: cohort study. British Medical Journal. Published online August 2014

Categories: NHS Choices

Are good neighbours really life-savers?

NHS Choices - Behind the Headlines - Tue, 19/08/2014 - 13:00

“Having good neighbours can help cut heart attack risk,” reports The Independent.

The paper reports on a nationally representative US study of over 5,000 adults over the age of 50.

People were asked about how they rated their neighbourhood social cohesion, then followed up for four years to see if they had a heart attack.

Social cohesion refers to how “neighbourly” people feel, and relates to feelings of security, connection to the area and trust of inhabitants. In this study, social cohesion was assessed by asking people how much they agreed with simple statements such as “people in this area are friendly” and “people in this area can be trusted”.

The study found that higher social cohesion was associated with a reduced risk of heart attack.

However, the association became non-significant (could have been the result of chance) once the researchers adjusted for all factors known to be associated with heart attack risk, such as smoking history, exercise and body mass index (BMI).

This makes it more difficult to draw any meaningful interpretation from these results. It's likely that any link between the risk of a heart attack and perceived social cohesion is being influenced by a varied mix of other factors.

While building social connections can bring mental health benefits, relying on your neighbours to cut your risk of a heart attack is probably unwise.

 

Where did the story come from?

The study was carried out by researchers from the University of Michigan. Sources of funding were not reported. 

The study was published in the peer-reviewed Journal of Epidemiology & Community Health.

This story was covered by The Independent, the Mail Online and The Daily Telegraph.

It was not stated that the association between social cohesion and heart attack was no longer significant when all covariates were adjusted for.

However, the Telegraph did make the point that it is too early to make any definitive conclusions.

 

What kind of research was this?

This was a cohort study that investigated whether higher perceived neighbourhood social cohesion was associated with lower incidence of heart attack (myocardial infarction).

Cohort studies cannot show that higher social cohesion caused the reduction in heart attacks, as there could be many other factors responsible for any association seen.

 

What did the research involve?

The researchers analysed 5,276 people without a history of heart disease who were taking part in the Health and Retirement Study – a nationally representative study of American adults over the age of 50.

People were asked at the beginning of the study about how they rated their neighbourhood social cohesion. Social cohesion was measured by the participants’ agreement with the following statements:

  • “I really feel part of this area”
  • “If you were in trouble, there are lots of people in this area who would help you”
  • “Most people in this area can be trusted”
  • “Most people in this area are friendly”

There was then a follow-up period of four years to see if those studied had a heart attack, which was self-reported or reported by a proxy if the participant had died.

The researchers looked to see if people with higher perceived social neighbourhood cohesion had a reduced risk of heart attack.

 

What were the basic results?

During the four-year study, 148 people (2.81%) people had a heart attack.

Each standard deviation (a measure of variation from the average) increase in perceived neighbourhood social cohesion was associated with a 22% reduced odds of heart attack after adjusting for age, gender, race, marital status, education and total wealth (odds ratio [OR] 0.78, 95% confidence interval [CI] 0.63 to 0.94).

However, the association was no longer statistically significant if all potential confounders were adjusted for (age, gender, race/ethnicity, marital status, education level, total wealth, smoking, exercise, alcohol frequency, high blood pressure, diabetes, BMI, depression, anxiety, cynical hostility, optimism, positive affect, social participation and social integration) (OR 0.82, 95% CI 0.66 to 1.02).

The researchers also divided perceived neighbourhood social cohesion into four categories: low, low-moderate, moderate-high and high. When age, gender, race, marital status, education and total wealth were adjusted for, people with high perceived neighbourhood social cohesion were at reduced risk of heart attack compared to people with low social cohesion. Again, this association was no longer significant if all confounders were adjusted for.

 

How did the researchers interpret the results?

The researchers concluded that “higher perceived neighbourhood social cohesion may have a protective effect against myocardial infarction”.

 

Conclusion

This US cohort study found that higher social cohesion was associated with a reduced risk of heart attack. However, the association became non-significant once the researchers adjusted for all behavioural (such as smoking or exercise), biological (such a BMI) and psychosocial (such as depression) factors that could act as potential confounders.

It is difficult to draw any meaningful interpretation from these results. Perceived social cohesion in this study was only measured by asking people how much they agreed with four simple statements about whether they liked living in the area, whether people in the area were friendly and if they could be trusted. This tells us little about the sociodemographic structure of the area, or the individuals’ interpersonal relationships with others.

Also, despite the large initial sample size, there were relatively few heart attacks over the four years. Heart attack cases were also noted by individual or proxy-self report, rather than a review of medical records, which may also have led to errors.

There are a variety of biological, hereditary and lifestyle factors that are well known to be associated with greater risk of cardiovascular disease, and various other psychological ones that have been speculated (such as stress).

As the results of this study suggest, it is likely that any link between risk of heart attack and perceived social cohesion is being influenced by a varied mix of other factors.

If you want to try and reduce your risk of a heart attack, maintaining a healthy weight through diet and exercise, avoiding smoking and limiting alcohol intake are a great start. 

Simply relying on your neighbours to cut your risk of heart attack is probably unwise.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Having good neighbours can help cut heart attack risk, study shows. The Independent, August 18 2014

Friendly neighbours could lower the risk of heart attack, study finds. The Daily Telegraph, August 18 2014

Good neighbours can keep your heart healthy: Chance of a heart attack found to be a fifth lower if you live in a friendly area. Daily Mail, August 19 2014

Links To Science

Kim ES, Hawes AM, Smith J. Perceived neighbourhood social cohesion and myocardial infarction. Journal of Epidemiology and Community Health. Published online August 18 2014

Categories: NHS Choices

Targeted brain stimulation 'could aid stroke recovery'

NHS Choices - Behind the Headlines - Tue, 19/08/2014 - 01:00

"Stimulating the part of the brain which controls movement may improve recovery after a stroke," BBC News reports after researchers used lasers to stimulate a particular region of the brain with promising results in mice.

The researchers were looking at a sub-type of stroke known as ischaemic stroke, where a blood clot blocks the supply of blood to part of the brain.

With prompt treatment an ischaemic stroke is survivable, but even a temporary block to the blood supply can cause brain damage, which can impact on multiple functions such as movement, cognition and speech. Attempting to recover these functions is now an important aspect of post-stroke treatment.

The researchers used a technique called optogenetics in this study. Optogenetics uses a combination of genetics and light, where genetic techniques are used to "make" (code) certain brain cells sensitive to the effects of light. Light is produced by a laser and delivered through an optical fibre.

The researchers used light to stimulate an area of the brain (the primary motor cortex) in mice which had stroke-related brain damage. After stimulation, the mice's performance improved in behaviour tests assessing sensation and movement.

But to use this technique in humans, brain cells would have to be made sensitive to light, possibly by introducing a gene coding for a light-sensitive channel into nerve cells using gene therapy techniques. It is unclear whether this would be feasible based on current technology and techniques.

 

Where did the story come from?

The study was carried out by researchers from Stanford University School of Medicine in the US.

It was funded by the US National Institutes of Health, the National Institute of Neurological Disorders, a Stroke Grant, Russell and Elizabeth Siegelman, and Bernard and Ronni Lacroute.

The study was published in the peer-reviewed journal PNAS.

The research was well reported by BBC News.

 

What kind of research was this?

This animal study aimed to determine whether stimulating nerve cells in certain undamaged parts of the brain could help recovery in a mouse model of stroke.

Animal research such as this is a useful first step in investigating whether treatments could potentially be developed for testing in humans.

 

What did the research involve?

The researchers used a mouse that had been genetically engineered so the nerve cells in the part of the brain responsible for movement (the primary motor cortex) produced an ion channel sensitive to light. When light is shone on the nerve cells expressing this ion channel, the ion channel opens and the nerve cell is activated.

The researchers used healthy mice, as well as mice with brain damage caused by stopping blood flow in one of the arteries that supplies blood to the brain. This mimics the damage that occurs during an ischaemic stroke. The damage occurred in a different part of the brain from the primary motor cortex (the area that was stimulated). 

The researchers looked at whether stimulating the nerve cells in the primary motor cortex using light from a laser could promote recovery in a mouse model of stroke. This combination of light and genetics is called optogenetics.

 

What were the basic results?

Light stimulation of the nerve cells in the undamaged primary motor cortex significantly improved brain blood flow, as well as blood flow in response to brain activity in "stroke mice". It also increased the expression of neurotrophins, a family of proteins that promotes the survival, development and function of nerve cells, and other growth factors.

Stimulation of the nerve cells in the primary motor cortex also promoted functional recovery in the "stroke mice". "Stroke mice" who received stimulation showed faster weight gain and performed significantly better in a sensory-motor behaviour test (the rotating beam test).

Interestingly, stimulations in normal "non-stroke mice" did not alter motor behaviour or expression of neurotrophins.

 

How did the researchers interpret the results?

The researchers concluded that, "These results demonstrate that selective stimulation of neurons can enhance multiple plasticity-associated [the brain's ability to change] mechanisms and promote recovery."

 

Conclusion

This mouse model of stroke has found that stimulating nerve cells in the part of the brain responsible for movement (the primary motor cortex) can lead to better blood flow and the expression of proteins that could promote recovery, as well as leading to functional recovery after stroke.

But it remains to be determined whether a similar technique could be used in people who have had a stroke.

The mice were genetically modified so nerve cells in the primary motor cortex produced an ion channel that could be activated by light. The nerve cells were then activated using a laser.

To use this technique in humans, a gene coding for a light-sensitive channel would have to be introduced into nerve cells, possibly using gene therapy techniques.

Gene therapy in people is very much in its infancy, so it is unclear whether this would be achievable, let alone safe. The last thing you would want to do with a brain recovering from stroke-related damage is to make that damage worse.

Overall, this interesting technique shows promise, but much more research needs to be done before there will be any practical applications in the treatment of stroke patients.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Brain stimulation 'helps in stroke'. BBC News, August 19 2014

Links To Science

Cheng MY, Wang EH, Woodson WJ, et al. Optogenetic neuronal stimulation promotes functional recovery after stroke. PNAS. Published online August 18 2014

Categories: NHS Choices

Bone marrow drug could treat alopecia

NHS Choices - Behind the Headlines - Mon, 18/08/2014 - 01:00

“Alopecia sufferers given new treatment hope with repurposed drug,” The Guardian reports.

Alopecia is a type of autoimmune condition where the body’s own immune cells start to attack the hair follicles for an unknown reason, leading to hair loss.

This new research actually involved two phases, one involving mice and one involving humans.

The researchers identified the specific type of immune cell (CD8+NKG2D+ T cells) that is involved in this autoimmune process, and identified the signalling pathways that stimulate the activity of these cells.

The researchers then demonstrated that using molecular treatments to block these signalling pathways was effective in preventing and reversing the disease process in mice genetically engineered to develop alopecia.

These findings in mice were followed by promising results in three people with moderate to severe alopecia. These people were treated with ruxolitinib, which is currently licensed in the UK to treat certain bone marrow disorders. All three patients demonstrated “near-complete hair regrowth” after three to five months of treatment.

This promising research is in very early stages. Ruxolitinib has been tested in only three people with alopecia, which is far too small a number to make any solid conclusions about the effectiveness or the safety of this treatment in people with alopecia.

The safety and efficacy would need to be tested in many further studies involving larger numbers of people, and it would also need to be tested against other currently used treatments for alopecia, such as steroids.

 

Where did the story come from?

The study was carried out by researchers from Columbia University in New York. The study received various sources of financial support including US Public Health Service National Institutes of Health, the Columbia University Skin Disease Research Center, the Locks of Love Foundation and the Alopecia Areata Initiative.

The study was published in the peer-reviewed scientific journal Nature Medicine.

The media gives varied reports of this study. The Mail in particular is overly premature, as the current study is a very long way away in terms of research steps before knowing whether there could be a new “standard treatment for the condition”.

Also, references to a “baldness pill” are potentially misleading as they could lead people to think that this treatment, or similar, would be effective against the most common type of baldness, male pattern baldness.

 

What kind of research was this?

This was a laboratory and mouse study that aimed to examine the cellular processes that cause alopecia and to try and investigate a treatment to reverse the process.

Alopecia is a condition where body hair falls out, ranging from just a patch of hair on the head to the entire body hair. It is understood to be a type of autoimmune condition where the body’s own immune cells start to attack the hair follicles. Causes are not completely understood, with associations with stress and genetics speculated. Unfortunately, although various treatments may be tried (most commonly corticosteroids) there is currently no cure for alopecia.

The autoimmune process is thought to be driven by T lymphocyte cells (a type of white blood cell). Previous laboratory studies in mouse and human models have shown that transfer of T cells can cause the disease. However, effective treatments are said to be limited by a lack of understanding of the key T cell inflammatory pathways in alopecia.

The researchers had previously identified a particular subset of T cells (CD8+NKG2D+ T cells) surrounding hair follicles in alopecia, as well as identifying certain signalling molecules that seem to stimulate them. In this study, the researchers aimed to further investigate the role of these specific T cells using a group of mice genetically engineered to spontaneously develop alopecia, and also human skin samples.

 

What did the research involve?

First of all the researchers examined skin biopsies from genetically engineered mice that had developed alopecia to confirm that these specific CD8+NKG2D+ T cells were infiltrating the hair follicles. They confirmed that there was an increase in numbers of these specific T cells, increase in total number of cells, and also noticed that there was an increase in growth of lymph nodes in the skin. They found that the type of T cell infiltrating the skin and infiltrating the lymph nodes was the same. They examined the genetic profile of these T cells from the lymph nodes.

They then looked into the role of these specific T cells in disease development by transferring these specific T cells, or overall cells from the lymph nodes, into thus far healthy genetically engineered mice that had not yet developed alopecia.

This was in order to confirm that the CD8+NKG2D+ T cells were the dominant cell type involved in the development of the disease and were sufficient to cause the disease.

The researchers then examined the gene activity in skin samples from the genetically engineered mice, and from humans with alopecia.

They identified several genes that were overexpressed around the areas of alopecia, as well as several signalling molecules that are drivers of this abnormal T cell activity, including interleukins 2 and 15, and interferon gamma. 

The researchers therefore then wanted to see whether using drug treatments that could block these signalling molecules would prevent disease development.

To do this they grafted skin from mice that had developed alopecia on to the backs of mice who had not yet developed the condition. They then tested the effectiveness of drug treatments that can block the signalling molecules to see if they could prevent or reverse the disease.

Finally, they followed their results in mice with tests in three people with alopecia.

 

What were the basic results?

When currently healthy mice were grafted with the skin of mice who had developed alopecia, 95-100% of them developed alopecia within 6 to 10 weeks. Giving antibodies to neutralise interferon gamma at the time of grafting prevented alopecia development. Giving antibodies to block interleukins 2 and 15 had a similar effect.

However, though the researchers could prevent development if given at the same time, none were able to reverse the process if given after alopecia had developed.

They then investigated whether they could block other signalling molecules that are involved in the downstream pathway from interferon gamma (called JAK proteins). Ruxolitinib (currently licensed in the UK to treat certain bone marrow disorders) is a molecule that blocks JAK1/2 proteins. Tofacitinib is another molecular treatment (not currently licensed for any condition in the UK) that blocks another (JAK3). When these two treatments were given at the same time the alopecia skin samples were grafted on to the healthy mice, the mice no longer developed alopecia.

The researchers then tested whether giving tofacitinib seven weeks after grafting could reverse alopecia. Treatment did result in “substantial hair regrowth” all over the body and reduced numbers of T cells, which persisted for a few months after stopping treatment. They also tested whether these two JAK inhibitor treatments were effective when topically applied (rubbed into the skin on the back) instead of given by mouth, and found that they were, with hair regrowth occurring within 12 weeks.

The human tests involved three people with moderate to severe alopecia who were given 20mg of ruxolitinib by mouth twice daily.

All three people demonstrated “near-complete hair regrowth” within three to five months of treatment.

No information on whether these people developed side effects was provided in the study.

 

How did the researchers interpret the results?

The researchers conclude that their results demonstrate that CD8+NKG2D+ T cells are the dominant cell type involved in the disease process of alopecia. They say that “the clinical response of a small number of patients with alopecia to treatment with the JAK1/2 inhibitor ruxolitinib suggests future clinical evaluation of this compound or other JAK protein inhibitors currently in clinical development is warranted”.

 

Conclusion

This is valuable laboratory research that identifies the specific type of immune cell (CD8+NKG2D+ T cells) that is involved in the disease process of alopecia. It further identifies several signalling molecules that are drivers of this T cell activity.

The researchers then demonstrate that giving two molecular treatments to block the signalling molecules – ruxolitinib (currently licensed in the UK to treat certain bone marrow disorders) and tofacitinib (not currently licensed for any condition in the UK) – were effective in preventing and reversing the disease process in mice with alopecia.

These findings in mice were followed by promising results in three people with moderate to severe alopecia who were treated with ruxolitinib. All three patients demonstrated “near-complete hair regrowth” after three to five months of ruxolitinib treatment.

These are promising results into the study of potential treatments for this devastating autoimmune condition, which currently has no cure.

However, it is important to realise that this research is in the very early stages. So far ruxolitinib treatment has been tested in only three people with alopecia, which is far too small a number to make any solid conclusions about the effectiveness or the safety of this treatment in people with alopecia. This drug is currently not licensed for use in this condition. It would need to go through many further clinical trial stages in larger numbers of people with alopecia. It would also need to be tested for safety and efficacy against other currently used treatments for alopecia, such as steroids.

Overall there is some way to go before we could know whether ruxolitinib holds real promise as a treatment for alopecia.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Alopecia sufferers given new treatment hope with repurposed drug. The Guardian, August 17 2014

Pill that can cure baldness in five months: Twice-a-day tablet that allows alopecia sufferers’ hair to grow back set to become standard treatment for condition. Daily Mail, August 18 2014

Baldness pill to cure alopecia. Metro, August 18 2014

Links To Science

Xing L, Dai Z, Jabbari A, et al. Alopecia areata is driven by cytotoxic T lymphocytes and is reversed by JAK inhibition. Nature Medicine. Published online August 17 2014

Categories: NHS Choices

Depression 'common' in early Parkinson’s

NHS Choices - Behind the Headlines - Mon, 18/08/2014 - 01:00

“Depression more common in early Parkinson’s,” BBC News reports, as a new study investigates the impact this degenerative condition can have on mental health.

Parkinson’s disease is a neurological condition caused by a lack of the chemical dopamine in the brain. Alongside the characteristic movement symptoms such as involuntary shaking, mental health symptoms including depression, anxiety and dementia are relatively common in people with Parkinson’s.

However, it is unclear whether these symptoms are directly caused by the disease process of Parkinson’s or whether there are other factors (for example, psychosocial) that may be involved in both.

This study compared people with newly diagnosed Parkinson’s disease and healthy controls over two years to see if symptoms developed and changed.

The researchers found that depression, fatigue, apathy and anxiety were more common at the time of diagnosis in people with Parkinson’s disease than healthy controls. Apathy and psychosis also increased over the two years in people with Parkinson’s.

This study demonstrates how a variety of mental health problems can be common in early Parkinson’s disease, something patients need to be aware of.

But we do not know whether these symptoms had newly developed as a direct result of the disease process, or whether these symptoms were present long before, or whether they even arose due to the “shock” of diagnosis.

Read more advice about living with a long-term condition.

 

Where did the story come from?

The study was carried out by researchers from University Hospital Donostia, San Sebastián, Spain; Perelman School of Medicine at the University of Pennsylvania; and the Department of Veterans Affairs at Philadelphia VA Medical Center, US.

Funding was provided by the Michael J. Fox Foundation for Parkinson’s Research and the following funding partners: Avid Radiopharmaceuticals, Abbott, Biogen Idec, Covance, Bristol-Myers Squibb, Meso Scale Discovery, Piramal, Eli Lilly and Co, F. Hoffman-La Roche Ltd, GE Healthcare, Genentech, GlaxoSmithKline, Merck and Co, Pfizer Inc, and UCB Pharma SA.

The study was published in the peer reviewed medical journal, Neurology.

BBC News’s reporting of the study was accurate and included some useful quotes from independent experts.

 

What kind of research was this?

This was a prospective cohort study that aimed to look at the course of mental health and cognition symptoms over two years in people with newly diagnosed Parkinson’s disease.

Parkinson’s is a neurological condition caused by a lack of the chemical dopamine in the brain that affects the nerve cells. This causes characteristic symptoms including tremor, rigidity and slow movements. Mental health symptoms including dementia, depression, anxiety, and sometimes psychosis (such as hallucinations and delusions), have also long been associated with Parkinson’s.

However, as the researchers say, it is unclear to what extent these “neuropsychiatric symptoms” are caused by the general degeneration of the nerve cells that occurs in Parkinson’s, or whether they could be caused by other psychosocial factors. Another possibility is that they could arise as side effects of the drugs often used to treat Parkinson’s.

So looking at a newly diagnosed, untreated population of people with Parkinson’s and following them through the first two years of their condition should help to see how these mental health symptoms develop and progress. 

 

What did the research involve?

This research was called the Parkinson’s Progression Markers Initiative (PPMI) study, which was an international study conducted in 16 US and five European sites. The study enrolled 423 people with newly diagnosed Parkinson’s disease, who met diagnostic criteria for the condition, had not yet received any treatment and were currently free from dementia. As a comparison group they enrolled 196 healthy controls without the condition.

A subset of people with Parkinson’s and healthy controls were assessed at baseline, 12-month and 24-month follow-up. People with Parkinson’s only had also been assessed at six months.

The assessments at baseline and each follow-up point included:

  • depression on the Geriatric Depression Scale
  • cognitive ability on the Montreal Cognitive Assessment (MoCA)
  • impulsive behaviour (compulsive or repetitive behaviours due to poor control, such as gambling, sexual, eating, excessive wandering) on the Questionnaire for Impulsive-Compulsive Disorders in Parkinson’s Disease
  • excessive daytime sleepiness on the Epworth Sleepiness Scale and other sleep disorders on the REM sleep behaviour disorder screening questionnaire
  • movement disorders and other aspects of disease severity on the Movement Disorders Society Unified Parkinson’s Disease Rating Scale
  • anxiety on the State-Trait Anxiety Inventory
  • sense of smell on the University of Pennsylvania Smell Identification Test

People with Parkinson’s could start treatment with dopamine replacement therapy (often levodopa) at any time after diagnosis. Dopamine replacement therapy is designed to help improve symptoms, though side effects can be wide ranging.

They were considered to have received treatment if they had been prescribed it for at least one year, and were still prescribed the treatment at the end of the study (the two-year follow-up). Treatment had been started by 9.6% of patients with Parkinson’s disease at six months, by 58.8% at 12 months and 81.1% at 24 months.

Comparisons were made between the Parkinson’s and control groups.

 

What were the basic results?

Overall, people with Parkinson’s had significantly more symptoms of depression, anxiety, fatigue and apathy at all time points compared to controls, and symptoms of apathy and psychosis increased over time in the people with Parkinson’s.

Depression

At enrolment 13.9% of people with Parkinson’s disease and 6.6% of healthy controls screened positive for depression on the GDS.

There was a non-significant increase to 18.7% of people with Parkinson’s disease having depression at 24 months, compared to a decrease to 2.4% in the health control group. The proportion of people with Parkinson’s disease taking an antidepressant increased from 16% at baseline to 25% at 24 months.

Cognition

The average MoCA score of people with Parkinson’s disease decreased significantly from 27.1 at baseline to 26.2 at month 24. The cutoff for mild cognitive impairment is below 26. Using this cutoff, 21.5% of people with Parkinson’s disease were cognitively impaired at baseline, 34.2% at 12 months, and 35.5% at 24 months. Mean scores in the health control group also decreased overtime from 28.5 at baseline to 27.7 at 24 months.

Other neuropsychiatric symptoms

The proportion of people with Parkinson’s disease with positive scores on the Movement Disorders Society Unified Parkinson’s Disease Rating Scale for fatigue and apathy at baseline was 50% and 16.7%, respectively, increasing to 61.5% and 30.2% at 24 months. These proportions were significantly higher than the health control group at all timepoints. Similarly, anxiety symptoms were significantly higher in the Parkinson’s disease than health control group at all timepoints, though anxiety scores did not increase over time in the Parkinson’s disease group. The prevalence of psychosis symptoms increased in the Parkinson’s disease group from only 3.0% of people at baseline, to 5.3% at 12 months and 10% at 24 months.

The proportion of people with Parkinson’s disease with impulsive behaviour symptoms was 21% at baseline and did not significantly increase during follow-up; nor was there a significant difference between Parkinson’s disease and health controls at any time point. There was a trend for daytime sleepiness symptoms to increase in people with Parkinson’s disease, but again no significant difference was seen compared with health controls.

Relation to treatment

At 24 months, 81% of people with Parkinson’s disease had started dopamine replacement therapy, and 43.7% had been taking it for at least one year. This group reported significantly more new problems with impulse control and excessive daytime sleepiness compared to baseline.

 

How did the researchers interpret the results?

The researchers conclude that multiple neuropsychiatric problems are more common in newly diagnosed, untreated people with Parkinson’s compared with the general healthy population. These problems tend to remain relatively stable in early disease, while cognition slightly deteriorates. Starting dopamine replacement treatment is associated with increasing frequency of several other neuropsychiatric problems.

 

Conclusion

This cohort study benefits from its prospective design, following a group of people newly diagnosed with Parkinson’s disease across the course of two years compared to a group of healthy controls. It also benefits from being an international, multicentre study including a fairly large sample size, and from conducting regular symptom assessments using a series of validated tools.

However, there was quite a high loss to follow-up. From 423 people with Parkinson’s assessed at study start, 62% were available for 12 month follow-up, and only 23% at 24 months. This is an important limitation that may affect the reliability of the results.

The study demonstrates that people with Parkinson’s already at the time of diagnosis seemed to have higher symptoms of depression, anxiety, fatigue and apathy than the healthy controls. The proportion of people with Parkinson’s who had fatigue and apathy increased over the two years. Also the proportion with symptoms of psychosis, though low, did increase throughout the study.

Cognitive ability deteriorated significantly over the two years of the study in people with Parkinson’s disease.

The use of dopamine replacement treatment was associated with the development of new symptoms of impulse control and excessive daytime sleepiness. However, these results were based on a small sample.

Therefore, the study provides us with an indication that certain mental health symptoms of depression, anxiety, fatigue and apathy may already be present at the time that Parkinson’s is first diagnosed.

This suggests that these symptoms are not likely to be caused by Parkinson’s treatment, as the people hadn’t started treatment yet, but it can’t really tell us much more about how they developed.

It seems possible that they may be caused by the general nerve degeneration process that happens in the development of Parkinson’s. However, we don’t know whether these symptoms may have been present long before the person developed Parkinson’s (such as whether the person had a lifetime history of depression and anxiety problems). Therefore, we don’t know overall whether they are caused by the Parkinson’s disease process.

It could be the case that there are other genetic, health psychosocial or lifestyle factors involved in the relationship that may put the person at risk of both these mental health conditions and Parkinson’s.

This study is a valuable contribution to the research into Parkinson’s disease and its associated mental health symptoms. But unfortunately it provides no solid answer to the direct cause of development all of these symptoms.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Depression 'more common' in early Parkinson's. BBC News, August 16 2014

Links To Science

de la Riva P, Smith K, Xie SX, et al. Course of psychiatric symptoms and global cognition in early Parkinson disease. Neurology. Published online August 15 2014

Categories: NHS Choices

Caution urged over CT scan radiation doses

NHS Choices - Behind the Headlines - Fri, 15/08/2014 - 11:24

BBC News reports on a sharp rise in the number of CT scans being performed, exposing people to the potential health risks of radiation.

However, as The Daily Telegraph says, it is not possible to calculate the cancer risk due to exposure to CT scans because there is a lack of data.

These media stories follow the publication of a report by the Committee on Medical Aspects of Radiation in the Environment (COMARE). COMARE has reviewed trends in the use of CT scans in the UK. The review weighs up the risk-benefit balance of using CT scans, and considers ways to obtain the best quality scan image while minimising the necessary radiation dose.

The COMARE report sets out good practice guidance, encouraging doctors to take a more “proactive approach” to protecting patients and reducing radiation doses.

The committee recommendations cover equipment and procedures already in place, but also note there are dose reduction features available on some of the newer CT scanning machines that should be considered when new equipment is purchased.

 

What is COMARE and why is it looking at CT scans?

The Committee on Medical Aspects of Radiation in the Environment (COMARE) is an independent expert advisory committee, set up in 1985 to assess the available evidence and advise the government on the health effects of any form of natural or man-made radiation.

This is the committee’s 16th major published report. It follows a request from the Department of Health to assess the data available on radiation exposure during CT scans. The report looked at whether radiation exposures were justified (whether the benefits of the CT scans outweighed the risks). It also looked at ways to optimise the benefits of CT scans while minimising the risk to patients.

 

What does COMARE’s report say about the use of CT scans?

We are exposed to many sources of radiation, with the majority of radiation exposure coming from natural, environmental sources. Figures from the US show that on average, each year in the 1980s, only 15% of radiation exposure came from medical sources (0.54 millisievert [mSv] per person per year) – the rest from natural sources. By 2006, radiation exposure coming from medical sources each year had leapt to almost 50% of total annual radiation exposure (2.98mSv).

By contrast, the UK’s less medically intensive culture means that only 15% of our radiation exposure comes from medical sources. However, it has still increased from 0.33mSv per person per year in 1997, to 0.4mSv in 2008.

CT scans account for much of this exposure. In the 1980s CT scans were only contributing around a quarter of the medical radiation dose in the UK, but this had increased to around two-thirds by 2008. The number of CT scans performed by the NHS in England each year increased from just over 1 million in 1996/97, to almost 5 million by 2012/13, with no sign of reaching a plateau.

The report says there has been wider use of CT scans among younger people and children, whose tissues may have greater sensitivity to radiation. They also, of course, have a longer lifespan ahead of them in which potential harmful effects may be observed.

 

How do the risks and benefits of CT scans compare?

A CT scan is a special type of X-ray that produces very accurate cross-section views of the inside of the body.

The COMARE report highlights how CT scans can:

  • improve diagnosis and staging of cancers
  • reduce need for unnecessary “exploratory surgery” or other invasive examinations
  • demonstrate response to treatments
  • help the treatment of certain conditions, such as guiding biopsies and treatments for stroke or heart disease

However, it says that 70% of indications for CT scans recommended by guidance relate to benign (non-harmful) or potentially benign conditions. It says that CT scans are increasingly being used as a standard investigation, replacing other conventional ways of detecting health problems.

There are potential risks related to radiation. Radiation can cause immediate direct damage to body tissues (such as radiation burns and hair loss), although usually only when given at higher doses. More problematically, radiation is also recognised as a carcinogen. It could potentially be involved in the future development of cancers for the person being scanned, or potentially having genetic effects in any future children.

Overall, there is uncertainty about the level of risk from radiation from CT scans. The risk to anyone is influenced by many factors, including age and size, the part of the body being scanned, number of scans given and radiation dose, and the radiosensitivity and genetic susceptibility of the individual.

Studies to date examining radiation risk are often population-based studies that have not accounted for important factors such as the age or medical prognosis of that person, making it difficult to attribute radiation as the direct cause of any outcomes.

UK law means that medical radiation exposures for patients must be:

  • “Justified” – exposure producing sufficient benefit to the exposed individual to outweigh the potential risk of exposing to radiation
  • “Optimised” – procedures and techniques should be in place to keep radiation exposures as low as reasonably practical

 

What does the COMARE report recommend?

COMARE recommends encouraging a more proactive approach to protecting the patient and reducing radiation dose as part of its good practice advice.

It wants:

  • the UK to be actively involved in further research into the risks of radiation
  • Public Health England to undertake more frequent UK dose surveys to provide data to support regular updating of national diagnostic reference levels, including those specifically regarding children. COMARE advises the Department of Health to require healthcare providers to submit dose data for individual patients
  • hospitals to consider CT scanners that have a full range of dose reduction features when buying new equipment
  • the Department of Health to fund, if necessary, independent evaluation of CT scanners
  • the Royal College of Radiologists to work to ensure that CT scans are optimised, taking into account both image quality and dose. This would mean requests for CT scans needing to include a clear statement regarding the clinical question to be answered by the scan
  • the Royal College of Radiologists and other appropriate organisations to review and produce referral guidelines that give greater emphasis on alternative imaging techniques that use less or no ionising radiation

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Sharp rise in CT scans on children and adults. BBC News, August 14 2014

Cancers caused by CT scan cannot be calculated due to lack of data: Comare report. The Daily Telegraph, August 15 2014

Cancer fears prompt call to cut hospitals' CT scan radiation levels. The Guardian, August 15 2014

Categories: NHS Choices

Macmillan finds cancer survival 'postcode lottery'

NHS Choices - Behind the Headlines - Fri, 15/08/2014 - 11:24

“Cancer postcode lottery ‘costs 6,000 lives a year’,” reports The Times.

This, and similar headlines, are based on cancer survival figures compiled by Macmillan Cancer Support. The cancer charity’s report suggests that the proportion of people who die within a year of a cancer diagnosis is two-thirds higher in poor-performing areas, compared with high-performing areas.

These are shocking statistics, but it’s important to bear in mind that one-year cancer survival rates don’t give us the whole picture about the state of cancer care in England.

In a press release, MacMillan reports that around 6,000 more people could survive for at least 12 months after their cancer diagnosis if average survival across the whole of England matched the top 10% of local healthcare regions.

It identified areas such as Telford, Medway and Dagenham as having among the lowest cancer survival rates. Leafy Surrey, Dorset and Richmond had among the best cancer survival rates, according to the charity.

MacMillan suggests that the differences in survival could be explained by differences in waiting times for urgent referrals and start of treatment, which should be a set standard across the country. The charity calls for this “looming crisis” in cancer care to be addressed.

 

What does the MacMillan cancer survival report say?

MacMillan used data from the Office for National Statistics (ONS) and London School of Hygiene and Tropical Medicine to find the estimates for one-year survival for all types of cancer combined for all adults (aged 15 to 99) in 2011.

The average one-year survival for the whole of England was 68%. This means that roughly two-thirds of all people in England diagnosed with cancer survived for 12 months after they were diagnosed, and a third of people died by 12 months. In the 10% of regions with the best one-year survival rates in the UK, one-year survival was almost three-quarters, at 71%.

 

What is the reason for the differences in one-year cancer survival?

Links To The Headlines

Cancer patients: best and worst places to live for survival revealed. The Daily Telegraph, August 15 2014

Postcode lottery 'is killing 6,000 cancer patients every year': Proportion who die within year of diagnosis is two-thirds higher in worst-performing areas than the best. Mail Online, August 15 2014

Cancer treatment in England: 'Inexcusable postcode lottery' causes 6,000 'needless' deaths. Daily Express, August 15 2014

Categories: NHS Choices

High-salt diet linked to 1.6 million heart deaths

NHS Choices - Behind the Headlines - Thu, 14/08/2014 - 11:20

"Salty diet 'causes 1.6 million deaths worldwide each year'," reports The Daily Telegraph. It goes on to quote a researcher saying this is "nearly 1 in 10 of all deaths from cardiovascular causes worldwide".

This scary-sounding headline has a grain of truth in it, but the science it's based on doesn't prove that salt is causing these deaths. In fact, the news is based on a modelling study.

To estimate the effect of current sodium intake on cardiovascular mortality worldwide, researchers used available data on:

  • sodium consumption
  • the dose-response effects of sodium consumption on blood pressure
  • the association between blood pressure and cardiovascular mortality
  • data on cause-specific deaths

Globally, 1.65 million deaths from cardiovascular causes in 2010 were attributed to people consuming more than 2g of sodium per day. That's roughly 5g of salt a day. Currently, UK advice is for adults to eat no more than 6g of salt a day.

But this study could not prove that sodium restriction reduces cardiovascular mortality. This means the findings are generally in keeping with current salt recommendations that adults consume no more than 6g of salt a day.

 

Where did the story come from?

The study was carried out by researchers from Tufts University, the Harvard School of Public Health, Brigham and Women's Hospital, Harvard Medical School, and the University of Washington in the US, and the Cambridge Institute of Public Health and Imperial College London in the UK.

It was funded by the Bill and Melinda Gates Foundation.

The study was published in the peer-reviewed New England Journal of Medicine. This article was open access, which means it's free to view online.

The media coverage is generally representative of this research, but it's worth bearing in mind that the study's results are estimates only. Also, the link between sodium and death has only been indirectly assessed by examining sodium's effect upon blood pressure, and then blood pressure's effect upon cardiovascular death.

 

What kind of research was this?

This was a modelling study that aimed to estimate the effects of sodium intake on cardiovascular deaths around the world.

This modelling study can estimate how many deaths from cardiovascular disease can be attributed to a sodium intake over 2g.

However, it does not prove that sodium consumption of more than 2g a day caused any of these deaths, or that sodium restriction reduces cardiovascular mortality.

 

What did the research involve?

The researchers modelled the effects of sodium intake on cardiovascular mortality around the world. They estimated the fraction and numbers of deaths estimated to be attributable to sodium intake above a reference level of 2g of sodium a day.

To do this, the researchers needed estimates of global sodium consumption, the effect of sodium intake on blood pressure, and the effect of blood pressure on cardiovascular deaths.

Estimating global sodium consumption

Previously conducted national or subnational surveys on individual level sodium consumption were systematically traced by the researchers. These surveys were based on measurements of sodium in urine, or estimates of sodium intake in the diet, or both. The researchers quantified consumption according to age, sex and country.

Assessing the effect of sodium intake on blood pressure

The researchers carried out a meta-analysis of all randomised controlled trials identified in two prior systematic reviews that had evaluated the effect of reduced sodium on blood pressure. They looked at the effects according to age, race and the presence or absence of hypertension.

Assessing the effects of blood pressure levels on deaths caused by cardiovascular disease

The effect of blood pressure levels on deaths as a result of cardiovascular disease was assessed by combining results from two large international projects (including 99 cohorts, comprising a total of 1.38 million participants, among whom there were 65,000 cardiovascular events) that pooled individual level data. The researchers looked at the effects according to age.

The number of people who die from cardiovascular disease was estimated from the Global Burden of Disease Study 2010.

 

What were the basic results?

The researchers estimated the average level of consumption of sodium worldwide was 3.95g a day and regional averages ranged from 2.18g to 5.51g a day. From their meta-analysis of randomised controlled trials, they found reducing sodium intake reduced blood pressure.

Each reduction of 2.30g of sodium a day was associated with a reduction of 3.82mmHg in blood pressure, although the effects depended on population characteristics such as age and race.

They also found lower blood pressure was associated with a reduced risk of cardiovascular death.

The researchers calculated nearly 1 of every 10 deaths from cardiovascular causes (1.65 million deaths a year, 9.5% of all cardiovascular deaths) is attributed to a sodium intake of more than 2g a day.

Four of every five deaths (84.3%) occurred in low- and middle-income countries, and two of every five deaths (40.4%) were premature (before 70 years of age).

The rate of death from cardiovascular causes associated with sodium intake above the reference level was highest in Georgia and lowest in Kenya.

 

How did the researchers interpret the results?

The researchers concluded that: "In this modelling study, 1.65 million deaths from cardiovascular causes that occurred in 2010 were attributed to sodium consumption above a reference level of 2.0g per day."

 

Conclusion

Links To The Headlines

Salty diet 'causes 1.6 million deaths worldwide each year'. The Daily Telegraph, August 13 2014

Links To Science

Mozaffarian D, et al. Global Sodium Consumption and Death from Cardiovascular Causes. New England Journal of Medicine. Published August 14 2014

Categories: NHS Choices

Is UK obesity fuelling an increase in 10 cancers?

NHS Choices - Behind the Headlines - Thu, 14/08/2014 - 11:00

“Being overweight and obese puts people at greater risk of developing 10 of the most common cancers,” reports BBC News.

The news is based on research using information in UK GP records for more than 5 million people, to see whether body mass index (BMI) was associated with 22 types of common cancers.

The researchers found that increasing BMI was associated with increased risk of several types of cancer. Some of these associations weren’t linear, meaning that there wasn’t always a steady increase in cancer risk with increased BMI. Additionally, some of the links seemed to be dependent on individual patient characteristics, such as gender and menopausal status.

The researchers estimated that 41% of uterine and 10% or more of gallbladder, kidney, liver and colon cancers could be attributable to excess weight.

However, increasing BMI was also found to decrease the risk of some types of cancer (such as prostate and premenopausal breast cancer).

The researchers suggest that BMI affects cancer risk through a number of different processes. However, the study was not able to demonstrate that being overweight or obese directly increase or decrease risk of these cancers, nor is it able to show the biological reasons for any of the associations found.

It is also not able to account for all possible factors that contribute to cancer risk, such as genetics and lifestyle factors.

Nevertheless, maintaining a healthy weight has proven benefits beyond any reduction in cancer risk. As always, the best way to do this is by eating a balanced diet and exercising regularly.

 

Where did the story come from?

The study was carried out by researchers from the London School of Hygiene and Tropical Medicine, and the Farr Institute of Health Informatics Research. The study was funded by the National Institute for Health Research, the Wellcome Trust and the Medical Research Council.

The study was published in the peer-reviewed medical journal The Lancet. This article is open-access and can be accessed for free on the journal’s website.

The story was widely covered by the media.

 

What kind of research was this?

This was a cohort study that aimed to investigate the link between BMI and the most common site-specific cancers after adjusting for potential confounders.

As this is a cohort study, it cannot prove that obesity causes cancer, as there may be a wide variety of other factors (such as hereditary, sociodemographic and lifestyle factors) that could explain the associations seen.

 

What did the research involve?

The researchers studied primary care (GP) records from 5.24 million people, using data collected between 1987 and 2012.

They calculated BMI from recorded weight and height, both of which are recorded by GPs when patients are registered, during patient care, or because the GP thinks it’s relevant to the patients’ health.

The researchers then looked to see if people had a cancer diagnosis in their records, in particular:

  • female breast cancer
  • prostate cancer
  • mouth, oesophageal, stomach, colon and rectum cancers
  • lung cancer
  • non-Hodgkin lymphoma
  • leukaemia and multiple myeloma (blood cancers)
  • ovary, uterus (womb) and cervix cancers
  • pancreas, brain and central nervous system cancers
  • liver and gallbladder cancer
  • kidney and bladder cancer
  • thyroid cancer
  • malignant melanoma

The researchers looked to see whether BMI was linked with increased risk of cancer. They estimated the average effect of a 5kg/m² increase in BMI on cancer risk.

They controlled for age, smoking status, alcohol use, previous diabetes diagnosis, socioeconomic status, time period and gender in their analyses.

 

What were the basic results?

People were followed for 7.5 years on average, and during the study, 166,995 people (3.2%) developed one of the cancers of interest.

The researchers found that a 5kg/m² increase in BMI was associated with an increased risk of the following types of cancer:

  • uterus (hazard ratio (HR) 1.62, 99% confidence interval (CI) 1.56 to 1.69)
  • gallbladder (HR 1.31, 99% CI 1.12 to 1.52)
  • kidney (HR 1.25, 99% CI 1.17 to 1.33)
  • cervix (HR 1.10, 99% CI 1.03 to 1.17)
  • leukaemia (HR 1.09, 99% CI 1.05 to 1.13)
  • liver (HR 1.19, 99% CI 1.12 to 1.27)
  • colon (HR 1.10, 99% CI 1.07 to 1.13)
  • ovarian (HR 1.09, 99% CI 1.04 to 1.14)
  • postmenopausal breast cancers (HR 1.05, 99% CI 1.03 to 1.07)

There was a borderline statistically significant increase in the risk of thyroid cancer (HR 1.09, 99% CI 1.00 to 1.19), pancreatic cancer (HR 1.05, 95% CI 1.00 to 1.10) and cancer of the rectum (HR 1.04, 95% CI 1.00 to 1.08).

The researchers noted that not all the associations were linear, and that the associations between BMI and both colon and liver cancer were more marked in men than in women. Increases in ovarian cancer risk with BMI were larger in premenopausal than postmenopausal women, and there were differences by menopausal status for breast cancer.

The researchers estimated that 41% of uterine and 10% or more of gallbladder, kidney, liver and colon cancers could be attributable to excess weight.

A 5kg/m² increase in BMI was associated with a reduced risk of the following types of cancer:

  • premenopausal breast cancer risk (HR 0.89, 99% CI 0.86 to 0.92)
  • oral cavity (HR 0.81, 99% CI 0.74 to 0.89)
  • lung (HR 0.82. 99% CI 0.81 to 0.84)

There was a borderline statistically significant reduction in the risk of prostate cancer (HR 0.98, 99% CI 0.95 to 1.00).

The researchers noted that when the analysis was restricted to people who had never smoked, a 5kg/m² increase in BMI did not reduce the risk of oral cavity or lung cancer. They suggest that this inverse association seen when all people were considered was due to residual confounding.

Overall, the researchers estimated that a 1kg/m² population-wide increase in BMI would result in 3,790 additional annual UK patients developing cancer of the uterus, gallbladder, kidney, cervix, thyroid, leukaemia, liver, colon, ovarian or postmenopausal breast cancer.

 

How did the researchers interpret the results?

The researchers concluded that, “BMI is associated with cancer risk, with substantial population-level effects. The heterogeneity in the effects suggests that different mechanisms are associated with different cancer sites and different patient subgroups.”

 

Conclusion

This large UK cohort study of more than 5 million people has found that, although there was variation in the effect of BMI on different cancers, a higher BMI was associated with increased risk of several cancers.

Overall, the researchers estimated that a 1kg/m² population-wide increase in BMI would result in 3,790 additional people in the UK each year developing uterus, gallbladder, kidney, cervix, thyroid, leukaemia, liver, colon, ovarian or postmenopausal breast cancer.

However, not all of the identified links were completely clear, with some showing a clearer linear association between increasing BMI and increasing cancer risk than others. Also, strangely, increased BMI was also found to decrease the risk of some types of cancer, such as lung cancer. Such associations may be explained by other factors: for example, smokers – who are obviously at a much higher risk of lung cancer – tend to have a lower BMI than non-smokers.

However, this study is unable to demonstrate that being overweight or obese definitely directly increase or decrease the risk of these cancers. The researchers suggest that BMI affects cancer risk through a number of different processes. The study is also not able to account for all possible factors that may be entangled in the links (such as various hereditary, sociodemographic and lifestyle factors).

Nevertheless, it is well established that maintaining a healthy weight has many health benefits, including reducing the risk of many common chronic diseases. The best way to do this is by eating a balanced diet and exercising regularly.

 

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Obesity epidemic fuelling 12,000 cancers a year. The Daily Telegraph, August 14 2014

Obesity is blamed for 12,000 cancer cases every year: Being overweight can increase chance of developing some forms of the disease by 60%. Mail Online, August 14 2014

Being overweight or obese 'linked to 10 common cancers'. BBC News, August 14 2014

Obesity increases risk of 10 common cancers, study finds. The Independent, August 14 2014

12,000 cancer cases a year a linked to obesity. Daily Express, August 14 2014

Links To Science

Bhaskaran K, et al. Body-mass index and risk of 22 specific cancers: a population-based cohort study of 5.24 million UK adults. The Lancet. Published August 14 2014

Categories: NHS Choices

Anti-obesity drugs 'may still work in middle-age'

NHS Choices - Behind the Headlines - Wed, 13/08/2014 - 11:27

“Drug to halt the dreaded spread of middle age,” reports The Daily Telegraph, with similar headlines on the Daily Express and Daily Mail websites.

However, these claims are rather premature given the research they’re based on anti-obesity drugs that aren’t licensed for use in the UK. Also, the study in question involved mice, not people.

Researchers compared middle-aged, obese mice to healthy young mice. They found that existing, but unlicensed, anti-obesity medications (lorcaserin, d-fenfluramine and sibutramine) reduced food intake to a similar extent in both groups of mice.

Our brains change as we get older or more obese, leading to a “rewiring” of the parts involved in energy balance. It was thought that anti-obesity medications that work on this part of the brain might not work in older, fatter people because of the rewiring. But this study suggests that despite the rewiring, the brain machinery needed for these drugs to work still functions – at least in mice.

This research is likely to help in the development of future weight loss drugs. But for now, consuming fewer calories and burning more calories off with regular brisk walking is a better defence against middle-aged spread than holding out for a miracle weight loss pill any time soon.

Where did the story come from?

The study was carried out by researchers from the University of Cambridge and the University of Aberdeen in collaboration with researchers from the University of Michigan Medical School in the US and the Consejo Nacional de Investigaciones Científicas y Técnicas in Argentina. It was funded by Diabetes UK, the Wellcome Trust, the National Institutes of Health, and the MRC Centre for Study of Obesity and Related Disorders.

The study was published in the peer-reviewed journal Endocrinology. The article is open access, meaning it can be accessed and read free of charge.

The media reporting of the story was generally accurate, but the headline claims that there could be a pill to stop middle-aged spread aren’t quite right. Two of the anti-obesity treatments (d-fenfluramine and sibutramine) tested in this study have been withdrawn from clinical use due to off-target effects. The other drug, lorcaserin (brand name Belviq), was approved by the US FDA in 2012, but it is not approved in Europe and appears unlikely to be approved.

What kind of research was this?

This was an animal study.

The researchers report that both obesity and ageing are associated with rewiring of the main brain pathway involved in energy homeostasis. This leads to reduced activity of a group of brain cells called the pro-opiomelanocortin (POMC) neurons, which are found in the hypothalamus. The POMC neurons make hormones that are important in regulating appetite and body weight.

A number of anti-obesity drugs (lorcaserin, d-fenfluramine and sibutramine) work by increasing the activity of the neurotransmitter serotonin, increasing the activity of the POMC neurons.

The researchers were concerned that the anti-obesity drugs may not work in older, obese people due to reduced activity of these neurons. They performed a number of experiments in mice to determine whether the drugs work in older, obese mice.

Animal studies are ideal for this type of basic research, but trials on humans are required before an assessment of the benefits and risks of anti-obesity medications can be made.

What did the research involve?

The researchers initially confirmed that the anti-obesity drugs work by increasing the activity of the POMC neurons. They did this by comparing the food intake of normal mice with that of mice genetically engineered to lack POMC neurons that had been given anti-obesity drugs.

The researchers then tested whether the anti-obesity drugs reduced the appetite of older, obese mice who had the POMC neurons. They tested the effect of lorcaserin, d-fenfluramine and sibutramine on normal mice, young adult mice (three to five months old) and middle-aged mice (12 to 14 months old, the equivalent of a human 40 year old according to the authors). The middle-aged mice were heavier and fatter than the young adult mice.

What were the basic results?

Food intake was significantly reduced (described as an anorectic effect) in normal mice after they were given anti-obesity drugs. However, food intake was not significantly changed in the genetically engineered mice that did not have the POMC neurones.

The researchers found that young adult mice and middle-aged mice reduced food intake to a similar extent after being given the anti-obesity drugs.

The researchers went on to do further brain studies. These found there is similar gene expression in young and middle-aged mice, and that the serotonin signalling machinery in POMC neurons still functions as well in middle-aged mice as in young mice.

How did the researchers interpret the results?

The researchers conclude that serotonin obesity medications require POMC neurons to have an effect on appetite. And while this pathway is remodelled with ageing, the anatomical machinery is preserved and appetite suppressive effects are maintained in older mice. They say that these findings are of clinical significance to the global ageing obese population.

 

Conclusion

This animal study has found that anti-obesity medications that increase serotonin signalling reduce food intake in “middle-aged, obese” mice to a similar extent as in young mice.

There had been concern as both obesity and ageing are associated with rewiring of the main brain pathway involved in energy homeostasis. The “rewiring” leads to reduced activity of POMC neurons, found in the hypothalamus and these POMC neurons make hormones that are important in regulating appetite and body weight.

A number of anti-obesity drugs (lorcaserin, d-fenfluramine and sibutramine) work by increasing the activity of the neurotransmitter serotonin, increasing the activity of the POMC neurons. As a result of the rewiring changes, it was thought these changes might mean that anti-obesity medications wouldn’t work.

From the results of this study, it seems that although POMC neurons may become less active as animals become older and fatter, they can be stimulated to become active by certain drugs – at least in mice.

However, claims there could be a pill to stop middle-aged spread are not strictly true – as we’ve seen, this study simply found that drugs continued to work in older subjects. Further, two of the anti-obesity treatments tested in this study have been withdrawn from clinical use due to off-target effects (d-fenfluramine and sibutramine). The other drug, lorcaserin, was approved by the US FDA in 2012, but it is not approved in Europe and appears unlikely to be approved here.

For now, exercise and eating healthily is the best defence against middle-aged spread.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

A drug could soon destroy middle-age spread: Pill could stop cells that control appetite becoming lazy with age, causing us to overeat. Mail Online, August 13 2014

Drug to halt the dreaded spread of middle age. The Daily Telegraph, August 13 2014

Scientists discover secret to losing weight in middle-age. Daily Express, August 13 2014

Links To Science

Burke LK, et al. 5-HT obesity medication efficacy via POMC activation is maintained during aging. Endocrinology. Published July 22 2014

Categories: NHS Choices

Salt injections: not a cure for cancer

NHS Choices - Behind the Headlines - Wed, 13/08/2014 - 11:27

“Salt injection ‘kills cancer cells’ by causing them to self-destruct,” reports the Mail Online.

Despite this headline, there is no new treatment for cancer using salt. The Mail Online reports on an early phase of experiments in laboratories that have worked out how increasing the amount of sodium chloride (salt) within a cell causes it to die.

The researchers did not inject cancer with salt, although they did create a way of getting salt inside cells (but not with a needle and syringe, as you may imagine from the headlines). In fact, they made two new molecules that bind to chloride and take it into cells. This increase in chloride also causes sodium to move into the cell, leading to an increase in sodium chloride.

Scientists already knew that increasing the level of salt within a cell would cause the cell to die, but wanted to know why.

The researchers found that increasing the salt level within normal and cancer cells in the laboratory caused cell death through one of the natural mechanisms, called the “caspase-dependent pathway”. This is a different pathway for cell death than the ones currently triggered by cancer drugs. The researchers hope this knowledge can be used to develop new drugs to treat cancer.

 

Where did the story come from?

The study was carried out by researchers from South Korea, the US, UK and Saudi Arabia. It was funded by the National Creative Research Initiative programme in South Korea, the US Department of Energy, the Engineering and Physical Sciences Research Council and a European Union Marie Curie Career Integration grant.

The study was published in the peer-reviewed journal Nature Chemistry.

Although most of the Mail Online’s coverage of this study was accurate, the headlines implied that cancer can be killed by injecting cells with salt. This is not the case. Researchers have found out how cells (both healthy cells and cancerous cells) die when there are increased levels of salt inside them. It is important to note that they have only done this in cells in a laboratory, not in any humans or other living creatures.

 

What kind of research was this?

This was a series of laboratory experiments designed to test compounds that the researchers designed as chloride transporters. They also wanted to better understand how cell death occurs when there is increased sodium chloride within the cell. Understanding the mechanism means that future research can look at ways of targeting it in cancer cells, but avoiding their healthy counterparts.

 

What did the research involve?

A number of molecular experiments, using cell membranes, were carried out to test compounds that the researchers designed as chloride transporters. After this, they worked out the underlying mechanisms behind cell death by increasing the salt level in cancer cells.

The researchers studied the effect the compounds had on the amount of sodium that then entered the cells through sodium channels, and whether it affected other positive ions, such as potassium and calcium.

The researchers then studied normal human cells from the prostate and lung, as well as rat kidney cells and human cancer cells from the lung, pancreas, colon and cervix, in the lab. These studies aimed to determine how increasing the amount of sodium chloride (salt) within the cells caused them to die.

Further experiments involved reducing the amount of sodium or chloride outside the cells to see what effect this would have on the ability of the cell to increase the level of salt. The drug amiloride (used to treat high blood pressure and heart failure) was used to test the effect of blocking the sodium channels.

 

What were the basic results?

The researchers made two new molecules, which attach to chloride and increase the amount that enters cells. The increased amount of chloride in the cells caused more sodium to enter. This excess sodium chloride triggered cell death through the “caspase-dependent pathway” (a different pathway to the ones usually induced by cancer drugs). Cell death occurred in all types of cells used – both healthy and cancerous cells.

The molecules were found to have no effect on the levels of potassium or calcium in the cells.

Cell death from this pathway did not occur when the concentration of sodium or chloride outside of the cells was low. Nor did it occur when cells were soaked in amiloride, which prevents increased sodium from entering the cells. These experiments indicated that increased levels of both chloride and sodium (in other words, salt) were required inside the cell to trigger cell death from the caspase-dependent pathway.

 

How did the researchers interpret the results?

The researchers conclude that, “synthetic transporters can be used to induce an influx of Cl- [chloride] as well as Na+ [sodium], and that this leads to an increased level of reactive oxygen species (ROS), the release of cytochrome c from the mitochondria and induction of apoptotic cell death via the caspase-dependent pathway”. They go on to say that “ion transporters, therefore, represent an attractive approach for regulating cellular processes that are normally controlled tightly by homeostasis”.

 

Conclusion

This is an early phase in the development of new drugs to combat cancer, and it should be stressed that these experiments did not involve humans or injecting cancer with salt. There is no new treatment for cancer using salt.

This research has, however, shed light on how increasing the salt level in cells can trigger the activation of one of the cell’s pathways for causing cell death.

Two different molecules were developed that transported chloride. The increased amount of chloride within the cells caused more sodium to enter. This caused cell death in a variety of different types of cancer cells in the lab, including healthy cells.

Understanding these underlying mechanisms will help pave the way for new drug developments. However, new drugs based on this science are a long way off, largely because there needs to be a way to use the technology to target only cancer cells, and not damage healthy ones. 

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Salt injection 'kills cancer cells' by causing them to self destruct...and it could pave the way for new drugs to prevent the disease. Mail Online, August 12 2014

Links To Science

Ko S-Y, et al. Synthetic ion transporters can induce apoptosis by facilitating chloride anion transport into cells. Nature Chemistry. Published August 11 2014

Categories: NHS Choices

Toothbrushing advice 'conflicting'

NHS Choices - Behind the Headlines - Tue, 12/08/2014 - 11:15

"Teeth-brushing advice unacceptably inconsistent," reports The Guardian, while the Mail Online states that a "simple, gentle scrub is best".

These headlines relate to a small literature review that found diversity in the methods of manual toothbrushing recommended by dental associations, toothpaste and toothbrush companies, dental textbooks, and experts in 10 countries. The study authors concluded that this inconsistency "should be of serious concern to the dental profession".

The diversity of advice across the countries was thought to be because of a lack of good evidence about which toothbrushing technique is most effective, which further research could address.

The study has several limitations, but these are unlikely to change its overall message. Despite its small and imperfect nature, this literature review highlights a fundamental issue in dentistry – that the toothbrushing techniques currently recommended are probably not strongly evidence based.

This research may spur dental and other related organisations to provide evidence-based guidance on oral hygiene – and to let the public know which brushing technique works best for kids and grown-ups.

 

Where did the story come from?

The study was carried out by researchers from the Department of Epidemiology and Public Health at University College London (UCL), and was published in the peer-reviewed British Dental Journal.

No funding source was reported.

Generally, the media reported the story accurately, with the Mail Online including an instructional video of a man brushing his teeth circularly. However, given the research's conclusions, there is no guarantee this is the most effective technique.

 

What kind of research was this?

The researchers say dentists, dental associations and government bodies all recommend regular daily toothbrushing because it is so important for preventing periodontal disease and caries.

However, there appears to be no consensus among professional bodies on the best method of toothbrushing for the general population, or for people of different ages or with particular dental conditions.

This study aimed to investigate this by conducting a literature review assessing methods of toothbrushing recommended for both adults and children.

 

What did the research involve?

The research involved examining online material on methods of toothbrushing from:

  • dental associations
  • toothpaste and toothbrush companies
  • associated organisations providing professional advice
  • dental textbooks

The consistency of recommendations from different sources was compared narratively.

The study mainly used simple Google and Google Scholar search strategies to identify relevant material, and focused their search remit on 10 countries they deemed to have the highest dental research and recommendation outputs: Australia, Brazil, Canada, Denmark, Finland, Japan, Norway, Sweden, the United Kingdom and the United States. Google Translate was used to translate non-English websites.

A score sheet was used to record relevant information, and the techniques were categorised based on the angle of the toothbrush bristles and the movement of the toothbrush head.

Supplementary information on toothbrushing frequency, duration and powered toothbrushing recommendation was collected.

Pictures and videos that were sourced were reviewed independently by three dentists, and a consensus view was recorded on the techniques they showed.

 

What were the basic results?

Of 66 sources located, 58 had one or more items of codeable data, while eight sources did not have any useable data. It was not possible to discern a brushing technique from 19 of the sources.

The main finding was evidence of vast diversity between recommendations on toothbrushing techniques, how often people should brush their teeth, and for how long.

Links To The Headlines

Teeth-brushing advice unacceptably inconsistent, study finds. The Guardian, August 8 2014

Revealed, the perfect way to brush your teeth: Forget fancy circular motions – a 'simple, gentle scrub is best'. Mail Online, August 8 2014

Links To Science

Wainwright J and Sheiham A. An analysis of methods of toothbrushing recommended by dental associations, toothpaste and toothbrush companies and in dental texts. British Dental Journal. Published August 8 2014

Categories: NHS Choices

Growth of newborn babies' brains tracked

NHS Choices - Behind the Headlines - Tue, 12/08/2014 - 11:15

"Scans chart how quickly babies' brains grow," reports BBC News Online.

The headline follows a fascinating study that shows newborn babies' brains are about a third the size of an adult's at birth, and rapidly grow to just over half the size of an adult's within three months.

The study involved 87 healthy babies who were given an MRI brain scan within the first week of life. Most then had a second scan after a month, and some had a third scan aged around three months. The researchers measured the size of the different major structures of the brain and calculated the growth rate.

The speed of growth was greatest just after birth, increasing by 1% per day, gradually tailing off to 0.4% per day by 90 days. The baby boys' brains were slightly larger than the baby girls' brains just after birth (347cm3 compared to 335cm3) and had grown slightly faster by 90 days (66% of the size) compared with female brains (63%).

Studies such as this can help our understanding of brain development, which could help unearth abnormal processes and certain developmental conditions. Being able to monitor brain development over time with an investigation that does not appear to have any side effects is also welcome. But this small study can't be used on its own as a reference for what's normal.

 

Where did the story come from?

The study was carried out by researchers from the University of California, the University of Hawaii and the Norwegian University of Science and Technology.

It was funded by the National Institutes of Neurological Disorders and Stroke, the National Institute on Drug Abuse, and the National Institute on Minority Health and Health Disparities.

A clear conflict of interest was reported by one of the study authors, who is a founder and equity holder in CorTechs Labs – a company selling software that analyses brain volumes from MRI scans and compares these volumes to norms.

The study was published in the peer-reviewed medical journal, JAMA Neurology.

The BBC reported the study accurately.

 

What kind of research was this?

This was an observational study aiming to plot the brain development of healthy babies using repeated MRI scans.

The researchers say there are usually problems obtaining a useable MRI image for newborn infants because it is hard to get the baby to stay still, head sizes change rapidly during the first few months, and the shape of the head may have been affected by birth.

To add to the difficulties, all of the neurones are already present but squeezed into a third of the size of an adult brain, making images more difficult to interpret.

The researchers wanted to chart normal development in babies who were not distressed by illness and therefore able to sleep during the scan.

This information could provide a benchmark that could help work out how and when all sorts of disorders start to occur, and therefore potentially lead to new treatments. 

 

What did the research involve?

The researchers gave 87 babies (39 boys and 48 girls) an MRI scan about a week after birth. The scan was conducted while they were asleep, so no sedation was required.

A repeat scan was performed on 57 babies after one month, and 49 of them had a third scan two months later.

The researchers measured the size of the different major structures of the brain and calculated the growth rate.

Data was collected regarding the ethnicity of the child and the mother's medical history and use of medication during the pregnancy.

Babies were excluded from the study if:

  • they had any known neurological disorders or abnormalities 
  • they had any newborn illness requiring more than one week in intensive care
  • there was a brain abnormality
  • they had overt perinatal TORCH infections (toxoplasmosis, other, rubella, cytomegalovirus or herpes simplex) at birth, or a major neurological disorder since birth 
  • there was any chromosomal anomaly
  • there were any other contraindications for MRI studies
  • the mother tested positive for HIV infection 
  • the mother had smoked tobacco cigarettes or had more than three alcoholic drinks a month during the pregnancy

 

What were the basic results?

The baby boys' brains were slightly larger than the baby girls' brains just after birth (347cm3 compared to 335cm3).

The longer the gestational age, the bigger each section of the brain, apart from the pallidum (an area that may be important in reward and motivation) and the third ventricle (a cerebrospinal fluid-filled area involved in communication between different areas of the brain).

By 90 days, the brains had grown by nearly two-thirds, with male brains growing slightly faster (66%) compared with female brains (63%).

The highest area of growth was the cerebellum at the back of the brain, which controls movement, co-ordination and balance. This had grown by 113% in males and 105% in females.

The slowest area was the hippocampus, which is known to be involved in memory formation – on average, this grew by 47%.

On average, the brains grew from 33.5% of the average size of an adult's brain to 54.9% by 90 days.

The speed of growth was greatest just after birth, at 1% per day, gradually reducing to 0.4% per day by 90 days.

Most of the babies were of mixed race (54%), then native Hawaiian/Pacific islander (22%), Asian (13%), white non-hispanic (8%) and black (1%).

 

How did the researchers interpret the results?

The researchers reported that they have "accurately mapped out early postnatal whole-brain growth trajectories for male and female infants", which they believe is the first time this has been done.

They say if the study is repeated on a larger and more diverse group of babies, this information could provide a reference point to measure brain growth in babies who have had a brain injury, and for monitoring the effects of any treatments.

 

Conclusion

This study has mapped out the growth rate of the major structures of the brain in 87 apparently healthy neonates from within a week of birth up to 90 days.

A study of this nature can help our understanding of the growth and development of the brain and our ability to monitor brain development over time. The fact the investigation had no apparent side effects is also welcome.

However, as the authors point out, the relatively small size of the study means the results cannot be used as a reference for normal development. Larger and more ethnically diverse studies would be required.

The goal of establishing data for a reference for normal development ties in with the commercial conflict of interest mentioned earlier, as one of the authors founded a company which sells software that analyses brain volumes from MRI scans and compares these volumes to norms.

Currently, brain growth is estimated using a measuring tape to chart the baby's head circumference over time.

The circumference is compared against established norms, with deviation from the norm a potential indication of problems in development and warranting further investigation.

The MRI technique offers a potentially more accurate way of measuring growth or confirming abnormality of growth.

Assessing every child's brain development through an MRI scan is not practical and is probably not the intended endpoint. So the real use of this developing knowledge and technology appears to be providing some evidence to help establish a reference for what is normal and what is abnormal. This could allow abnormalities to be detected earlier than we can currently.

However, there would still need to be a decision made about which babies should be scanned. This would most likely be those at a higher risk of developmental problems, possibly because of a family history or a traumatic birth or pregnancy.

This study highlights the importance of the first few months of life on brain development. This can be supported, if possible, by breastfeeding a baby.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

'Scans chart how quickly babies' brains grow'. BBC News, August 12 2014

Links To Science

Holland D, et al. Structural Growth Trajectories and Rates of Change in the First 3 Months of Infant Brain Development. JAMA Neurology. Published August 11 2014

Categories: NHS Choices

Pages