NHS Choices - Behind the Headlines

Subscribe to NHS Choices - Behind the Headlines feed
Constantly updated health news across a range of subjects.
Updated: 10 hours 39 min ago

Does moderate drinking reduce heart failure risk?

Tue, 20/10/2015 - 12:10

"Seven alcoholic drinks a week can help to prevent heart disease," the Daily Mirror reports. A US study suggests alcohol consumption up to this level may have a protective effect against heart failure.

This large US study followed more than 14,000 adults aged 45 and older for 24 years. It found those who drank up to 12 UK units (7 standard US "drinks") per week at the start of the study had a lower risk of developing heart failure than those who never drank alcohol.

The average alcohol consumption in this lower risk group was about 5 UK units a week (around 2.5 low-strength ABV 3.6% pints of lager a week).

At this level of consumption, men were 20% less likely to develop heart failure compared with people who never drank, while for women it was 16%.

The study benefits from its large size and the fact data was collected over a long period of time.

But studying the impact of alcohol on outcomes is fraught with difficulty. These difficulties include people not all having the same idea of what a "drink" or "unit" is.

People may also intentionally misreport their alcohol intake. We also cannot be certain alcohol intake alone is giving rise to the reduction in risk seen.

Steps you can take to help reduce your risk of heart failure – and other types of heart disease – include eating a healthy diet, achieving and maintaining a healthy weight, and quitting smoking (if you smoke).

 

Where did the story come from?

The study was carried out by researchers from Brigham and Women's Hospital in Boston, and other research centres in the US, the UK and Portugal.

It was published in the peer-reviewed European Heart Journal.

The UK media generally did not translate the measure of "drinks" used in this study into UK units, which people might have found easier to understand.

The standard US "drink" in this study contained 14g of alcohol, and a UK unit is 8g of alcohol. So the group with the reduced risk actually drank up to 12 units a week.

The reporting also makes it seem as though 12 units – what is referred to in the papers as "a glass a day" – is the optimal level, but the study cannot not tell us this.

While consumption in this lower risk group was "up to" 12 units per week, the average consumption was about 5 units per week. This is about 3.5 small glasses (125ml of 12% alcohol by volume) of wine a week, not a "glass a day".

And the poor old Daily Express got itself into a right muddle. At the time of writing, its website is actually running two versions of the story. 

One story claims moderate alcohol consumption was linked to reduced heart failure risk, which is accurate. 

The other story claims moderate alcohol consumption protects against heart attacks, which is not accurate, as a heart attack is an entirely different condition to heart failure.

 

What kind of research was this?

This was a large prospective cohort study looking at the relationship between alcohol consumption and the risk of heart failure.

Heavy alcohol consumption is known to increase the risk of heart failure, but the researchers say the effects of moderate alcohol consumption are not clear.

This type of study is the best way to look at the link between alcohol consumption and health outcomes, as it would not be feasible (or arguably ethical) to randomise people to consume different amounts of alcohol over a long period of time.

As with all observational studies, other factors (confounders) may be having an effect on the outcome, and it is difficult to be certain their impact has been entirely removed.

Studying the effects of alcohol intake is notoriously difficult for a range of reasons. Not least is what can be termed the "Del Boy effect": in one episode of the comedy Only Fools and Horses, the lead character tells his GP he is a teetotal fitness fanatic when in fact the opposite is true – people often misrepresent how healthy they are when talking to their doctor.

 

What did the research involve?

The researchers recruited adults (average age 54 years) who did not have heart failure in 1987 to 1989, and followed them up over about 24 years.

Researchers assessed the participants' alcohol consumption at the start of and during the study, and identified any participants who developed heart failure.

They then compared the likelihood of developing heart failure among people with different levels of alcohol intake.

Participants came from four communities in the US, and were aged 45 to 64 years old at the start of the study. The current analyses only included black or white participants. People with evidence of heart failure at the start of the study were excluded.

The participants had annual telephone calls with researchers, and in-person visits every three years.

At each interview, participants were asked if they currently drank alcohol and, if not, whether they had done so in the past. Those who drank were asked how often they usually drank wine, beer, or spirits (hard liquor).

It was not clear exactly how participants were asked to quantify their drinking, but the researchers used the information collected to determine how many standard drinks each person consumed a week.

A drink in this study was considered to be 14g of alcohol. In the UK, 1 unit is 8g of pure alcohol, so this drink would be 1.75 units in UK terms.

People developing heart failure were identified by looking at hospital records and national death records. This identified those recorded as being hospitalised for, or dying from, heart failure.

For their analyses, the researchers grouped people according to their alcohol consumption at the start of the study, and looked at whether their risk of heart failure differed across the groups.

They repeated their analyses using people's average alcohol consumption over the first nine years of the study.

The researchers took into account potential confounders at the start of the study, including:

  • age
  • health conditions, including high blood pressure, diabetes, coronary artery disease, stroke and heart attack
  • cholesterol levels
  • body mass index (BMI)
  • smoking
  • physical activity level
  • educational level (as an indication of socioeconomic status)

 

What were the basic results?

Among the participants:

  • 42% never drank alcohol
  • 19% were former alcohol drinkers who had stopped
  • 25% reported drinking up to 7 drinks (up to 12.25 UK units) per week (average consumption in this group was about 3 drinks per week, or 5.25 UK units)
  • 8% reported drinking 7 to 14 drinks (12.25 to 24.5 UK units) per week
  • 3% reported drinking 14 to 21 drinks (24.5 to 36.75 UK units) per week
  • 3% reported drinking 21 drinks or more (36.75 UK units or more) per week

People in the various alcohol consumption categories differed from each other in a variety of ways. For example, heavier drinkers tended to be younger and have lower BMIs, but be more likely to smoke.

Overall, about 17% of participants were hospitalised for, or died from, heart failure during the 24 years of the study.

Men who drank up to 7 drinks per week at the start of the study were 20% less likely to develop heart failure than those who never drank alcohol (hazard ratio [HR] 0.80, 95% confidence interval [CI] 0.68 to 0.94).

Women who drank up to 7 drinks per week at the start of the study were 16% less likely to develop heart failure than those who never drank alcohol (HR 0.84, 95% CI 0.71 to 1.00).

But at the upper level of the confidence interval (1.00), there would be no actual difference in risk reduction.

People who drank 7 drinks a week or more did not differ significantly in their risk of heart failure compared with those who never drank alcohol.

Those who drank the most (21 drinks per week or more for men, and those drinking 14 drinks per week or more for women) were more likely to die from any cause during the study.

 

How did the researchers interpret the results?

The researchers concluded that, "Alcohol consumption of up to 7 drinks [about 12 UK units] per week at early middle age is associated with lower risk for future HF [heart failure], with a similar but less definite association in women than in men."

 

Conclusion

This study suggests drinking up to about 12 UK units a week is associated with a lower risk of heart failure in men compared with never drinking alcohol.

There was a similar result for women, but the results were not as robust and did not rule out the possibility of there being no difference.

The study benefits from its large size (more than 14,000 people) and the fact it collected its data prospectively over a long period of time.

However, studying the impact of alcohol on outcomes is fraught with difficulty. These difficulties include people not being entirely sure what a "drink" or a "unit" is, and reporting their intakes incorrectly as a result.

In addition, people may intentionally misreport their alcohol intake – for example, if they are concerned about what the researchers will think about their intake.

Also, people who do not drink may do so for reasons linked to their health, so may have a greater risk of being unhealthy.

Other limitations are that while the researchers did try to take a number of confounders into account, unmeasured factors could still be having an effect, such as diet.

For example, these confounders were only assessed at the start of the study, and people may have changed over the study period (such as taking up smoking). 

The study only identified people who were hospitalised for, or died from, heart failure. This misses people who had not yet been hospitalised or died from the condition.

The results also may not apply to younger people, and the researchers could not look at specific patterns of drinking, such as binge drinking.

Although no level of alcohol intake was associated with an increased risk of heart failure in this study, the authors note few people drank very heavily in their sample. Excessive alcohol consumption is known to lead to heart damage.

The study also did not look at the incidence of other alcohol-related illnesses, such as liver disease. Deaths from liver disease in the UK have increased 400% since 1970, due in part to increased alcohol consumption, as we discussed in November 2014.

The NHS recommends that:

  • men should not regularly drink more than 3-4 units of alcohol a day
  • women should not regularly drink more than 2-3 units a day
  • if you've had a heavy drinking session, avoid alcohol for 48 hours

Here, "regularly" means drinking this amount every day or most days of the week.

The amount of alcohol consumed in the study group with the reduced risk was within the UK's recommended maximum consumption limits.

But it is generally not recommended that people take up drinking alcohol just for any potential heart benefits. If you do drink alcohol, you should stick within the recommended limits.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Seven alcoholic drinks a week can help to prevent heart disease, new research reveals. Daily Mirror, January 20 2015

A drink a day 'cuts heart disease risk by a fifth' researchers claim...so don't worry about having a dry January. Mail Online, January 19 2015

A drink a night 'is better for your heart than none at all'. The Independent, January 19 2015

Glass of wine a day could protect the heart. The Daily Telegraph, January 20 2015

Daily drink 'cuts risk' of middle-age heart failure. The Times, January 20 2015

Drinking half a pint of beer a day could fight heart failure. Daily Express, January 20 2015

Links To Science

Gonçalves A, Claggett B, Jhund PS, et al. Alcohol consumption and risk of heart failure: the Atherosclerosis Risk in Communities Study. European Heart Journal. Published online January 20 2015

Categories: NHS Choices

Heart failure drug digoxin linked to premature death

Tue, 05/05/2015 - 13:30

"A heart drug taken by 250,000 Britons can actually hasten death," the Mail Online warns today. An analysis of previous research on digoxin, used to treat heart failure and heart rhythm abnormalities, suggests that it can raise the risk of premature death.

The analysis pooled the results of 19 different studies investigating whether digoxin – used in the treatment of heart failure and atrial fibrillation – increases the risk of death from any cause.

Overall, the review found that people taking digoxin had a 21% higher risk of death from any cause compared to people not taking the drug.

The risk increase was slightly higher for people taking digoxin for atrial fibrillation (29%) than for heart failure (14%).

Though an effective drug, digoxin has long been known to have potentially serious adverse effects and always needs to be used with care. However, in this analysis, it is difficult to know how much of the higher risk of death is due solely to digoxin, and how much is due to health differences between the people who were and were not taking the drug. People who were prescribed digoxin may have had more severe health problems and these may have increased their mortality risk.

If you are taking digoxin and you have any concerns, or any new or worsening symptoms, do not stop taking your medication, but contact your health professional as soon as possible.

 

Where did the story come from?

The study was carried out by researchers from Goethe University in Germany. No sources of funding are reported, though one of the authors declares receiving consulting fees from various pharmacological companies.

The study was published in the peer-reviewed medical journal European Heart Journal.

Unsurprisingly, the UK media was vigorous in highlighting the potential dangers of digoxin. However, they did take the responsible step of advising readers not to stop taking digoxin without first consulting their GP. 

The Express’ headline of "Popular heart pill raises death risk by a third" was a little misleading. This figure actually refers to the risk in people with atrial fibrillation (29%). The overall figure, for atrial fibrillation and congestive heart failure combined, was slightly lower, at just over a fifth (21%).

What kind of research was this?

This was a systematic review that searched for all relevant studies looking at the link between digoxin use and mortality risk. They pooled the results in a meta-analysis.

Digoxin is a heart drug that increases the strength of each heartbeat. It also controls the rate that electrical impulses signalling the heart muscle to contract are transmitted through the heart chambers. For this reason, it can be used in the control of fast and irregular heartbeats such as atrial fibrillation, and is also sometimes used in the treatment of heart failure.

However, digoxin has side effects. It takes a long time for the drug to be broken down by the body, so it can sometimes have toxic effects, particularly at high blood concentrations. Side effects often centre upon heart function, so it can sometimes be difficult to distinguish between what are direct side effects of the drug and what is due to the worsening clinical condition.

Various studies are said to have caused uncertainty over the side effects of digoxin, with some suggesting it could increase mortality risk. The researchers therefore aimed to carry out a systematic review to pool the evidence on the safety of the drug, particularly looking at mortality effects. 

 

What did the research involve?

The researchers searched two literature databases (Medline and Cochrane) up to November 2014 to identify English language publications looking at the effect of digoxin on all-cause mortality (death from any cause) in people taking the drug for heart failure or atrial fibrillation.

There were 19 studies included, nine of which included people with atrial fibrillation, seven of people with heart failure, and three studies that included a combination of the two. These studies included a total of 235,047 people with atrial fibrillation and 91,379 with heart failure. Study duration ranged from less than one year to 4.7 years (average 2.5 years). Only one of the studies was a randomised controlled trial, the rest were observational studies. All studies were assessed to be of high quality.

Results were pooled and took into account the differences between study results, due to their different study design (heterogeneity).

 

What were the basic results?

In a pooled analysis of all 19 studies, people taking digoxin had a 21% increased risk of all-cause mortality compared with people not taking this drug (hazard ratio (HR) 1.21, 95% confidence interval (CI) 1.07 to 1.38). When separately analysed by condition, people with atrial fibrillation had a slightly higher risk of all-cause mortality (HR 1.29, 95% CI 1.21 to 1.39) compared to people taking the drug for heart failure (HR 1.14, 95% CI 1.06 to 1.22).

 

How did the researchers interpret the results?

The researchers conclude: "The present systematic review and meta-analysis of all available data sources suggests that digoxin use is associated with an increased mortality risk, particularly among patients suffering from AF."

 

Conclusion

This is a valuable systematic review that has searched the global literature to investigate the link between digoxin use and death from any cause in people with atrial fibrillation or heart failure.

Overall, it found that people taking the drug had increased risk of death from any cause. People who were taking the drug for atrial fibrillation had a slightly higher risk than those taking it for heart failure.

These are important findings in terms of trying to quantify the size of the increased risk. However, there are points to consider:

  • The researchers report how the individual studies had adjusted their results for potential confounders that could be influencing the results. However, the factors adjusted for are likely to have differed between studies, and we do not know how completely they will have taken into account all of the differences in characteristics between people who were and weren’t taking digoxin. This means it's still not clear how much of the increase in mortality risk is due directly to digoxin, and how much could be due to the health differences between the people studied.
  • As the researchers also noted, the studies provided limited information on how mortality risk was associated with a particular therapeutic dose of digoxin, or with blood concentration levels. As such, it is difficult to know of a particular "toxic dose" when it comes to increased overall mortality risk.
  • This study has also only focused on all-cause mortality. It has not investigated the underlying reasons for death. Therefore, the review cannot inform us on the reasons why digoxin could be increasing mortality risk (for example, by causing adverse effects on the way the heart functions).

Digoxin is already recognised by the medical profession to be a drug with potential serious adverse effects, and one that needs careful monitoring. This review again highlights the delicate balance there may be between its beneficial therapeutic effects upon conditions such as atrial fibrillation and heart failure, and its possible risks.

It is reported that the Medicines & Healthcare Products Regulatory Agency (the government body that regulates medicines and medical devices in the UK) is now looking at the evidence provided by this new analysis.

People taking digoxin should discuss with the doctor in charge of their care if they have any concerns, or any new or worsening symptoms. These could include lethargy or fatigue, feeling lightheaded or dizzy, or sickness.

However, it is important not to suddenly stop taking digoxin without having an alternative treatment plan, as there could be serious risk from the untreated heart problem.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Irregular heartbeat drug taken by 250,000 Britons 'can raise risk of death'. Mail Online, May 5 2015

Popular heart pill raises death risk by a third. Daily Express, May 5 2015

Links To Science

Vamos M, Erath JW,  Hohnloser SH. Digoxin-associated mortality: a systematic review and meta-analysis of the literature. European Heart Journal. Published online May 4 2015

Categories: NHS Choices

New test could improve diagnosis of ovarian cancer

Tue, 05/05/2015 - 13:10

"New ovarian cancer test twice as effective as existing methods," The Guardian reports, after new research proved relatively successful in diagnosing ovarian cancer.

This study hasn't identified a new blood test for cancer as such – instead, it is a refinement of existing diagnostic methods. The blood test looks at levels of the protein CA125, long recognised as a marker for ovarian cancer.

But this marker is not very reliable – some women with ovarian cancer don't have raised levels, and levels can also be raised in non-cancerous conditions.

This study has developed a new algorithm called the risk of ovarian cancer algorithm (ROCA), which categorises cancer risk according to CA125 levels measured every year.

Around 50,000 women aged 50 or over were screened using ROCA:

  • women at normal risk carried on with annual screening
  • women at intermediate risk had CA125 repeated at 12 weeks
  • women at elevated risk had CA125 repeated at six weeks and a transvaginal ultrasound scan

High-risk women would then be referred for further assessment and surgery as needed. The algorithm accurately detected 86% of women with ovarian cancer, and ruled out almost 100% of women who were cancer-free.

The study suggests the new algorithm could be a valuable way of assessing risk of ovarian cancer, a cancer with notoriously non-specific symptoms. A reliable method of early diagnosis could save some women's lives.

But this is the key thing the research team still has yet to examine – whether screening using this method actually does save lives. Results for this are expected in the autumn.

Screening isn't a "magic bullet", and there has to be careful assessment of the risks of misdiagnosis and any cost implications. 

Where did the story come from?

The study was carried out by researchers from University College London, among various other hospitals and academic institutions in the UK.

It was funded by the Medical Research Council, Cancer Research UK, the Department of Health and the Eve Appeal.

Two study authors are co-inventors of the risk of ovarian cancer algorithm (ROCA), which is patented and licensed to Abcodia Ltd. Two other study authors also declare financial interests through Abcodia Ltd.

One of the authors declared a consultancy arrangement with Becton Dickinson in the field of tumour markers. The remaining authors declare no conflicts of interest.

The study has not yet been published online.

The media generally reported the findings accurately, though some reports give the impression a new test has been developed. This isn't technically a new test – it's a new way of interpreting the results.

The media also failed to mention that it isn't clear yet whether this could be introduced as a screening test, as there are many issues to consider.

Professor Usha Menon of University College London told the BBC News website that, "It's good, but the truth lies in whether we've picked up the cancer early enough to save lives", adding that they don't know this yet. 

What kind of research was this?

This was a randomised controlled trial (RCT) looking at whether annual blood tests for a biomarker of ovarian cancer might be a useful cancer screening tool. The biomarker being examined is called CA125. It has long been recognised that levels of this marker can be raised in ovarian cancer.

However, it's not a specific marker for ovarian cancer as it can also be raised by other conditions, such as infection or inflammation. Also, some women with ovarian cancer don't have raised CA125, so it isn't very good at picking up ovarian cancer on its own.

The research team devised a new way of looking at changes in levels of CA125 over time using an algorithm. 

This publication reports on women in the UK Collaborative Trial of Ovarian Cancer Screening (UKCTOCS) who were allocated to the multimodal screening arm of the trial. These women had their CA125 levels measured every year and interpreted using the risk of ovarian cancer algorithm (ROCA).

Other arms of the trial not reported here included a group who received screening by ultrasound scan (around 50,000 women) and a control group who received no screening (around 100,000 women).  

What did the research involve?

A total of 46,237 women aged 50 years or older were involved in the multimodal screening arm of the trial. Each year, their CA125 levels were measured. Based on these levels, their risk of ovarian cancer (ROC) was interpreted on the algorithm as:

  • normal – return to annual screening
  • intermediate – repeat CA125 in 12 weeks (repeat level I screen)
  • elevated – repeat CA125 and transvaginal ultrasound in six weeks (level II screen) with earlier screens arranged where clinically suspicious

A transvaginal ultrasound scan (TVS) uses high-frequency sound waves to create an image of the ovaries. This image can show the size and texture of the ovaries, plus any cysts or other swellings present.

The way the ROC risk categories were set meant about 15% of all screened women would be in the intermediate ROC category and 2% would be in the elevated ROC category.

For the minority of women in the elevated category, follow-up actions after the level II screen six weeks later would be as follows:

  • TVS normal and normal/intermediate ROC – return to annual screening
  • TVS normal and elevated ROC – repeat level II screen six weeks
  • TVS unsatisfactory, regardless of ROC – repeat level II screen six weeks
  • TVS abnormal – clinical referral

Women with a high ROC risk were recommended to be referred for further investigation and surgery as required.

Participants were followed up using national cancer and death registries. 

What were the basic results?

Overall, there were 296,911 annual screens carried out over an average of three years of follow-up. In this study arm, 640 women underwent surgery, 133 of whom were found to have ovarian cancer.

The researchers calculated that multimodal screening had a 85.8% sensitivity for ovarian cancer. This is the proportion of women with ovarian cancer who were correctly identified as being at risk by the ROCA algorithm.

The specificity was even better at 99.8%, the proportion of women without ovarian cancer who would be correctly identified as not being at risk by the algorithm. For each single case of ovarian cancer identified, 4.8 operations were performed.

However, the researchers also found that basing risk on a fixed CA125 cut-off level was much less accurate, and only would have identified about half of the women with ovarian cancer. 

How did the researchers interpret the results?

The researchers concluded that, "Screening using ROCA doubled the number of screen-detected ovarian cancers compared to a fixed [CA125] cut-off."

They also said that, "In the context of cancer screening, reliance on predefined single threshold rules may result in biomarkers of value being discarded." This implies that CA125 is a valuable biomarker when used in the right way. 

Conclusion

This is a valuable study that has reported the results for around 50,000 women aged 50 or over who were allocated to one arm of a larger trial. They had their ovarian cancer risk assessed annually using the risk of ovarian cancer algorithm (ROCA).

When CA125 levels were used to categorise cancer risk alongside this algorithm, the algorithm was able to accurately identify 86% of women with ovarian cancer. Encouragingly, it ruled out almost 100% of women who were cancer-free. This means these women would not undergo unnecessary further investigation and surgery.

The study suggests the new algorithm could be a valuable way of assessing ovarian cancer risk. This cancer has notoriously non-specific symptoms often only first detected when it is at an advanced stage.

But before any new screening test is introduced there has to be careful assessment of its risks and benefits. These include comparing it with other methods of detecting ovarian cancer based on an assessment of symptoms, clinical examination and investigation findings.

This study did not compare outcomes with the large number of women in the other two screening arms of the trial – those in the control group and those being screened by transvaginal ultrasound. Other issues also need to be considered, including resource implications.

This research doesn't yet say whether the screening saved any lives by detecting the cancer earlier so it could be treated more effectively.

On this point, Professor Usha Menon from University College London told the BBC News website: "There is no screening at the moment, so we are awaiting the results [on whether lives have been saved] before the NHS can decide. Many people would have to be screened, so it really needs to translate to lives saved."

BBC News reports the results of screening this are expected in the autumn. We will provide an update once these become available.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

New ovarian cancer test twice as effective as existing methods. The Guardian, May 4 2015

Ovarian cancer trial boosts hope of screening programme. The Daily Telegraph, May 4 2015

Blood test 'boost' in ovarian cancer fight. BBC News, May 5 2015

Ovarian cancer blood tests breakthrough: Huge success of new testing method could lead to national screening in Britain. The Independent, May 5 2015

New Test May Double Detection Of Ovarian Cancer. Sky News, May 5 2015

New ovarian cancer screening test 'more accurate'. ITV News, May 5 2015

New test hope for detecting ovarian cancer. The Times, May 5 2015

Links To Science

Menon R, Ryan A, Kalsi J, et al. A risk algorithm using serial biomarker measurements doubles the number of screen-detected cancers compared to a single threshold rule in the United Kingdom Collaborative Trial of Ovarian Cancer Screening (UKCTOCS). Journal of Clinical Oncology. Study not yet available online

Categories: NHS Choices

Replacing sugary drinks with water may reduce diabetes risk

Fri, 01/05/2015 - 13:30

"Swapping orange squash for a cup of tea cuts diabetes risk," The Daily Telegraph reports.

This widely-reported news is based on a major UK study, involving around 25,000 adults, which looked at the association between drink choices and the risk of type 2 diabetes. It found that those who consumed more of their calories through sugary drinks, and those who drank more soft drinks or sweetened milk drinks, were more likely to develop type 2 diabetes.

The study has a number of strengths, including its large size and use of multiple approaches to identify people who developed diabetes. But its main limitation is that other factors may be contributing to the effect seen, even though the researchers did try to reduce this as much as possible.

Based on their data, the researchers estimated that swapping water or unsweetened tea or coffee for soft drinks or sweetened milks could potentially reduce the number of new diabetes cases by up to 25%.

We know being overweight or obese is a major risk factor for type 2 diabetes, and ensuring we maintain a healthy weight will help reduce your diabetes risk.

Some sugary drinks contain a surprisingly high amount of calories – for example, a 330ml can of coke contains 139 calories, which would take around an hour of dog walking to burn off.

Reducing your calorie intake by swapping sugar-sweetened drinks for unsweetened drinks, such as tap water, could be one way to help achieve this goal. 

Where did the story come from?

The study was carried out by researchers from the University of Cambridge, and was funded by The Medical Research Council UK and Cancer Research UK.

It was published in the peer-reviewed medical journal Diabetologia.

The UK media's coverage of the study was accurate.

What kind of research was this?

This was an ongoing prospective cohort study called the European Prospective Investigation into Cancer and Nutrition (EPIC)-Norfolk study.

The current analysis looked at whether the amount of sugar-sweetened beverages (SSBs), artificially sweetened beverages (ASBs) and fruit juice a person drank was linked to their risk of developing type 2 diabetes. The researchers also wanted to estimate what impact swapping non-sweetened beverages for these sweet beverages would have.

A previous statistical pooling of prospective studies found that higher SSB consumption was associated with greater diabetes risk, while studies have had varying findings for ASBs and fruit juice.

However, the researchers note that these studies have largely relied on food frequency questionnaires, which do not collect very detailed information on drinks. They wanted to use food diaries (where people are asked to record their food consumption on a daily basis) in their study to better assess drinks intake.

This is the best way to assess this question, given that it would be unethical to randomly assign people to drink a lot of sugary drinks over a long period of time.

The main limitation of this type of study is that healthy (and unhealthy) behaviours and environments tend to cluster together, so picking apart their effects is difficult. 

What did the research involve?

The researchers recruited adults in the UK who did not have diabetes and had them record their food and drink consumption over a week.

They then followed them up over almost 11 years to see who developed type 2 diabetes, and analysed whether people who drank more sweet beverages were at increased risk.

Using their results, they then calculated what impact it would have if people swapped non-sweetened drinks, such as water, for these sweet beverages.

The 25,639 participants in the study were recruited in the 1990s, when they were aged 40 to 79 years old. They filled out a food diary for a week, and the researchers used these to determine how much of the following they drank:

  • soft drinks – squashes and juice-based drinks sweetened with sugar
  • sweetened tea or coffee
  • sweetened milk beverages – such as milkshakes, flavoured milks, and hot chocolate
  • artificially sweetened drinks (ASBs) – such as diet sodas
  • fruit juice

The first three categories were classed as SSBs. The participants also provided other information on their lifestyles. During the study, they had health checks and filled in follow-up health and lifestyle questionnaires.

The researchers followed participants up until 2006, and identified anyone who developed type 2 diabetes through the health checks, questionnaires, and medical records. If a person reported they had diabetes but this could not be confirmed with medical records, they were not counted as having the condition.

The analyses included the 24,653 participants who did not have diabetes or a family history of diabetes and had reported all the information the researchers needed. The researchers looked at whether the number of servings of the individual drinks consumed was linked to the risk of developing type 2 diabetes during the study.

These analyses took into account a range of factors that could influence the results (potential confounders), such as:

  • age
  • gender
  • socioeconomic status
  • physical activity
  • smoking
  • intake of other sweet beverages
  • total calorie intake
  • body mass index (BMI)
  • waist circumference

The researchers used standard methods to estimate what impact it would have if people stopped consuming SSBs, based on their findings. They also calculated the potential impact of swapping water or ASBs for SSBs. 

What were the basic results?

During the study, 847 participants (3.4%) developed type 2 diabetes.

After adjustment for all of the potential confounders, including total energy intake and BMI:

  • each additional serving of soft drinks was associated with a 14% increase in the risk of developing diabetes (hazard ratio [HR] 1.14, 95% confidence interval [CI] 1.01 to 1.32)
  • each additional serving of sweetened milk drinks was associated with a 27% increase in the risk of developing diabetes (HR 1.27, 95% CI 1.09 to 1.48)
  • sugar-sweetened tea and coffee, ASBs, fruit juice and water were not associated with type 2 diabetes risk

Overall, consuming more sweet drinks (measured as what percentage of a person's calorie intake came from these drinks) was associated with increased type 2 diabetes risk.   

Substituting one serving a day of water or unsweetened tea or coffee for soft drinks or sweetened milk drinks was estimated to have the potential to reduce the number of new cases of type 2 diabetes by 14-25%. Substituting ASBs for SSBs was not estimated to have a significant effect.

If people who drank sweet beverages reduced their intake of these drinks so they accounted for less than 2% of their total calorie intake, this was estimated to have the potential to prevent 15% of new diabetes cases.

How did the researchers interpret the results?

The researchers concluded that, "The consumption of soft drinks, sweetened milk beverages and energy from total sweet beverages was associated with higher type 2 diabetes risk independently of adiposity [BMI and waist circumference]".

They suggest that, "Water or unsweetened tea/coffee appear to be suitable alternatives to SSBs for diabetes prevention", and feel their findings are of public health importance. 

Conclusion

This cohort study has found an association between sugar-sweetened drink consumption and the risk of type 2 diabetes. It estimated that swapping water or unsweetened tea or coffee for these beverages could have the potential to reduce the number of new diabetes cases by up to 25%.

The study has a number of strengths, including its large size and prospective collection of data. It also used multiple approaches to identify people who developed diabetes, which should help to make sure that most, if not all, cases were identified.

People also used a food diary to record food and drink intake, which is reported to provide more detailed information than the questionnaire-based methods used in many previous studies.

As with all studies of this type, the main limitation is that it is difficult to single out the impact of one factor and be sure that no others are contributing to the link seen.

For example, people who drank more sweetened tea or coffee and sweetened milk beverages tended to have less healthy diets overall.

The researchers did take into account a range of factors, such as diet and physical activity in their analyses, to reduce this as much as they could, but it could still be having some effect.

Another limitation is that the researchers only assessed drink intake once, at the start of the study, and this may have changed over time.

The figures for the percentage of cases of type 2 diabetes that could be prevented are estimates. They are based on the assumption that the risk factor (sugar-sweetened drinks in this case) is directly causing the entire link seen, which may not be the case.

This method can overestimate the impact of individual factors. However, these types of estimates are used to help public health policy makers decide which disease risk factors are most important for them to target.

Overall, we know that being overweight or obese is a major risk factor for type 2 diabetes. Maintaining a healthy weight will help to reduce this risk.

Reducing your calorie intake by swapping sugar-sweetened drinks for unsweetened drinks could work towards this goal. And given that UK tap water is cheap, calorie-free and safe to drink, it would seem the obvious choice for a sugar swap.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Choosing water instead of sugary drinks could cut diabetes 2 risk by a quarter. The Guardian, April 30 2015

Swapping orange squash for a cup of tea cuts diabetes risk. The Daily Telegraph, April 30 2015

Cutting out one fizzy drink a day slashes diabetes risk by 25%: Replacing sugary beverages with unsweetened tea or coffee could combat epidemic. Mail Online, May 1 2015

A glass of water a day 'can cut diabetes risk by a quarter'. Daily Express, May 1 2015

Links To Science

O'Connor L, Imamuara F, Lentjes MAH, et al. Prospective associations and population impact of sweet beverage intake and type 2 diabetes, and effects of substitutions with alternative beverages. Diabetologia. Published online April 30 2015

Categories: NHS Choices

Texting may help relieve pain during minor surgery

Fri, 01/05/2015 - 11:45

"Need pain relief for surgery? Try a text," the Daily Mail reports. The advice was prompted by a small study that found people who used a mobile phone during minor surgery were less likely to require additional pain medication.

During surgery, participants in this study were allocated to texting a close friend or family member, texting a research assistant they did not know, playing Angry Birds, or receiving usual care.

Researchers found patients who used a mobile phone to text someone were less likely to need additional painkillers during surgery. Interestingly, people who texted a research assistant tended to need slightly less painkillers than those who texted somebody that they knew.

The researchers speculate this could be because the conversations with the research assistant were not about their surgery, so this may have helped take their mind off the experience.

While the study was well designed, it was relatively small and may not be representative of all people having this type of surgery, or be able to detect small effects.

Larger studies assessing a wider variety of pain-related outcomes, such as the patient's own rating of their pain, are needed to confirm the findings.

Distraction techniques and social support can be useful self-help methods for coping with pain

Where did the story come from?

The study was carried out by researchers from Cornell University in the US and McGill University and LaSalle Hospital in Canada, and was funded by Cornell University. It was published in the peer-reviewed journal, Pain Medicine.

The Daily Mail covered the story reasonably, but does not highlight any of the study's limitations. The Daily Telegraph's headline, "Angry Birds could reduce pain during surgery, study finds", is misleading. There was no statistically significant difference between the "Angry Birds" group and the "no special activities" group in terms of their need for the painkiller.

It would be a shame if the paper just included the term to create an eye-catching headline rather than trying to report the study accurately.  

What kind of research was this?

This was a randomised controlled trial (RCT) looking at whether text messaging or playing a mobile phone game during minor surgical procedures could reduce patients' need for a strong painkiller.

Having social support has been reported to have a number of benefits, including reducing a person's pain sensation and making them able to bear pain for longer (in childbirth, for instance).

Distraction techniques, such as listening to music or using virtual reality simulations, have also been reported to help reduce anxiety and people's need for anaesthesia.

The researchers were interested in whether social support (in the form of text messaging) would have a greater effect than just being distracted (in the form of a game).

They also tested whether there was a difference between texting a friend or family member, who might be anxious about the person's surgery, and texting a stranger. An RCT is the best way to compare the effects of different interventions.  

What did the research involve?

The researchers recruited 98 adults scheduled to undergo minor surgery with regional, rather than general, anaesthesia. They allocated them at random to perform one of four things just before and during their surgery:

  • texting a close friend or family member
  • texting a research assistant they did not know about their hobbies and interests, for example
  • playing Angry Birds on a phone
  • no special activities (usual care)

The participants had the normal pre-surgery procedures, including being given their anaesthesia and initial dose of painkillers.

All except one of the anaesthetists (the doctors who give anaesthetic during surgery) did not know the aim of the study or what it was measuring. They knew whether the patient had a phone with them, but were not told what the patient had been asked to do with it.

The anaesthetists asked patients if they were in pain after the first surgical incision was made, then again within the first 5-10 minutes of surgery and throughout the procedure. If the patient reported pain, the anaesthetist could give them the painkiller fentanyl or sedation as they judged appropriate.

The researchers then compared the groups to see if they differed in terms of how much fentanyl was needed during the surgery.

What were the basic results?

The patients in the four groups did not differ in their anxiety levels before surgery, or the type of surgery or how long they were in the operating room. Only about a quarter of the patients (27.6%) needed extra fentanyl during the surgery.

The researchers found that:

  • patients who texted a close friend or family member during their surgery needed less fentanyl than those who did not do any of the activities
  • patients who texted the research assistant needed less fentanyl than those playing the game and those who did not do any of the activities
  • patients in the two texting groups did not differ significantly in their need for fentanyl
  • patients in the game group and those who did not do any of the activities did not differ significantly in the amount of fentanyl they needed

The researchers also analysed the odds of needing additional fentanyl during surgery. They report that those doing nothing were four times more likely to need more fentanyl than those texting friends or family, and six times more likely than those texting the research assistant.

Looking at the text conversations, those who texted the research assistants tended to be more positive, while the texts with a friend or family member tended to use more biological terms, so seemed to focus on the surgery itself. 

How did the researchers interpret the results?

The researchers concluded that their study "provides the first evidence of the analgesic-sparing benefits of social support from text messaging in a surgical setting". 

Conclusion

This relatively small study suggests text message conversations during minor surgery may reduce the need for painkillers, and are more effective than playing the game Angry Birds.

The study was an RCT, the best design for comparing different interventions, which should ensure the groups were well balanced. This means any differences in the patients' outcomes should be the result of the interventions.

But this study does have some limitations:

  • It was relatively small and may not be representative of all people having this type of surgery. The authors suggest the study's small size may also be why they did not find an effect for the Angry Birds intervention.
  • The anaesthetists could not be completely blinded to which group patients were in, as they knew whether the person had a phone with them. They may also have been able to guess what a person was doing (texting or playing a game) based on their hand movements or expression. This might influence their perception of the participants' pain.
  • Being engaged with the phone might affect the frequency anaesthetists asked the participants about their pain. The researchers say they tried to make sure this wasn't the case, but acknowledge this was down to the discretion of the anaesthetists.
  • It only assessed one outcome. Ideally, comparison of the patients' own assessment of pain and satisfaction with the procedure would be an important outcome to assess.

There is interest in developing non-drug-related methods to reduce people's pain and discomfort during surgery or other procedures.

The researchers suggest that texting could be a good approach, as it is simple and doesn't need specialised equipment or input from healthcare staff. However, whether this would be considered acceptable from an infection control perspective is unclear.

Overall, this study suggests that using a mobile phone during surgery has some effect, but larger studies assessing a wider variety of outcomes are needed.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Need pain relief for surgery? Try a text: Simple act of writing a message on a mobile phone could significantly reduce patients' discomfort. Daily Mail, May 1 2015

Angry Birds could reduce pain during surgery, study finds. The Daily Telegraph, April 30 2015

Links To Science

Guillory JE, Hancock JT, Woodruff C, Keilman J. Text Messaging Reduces Analgesic Requirements During Surgery. Pain Medicine. Published online December 19 2015

Categories: NHS Choices

UK life expectancy expected to rise to late 80s by 2030

Thu, 30/04/2015 - 12:50

"Life expectancy is rising faster than thought, with 90 expected to become the norm in some affluent areas of the country by 2030," The Guardian reports. The same predictions led the Daily Mail to warn of a "life expectancy timebomb".

A new modelling study looking at trends in life expectancy estimated that male babies born in 2030 could live to an average of 85.7 years, with females living an average of 87.6 years.

The study also flagged up the potential effects of health and socioeconomic inequalities on life expectancy. For example, it estimated life expectancy in the affluent London borough of Kensington and Chelsea would be five to six years higher than the working class area of Tower Hamlets.

It remains to be seen if the increase in life expectancy would be a blessing or a burden. Elderly people contribute to society in many meaningful ways, such as helping out with childcare or volunteering for charity work. But they may also have complex health needs that could require significant resources to treat.

Assuming the model is accurate, the study produces some interesting results about trends in life expectancy and inequalities, and how they may change over time.  

Where did the story come from?

The study was carried out by researchers from the department of epidemiology and biostatistics at the School of Public Health and MRC-PHE Centre for Environment and Health, the UK Small Area Health Statistics Unit, Imperial College London, Northumbria University, and GlaxoSmithKline. It was funded by the UK Medical Research Council and Public Health England.

The study was published in the peer-reviewed medical journal, The Lancet. It has been made available on an open-access basis, so it is free to read online.

Most of the media reported the results of the research well, although they did not question the accuracy of the predictions much. Different outlets focused on different aspects of the research.

The Daily Telegraph and the Mail focused on the headline figure that the study predicted higher life expectancies than official estimates. In its headline, the Telegraph claimed people would live "up to four years longer" than official estimates, although the study shows a difference of 2.4 years for men and one year for women.

BBC News highlighted the narrowing gap between men and women's life expectancies, while The Guardian and The Independent were more concerned with the widening gap between rich and poor.  

What kind of research was this?

This modelling study analysed death rates and population data for 375 districts of England and Wales. Researchers used the data to construct mathematical models to predict life expectancy from 1981 to 2030 for each of the districts, looking at men and women separately.

The study aimed to give reliable district-level information about life expectancy to help with future planning for health, social service and pension needs. The figures are all averages for the districts and cannot be used to predict individual lifespans. 

What did the research involve?

Researchers looked at records of deaths in England and Wales between 1981 and 2012 by local authority district. They combined this with population data to develop five statistical models that could predict future death rates and life expectancy.

The researchers tested the models to see which best predicted actual death rates during the last 10 years of the data, then used the best-performing model to predict future life expectancy at the local and national level.

The data in the study came from the Office for National Statistics. The models incorporated features of death rates in relation to people's age, trends of death rates in people who were born within or close to the same five-year period, changes to death rates over time, and by local area.

The test of the five models found one model, which gave greater importance to trends in those born within adjacent time periods, worked better than the others, with forecast errors of 0.01 years for men and women.

This model was best able to predict death rates for 2002-12 using the first 21 years of the data. The researchers therefore chose this model to predict life expectancy from 2012-30.

While the geographical areas of the districts remained the same over the study, people living in these areas obviously change. The researchers looked at trends for each district, including birth rates and migration, so they could factor this in.

They looked at how relative levels of deprivation for each district affected the mortality rates and life expectancy. Taking account of all this data, they then predicted how life expectancy at birth could change from babies born in 2012 to babies born in 2030.

Rates for men and women were calculated separately, as life expectancy differs by gender. As far as we can tell from the paper, the analysis was done using reasonable assumptions about population trends.  

What were the basic results?

The study found life expectancy in England and Wales is expected to continue to rise from the 2012 average of 79.5 years for men and 83.3 for women, to 85.7 (95% credible interval 84.2 to 87.4) for men and 87.6 (95% credible interval 86.7 to 88.9) for women by 2030.

This is higher than predictions from the Office of National Statistics. However, this will come at the cost of increasing inequality between districts.

Improvements in life expectancy from 1981-2012 varied a great deal between districts. In 1981, men in districts with the best life expectancies could expect to live 5.2 years longer than those in the areas with the lowest life expectancies (4.5 for women).

By 2012, this had increased to a difference of 6.1 years for men and 5.6 years for women. The study says this trend is expected to accelerate, so that by 2030 the difference in life expectancy between the best and worst districts could reach 8.3 years for both men and women.

Most of the districts with the lowest life expectancies now and in 2030 were in south Wales and the northeast and northwest of England. The areas with the highest life expectancy were mostly in the south of England and London. However, London districts varied from the highest to the lowest life expectancy levels.

The gap between men and women's life expectancy is expected to shrink further. It has already shrunk from 6 years in 1981 to 3.8 years in 2012, and by 2030 it could be only 1.9 years. In some areas, there may be no difference between men and women's life expectancy at all.  

How did the researchers interpret the results?

The researchers say their results are a more accurate prediction of how life expectancy will increase than official figures, and are the first to look consistently at changes in life expectancy at the district level over a long period of time.

They say the increase is likely to be the result of better survival in people over the age of 65. They say men's life expectancy will rise faster than women's, partly because of the effect of social trends such as smoking among middle-aged and older women.

The researchers claim the data will allow local authorities to plan better for the future, especially as much health and social care is now the responsibility of local areas. However, they also say the figures provide a warning that inequality in England and Wales will continue to rise.  

Conclusion

This analysis of population data provides some fascinating information about how life expectancy has changed over the past 30 years, and how it may change in the future.

It found life expectancy for men and women will continue to rise. However, it also found the existing trends of the difference in life expectancy between different districts will continue to rise, which is of concern.

Although the data shows more deprived areas have seen less of an improvement in life expectancy, the study cannot inform us what factors are responsible for the differences in life expectancy.

There is one big limitation of any study that predicts life expectancy in the future: the figures are always based on trends from death rates in the past, and assume that past trends will continue into the future.

These types of studies cannot account for unexpected events or major social changes that could have a huge effect on life expectancy. For example, they can't build into their models the potential for unlikely events such as a big natural disaster, changes within the healthcare system, or even a major health breakthrough, such as a cure for heart disease or cancer.

It's worth remembering, too, that life expectancy figures represent the life expectancy of a baby born in that particular year. So the life expectancy figures for 2012 don't represent life expectancy for adults alive in 2012, but for babies born that year. This means the figures for 2030 don't yet apply: they are only predictions for babies born in the future.

The study can't be used by individuals to predict how long they may live, but it does provide useful data to plan for pensions and health and social provisions in the future.

If you are keen to live to 2030 and beyond, your best bet is to take steps to reduce your risk of the five leading causes of premature death:

Read more about the top five causes of premature death.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Life expectancy increases but gap widens between rich and poor. The Guardian, April 30 2015

Men near equality with women in life expectancy. The Times, April 30 2015

Men 'catching up' on life expectancy. BBC News, April 30 2015

Britain is facing a life expectancy timebomb: By 2030, the average man will live to 85... and women will reach 87. Daily Mail, April 30 2015

Richest one per cent will live over eight years longer on average than those living in poorest parts of UK by 2030, say experts. The Independent, April 30 2015

Britons to live up to four years longer than official estimates by 2030. The Daily Telegraph, April 30 2015

Why we are all living longer: Life expectancy set to soar for Britons. Daily Express, April 30 2015

Links To Science

Bennett JE, Li G, Foreman K, et al. The future of life expectancy and life expectancy inequalities in England and Wales: Bayesian spatiotemporal forecasting. The Lancet. Published online April 29 2015

Categories: NHS Choices

New prostate cancer treatment has promising results in mice

Thu, 30/04/2015 - 12:00

"Prostate cancer resistant to conventional treatment could be all but wiped out by a therapy that boosts the immune system," the Daily Mail reports. The therapy, as yet only used in mice, enabled chemotherapy to destroy cancer cells in mice with previously treatment-resistant prostate cancer.

Abnormal body cells will usually be recognised by the immune system and destroyed. However, the fact the cancers develop and progress, and can be resistant to treatment, shows that something is preventing these cells from being destroyed.

Previous study has suggested that immune cells called B cells (which make antibodies) may have a role in making prostate tumours resistant to chemotherapy. This mouse study further investigated this by looking at different ways to suppress these B cells, using immune therapy or genetic techniques. It found that once these B cells were blocked or removed, a chemotherapy drug (oxaliplatin) was then able to attack and destroy mouse prostate tumours.

The researchers have dubbed this approach "chemoimmunotherapy", as it combines chemotherapy with immunotherapy (having an effect on the immune cells).

It is too soon to know whether "chemoimmunotherapy" could be the answer for progressive and treatment-resistant cancer in humans – prostate or any other type of cancer.

However, this study could aid further understanding of how the immune system tackles cancer, potentially leading to new treatment approaches.

 

Where did the story come from?

The study was carried out by researchers from the University of California, Institute of Immunology in Berlin, Medical University of Vienna and the University of Veterinary Medicine in Vienna. There is no information about external funding. 

The study was published as a letter in the peer-reviewed scientific journal Nature (letters are short reports of new research that are of potential interest to other researchers).

Media coverage was fair, but over-optimistic, about the results being applicable to humans. It is exaggerating the results of this very early stage study to suggest that advanced prostate cancer could be “"wiped out", as suggested by both the Daily Telegraph and the Daily Mail.

To its credit, the Mail’s headline made it clear that the experiment was in mice. The Telegraph also mentioned this, below its headline.

 

What kind of research was this?

This was a laboratory experiment using mice, exploring how the body’s immune system deals with cancer.

Abnormal body cells will usually be recognised by the immune system and destroyed. However, the fact the cancers develop and progress, and can be resistant to treatment, shows that something is preventing these cells from being destroyed. The possible reasons are poorly understood.

Previous research suggested that immune system cells, called B cells (which make antibodies), may be involved in making prostate cancer cells progress and become resistant to chemotherapy. As the researchers point out, although early prostate cancer responds well to chemotherapy, this is not the case with advanced or established tumours.

The researchers aimed to look at whether by disabling or blocking the B cells in mice, chemotherapy may be more successful in activating the immune system to fight the cancer. This would be an approach of combined chemotherapy and immunotherapy – known as chemoimmunotherapy.

 

What did the research involve?

The research used mouse models of metastatic prostate cancer that were resistant to the chemotherapy drug oxaliplatin, which is used in the treatment of aggressive prostate cancer in humans.

The researchers looked at different ways of suppressing the development or activity of the B cells that are thought to block the activity of chemotherapy drugs. They blocked or removed the B cells using immune-modulating drugs or genetic engineering techniques. Treated and untreated mice were then given oxaliplatin for a three-week period to look at the effects.

The researchers also investigated which are the crucial B cells that require elimination, including looking at human prostate cancer samples.

 

What were the basic results?

Researchers found that when the B cells were blocked or removed, the mice prostate tumours were successfully treated with oxaliplatin.

The researchers were able to identify the exact type of B cells that were responsible for blocking treatment, and these cells were also found in human prostate cancer samples resistant to chemotherapy.

 

How did the researchers interpret the results?

The researchers say in an accompanying press release that their findings call for clinical testing of "this novel therapeutic approach."

They also point out that in addition to prostate cancer, similar immunosuppressive B cells can be detected in other human cancers. They say this indicates that B cell-mediated immunosuppression might be the reason several other cancers do not respond to treatment, raising hopes that the combination of chemotherapy and immunotherapy could have broader applications for other cancers.

 

Conclusion

This study has built on the findings of previous research that has suggested B immune cells could have a role in making prostate tumours resistant to chemotherapy. This mouse study further investigated this by looking at different ways to suppress these B cells, using immune therapy or genetic techniques. It found that once these B cells were blocked or removed, chemotherapy was able to attack and destroy aggressive prostate cancer cells in mice.

The potential for a new chemoimmunotherapy treatment approach for cancer is promising. However, the study is still at a very early stage. While mice studies can give an indication of how cellular processes work and how a treatment may work in humans, they are only indications, as there are inherent differences between the species. It is often the case that diseases in genetically engineered mice differ in key ways from the same disease in humans, so we can’t say whether the results of this study would be the same for humans.

It is too soon to know whether suppressing the B immune cell response could be the answer for progressive and treatment-resistant cancer in humans – prostate or any other type of cancer. It is also not known whether a safe and effective new immunotherapy treatment for cancer could be developed on the back of these results. Other immunosuppressants can cause a wide range of side effects, so the benefits of this treatment approach could be outweighed by the risks.

However, this study could further understanding of how the immune system tackles cancer, potentially leading to new treatment approaches.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

New drug 'helps healthy cells kill prostate cancer': Test carried out in combination with chemotherapy drugs achieve almost complete remission in mice. Daily Mail, April 30 2015

Prostate cancer could be 'wiped out' by new treatment. The Daily Telegraph, April 29 2015

Links To Science

Shalapour S, Font-Burgada J, Di Caro G, et al. Immunosuppressive plasma cells impede T-cell-dependent immunogenic chemotherapy. Nature. Published online April 29 2015

Categories: NHS Choices

Diet swap study highlights bowel effects of western-style diet

Wed, 29/04/2015 - 13:30

"Diet swap experiment reveals junk food's harm to guts," BBC News reports.

20 American volunteers were asked to eat an African-style diet (high fibre and low fat) while 20 Africans were asked to eat a typical American-style diet (low fibre and high fat). The western diet seemed to contain more red and processed meat.

The researchers found that after just two weeks, both diets led to biological changes in the guts of both groups, such as changes in the microbes present and levels of inflammation.

The African-style diet led to changes that were suggested to possibly contribute to reduced bowel cancer (also known as colon cancer) risk in the long term, while the opposite was true of the western-style diet.

However, it was a very short-term study, which only looked at biological changes in the gut, and the authors say they can’t be certain that these led to the changes in bowel cancer risk.

That said, there is the striking figure that Americans are around 13 times more likely to develop bowel cancer than Africans, with similar rates existing in most western countries. There is also evidence that when non-western populations adopt a more westernised diet, there is a corresponding rise in bowel cancer cases.

The Department of Health advises people who eat more than 90 grams (g) of red and processed meat (cooked weight) a day to cut down to 70g, to help reduce their bowel cancer risk.

 

Where did the story come from?

The study was carried out by researchers from the University of Pittsburgh and other research centres in the US, Europe and South Africa. It was funded by the US National Institutes of Health, the UK National Institute for Health Research, the Academy of Medical Sciences, the Netherlands Organization (de Vos) for Scientific Research, the European Research Council and the Academy of Finland. The study was published in the peer-reviewed journal Nature Communications.

The news headlines generally focus on the effects of these diets on cancer risk – not making it clear that this study was not looking at cancer directly. Instead, it was looking at a range of indicators – biomarkers – that may provide an indication of how healthy a person’s digestive system is.

The BBC bucks this trend, with a more representative headline "Diet swap experiment reveals junk food's harm to gut", although the study was not specifically looking at junk food.

Some sources took a positive interpretation of the results, such as The Independent, which told us that "Adopting high-fibre diet could dramatically cut risk of bowel cancer". Others took a more negative approach, such as the Daily Express, whose headline was "Western diets can raise your cancer risk after just two weeks". While the study did find bowel changes after two weeks, we don’t know if these changes directly raise cancer risk or whether they remained after people changed back to their normal diet.

 

What kind of research was this?

This was an experimental study looking at the effects of two different diets – those of African-Americans and rural Africans – on the gut. Rural South Africans have much lower rates of bowel cancer than African-Americans – with fewer than 5 people per 100,000 affected, as opposed to 65 per 100,000 African-Americans.

Dietary differences are likely to be responsible for this difference, and the researchers wanted to see what the effect of the typical diets of these groups had on the gut. They did this by getting these two groups to effectively switch diets for two weeks and seeing what happened.

This study is appropriate for looking at short-term effects of diet on the gut, which might be related to cancer risk if the diet was maintained in the long term.

However, a long-term study would be unethical, as you would be exposing some people to a diet that you know, or at least strongly suspect, is unhealthy.

 

What did the research involve?

The researchers recruited 20 healthy African-Americans aged 50 to 65 years old, living in the US, and an age and sex-matched group of 20 South Africans living in a rural area. They were first assessed over a two-week period, where they ate their normal diet at home. They then switched to the "opposite" diet – either a western-style diet or a rural African-style diet provided by the researchers. The researchers then assessed what affect this had on their gut.

The rural African-style diet increased average fibre intake among the African-Americans from 14g to 55g per day, and reduced their fat from 35% to 16% of their total calorie intake. The western-style diet reduced fibre intake among the rural Africans from 66g to 12g per day, and increased their fat intake from 16% to 52% of their total calorie intake.

During this part of the study, the participants lived in research facilities and had their meals prepared for them. The meals were also designed to be appealing to participants. While there was some "junk food" in the western-style diet used in the study (hamburger, fries and hot dogs), there were also some healthier meals, such as chilli, rice and stuffed bell peppers. The rural African-style diet also included some foods that would not be traditionally served in Africa – such as vegetarian corn dogs and hushpuppies (a fried or baked ball of cornmeal batter). From the sample menus reported in the study, the western-style menus appeared to include more red and processed meat than the African-style meals – with the latter including more fish.

The investigations the researchers performed included collecting faecal samples to test them for bacteria and chemical by-products of digestion, and carrying out colonoscopies (where a small tube containing a light and a camera is inserted via the rectum to observe the bowel wall).

 

What were the basic results?

In their normal diet, the African-Americans ate two to three times more protein and fat than the rural Africans. In contrast, fibre intake was higher in the rural Africans’ diets. The cells in the walls of the African-Americans’ colons were dividing more than those in the rural Africans.

The researchers found that switching African-Americans to the high-fibre, low-fat diet led to an increase in the fermentation of sugars in their gut. This indicated a change in the microbes in the gut that are responsible for this process, and this was supported by testing which microbes were present.

There was also a reduction in the production of certain bile acids in the rural African diet. Some animal studies have suggested that these bile acids can promote cells to become cancerous, and human studies were also reported to have found that higher levels are linked with increased colon cancer risk. There was also a reduction in signs of inflammation of the walls of the colon, and the cells in the colon wall stopped dividing as quickly. Again, these changes could potentially predict lower cancer risk.

The opposite changes were observed in the rural Africans when they switched to a western-style diet.

 

How did the researchers interpret the results?

The researchers concluded that "in individuals from high-risk and from low-risk cancer populations, changes in the food content of fibre and fat had remarkable effects on their [bowel bacteria and metabolites] within two weeks, and, critically, that these changes were associated with significant changes in inflammation and proliferation [of the bowel lining]". They say these changes may not lead to changes in bowel cancer risk, but state that other studies suggest there could be links.

 

Conclusion

This study aimed to investigate various biological changes to the gut that occur when switching from a western-style low-fibre, high-fat diet to an African-style high-fibre, low-fat diet, and vice versa. These changes may partly explain why African-Americans living in the US have over 10 times the bowel cancer rate of rural Africans.

The differences seen may not solely have been due to the differences in fibre and fat. The western-style diet also appeared to contain more red and processed meat, which have also been linked to increased bowel cancer risk. It is also worth bearing in mind that this study only took place over two weeks, and the longer-term effects of these diets on the colon were not studied. The authors themselves acknowledge that they can’t be sure the changes they saw would directly lead to changes in cancer risk. However, other research suggests they might be if they were present in the long term.

The other limitations are that the study was relatively small and only included healthy middle-aged and older adults of African origin, so may not apply to the wider population. 

Overall, the results do not contradict current advice that consuming a high-fibre diet can reduce your bowel cancer risk. Meanwhile, obesity and a diet high in red and processed meat have been shown to increase bowel cancer risk.

Read more about bowel cancer prevention.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Diet swap experiment reveals junk food's harm to gut. BBC News, April 28 2015

Eating a western diet for just TWO WEEKS raises colon cancer risks: US-African diet swap reveals damaging impact of junk food. Mail Online, April 28 2015

Adopting high fibre diet could dramatically cut risk of bowel cancer, says study. The Independent, April 29 2015

Bowel cancer risk may be reduced by rural African diet, study finds. The Guardian, April 28 2015

Why Western diets can raise your cancer risk after just two weeks. Daily Express, April 28 2015

Links To Science

O’Keefe SJD, Li JV, Lahti L, et al. Fat, fibre and cancer risk in African Americans and rural Africans. Nature Communications. Published online April 28 2015

Categories: NHS Choices

Bullying may have worse long-term effects than child abuse

Wed, 29/04/2015 - 12:10

"Bullied children are five times more at risk of anxiety than those maltreated," reports the Daily Mail. A study looking at both UK and US children found an association between childhood bullying and anxietydepression and self-harm in adulthood.

People bullied by their peers in childhood were found to be more likely to have mental health problems in young adulthood than those who were ill-treated by adults, including their parents.

But the headlines are misleading – this figure only reflects the results of the US study. The results from the UK part of the study, which included more than three times the number of children, were not nearly as dramatic.

There are also some problems with the way this study was designed. It relied on children and parents self-reporting their experiences, which may make the results less reliable. For obvious reasons, parents in particular may have played down their ill-treatment of their children.

Still, the authors' conclusion that schools, health services and other agencies should co-ordinate their response to bullying seems a valid suggestion.

If you are concerned that your child is being bullied, it's essential that you or your child, or both of you, talk to their school. You could ask to see their anti-bullying policy, which every school has to have by law. This will allow you to see how the school plans to prevent and tackle bullying.  

Where did the story come from?

The study was carried out by researchers from the University of Warwick and Duke Medical Centre, both in the UK.

It was funded by The Wellcome Trust, the Medical Research Council, and the Economic and Social Research Council in the UK, and the National Institute of Mental Health, the National Institute on Drug Abuse, NARSAD (Early Career Award), and the William T Grant Foundation in the US.

It was published in the peer-reviewed medical journal, The Lancet Psychiatry on an open-access basis, so it is free to read online or download as a PDF.

The study was widely covered by the media. However, the Mail's assertion that bullied children are five times more at risk of anxiety than those maltreated by adults is misleading.

This figure was also used in other news sources and in an accompanying press release, but it only reflects the results of a US study. The figures from the UK, which involved more than three times the number of children, were not as striking.

What kind of research was this?

This was a cohort study exploring the long-term mental health effects of bullying in childhood compared with a child's ill-treatment by adults.

The researchers say ill-treatment by adults in childhood, such as neglect, cruelty and sexual abuse, is a matter of intense public concern. It has been shown to increase the risk of mental ill health, substance abuse and suicide attempts.

Verbal and physical abuse (bullying) by other children is also a global issue, with one in three children across 38 countries reporting being bullied. It can also have similar adverse effects in adulthood.

The researchers aimed to find out whether mental ill health is a result of both ill-treatment and bullying, or whether bullying has an independent effect. 

What did the research involve?

The research was based on two large ongoing cohort studies of families. One involved 4,026 children from the UK and the other had 1,420 children from the US.

The UK study aims to look at the health and development of children during childhood and beyond. The participants were pregnant women with an expected delivery date between April 1991 and December 1992.

From the first term of pregnancy, parents in the study completed postal questionnaires about themselves and their child's health and development.

The mother provided information on maltreatment between the ages of 8 weeks and 8.6 years, and their child's reports of bullying when they were aged 8, 10 and 13. The term "maltreatment" was assessed as physical, emotional or sexual abuse, or "severe maladaptive parenting".

Children attended annual assessment clinics, including face-to-face interviews and psychological and physical tests, from the age of seven onwards.

The US study is based on a sample of three groups of children aged 9, 11 and 13 years who were recruited in 1993. The parents and children were repeatedly interviewed and asked about bullying and maltreatment.

This included any physical or sexual abuse, or harsh parental discipline. The children were screened for behavioural problems and mental disorders up until young adulthood.

The researchers controlled the results for factors thought to increase the risk of child abuse and bullying, including the sex of the child, family hardship and the mother's mental health. They assessed for these factors during pregnancy for the UK cohort, and at annual parent and child interviews for the US cohort. 

What were the basic results?

The researchers found that:

  • In the US cohort, children who were bullied were nearly five times more likely to suffer anxiety than children who were maltreated (US cohort odds ratio [OR] 4.9; 95% confidence interval [CI] 2.0 to 12.0).
  • In the UK group, compared with children who were maltreated, children who were bullied were more likely to have depression (OR 1.7, 1.1-2.7) and self-harm (OR 1.7, 1.1-2.6).
  • In the US cohort, children who were maltreated but not bullied were four times more likely to have depression in young adulthood compared with children who were not maltreated or bullied (OR 4.1, 95% CI 1.5-11.7).
  • In the UK cohort, those who were maltreated but not bullied were not at an increased risk for any mental health problem compared with children who were not maltreated or bullied.
  • In both cohorts, those who were both maltreated and bullied were at an increased risk for overall mental health problems, anxiety and depression compared with children who were not maltreated or bullied. In the UK cohort, they were also at risk of self-harm.
  • In both cohorts, children who were bullied by peers but not ill-treated by adults were more likely to have mental health problems than children who were maltreated but not bullied (UK cohort 1.6, 95% CI 1.1-2.2; US cohort 3.8, 95% CI 1.8-7.9).  
How did the researchers interpret the results?

The researchers say that being bullied by peers in childhood had generally worse long-term adverse effects on young adults' mental health than ill-treatment by adults.

The findings have important implications for public health planning and service development for dealing with peer bullying, they argue. 

Conclusion

The two sets of results from differing cohort groups make the findings of this study quite confusing. For example, the abstract and press release highlight the 4.9% increase in anxiety when children had been bullied only, compared with children ill-treated by adults. But this figure only comes from the US cohort.

The confidence interval for this figure is very wide, suggesting it may not be reliable. In the UK cohort, the increased risk for anxiety among those who were bullied was small, but this was not included in the abstract or the press release.

The study relied on both adults and children self-reporting bullying or maltreatment by adults, which may undermine its reliability. Adults especially may be less inclined to report ill-treatment by themselves or a partner, although the authors tried to design the study in a way to guard against this. Also, as the authors point out, the study makes no distinction between abuse by adults and harsh parenting.

In the UK cohort, not all children completed the mental health assessment at 18 years. Those with more family problems were more likely to drop out, which could also make the results less reliable. There may also have been some selection bias of those people who agreed to participate in the study in the first place.

The study also failed to take cyberbullying into account, although the authors say previous studies have shown an overlap between "traditional" forms of bullying and cyberbullying.

Across both cohorts, about 40% of children who were ever maltreated were also bullied. As the authors point out, it is possible that being ill-treated may make children more susceptible to being bullied, or that both types of abuse have common risk factors.

Read more advice about bullying, including spotting the signs and what you can do to help.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Childhood bullying has worse long-term negative impact on kids than maltreatment by adults. Daily Mirror, April 28 2015

Children bullied by peers 'at greater mental health risk'. The Guardian, April 28 2015

Being bullied is 'worse than child neglect or abuse': Youngsters tormented at school are much more likely to suffer anxiety, depression or self-harm. Daily Mail, April 29 2015

'Being bullied worse than neglect and abuse,' study says. The Daily Telegraph, April 27 2015

Links To Science

Lereya ST, Copeland WE, Costello EJ, Wolke D. Adult mental health consequences of peer bullying and maltreatment in childhood: two cohorts in two countries. The Lancet Psychiatry. Published online April 28 2015

Categories: NHS Choices

Brain’s 'hunger hub' could be switched off

Tue, 28/04/2015 - 12:30

"Have scientists found a way to banish hunger pangs?," the Mail Online asks. The question is prompted by research in mice looking at the "biological pathways" that regulate appetite and hunger.

While it may feel like the sensation of hunger is triggered by the stomach, it is actually the brain that causes the sensation - specifically, a region of the brain called the hypothalamus.

Experiments found that the brains of hungry mice produce a chemical that targets certain nerve cells. These then stimulate more nerve cells, which promote appetite. The signals to the appetite-stimulating nerve cells can be blocked by the chemical POMC. 

The researchers suggest that this process could be a target for weight loss treatment, possibly in the form of an appetite suppressant. 

However, it is too early to confirm whether this could be a possibility. Biological pathways may be similar in humans, but we don't know if they are exactly the same. Even if further research confirms this pathway as being a key regulator of food intake in humans, no treatment targeting this area has been developed. The researchers used invasive techniques, such as surgery or injections, to manipulate the pathways in mice, rather than drug treatment.

The study furthers the understanding of brain pathways involved in appetite regulation, but the findings have no current implications.

 

Where did the story come from?

The study was carried out by researchers from the University of Edinburgh, Harvard Medical School and other US institutions. The study received various sources of funding, including the University of Edinburgh Chancellor’s Fellowship and US National Institutes of Health.

The study was published in the peer-reviewed scientific journal Nature Neuroscience.

The Mail Online may have jumped the gun in calling this research a "breakthrough that could help dieters lose more weight". We are a long way from knowing whether a safe and effective treatment could be developed on the back of this research, and even further from knowing whether such a treatment could make dieters "less grumpy".  

The Daily Telegraph’s coverage is more restrained and includes some interesting, if over-optimistic, quotes from the researchers themselves. 

 

What kind of research was this?

This was a laboratory study looking at how appetite is regulated by brain cells in the arcuate nucleus (ARC) of the hypothalamus. The hypothalamus brain region regulates hormone production, keeping our body process in balance; this includes temperature, sleep and appetite. 

The researchers say there are two sets of brain cells in the ARC that work to regulate appetite. Some signal that the body is full, others that the body is hungry and needs to eat. ARC agouti-related peptide (AgRP) increases food intake, while pro-opiomelanocortin (POMC) decreases food intake. It is thought that both control appetite by influencing downstream nerve cells – melanocortin-4 receptor (MC4R)-expressing nerves. Previous studies have shown that MC4R nerves have an effect on feelings of fullness and promoting weight loss. These nerves are located in a different part of the hypothalamus – the paraventricular nucleus of the hypothalamus (PVH).

This study used various mice genetically engineered to have functioning or non-functioning versions of these nerves, to further explore the nerve pathways that control appetite in the hypothalamus.

 

What did the research involve?

The research team used a large number of laboratory experiments in mice to explore in detail the brain pathways involved in appetite and feeding behaviour.

They did a lot of different experiments, which included manipulating brain circuitry through genetic engineering and surgery to measure the effect on energy expenditure, eating habits and other appetite-related behaviour. For example, one of the experiments involved switching off different brain cells by exposing the mice to blue laser light, via an optical fibre implanted into their brains. This allowed them to see what role these brain cells were playing. Other experiments involved manipulating cell function via injections. They also analysed the brains of mice after they’d died.

All the experiments aimed to build a clearer picture of the specific roles of AgRP, POMC and MC4R brain cell signalling in appetite and feeding behaviour. 

 

What were the basic results?

The researchers found that not having enough energy activates the AgRP cells of the ARC, and this switched off the MC4R nerve cells of the PVH, which drive hunger, appetite and food intake.

MC4R had this effect by activating the lateral parabrachial nucleus (LPBN) pathway. Activating this brain circuit promoted appetite.

Meanwhile, fullness stimulates the POMC cells of the ARC and "switches on" the MC4R cells.

In short, switching MC4R nerve cells off heightened hunger, while turning them on meant they felt full.

 

How did the researchers interpret the results?

The researchers say that the effects of MC4R nerve cells upon LPBN nerve cells supports this as a brain circuit for suppressing appetite, and highlights this as "a promising target for antiobesity drug development".

 

Conclusion

This study in mice explored the nerve cell pathways in the hypothalamus that regulate appetite.

It found that hunger drives cells producing a chemical called AgRP to increase food intake. They do this by acting upon MC4R nerve cells in another region of the hypothalamus, which in turn stimulate another nerve cell pathway (LPBN) to stimulate appetite. Meanwhile, another group of nerve cells producing a chemical called POMC block this MC4R pathway when we are full.

The researchers suggest that this MC4R and LPBN pathway could be a target for weight loss treatment. However, it is too early to say if this is a possibility. This laboratory research has only studied mice, and though biological pathways may be similar in humans, we do not know if they are exactly the same. Even if further study identifies the same pathway used in humans, there is currently no treatment to target it. There would be many stages in drug development to go through before it is known whether a treatment could be developed, and then whether it could be safe and effective.

There are techniques you can use to resist the temptation of abandoning your diet goals for the day, such as recognising the triggers, like stress or tiredness, that cause you to overeat. Once you do this, try to find new methods, other than eating, to cope with the triggers.

Read more about "Diet Danger Zones" and how to avoid them

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Have scientists found a way to banish hunger pangs? Breakthrough could help dieters lose more weight - and make them less grumpy. Mail Online, April 27 2015

Have scientists found a way to 'switch off' hunger? The Daily Telegraph, April 27 2015

Links To Science

Garfield AS, Li C, Madara JC, et al. A neural basis for melanocortin-4 receptor–regulated appetite. Nature Neuroscience. Published online April 27 2015

Categories: NHS Choices

No evidence organic milk in pregnancy lowers a baby's IQ

Tue, 28/04/2015 - 12:00

"Pregnant women who switch to 'healthier' organic milk may be putting the brain development of their unborn babies at risk," The Guardian reports after researchers found organic milk had lower levels of iodine than standard milk.

Iodine is needed for the healthy function of the thyroid gland. Thyroid hormones are needed for the development of the brain and spinal cord in an unborn baby. This means a sufficient iodine intake during pregnancy is important, as it is throughout life.

As a result of different farming systems, milk produced from grass-fed organic cows during the summer is known to contain less iodine than standard milk.

This study compared samples of milk taken from supermarkets during the winter. It found that in the winter, organic milk still contains around a third less iodine than standard milk. This is regardless of fat content. But the iodine content in a normal 346ml glass of organic milk was still enough to provide the recommended daily iodine intake.

Despite the headlines, the researchers did not actually look at the effects of milk consumption on any measure of child health, including intelligence. The study also did not consider the iodine content of other dairy products or non-dairy sources, such as eggs, fish and certain grains.

This study therefore does not provide any evidence to suggest drinking organic milk during pregnancy could have a negative impact on a child's IQ.

But it is worth being aware that organic milk is likely to contain less iodine than standard milk, so you may need to balance your intake through other sources. 

Where did the story come from?

The study was carried out by researchers from the University of Reading and was published in the peer-reviewed journal, Food Chemistry. 

It was funded by the University of Reading, and the authors declare no conflicts of interest.

The media headlines give the impression the study found evidence organic milk can harm babies' development. This is not the case.

While it is true iodine is needed for the development of a healthy brain and nervous system, this study only compared the iodine content of a sample of different milks. It didn't look at any health outcomes for babies whose mothers drank organic or non-organic milk during pregnancy. 

What kind of research was this?

This was a cross-sectional study that aimed to compare the iodine content of organic and standard milk produced during the winter; whole, semi-skimmed and skimmed milk; and pasteurised and ultra-high-temperature (UHT) treated milk.

The researchers explain how iodine is a key component of the hormones released by the thyroid gland. These hormones are important for the development of the foetal brain and spinal cord. This makes iodine intake in pregnancy important.

The recommended intake for adults in the UK is 140mcg (0.14mg) a day, with no recommended increase during pregnancy or breastfeeding. 

Previous studies have observed an increase in iodine deficiency in the UK, particularly among teenage girls, as we reported in 2011. Milk and dairy foods are the main source of iodine intake in this country.

One study found organic milk produced during the summer has lower iodine concentration than standard milk. But there has been little research comparing organic milk produced in the winter with standard milk, or looking at the effect of the fat content in milk or the processing method. This is what this study aimed to investigate. 

What did the research involve?

The researchers carried out two studies to investigate this. In the first study, they purchased 22 samples of organic and standard milk (whole, semi-skimmed and skimmed) from two supermarkets in Reading in late January 2014.

In the second study, they purchased 60 samples of milk from four supermarkets in Reading over three consecutive weeks starting from the beginning of February.

They bought five different types of milk product:

  • standard semi-skimmed
  • organic semi-skimmed
  • branded organic semi-skimmed
  • UHT semi-skimmed
  • Channel Island standard whole milk

Milk samples were then analysed in the laboratory for fat, protein, lactose and iodine concentrations. 

What were the basic results?

In the first study, the researchers found standard or organic production systems made no difference to the fat, protein or lactose content of milk (whole, semi-skimmed or skimmed).

However, organic milk had significantly lower iodine concentration than standard milk – about a third lower. The difference here was 595mcg/l of iodine in each litre of standard milk, compared with 404mcg/l in each litre of organic milk.

The second study similarly found organic milk had a significantly lower iodine concentration than standard milk. Again, this was about a third lower, with standard milk having iodine of 474mcg/l versus 306mcg/l in organic. Branded milk tended to have lower iodine content than own-brand organic.

UHT milk also had a significantly lower iodine content than standard milk and was no different from organic milk. The iodine content in the standard Channel Island whole milk was no different from the other standard milks.  

How did the researchers interpret the results?

The researchers say their results "indicate that replacement of [standard] milk by organic or UHT milk will increase the risk of suboptimal iodine status, especially for pregnant or [breastfeeding] women".  

Conclusion

Previous studies have shown that organic milk produced during the summer has lower iodine content than standard milk. This is said to have been the first study comparing the milks in the winter. It also found iodine concentration is lower in organic milk.

During the winter, cows housed indoors receive more iodine supplement through their feed concentrate than grazing cows in the summer. Winter milk is therefore known to contain more iodine than summer milk.

It may have been expected there would be less of a difference between organic and non-organic cows during the winter. But organic systems are known to rely more heavily on foraged feed than standard systems, which is why the iodine content of the milk is still expected to be lower in organic milk.

However, before leaping to the conclusion that everyone should avoid organic milk – particularly pregnant and breastfeeding women – there are some points to bear in mind.

  • The study only compared samples from a small number of supermarkets from two months in the winter of 2014. Though these are likely to give a good indication, the iodine content of milk may vary across the country and in different years.
  • Although there was almost 200mcg (0.2g) less iodine per litre in organic milk compared with standard milk, this may not mean a person who drinks this milk has an insufficient iodine intake. The amount of iodine in organic milk was still sufficient to provide the daily recommended intake of iodine in a standard glass of 346ml.
  • The study also has not taken into account other dietary sources of iodine beyond milk. It did not compare the iodine content of other organic and standard dairy products, such as cheese and yoghurt. Nor did it look at non-dairy sources, such as eggs, fish and grains. Pregnant women have to be careful about eating some non-milk sources of iodine, such as soft cheeses, undercooked eggs and seafood, and are advised to limit their intake of certain fish, such as tuna.
  • Iodine is needed to help the development of the foetal brain and nervous system. But this study did not look at health outcomes for the foetus or infant. The study did not compare the outcomes for a group of pregnant women who drank organic milk throughout pregnancy against outcomes for women who drank non-organic milk. News reports that organic milk may harm an unborn baby or affect IQ are therefore not supported by the results of this study.
  • Excess iodine intake could have an impact on the way the thyroid works. It should be possible to get all the iodine you need through a balanced diet without the need for supplements, even during pregnancy. The current advice is you should take no more than 500mcg (0.5mg) of iodine supplements a day.

The possible benefits and drawbacks of organic versus non-organic farming methods have often been debated. There is no firm evidence that organic foods offer any health benefits.

The choice about whether or not to go organic is often prompted by ethical concerns about animals and the environment. Pregnant and breastfeeding women still have this option – there is no evidence that drinking organic milk could harm an unborn baby.

If you drink organic milk, it is likely to contain less iodine than standard milk, so you may need to balance your intake through other sources. Good food sources of iodine include fish and shellfish.

Pregnant women should never eat raw shellfish and should also avoid eating shark, swordfish and marlin because of their high mercury content.

Read more about foods pregnant women should avoid

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Organic and UHT milk could put unborn babies at risk, says study. The Guardian, April 28 2015

Can drinking organic milk in pregnancy cut baby's IQ? Normal milk contains a third more iodine vital to brain growth. Daily Mail, April 28 2015

Organic milk 'may harm babies brain development'. ITV News, April 28 2015

Organic milk can put babies' health at risk. The Times, April 28 2015

Links To Science

Payling LM, Juniper DT, Drake C, et al. Effect of milk type and processing on iodine concentration of organic and conventional winter milk at retail: Implications for nutrition. Food Chemistry. Published online January 24 2015

Categories: NHS Choices

Having a spine similar to a chimp could lead to back pain

Mon, 27/04/2015 - 13:00

"People with lower back problems are more likely to have a spine similar in shape to the chimpanzee," BBC News reports. Research suggests that humans with similar shaped vertebrae to chimps are more vulnerable to developing a slipped disc.

Back pain is a common problem that affects most people at some point in their life and is one of the leading causes of what is known as a slipped disc – when one of the discs that sit between the bones of the spine (the vertebrae) is damaged and presses on the nerves.

But our knuckle-walking ape cousins don’t suffer nearly as much. One explanation is that our back problems are due to the extra stress placed on our backs from standing upright.

Scientists studying the vertebrae of chimpanzees, medieval humans and orangutans found humans with disc-related back problems had spines more similar in shape to chimpanzees.

Back problems in this study were defined as the presence of a lesion called a Schmorl's node; they are most often seen in people who have a slipped disc and can be a general sign of degeneration in the spine, though their significance is not completely understood. The participants, however, were long dead, so we don’t actually know if they had back pain.

The researchers think this knowledge could be used to identify people who are more likely to have back problems, based on the shape of their spines. This is plausible, but not yet a reality.

Where did the story come from?

The study was carried out by researchers from Universities in Canada, Scotland, Germany and Iceland. It was funded by the Social Sciences and Humanities Research Council, Canada Research Chairs Program, Canada Foundation for Innovation, British Columbia Knowledge Development Fund, MITACS, and Simon Fraser University.

The study was published in the peer-reviewed science journal BMC Evolutionary Biology. This is an open-access journal, so the study is free to read online.

Generally, the UK media reported the story accurately, avoiding the common pitfall of saying, or implying, that humans have evolved from chimps. This is not the case. We both have a common ancestor, so are cousins, albeit cousins who shared a grandparent 5-10 million years ago.

Many articles suggested that the finding may help identify people at a higher risk of back pain, such as athletes. However, any implications from this study are not completely clear, and we don’t yet know how useful this knowledge would be in practice.

 

What kind of research was this?

This was an evolutionary study looking at the spines of human and non-human primates to see how differences might relate to back problems.

Back pain is a common problem that affects most people at some point in their life. However, our ape cousins don’t suffer nearly as much. One explanation is that our back problems are due to the extra stress placed on our backs from standing upright. Non-human apes don’t walk upright nearly as much as humans.

Our ape ancestors' vertebral shape would not have been adapted for walking upright. Because of this, the research team predicted that people whose vertebrae were at the more ancestral end of the range of shape variation can be expected to suffer disproportionately more from load-related spinal disease.

 

What did the research involve?

The last thoracic (upper back) and first lumbar (lower back) vertebrae from 71 humans, 36 chimpanzees and 15 orangutans were scanned using computers and compared in detail for subtle differences in their shape and position of bony landmarks.

The human vertebrae were from skeletons dug up from the medieval and post-medieval period, while chimpanzee and orangutan vertebrae were a mix of wild and zoo animals from US Natural History museums.

Of the human vertebrae, about half had Schmorl’s nodes, and half did not. The spine is made up of stacks of bone (vertebrae) and discs (cartilage), making the spine both strong and moveable. The nodes are small bulges of the cartilage disc into the adjacent bony vertebrae.

They are most often seen in people who have a slipped disc and may be a general sign of degeneration and inflammation in the spine.

However, the nodes' significance in slipped discs and back pain is not completely understood. For example, some people who have them have pain, while others do not. For the purposes of this research, vertebrae with the Schmorl’s nodes were referred to as “diseased” and those without referred to as “healthy”. None of the non-human ape vertebra were classed as diseased.

They fed all the information into a statistical model to predict spine health for human and non-human apes.

 

What were the basic results?

The predictive model was able to show there were differences in the vertebrae in healthy humans, chimpanzees and orangutans. Crucially, it found no difference between diseased human vertebrae and chimpanzees.

This suggested that humans with Schmorl’s nodes are closer in shape to chimpanzee vertebrae than healthy human vertebrae.

 

How did the researchers interpret the results?

The research team concluded: "The results support the hypothesis that intervertebral disc herniation [a "slipped disc"] preferentially affects individuals with vertebrae that are towards the ancestral end of the range of shape variation within H. sapiens [modern humans] and therefore are less well adapted for bipedalism [walking upright on two legs]. This finding not only has clinical implications, but also illustrates the benefits of bringing the tools of evolutionary biology to bear on problems in medicine and public health."

 

Conclusion

This evolutionary research used a small sample of vertebrae from humans, chimpanzees and orangutans to show that people with a disc bulge had spines more similar in shape to chimpanzees than healthy humans. The research team took this as a sign that people with vertebrae shape more similar to chimpanzees may be more likely to have disc-related back problems because they are less well adapted, evolutionary speaking, to walking upright.

The main limitation of the study is the use of Schmorl’s nodes to label spines as "diseased" vs. "healthy", and to assume the presence of the nodes was a sign of back pain. Obviously, the skeletons could not be asked whether they experienced back pain. The significance of Schmorl’s nodes is still not completely understood. Not everyone with them has back pain, so the results are less widely applicable than they may appear.

The study also used a relatively small number of vertebrae to reach its conclusions. The reliability of the findings would be improved if they were replicated using more vertebrae.

The implications of the study were summed up by lead scientist Dr Kimberly Plomp, in The Daily Telegraph, who said: "The findings have potential implications for clinical research, as they indicate why some individuals are more prone to back problems … This may help in preventative care by identifying individuals, such as athletes, who may be at risk of developing the condition."

This may be possible, but at this stage in the research, we can’t draw any firm conclusions.

The study isn’t applicable to all back pain, only those related to specific disc bulges. The findings are not relevant to the large number of people with general mechanical back pain, without specific cause, or to people with other disease or injury causes of back pain.

For advice on how to prevent and treat back pain, visit the NHS Choices Back Pain Guide.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Lower back pain linked to chimpanzee spine shape. BBC News, April 27 2015

Back pain sufferers may have 'vertebrae like apes'. The Daily Telegraph, April 27 2015

Back pain 'linked to chimpanzee ancestors'. ITV News, April 27 2015

Links To Science

Plomp KA, Viðarsdóttir US, Weston DA, et al. The ancestral shape hypothesis: an evolutionary explanation for the occurrence of intervertebral disc herniation in humans. BMC Evolutionary Biology. Published online April 27 2015

Categories: NHS Choices

Parents 'may pass anxiety on to their children'

Mon, 27/04/2015 - 13:00

The Mail Online has given stressed-out parents one more thing to worry about, saying: "Anxiety is 'catching' and can be passed on to children", adding that, "Attitudes of over-anxious parents can severely affect children's behaviour".

The study that prompted these headlines used an interesting "children of twins" study design intended to filter out the influence of genetics, which is known to have an effect on anxiety.

To do this, researchers studied patterns of anxiety in families of identical twins, who are genetically identical, and in families of non-identical twins.

They found there was some link between anxiety and neuroticism (a tendency to have negative thought patterns) in parents and their adolescent children.

There was no evidence that genetics was playing a significant role, but modest evidence that non-genetic factors were. This suggested that anxiety, far from being hardwired into DNA, might be passed on in other ways, such as through learned or mimicked behaviour.

In the Mail Online, journal editor Dr Robert Freedman said: "Parents who are anxious can now be counselled and educated on ways to minimise the impact of their anxiety on the child's development."

This suggestion seems a touch premature – as noted by the researchers, there is a chicken and egg situation here that has not been resolved. Do children worry because they sense their parents are worried, or do parents worry because they see their children are worried about something? 

Family life is not always easy, but one way to boost your physical and mental health is to make the time to do activities as a family

Where did the story come from?

The study was carried out by researchers from universities based in London, Sweden and the US. It was funded by the Leverhulme Trust, the US National Institute of Mental Health, and the National Institute for Health Research.

The study was published in The American Journal of Psychiatry, a peer-reviewed medical journal. It has been made available online on an open-access basis, so it is free to read or download as a PDF.

Generally, the Mail Online reported the story accurately, but hardly mentioned the study's limitations. The quote from journal editor Dr Robert Freedman saying that, "Parents who are anxious can now be counselled and educated on ways to minimise the impact of their anxiety on the child's development", seems a little premature, based on the relatively weak associations found in this research.  

What kind of research was this?

This twin study investigated the relative role of genetic factors (nature) and non-genetic factors (nurture) in the transmission of anxiety from parent to child.

Non-genetic factors might be, for example, the children observing their parents' anxious behaviours and mimicking them, or the parenting style of anxious parents.

The researchers say it is well recognised that anxiety can run in families, but the underlying processes are poorly understood. This study wanted to find out whether genetics or environment was more important in the transmission of anxiety, by observing identical twins.

This type of study is commonly used for this type of question. It does not aim to pinpoint exact genes or non-genetic factors that play a role in a trait. 

What did the research involve?

The team gathered self-reported anxiety ratings from parents and their adolescent children. They compared the results between identical twin families and non-identical twin families to see to what extent non-genetic factors were driving anxiety transmission, in contrast to genetics. 

Data came from the Twin and Offspring Study of Sweden, which has information on 387 identical (monozygotic) twin families and 489 non-identical (dizygotic) twin families. A twin family comprised a twin pair where both twins were parents, each twin's spouse, and one of each of their adolescent children.

In families where the twins were identical, the cousins would share, on average, 50% of the same DNA with their (blood) aunt or uncle. In families where the twins were not identical, the cousins would share less of their DNA, on average, with their aunt or uncle.

If cousins whose parents are identical twins are more similar to their aunt or uncle for a trait than cousins whose parents are non-identical twins, this suggests that genes are playing a role.

Only same-sex twin pairs were used. Twin offspring were selected, so cousins were the same sex as one another and did not differ in age by more than four years, so they were as similar as possible. The average age of the twin offspring was 15.7 years.

This type of study design, known as a "children of twins" study, is intended to dampen down the potential influence that family genetics could have on the outcomes being investigated.

Anxious parental personality was self-reported using a 20-item personality scale. They rated phrases such as, "I often feel uncertain when I meet people I don't know very well", and, "Sometimes my heart beats hard or irregularly for no particular reason".

Each item was ranked between 0 (not at all true) and 3 (very true), covering social and physical signs of anxiety, as well as general worry. There was a similar self-reported scale to measure neuroticism.

Offspring anxiety symptoms – social, physical and general worry – were measured in a similar way, using questions from a Child Behaviour Checklist.

Both parents and offspring rated their anxiety and neuroticism over the last six months. The researchers used computer modelling of the relationships between individuals and their traits to estimate the contribution of genetic and non-genetic factors. 

What were the basic results?

Analysis of the data suggested genetic factors were largely not driving the transmission of anxiety or neuroticism from parent to adolescent. Ratings of anxiety and neuroticism within and between twin families were only very weakly linked.

However, there was "modest evidence" that non-genetic transmission of both anxiety and neuroticism was happening. Although still a relatively weak relationship, it was statistically significant, unlike the genetic finding. 

How did the researchers interpret the results?

The research team said their results supported the theory that direct, environmentally mediated transmission of anxiety from parents to their adolescent offspring was the main driver, and not genetics.   

Conclusion

This study tentatively shows that environmental factors, as opposed to genetics, play a more important role in the transmission of anxiety from parents to their adolescent children.

However, it used self-reported anxiety ratings over a six-month period, so this tells us very little about any potential longer-term effects of anxiety transmission while growing up.

The correlations in the main results were quite weak. This means that not every adolescent with an anxious parent will "catch" or "take on" their parents' anxiety. This suggests that it's a more complex issue.

The results showed non-genetic (environmental) factors were more important than genetic, but precisely what these environmental factors were is not something this study can tell us.

The study used a clever and unique sample of twins and their families to drill down into the age-old debate about the influence of nature versus nurture. However, it doesn't prove that environmental factors are the main driver overall.

That notwithstanding, the authors suggest two main contrasting explanations for the results:

  • parental anxiety causes their children to be more anxious – this could happen through different learning and mirroring behaviours known to occur when children and adolescents grow and develop; for example, an adolescent witnessing repeated examples of parental anxiety may learn that the world is an unsafe place that should be feared
  • anxiety in the offspring influences the parenting they receive – the flipside is that a teenager showing anxious behaviour may cause their parents to worry; the research team add that this might in turn worsen the anxiety in the teenager, creating a negative feedback loop

This twin study doesn't bring us any closer to knowing which explanation might be true, or to what extent this can be impacted by changes in behaviour.

Despite these limitations, the hypothesis that children are sensitive to their parents' attitudes and mood seems plausible. So, learning more about how to manage your stress and feelings of anxiety could be good for both you and your children.

For more information and advice, visit the NHS Choices Moodzone.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Anxiety is 'catching' and can be passed on to children, scientists warn over-protective parents. Mail Online, April 24 2015

Links To Science

Eley TC, McAdams TA, Rijsdijk FV, et al. The Intergenerational Transmission of Anxiety: A Children-of-Twins Study. The American Journal of Psychiatry. Published online April 2015

Categories: NHS Choices

Gene editing technique could prevent inherited diseases

Fri, 24/04/2015 - 13:45

"Researchers in the US have raised hopes for a simple genetic therapy that could prevent devastating diseases being passed on from mothers to their children," The Guardian reports.

The diseases in question are known as mitochondrial diseases, where mutations occur in mitochondria: a small section of DNA that is passed directly from mother to child.

Some children born with mitochondrial diseases can develop symptoms including muscle weakness, intestinal disorders and heart disease – and have reduced life expectancy.

One option to treat this, as we have discussed several times, is so-called "three-parent IVF", where unhealthy mitochondria are replaced by donor mitochondria.

This new technique from the US may offer an alternative approach.

The researchers developed a way to target and break down mutated mitochondrial DNA. They found they could successfully use this technique in mouse eggs. Once fertilised, these eggs could go on to produce healthy and fertile mice, with little of the targeted mitochondrial DNA in their cells. The technique also seemed to work on hybrid mouse-human cells carrying human mitochondrial DNA mutations in the lab.

This new technique is of interest because if it were effective and safe in humans, it could offer a way to prevent mitochondrial diseases without the need for the donor egg. Many questions remain for future studies to investigate before this technique could be considered for testing in humans.

 

Where did the story come from?

The study was carried out by researchers from the Salk Institute for Biological Studies and other research centres in the US, Japan, Spain and China.

The researchers were funded by the Leona M. and Harry B. Helmsley Charitable Trust, the US National Institutes of Health, National Basic Research Program of China, Chinese Academy of Sciences, National Natural Science Foundation of China, the JDM Fund, the Muscular Dystrophy Association, United Mitochondrial Disease Foundation, the Florida Department of Health and the G. Harold and Leila Y. Mathers Charitable Foundation.

The study was published in the peer-reviewed scientific journal Cell on an open-access basis, so the study is free to read online.

Both the Guardian and The Independent cover this research reasonably. One quote from a study author suggests that: "the technique is simple enough to be easily implemented by IVF clinics around the world", but it is important to realise that much more research is needed to make sure the technique is effective and safe before it could be tested in humans.

 

What kind of research was this?

This was laboratory and animal research aiming to develop a new way of preventing transmission of mutations in the mitochondrial DNA. This research is appropriate for the early development of new techniques, which may eventually be used to treat human disease.

While most of our DNA is found in a compartment of our cells called the nucleus, there is some DNA within the cell’s many mitochondria. These are the energy producing "powerhouses" of the cells. Mutations in this DNA can cause a range of serious diseases affecting the organs that need a lot of energy – such as the brain and muscles.

We inherit our mitochondria from our mothers. Researchers have developed techniques to avoid passing these mutations on, involving transferring the DNA from the mother’s nucleus into a donor egg. Manipulation of human embryos is tightly controlled in the UK, and after much debate, the government recently agreed to make it legal to perform these "three-parent IVF" techniques to prevent mitochondrial diseases.

One concern with these techniques is that the child inherits mitochondrial DNA from a third person (the egg donor). The current research aimed to develop a different technique to avoid passing on mitochondrial mutations that does not involve a donor egg. It is specifically aimed at women who have a mixture of mitochondria in their cells – some carrying a disease-causing mutation and some not.

 

What did the research involve?

The researchers developed a technique to reduce the amount of mutation carrying mitochondrial DNA. This involved injecting into the cells genetic instructions for making a protein to be sent to the mitochondria and cut the mitochondrial DNA in a specific place. They first tested this technique on mouse egg cells that carried a mixture of two types of mitochondrial DNA, one of which could be cut by the protein (the "target" mitochondrial DNA) and one which could not. They then checked to see if it could reduce the amount of “target” mitochondrial DNA.

They then tested it on fertilised "mixed mitochondrial DNA" mouse egg cells to see if it had the same effect and whether it affected development of the embryo. They also implanted treated embryos into host mother mice to see if the offspring were born healthy and how much of the target mitochondrial DNA they carried.

Finally, they modified their technique slightly so they could use it against human mitochondrial DNA carrying disease-causing mutations. After testing this adapted technique in mice, they tested it on cells in the lab containing human mitochondria with mutations that caused one of two different mitochondrial diseases:

  • Leber’s hereditary optic neuropathy and dystonia (LHOND)
  • neurogenic muscle weakness, ataxia and retinitis pigmentosa (NARP)

These are both rare conditions in humans that cause symptoms affecting the muscles, physical movement and vision.

These hybrid cells were created by fusing mouse egg cells and human cells carrying the mitochondrial mutations.

 

What were the basic results?

The researchers found that their technique reduced the amount of the target type of mitochondrial DNA in the "mixed mitochondrial DNA" mouse egg cells. Their technique performed similarly in fertilised embryos from these eggs. These embryos appeared to develop normally in the lab when examined under a microscope. The technique did not appear to affect the DNA in the mice’s nuclei.

When the treated embryos were implanted into host mothers, the offspring born also had much less of the target type of mitochondrial DNA throughout their bodies. They appeared to be healthy and develop normally in the tests performed, and could themselves produce healthy offspring. These offspring had such low levels of the target type of mitochondrial DNA that it was barely detectable.

The researchers were able to adapt their technique to target human mitochondrial mutations. It reduced the amount of mitochondrial DNA containing the LHON or NARP mutations in hybrid egg cells in the lab.

 

How did the researchers interpret the results?

The researchers concluded that their "approaches represent a potential therapeutic avenue for preventing the transgenerational transmission of human mitochondrial diseases caused by mutations in [mitochondrial DNA]".

 

Conclusion

This early research has developed a new technique to reduce the amount of mutation-carrying DNA within mitochondria. The hope is that this technique might be used in the eggs of women carrying disease-causing mitochondrial mutations.

The government has recently given the go ahead for a technique that allows a woman who carries such a disease from passing it on to her child – making the UK the first country to do so.

This technique has raised some ethical and safety concerns, as it places the woman’s chromosomes into a donor egg with healthy mitochondria. This means that once this egg is fertilised it contains DNA from three people – the DNA in the nucleus comes from the mother and father, and the mitochondrial DNA comes from the egg donor.

This new technique is of interest because if it were effective and safe in humans, it could offer a way to prevent mitochondrial diseases without the need for the donor egg. This technique shows promise, but is still in its early stages. It has thus far only been tested in mice, and in human-mouse hybrid egg cells carrying mutated human mitochondria in the lab.

It is also specifically aimed at women who have a mixture of normal and mutated mitochondrial DNA, as it relies on the normal mitochondrial DNA being there to "take over" once the mutated DNA has been reduced. It would not work in women who have only mutated mitochondria, and there may be a certain level of normal mitochondrial DNA that needs to be present for the technique to work.

All of these issues are likely to be investigated in future studies.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Hopes raised for new genetic therapy to prevent inherited diseases. The Guardian, April 23 2015

Scientists develop technique that could stop a genetic disease being passed on to future generations. The Independent, April 23 2015

Links To Science

Reddy P, Ocampo A, Suzuki K, et al. Selective Elimination of Mitochondrial Mutations in the Germline by Genome Editing. Cell. Published online April 23 2015

Categories: NHS Choices

Air pollution linked to silent strokes

Fri, 24/04/2015 - 13:00

"Adults who live in towns and cities suffer ageing of the brain and increased risk of dementia and [silent] strokes because of air pollution," The Daily Telegraph reports.

A "silent stroke" (technically known as a covert brain infarct) are small areas of damage caused by lack of oxygen to the brain tissue, but are not severe enough to cause obvious symptoms. They may be a sign of blood vessel disease, which increases the risk of one type of dementia (vascular dementia).

This headline is based on a study which took brain scans of more than 900 older adults and assessed their exposure to air pollution. It found that higher levels of small particles in the air around where an individual lived were associated with a greater likelihood of them having signs of a "silent stroke" on a brain scan.

There was some evidence of association between the particles and slightly smaller brain volume, but this link did not remain once people’s health conditions were taken into account.

Limitations of the study include that the researchers could only estimate people’s air pollution exposure based on average air quality of where they lived in one year, rather than lifetime exposure. It should also be noted that the news has suggested a link to dementia, but the study did not actually assess this.

The findings need to be investigated in future studies before firm conclusions can be drawn.

If you are concerned about air pollution, then the Department for Environment, Food & Rural Affairs (DEFRA) provides alerts when pollution is known to be high or very high in a particular region.

Where did the story come from?

The study was carried out by researchers from Beth Israel Deaconess Medical Center and other centres in the US. It was funded by the US National Institutes of Health and the United States Environmental Protection Agency.

The study was published in the peer-reviewed medical journal Stroke.

The Daily Telegraph headline suggests that air pollution could increase a person’s risk of dementia, but this is not what the study assessed, and none of the participants had dementia, a stroke or mini-stroke (also known as a transient ischaemic attack).

They also suggest that it is living in towns and cities that increases risk, but this was not what the study assessed. It compared people with different levels of particulate matter in the air where they lived, not whether they lived in towns and cities, and in their main analyses they did not include people living in rural areas far from major roads.

The Mail Online similarly overstates findings, by stating that "living near congested roads with high levels of air pollution can cause ‘silent strokes’". While an association was found, a direct cause and effect relationship remains unproven.

 

What kind of research was this?

This was a cross-sectional analysis assessing whether there was a link between air pollutant exposure and changes in the brain linked to ageing.

The authors report that long-term exposure to air pollution is associated with, for example, increased risk of stroke and cognitive impairment. However, its effects on the structure of the brain are not known. If air pollution is linked to structural brain changes, these could, in turn, contribute to the risk of stroke and cognitive problems.

This type of study can show links between two factors, but cannot prove that one caused the other. As the study was cross-sectional, it cannot establish the sequence of events and whether exposure to air pollution came before any differences or changes in brain structure. As an observational study, there may also be factors other than air pollution exposure that could be causing the differences seen. The researchers did take steps to try to reduce the impact of other factors, but they may still be having an effect.

 

What did the research involve?

The researchers took brain scans of 943 adults aged 60 and over. They also estimated their exposure to air pollution, based on where they lived. They then analysed whether those with more exposure to air pollution were more likely to have smaller brain volume or signs of damage.

Participants in this study were taking part in an ongoing longitudinal study in the US state of New England. Only those who had not had a stroke or mini-stroke and did not have dementia were selected to take part.

The type of effects on the brain that the researchers were looking for were referred to as "subclinical". This means that they did not cause the people to have symptoms and therefore would not normally be detected.

They looked at total volume of the brain and also the volume of the specific parts of the brain using a magnetic resonance imaging (MRI) brain scan. The brain shrinks gradually with age, so the researchers were interested in whether pollution might have a similar effect. The MRI also identified whether the brain showed signs of a "silent stroke" – that is, parts of the brain tissue that had been damaged by having the blood supply interrupted.

These "covert brain infarcts" were not severe enough to cause symptoms, in the form of a stroke or mini-stroke. However, this damage suggests that the person may have some degree of blood vessel (vascular) disease. They are often seen in the brain scans of people who have vascular dementia.

The researchers used satellite data measuring the level of small particles (PM2.5) on the air in New England to assess average daily air pollution exposure at each participant’s current home address in 2001. They also assessed how close each home was to roads of different sizes. The researchers only looked at those living in urban and suburban areas in their main analyses.

They then looked at whether there were any links between estimated particulate matter exposure and distance from roads and brain findings.

They first took into account confounding factors that could affect results, including:

  • age
  • gender
  • smoking
  • alcohol intake
  • education

They then carried out a second analysis, taking into account a number of additional factors, such as:

  • diabetes
  • obesity
  • high blood pressure

 

What were the basic results?

Average (median) daily exposure to small particles in the air was about 11 microgrammes per cubed metre of air, and participants lived an average of 173 metres from a major road. The participants were, on average, 68 years old when they had their brain scan, and 14% showed signs of a "silent stroke" on the scans.

The researchers found that greater estimated exposure to air pollution was associated with a slightly smaller total brain volume. Each two microgramme per cubed metre increase in particulate matter was associated with a 0.32% lower brain volume. However, once this analysis was adjusted for conditions such as diabetes, this difference was no longer statistically significant.

Greater estimated exposure to air pollution was also associated with a higher likelihood of having signs of "silent stroke" damage to the brain tissue. Each two microgramme per cubed metre increase in particulate matter was associated with a 37% higher odds of this silent damage (odds ratio (OR) 1.37, 95% confidence interval (CI) 1.02 to 1.85).

They did not find differences in association across areas with different average income brackets. Distance from a major road was not linked to total brain volume or a "silent stroke" after adjustment for confounders.

 

How did the researchers interpret the results?

The researchers concluded that their findings "suggest that air pollution is associated with insidious effects on structural brain aging, even in dementia and stroke-free persons".

 

Conclusion

This cross-sectional study has suggested a link between exposure to small particles in the air (one form of pollution) and the presence of "silent stroke" in older adults – small areas of damage to the brain tissue that are not severe enough to cause obvious symptoms.

There are a number of limitations to be aware of when assessing the results of this study:

  • While there was an association between particulate matter in the air and total brain volume, this was no longer statistically significant after taking into account whether people have conditions such as high blood pressure, which can also affect their risk of stroke.
  • While the researchers did try to take into account factors such as smoking, alcohol intake and diabetes, which could be having an effect on risk, this may not remove their effect totally. There may also be various other unmeasured factors that could account for the association seen. This makes it difficult to be sure whether any link seen is directly due to the pollution itself.
  • The researchers could only estimate people’s air pollution exposure based on average air quality of where they lived in one year. This may not provide a good estimate of a person’s lifetime exposure.
  • While the news extrapolated these findings to suggest a link between air pollution and people’s risk of dementia, this is not what the study assessed. While areas of "silent stroke" can often be seen in people who have vascular dementia, none of the study participants had dementia, or a stroke or mini-stroke.

Overall, this study finds some evidence of a link between one measure of air pollution and "silent stroke", but the limitations mean that this finding needs to be confirmed in other studies.

It is also not possible to say whether the link exists because air pollution is directly affecting the brain.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Air pollution could increase risk of dementia. The Daily Telegraph, April 23 2015

Living near busy roads 'can raise dementia risk': Exposure to sooty particles alters structure of the brain. Mail Online, April 24 2015

Links To Science

Wilker EH, Preis SR, Beiser AS, et al. Long-Term Exposure to Fine Particulate Matter, Residential Proximity to Major Roads and Measures of Brain Structure. Stroke. Published online April 23 2015

Categories: NHS Choices

New asthma treatment within five years, researchers hope

Thu, 23/04/2015 - 12:50

"Asthma cure could be in reach," The Independent reports. Researchers have discovered that protein molecules called calcium-sensing receptors play a pivotal role in asthma. Drugs known to block these proteins already exist.

In asthma, the immune system mistakes harmless substances, such as pollen, as a threat. White blood cells and inflammatory proteins then collect in the airways. The inflammation causes the airways to constrict, leading to the breathing difficulties associated with asthma. This study found these proteins stimulate calcium-sensing receptors, which leads to further inflammation of the airways.

The research used mouse models of asthma and human airway tissue taken from asthmatic and non-asthmatic people. The researchers found increased numbers of these calcium-sensing receptors compared with healthy lung tissue. They concluded that this is one of the reasons for the exaggerated inflammatory response that occurs in asthma.

The drug calcityrol, which is used to treat osteoporosis, is known to block the actions of the receptors. It reduced inflammation of the airways when used in mice.

However, it is not clear that calcityrol could be a "cure" for asthma, as the initial inflammatory response by the immune system would still occur.

Though calcityrol pills are safe as a treatment for osteoporosis, it is not known whether the dose required to be effective in reducing the inflammation found in asthma would be safe.

The researchers plan to develop a version of the drug that can be inhaled to maximise its effectiveness and minimise side effects. They expect human trials to commence in a couple of years.

Where did the story come from?

The study was carried out by researchers from Cardiff University, the Open University, the Mayo Clinic, and the University of California, San Francisco School of Medicine in the US, and the University of Manchester and King's College London in the UK.

It was funded by Asthma UK, the Cardiff Partnership Fund, Marie Curie Initial Training Network, the Biotechnology and Biological Sciences Research Council, and the US National Institutes of Health.

Four of the authors report they are co-inventors of a patent for the use of calcium-sensing receptor antagonists for the treatment of inflammatory lung diseases.

The study was published in the peer-reviewed journal Science Translational Medicine.

The media reported the story accurately, although headlines saying that an asthma "cure" is five years away are a little premature. No clinical studies in people have been conducted yet, and there is no guarantee they will work. However, the "five-year cure" claim came from the researchers themselves.  

What kind of research was this?

This was a set of laboratory experiments involving mice models of asthma and samples of human lung tissue. The researchers aimed to better understand the inflammation that causes narrowing of the airways in asthma.

The inflammation is an exaggerated response to various triggers, such as pollen, infections and pollutants, but sometimes no cause is identified.

Recent research found that this inflammation results in the build-up of two proteins: eosinophilic cationic protein (ECP) and major basic protein. These proteins carry multiple positive electrical charges.

The researchers wanted to test the theory that the inflammation is driven by these proteins activating another type of protein molecule called calcium-sensing receptors (CaSR) on the surface of the smooth muscle cells that line the airways.  

What did the research involve?

The researchers conducted a variety of laboratory experiments, which involved looking at human lung tissue samples taken from people with asthma and comparing them with healthy lung tissue. They then performed several studies comparing mice with a type of asthma with healthy controls.

The researchers first compared the number of CaSRs in the lung tissue of people with asthma, compared with healthy lung tissue. They then measured how the CaSRs reacted to positively charged proteins and various chemicals involved in inflammatory response, such as histamine.

They repeated the experiments using a type of drug called a calcilytic, which blocks CaSRs. Calcilytic drugs were developed as a treatment for osteoporosis, as they increase the level of parathyroid hormone by targeting CaSRs. This helps to increase the level of calcium in the blood. 

What were the basic results?

The experiments indicated there are more CaSRs in people with asthma, which are required for inflammation. Calcilytic drugs blocked the receptors.

There were three times the number of CaSRs in biopsies of smooth muscle taken from the airways of people with asthma, compared with those who do not have asthma. The same was true for biopsies of mice with a form of asthma, compared with healthy controls.

In the laboratory setting, positively charged proteins and chemicals such as histamine activated the CaSRs, causing an inflammatory response. These receptors could be blocked by the calcilytic drugs.

Mice without CaSRs in their smooth muscle cells did not have an inflammatory response to the positively charged proteins. Healthy control mice did have an inflammatory response. Calcilytic drugs were able to reduce the effect of these proteins and other inflammatory stimulants tested. 

How did the researchers interpret the results?

The researchers concluded that there are more CaSRs in the lungs of people with asthma, and this contributes to the inflammation that causes narrowing of the airways.

They say that calcilytic drugs could reduce the number of CaSRs and reduce their responsiveness. This could both "prevent as well as relieve AHR [airways hyper-responsiveness]", which is found in asthma.

The researchers do not yet know if their findings would be true for all types of asthma. 

Conclusion

This piece of research has found that CaSRs play a role in the inflammatory response seen in asthma. The early results of laboratory experiments indicate that drugs called calcilytics can dampen this inflammatory response in asthmatic human lung tissue and in mice with asthma.

Though the media described this as a "cure" for asthma, the study has not proved this. It showed that there were more CaSRs in the human lung samples from people with asthma, and compared it with healthy lung tissue.

The researchers also have not shown that calcilytics can block the receptors. What is not known is how long this effect would last and whether it would stop the lungs producing more of the excessive numbers of receptors.

It remains unclear why people with asthma in this study had an increased number of receptors, and if this is true for everyone with asthma.

The researchers predict that if calcilytics prove to be effective in clinical trials, it will take around five years for them to become available as a treatment for asthma.

This is because, although this drug has been deemed a safe treatment for osteoporosis, the researchers intend to develop the drug so it can be used as an inhaler. This would deliver it straight to the lungs to maximise the effectiveness and minimise side effects.

Drug development will involve further animal trials to work out what dose would be required to achieve clinically meaningful results, and will also test its safety. If these trials are successful, the research will progress to human trials.

This is an exciting piece of research that may provide a new treatment for asthma, but it is still early days, so there are no guarantees.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Asthma cure could be in reach as scientists make 'incredibly exciting' breakthrough. The Independent, April 22 2015

Asthma could be cured within five years after drug breakthrough. The Daily Telegraph, April 22 2015

Major asthma breakthrough as scientists discover root cause of the condition - and say a new treatment is less than 5 years away. Mail Online, April 23 2015

Cardiff University scientists discover asthma's root cause. BBC News, April 22 2015

Scientists discover root cause of asthma and believe bone drug could be cure. Daily Express, April 23 2015

Links To Science

Yarova PL, Stewart AL, Sathish V, et al. Calcium-sensing receptor antagonists abrogate airway hyperresponsiveness and inflammation in allergic asthma. Science Translational Medicine. Published online April 22 2015

Categories: NHS Choices

A magnet for mosquitoes? Blame your genes

Thu, 23/04/2015 - 11:30

"Mosquitoes 'lured by body odour genes','' BBC News reports. Researchers tested a series of non-identical and identical twins, and found identical twins had similar levels of attractiveness to mosquitoes.

Researchers have long known that some people are more attractive to mosquitoes than others, and some think this is to do with body odour.

Body odour is, in part, inherited through our genes, so the researchers running this study wanted to find out whether twins with identical genes shared a similar level of attractiveness to mosquitoes.

They exposed the hands of sets of identical and non-identical twins to mosquitoes to see which twin the mosquitoes preferred.

The results showed identical twins were likely to have about the same level of attractiveness to mosquitoes, while non-identical twins' results differed more. This strongly suggests there is a genetic component, in the same way there is for height and IQ.

This could explain why one half of a couple is plagued by mosquitoes on holiday, while the other will be blissfully free of any bites. The research could eventually help scientists develop better insect repellents.  

Where did the story come from?

The study was carried out by researchers from the London School of Hygiene and Tropical Medicine, the University of Florida, the University of Nottingham and Rothamsted Research. It was funded by the Sir Halley Stewart Trust.

The study was published in the peer-reviewed medical journal PLOS One, which is an open-access journal, meaning the study can be read for free online.

Generally, the media reported the study accurately, but did not question the reliability of results from the fairly small sample size (a total of 74 participants).

The Daily Telegraph suggested that using insect repellent made no difference to people with a genetic disposition to being bitten, but the study did not look at insect repellent, so we don't know if that is true. 

What kind of research was this?

This was a laboratory-based twin study, which compared the relative attractiveness to mosquitoes of pairs of twins.

The researchers wanted to know whether identical twins, who share the same genes, were more likely to have the same level of attractiveness to mosquitoes as non-identical twins, whose genes are different.

Twin studies are useful ways to show how likely a particular trait is to be inherited. However, they can't tell us any more than that – for example, which gene is involved, or how genetics affects the trait. 

What did the research involve?

Researchers took 18 pairs of identical twins and 19 pairs of non-identical twins. They tested them for attractiveness to mosquitoes by releasing the insects into a Y-shaped tube with two sections.

The twins put their hand into the top of a section, and the researchers counted the numbers of mosquitoes that flew up each side of the tube. They then looked at whether results were closer for identical twins than for non-identical twins.

The researchers did a series of experiments, testing the twins individually against clean air, and also pairing them against each other. They tried to avoid bias in the study by using randomisation to decide which side of the tube was used by which twin, and which twin was tested first.

All the twins were women and over the age of menopause. The twins had also been asked to avoid strong-smelling food such as garlic or chilli, to avoid alcohol, and to have washed their hands with odour-free soap before the experiment.

The researchers also checked the twins' temperatures to see whether body temperature had any effect on the results. The researchers used Aedes aegypti mosquitoes, which is the strain that carries dengue fever.

They analysed the data in two sets – firstly, which twin was more attractive to mosquitoes when tested against clean air, and then which was more attractive when tested against the other twin.

As well as seeing which tube the mosquitoes flew into (used to measure relative attraction), the researchers also counted how many flew at least 30 centimetres up the Y-shaped tube (used to measure flight activity).

The researchers used an average of 10 measurements for each twin to come up with estimates of the proportion of the attractiveness that was down to heritability. 

What were the basic results?

The study found identical twins were much more likely to share the same level of attractiveness to mosquitoes than non-identical twins.

The study gives an estimate that 62% (standard error 12.4%) of relative attraction (the chances of the mosquitoes choosing that person's tube) was down to heritable factors, along with 67% (standard error 35.4%) of flight activity (the chance of the mosquitoes flying 30 centimetres up the tube).

The researchers say this would put attractiveness to mosquitoes at a level similar to height and IQ in terms of how much of it is inherited.

How did the researchers interpret the results?

The researchers say their results "demonstrate an underlying genetic component detectable by mosquitoes through olfaction". In other words, the study showed genetic differences account for at least some of the relative attractiveness of people to mosquitoes, and the difference is smelt by the insects.

They go on to suggest some people may have developed a body odour that is less attractive to mosquitoes, which could then have been handed down through natural selection of favourable genes, as it would protect against diseases such as dengue fever and malaria.

However, the researchers warn that the relatively small sample size and the nature of the experiment means they can't be precise about their conclusions. The standard error rates on their estimates of heritability are quite high, showing the level of uncertainty. 

Conclusion

This research suggests the genes you inherit from your parents may determine your chances of being bitten by mosquitoes. However, the small size of the study limits how confident we can be in the results.

The researchers suggest differences in body odour determine how attractive a person is to mosquitoes. We know body odour is partly down to inherited genetic factors, so it would make sense that inherited body odour can make you more or less attractive to mosquitoes.

However, the study doesn't tell us whether the mosquitoes were attracted to people because of their body odour, or for some other reason that wasn't researched.

A lot more research needs to be done into which inherited components of body odour are linked to attractiveness to mosquitoes before scientists can use this information to produce better mosquito repellents.

At this stage, we don't know whether people who get bitten less often have less of a mosquito-attractive chemical in their body odour, or more of a mosquito-repellent chemical.

If you get bitten by mosquitoes more than other people, and one or both of your parents does too, this research suggests you might have inherited the susceptibility to being bitten.

Unfortunately, at this stage, there's not much you can do about it, except for wearing insect repellent. Wearing light, loose-fitting trousers rather than shorts, and wearing shirts with long sleeves may also help. This is particularly important during the early evening and at night, when mosquitoes prefer to feed.

If you are travelling to an area where mosquitoes are known to carry malaria, it's vital to get medical advice about which type of antimalarial medication you should take. You may need to start taking the medication before you leave the country, so it's important to plan ahead.

Read more about antimalarial medication.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Mosquitoes 'lured by body odour genes'. BBC News, April 23 2015

Do YOU always get bitten by mosquitoes? Blame your parents: Being attractive to bugs is genetic, scientists say. Mail Online, April 23 2015

Chance of being bitten by mosquito is written in genes. The Daily Telegraph, April 22 2015

Some people are BORN to be bitten by mosquitoes - genes can make us more attractive to the bugs. Daily Mirror, April 22 2015

Genes and body odour determine chance of mosquito bites, scientists find. Daily Express, April 23 2015

Mosquito Bite? It May Be Your Parents' Fault. Sky News, April 22 2015

Links To Science

Fernández-Grandon GM, Gezan SA, Armour JAL, et al. Heritability of Attractiveness to Mosquitoes. PLOS One. Published online April 22 2015

Categories: NHS Choices

Athlete’s foot cream could also treat multiple sclerosis

Wed, 22/04/2015 - 12:10

"Two common drugs – one used for treating athlete's foot and another for alleviating eczema – may be useful therapies for multiple sclerosis," BBC News reports. The drugs have shown promise in lab and animal studies.

Multiple sclerosis (MS) is a neurological condition caused by damage to myelin. Myelin is a protein that acts as a protective layer to individual nerve fibres.

In this study researchers screened a number of drugs used for other conditions in the lab to see if any could produce mature cells to help replace damaged myelin.

One of the chemicals they identified as promising in their screen was miconazole, which is the active ingredient in some types of antifungal creams used to treat athlete’s foot. They found that it increased the number of mature myelin-producing cells in the brains of baby mice. It also helped repair damaged myelin in a mouse model of MS, and this made the mice’s symptoms less severe.

Clobetasol, a steroid cream used to treat psoriasis and eczema, also showed promise.

This is an early-stage study, and researchers hope they can eventually go on to test the drugs, or similar chemicals, in people with MS. Researchers will need to establish how safe this drug is if taken orally, and what effect it has in humans with the condition.

 

Where did the story come from?

The study was carried out by researchers from the Case Western Reserve University School of Medicine, and other research centres in the US. The study was funded by the US National Institutes of Health, New York Stem Cell Foundation, Myelin Repair Foundation, Mt. Sinai Health Care Foundation, the Case Comprehensive Cancer Center, the CWRU Council to Advance Human Health and philanthropic support from individual families. The authors declared that they did not have competing financial interests.

The study was published as a letter in the peer-reviewed scientific journal Nature.

BBC News gives a good, balanced report of this study, noting the early stage of the findings, and warning of the potential risks of people self-medicating.

The Daily Telegraph reports the study reasonably well, but refers to the drug as a possible "cure", when it is too early to talk about the drug in these terms.

 

What kind of research was this?

This was laboratory and animal research that aimed to identify known human drugs that can prompt immature oligodendrocytes (called progenitor cells) to mature. Mature oligodendrocytes are the cells that "insulate" nerves with myelin. This myelin sheath helps nerves to send messages, and damage to the myelin sheath causes conditions such as multiple sclerosis (MS). One way to repair this damage might be to prompt the body to make more oligodendrocytes.

This type of screening of large amounts of chemicals at once is a quick way to find promising chemicals. These drugs need to be shown to be effective and safe in animal models before they can be used in humans. If a drug is already licensed for another condition in humans this can make progress to human trials quicker if it is going to be given at a similar dose and in the same way. However, if dose or how the drug is given are likely to differ for the new condition, safety would still need to be established in animals first.

 

What did the research involve?

The researchers tested more than 700 drugs on mouse oligodendrocyte progenitor cells in the lab. These immature cells are derived from stem cells, and the researchers singled out the drugs that caused them to develop into mature oligodendrocytes. They then tested their effects in brain tissue and in mice, including mouse models of MS, as well as on human oligodendrocyte progenitor cells in the lab.

In human MS, the immune system mistakenly attacks the body’s own myelin. The researchers used two different mouse models of the disease. In one "immune driven" model the mice’s immune system is actively attacking the myelin, and mimics the relapsing remitting form of MS. In the second model the immune system is not as active, and it is has more chronic progressive loss of myelin.

 

What were the basic results?

The research identified 22 drugs that prompted the oligodendrocyte progenitor cells in the lab to mature. They then picked the two drugs that were the best at getting the precursor cells to mature in brain tissue from young mice in the lab. These drugs were miconazole, which is currently used in antifungal creams, and clobetasol, which is a steroid used in creams (topical corticosteroid) for skin conditions such as psoriasis and eczema. They also found that drugs prompted human oligodendrocyte progenitor cells to mature in the lab. Of the two drugs, miconazole had the greater effect.

They found that giving the drugs to baby mice increased the number of myelin-producing cells in their brains. They also helped repair damaged myelin in the spinal cords of mice treated with a myelin-damaging chemical.

In the "immune driven" mouse model of MS, injections of clobetasol – but not miconazole – dampened down the immune response and reduced the severity of the mice’s symptoms. Steroids are known to affect the immune system, so the researchers had expected this. In the chronic mouse MS model, which has hind-leg paralysis, both clobetasol and miconazole injections helped to re-myelinate damaged nerves in the spinal cord and improved the mice’s movements.

Most existing MS drugs act by affecting the immune system, but miconazole did not appear to do this. Therefore the researchers felt this showed more promise as a new way to treat the disease. To show that their results were correct they had another lab confirm their miconazole results in the chronic MS mouse model.

 

How did the researchers interpret the results?

The researchers concluded that their screening system allowed them to rapidly identify drugs that have potential for re-myelination. This allowed them to identify two existing human drugs – miconazole and clobetasol – which increase re-myelination of nerves and "significantly reduce disease severity in mouse models of MS". They say that this "raises the exciting possibility that these drugs, or modified derivatives, could advance into clinical trials for the currently untreatable chronic progressive phase of MS".

 

Conclusion

This laboratory and mouse study has identified two drugs currently used for skin conditions – miconazole and clobetasol – that showed promise for treatment of conditions caused by myelin damage, such as MS.

If a drug is already licensed for another condition in humans, this can make progress to human trials quicker if it is going to be given at a similar dose and in the same way. However, as the researchers point out, these two drugs are licensed for use on the skin – not to be taken orally or injected into the system. This means more work will be needed to ensure the drugs are safe enough to be used in this way in humans. The drugs' chemical structures may need to be modified to make them work efficiently and reduce side effects.

Existing MS treatments act by dampening down the immune system, which attacks the myelin, so drugs that act in a different way, by repairing the myelin damage, could bring additional benefit. As yet, research into these drugs for MS is at an early stage, but many people will await with interest to see whether this early promise translates into better treatments.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Athlete's foot drug may be MS therapy. BBC News, April 20 2015

Creams used to treat athlete's foot and eczema 'could REVERSE multiple sclerosis'. Mail Online, April 21 2015

Common athlete's foot cream 'could reverse multiple sclerosis'. The Daily Telegraph, April 20 2015

Links To Science

Najm FJ, Madhavan M, Zaremba A, et al. Drug-based modulation of endogenous stem cells promotes functional remyelination in vivo. Nature. Published online April 20 2015

Categories: NHS Choices

Coffee could make breast cancer drug tamoxifen more effective

Wed, 22/04/2015 - 12:00

"A cancer-killing cocktail of the hormone drug tamoxifen and two coffees every day was found to reduce the risk of [breast cancer] tumours returning," the Mail Online reports. The same study also found evidence that caffeine slowed the cancer's growth.

The study looked at coffee consumption among 1,090 women with breast cancer, about half of whom were treated with tamoxifen.

Tamoxifen is a hormonal treatment used to treat cases of breast cancer known to be associated with the hormone oestrogen (known as oestrogen-dependent breast cancer).

The study found that women who reported drinking two to five cups of coffee a day had smaller primary tumours and a lower proportion of oestrogen-dependent tumours than those who drank one cup of coffee or less.

Women with oestrogen-dependent breast cancer who were treated with tamoxifen, and who drank at least two to five cups of coffee daily, had half the risk of the cancer recurring as those who drank less.

Researchers also carried out a laboratory study on the effect of two substances found in coffee – caffeine and caffeic acid (a compound found in coffee) – on breast cancer cells. They found that the substances suppressed the growth of breast cancer cells.

Although the results of this study are interesting, it cannot prove that coffee has an effect on breast cancer, as other factors, called confounders, could have influenced the results. 

There’s no harm in women being treated with tamoxifen for breast cancer drinking coffee in moderation. However, drinking excessive amounts can cause irritability, insomnia and indigestion.

 

Where did the story come from?

The study was carried out by researchers from Lund University and Skane University Hospital in Sweden, and the University of Bristol in the UK. It was funded by various grants from Swedish organisations, including the Swedish Cancer Society and the Swedish Research Council.

The study was published in the peer-reviewed journal Clinical Cancer Research.

The Mail Online’s reporting of the results implies it was proven that coffee prevents the return of oestrogen-dependent breast cancer or enhancing the action of tamoxifen. This is not the case, though the initial results are encouraging.

The Mail Online also did not include any comments on the study from independent experts. As a result, there is a risk that millions of women taking tamoxifen will start worrying about how much coffee they should drink.

There are no official UK guidelines on caffeine consumption, but regularly drinking more than 400 milligrams (mg) of caffeine a day (around four cups of brewed coffee or two "energy drinks") could cause side effects.

 

What kind of research was this?

This was a cohort study of 1,090 women with primary invasive breast cancer, living in Sweden. It is a follow-up study to one published in 2013 by the same researchers, which used a smaller number of women from the same cohort. The researchers say their previous study found an association between moderate to high coffee consumption and improved survival rates among women with breast cancer who were treated with tamoxifen. The researchers say the aim of the study is to investigate the association between coffee consumption, cancer characteristics and survival rates in a larger cohort of women with breast cancer.

Some breast cancer tumours rely on oestrogen to grow. These are called oestrogen-receptor (ER) positive cancers (the convention in classifying these types of cancer is to use the American spelling of oestrogen, which is estrogen; hence the ER).

Tamoxifen is the main hormonal therapy drug given for these types of breast cancer, as it blocks oestrogen from reaching the cancer cells. This reduces or stops the cells from growing.

The researchers also performed studies in the laboratory using human breast cancer cells to look at possible mechanisms by which two substances in coffee – caffeine and caffeic acid – may affect breast cancer growth.

 

What did the research involve?

For the cohort study, 1,090 women aged 24 to 99 who had been diagnosed with primary invasive breast cancer between 2002 and 2012 were recruited. Before undergoing surgery, the women’s body measurements and breast volume were taken and they were given an extensive questionnaire on their reproductive history, medication use and lifestyle factors, including smoking, alcohol and coffee consumption.

Coffee consumption was categorised into low (one cup or less a day), moderate (two to four cups a day) or high (five or more cups a day).

The researchers obtained information from pathology reports and medical records about tumour size and grade, whether it had spread to any lymph nodes, and if the tumour was hormone-receptor positive.

The women were followed until either their first breast cancer recurrence, their last disease-free follow-up or their death, whichever came first, before January 2013. Follow-up information on whether the breast cancer came back or whether the women had died was obtained from various official records. The results were analysed using standard statistical methods and adjusted for other factors, such as tumour size.

In their laboratory study, human breast cancer cells were exposed for 48 hours to caffeine or caffeic acid, with or without tamoxifen. The researchers used breast cancer cells that were ER positive, ER negative or cells that were resistant to tamoxifen. A minimum of three independent repeats were performed for each experiment.

 

What were the basic results?

The main results of the cohort study were:

  • women who reported a moderate to high coffee intake had smaller invasive primary tumours compared to those with low coffee consumption
  • moderate to high coffee intake was also associated with a lower proportion of ER positive tumours compared to patients with low consumption
  • moderate to high coffee consumption was associated with a 49% lower risk for breast cancer recurrence in women with ER positive tumours being treated with tamoxifen (adjusted hazard ratio 0.51; 95% confidence interval 0.26-0.97)

In the laboratory, caffeine and caffeic acid suppressed the growth of both ER positive and ER negative cancer cells. Caffeine and caffeic acid also had other effects on breast cancer cells, which led to slower cell growth and enhanced cell death.

 

How did the researchers interpret the results

The researchers say their findings demonstrate the various anticancer properties of caffeine and caffeic acid against both ER positive and ER negative breast cancers. In particular, they suggest that coffee may sensitise tumour cells to tamoxifen and therefore reduce breast cancer growth. It is possible, they say, that the substances in coffee switch off signalling pathways that cancer cells need to grow.

 

Conclusion

This study is interesting, but has several limitations. Its first finding was that women who report higher coffee consumption have smaller breast tumours, and also that their cancers are less likely to be ER positive. However, it seems that the women only reported their coffee consumption once, after diagnosis, and it is unclear from the write-up whether the questionnaire referred to their past or present coffee drinking habits. The women may also have under- or overestimated their coffee consumption, especially if they were asked to recall coffee consumption over a long period. The accuracy of coffee consumption is further hampered, as the study did not provide a standard definition for the size of a "cup" of coffee.

The second finding was that among women with ER positive cancer being treated with tamoxifen, higher coffee consumption was associated with better results. This sounds promising, especially when taken with the results of the laboratory study, but it is always possible that confounders might have affected the results.

Tamoxifen is normally only used in women with ER positive cancer who have not yet gone through the menopause. Therefore, it is unclear whether a similar effect would be seen in post-menopausal women who require a different type of hormonal treatments, such as aromatase inhibitors.

Further research is required on the possible association between coffee consumption and breast cancer risk, as it could lead to new treatments.

However, it should be noted that over-consuming coffee can have negative side effects. Regularly drinking more than five cups of coffee a day can cause insomnia, irritability, an upset stomach and palpitations.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Coffee 'can cut risk of breast cancer tumours returning': Two cups a day found to reduce chance by half. Mail Online, April 21 2015

 

Links To Science

Rosendahl AH, Perks CM, Zeng L, et al. Caffeine and Caffeic Acid Inhibit Growth and Modify Estrogen Receptor and Insulin-like Growth Factor I Receptor Levels in Human Breast Cancer. Clinical Cancer Research. Published online April 15 2015

Categories: NHS Choices

Mindfulness 'as good as drugs for preventing depression relapse'

Tue, 21/04/2015 - 14:00

"Mindfulness-based cognitive therapy may be as good as pills at stopping people relapsing after recovering from major bouts of depression," The Guardian reports.

Researchers wanted to see if a type of therapy known as mindfulness-based cognitive therapy (MBCT) could be an effective alternative treatment to antidepressants for people with major depression at high risk of relapse.

MBCT combines the problem-solving approach of cognitive behavioural therapy (CBT) with mindfulness techniques. These are designed to fix your awareness on the "here and now" instead of having unhelpful thoughts about the past and the future.

In a two-year clinical trial, people already taking antidepressants were assigned to a MBCT programme with a view to reducing or stopping their medication, or were asked to continue antidepressants alone. With support from their GP and therapist, around 70% of the mindfulness group were able to stop taking antidepressants.

The trial suggests MBCT might help some people with major recurrent depression reduce or cut out their medication. However, between four and five people out of every 10 in the trial relapsed within two years, regardless of their treatment. Depending on your perspective, the treatments were equally good or equally bad.

Research does suggest that mindfulness can benefit all of us, not just people with a history of severe depression. Read more about mindfulness for mental wellbeing

Where did the story come from?

The study was led by researchers from Oxford University and was funded by the National Institute for Health Research.

Two authors, including the first author, are co-directors of the Mindfulness Network Community Interest Company and teach nationally and internationally on mindfulness-based cognitive therapy. The other authors declare no competing interests.

The study was published in the peer-reviewed medical journal The Lancet on an open-access basis, so it is free to access online.

The media generally reported the story accurately and overall took a positive spin on the results, with some exceptions. The Daily Telegraph, for example, added some balance by saying that, "Some experts warned that the trial was not large enough to come to a definitive conclusion and had not included a placebo group".

However, few sources mentioned the potential conflicts of interest. Some did not recognise that MBCT as well as antidepressants are already recommended treatment options in national guidelines on depression for England and Wales for the prevention of relapse.

The Mail Online's headline, "Meditation is as effective as drugs for treating depression", is also quite careless, as this may give the impression that this is the kind of meditation that may be practised in a yoga class, for example, when it was actually a structured programme of mindfulness-based cognitive therapy.  

What kind of research was this?

This was a single-blind randomised control trial (RCT) comparing mindfulness-based cognitive therapy with antidepressant treatments to prevent the relapse or recurrence of depression.

People with depression often have relapses, and an increasing number of past episodes or ongoing health or life problems can increase the risk of further relapses. People who have had three or more depression episodes are reported to have relapse rates as high as 80% over two years.

For people at high risk of relapse, taking antidepressants for at least two years is the current recommended treatment. However, psychological therapies, including mindfulness-based cognitive therapy (MBCT), are also a recommended option.

This may be given either alongside antidepressant treatment, as an alternative for people who cannot, or do not want to, take antidepressants for this long, or for people who have not responded to antidepressants.

MBCT is a psychosocial intervention specifically designed to teach people with recurrent depression the skills to stay well in the long term. It uses a combination of problem-solving techniques, as well as teaching people how to focus on their immediate environment instead of dwelling on the past or worrying about their future.

MBCT, say the study team, has been shown to reduce the risk of relapse or recurrence compared with usual care, but has not yet been compared with maintenance antidepressant treatment in an RCT.

The aim of the study was to see whether MBCT with support to taper or discontinue antidepressant treatment (MBCT) was better than taking antidepressants for the prevention of depressive relapse or recurrence over 24 months.

Randomised controlled trials are an appropriate and effective way of testing how well different treatments work, such as MBCT compared with antidepressants. 

What did the research involve?

The study analysed 424 adults from urban and rural areas in the UK. All had a diagnosis of recurrent major depressive disorder (currently in remission), had three or more previous major depressive episodes, and were taking maintenance antidepressants to prevent further relapses.

The recruits were randomly assigned to receive an eight-week MBCT class or continue on maintenance antidepressants (212 in each group). Recurrence of depression was assessed over the following two-year period.

Both groups took antidepressants to begin with. The MBCT intervention was added on top, and included efforts to lessen the use of antidepressants, in consultation with their GP, if they felt they didn't need them or needed less of them.

MBCT is intended to enable people to learn to become more aware of their bodily sensations, thoughts and feelings associated with depressive relapse or recurrence, and to relate constructively to these experiences.

Participants learn mindfulness practices and cognitive-behavioural skills both in sessions and through homework assignments. Therapists provide support to patients in learning to respond adaptively to thoughts, feelings and experiences that might otherwise have triggered a relapse.

The programme involved eight 2¼-hour group sessions, normally over consecutive weeks, with four refresher sessions offered roughly every three months for the following year.

Patients in the maintenance antidepressant group received support from their GPs to maintain a therapeutic level of antidepressant medication in line with prescribing guidelines for the two-year follow-up period.

The main success measure was the time to relapse or recurrence of depression, with patients followed up at five separate intervals over two years. Secondary measures of success were the number of depression-free days, residual depressive symptoms, psychiatric and medical comorbidity, quality of life, and cost effectiveness. 

What were the basic results?

Most people completed the two-year trial (86%). In the MBCT group, 13% did not lower their antidepressant dose, 17% did, and 71% stopped completely.

Time to relapse or recurrence of depression over 24 months did not differ between people in the MBCT group or those taking maintenance antidepressants alone (hazard ratio [HR] 0.89, 95% confidence interval [CI] 0.67 to 1.18). A total of 94 (44%) of 212 patients in the MBCT group relapsed, compared with 100 (47%) of 212 in the maintenance antidepressants group.

Nor did the number of serious adverse events differ. Five adverse events were reported, including two deaths in each of the MBCT and maintenance antidepressants groups. No adverse events were attributable to the interventions or the trial.

MBCT was no better than antidepressants for the number of depression-free days, residual depressive symptoms, psychiatric and medical comorbidity, and quality of life.

The cost effectiveness analysis showed MBCT is not more cost effective than maintenance antidepressants alone. 

How did the researchers interpret the results?

The researchers said that, "We found no evidence that MBCT [combined with support to reduce antidepressant treatment] is superior to maintenance antidepressant treatment for the prevention of depressive relapse in individuals at risk for depressive relapse or recurrence.

"Both treatments were associated with enduring positive outcomes in terms of relapse or recurrence, residual depressive symptoms, and quality of life." 

Conclusion

This trial showed that mindfulness-based cognitive therapy enabled many people at high risk of a relapse of depression to discontinue their medicines, and achieve similar levels of relapse over a two-year period.

The number of depression-free days, residual depressive symptoms, psychiatric and medical comorbidity, and quality of life ratings were also similar. This suggests the mindfulness programme in the trial may help those who can't, or do not want to, use antidepressant drugs over the long term.

These results are consistent with current national guidelines for the prevention of depression relapse in England and Wales.

These recommend that people with depression who are considered to be at significant risk of relapse – including those who have relapsed despite antidepressant treatment, or who are unable or choose not to continue antidepressant treatment – or who have residual symptoms should be offered one of the following psychological interventions:

  • individual cognitive behaviour therapy (CBT) – for people who have relapsed despite antidepressant medication, and for people with a significant history of depression and residual symptoms despite treatment
  • mindfulness-based cognitive therapy – for people who are currently well but have experienced three or more previous episodes of depression

The results remind us that treatments to prevent depression relapse in this high-risk group don't have a high success rate. Between four and five people out of every 10 in the trial relapsed, regardless of their treatment.

Depending on your perspective, the treatments were equally good or equally bad. This highlights that people at high risk of relapse need to receive tailored care and regular follow-up so they can find the best treatment approach for them.

But this study has a number of limitations. As the researchers say, the people in the trial were all willing to try a psychological treatment and try reducing their antidepressant dose. This may mean the results are not generalisable to all people at high risk of depression relapse.

The people in the study had also already tried antidepressants for relapse prevention. They are not the same as people who are considering relapse prevention for the first time and are discussing the first option to use in preventing further episodes.

There was also no control comparison to MBCT. That is, a control intervention where the person still received the same regular group sessions, but without the specific components of the MBCT intervention.

This means it is less able to provide solid proof that the mindfulness intervention is as good as antidepressants for most people with major depression, or whether it is just the regular attention and follow-up that has an effect.

Simply talking to a person could have a significant placebo effect that may improve mood. Larger and longer studies are needed to know this for sure.

This mindfulness intervention was designed specifically to prevent relapses of major depression in those considered to be high risk.

It is not designed or tested to prevent depression in the first place, prevent relapse in lower-risk groups (such as those with only one previous episode of depression), and was not being tested here as an initial treatment for depression.

If you are concerned you are depressed, it is usually recommended that the first person you talk to about your concerns is your GP.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Mindfulness as effective as pills for treating recurrent depression – study. The Guardian, April 21 2015

Mindfulness can help prevent relapses of depression as well as anti-depressants, study claims. The Independent, April 21 2015

Mindfulness 'as good as anti-depressants for tackling depression'. The Daily Telegraph, April 21 2015

Depression: 'Mindfulness-based therapy shows promise'. BBC News, April 21 2015

Mindfulness therapy can treat depression as effectively as pills, doctors claim. Daily Mirror, April 21 2015

Meditation is 'as effective as drugs for treating depression': Mindfulness could be offered as an alternative to antidepressants, study claims. Mail Online, April 21 2015

Links To Science

Kuyken W, Hayes R, Barrett B, et al. Effectiveness and cost-effectiveness of mindfulness-based cognitive therapy compared with maintenance antidepressant treatment in the prevention of depressive relapse or recurrence (PREVENT): a randomised controlled trial. The Lancet. Published online April 20 2015

Categories: NHS Choices

Pages