NHS Choices

Long-term mobile phone use and brain cancer

NHS Choices - Behind the Headlines - Thu, 13/11/2014 - 08:28

"Do mobile and cordless phones raise the risk of brain cancer?" asks the Mail Online.

There are now more mobile phones than people in the UK, so you would expect the commonsense answer to be a resounding "no". But, as we never get tired of saying, it's a bit more complicated than that.

The Mail Online reports on the latest study looking for evidence of a link between mobile and cordless phone calls and brain tumours. This large Swedish study found more than 25 years' use of mobile phones trebled the (very small) risk of glioma, the most common type of brain tumour.

The study matched healthy volunteers with people who had been diagnosed with a glioma, and asked them to estimate the amount of time they had ever spent using mobile and cordless phones. This ranged from less than one year to around 25 years.

The researchers found:

  • any mobile phone use increased the risk of glioma by a third
  • using 2G phones for 15 to 20 years doubled the risk
  • 3G phone (smartphones) use for 5 to 10 years gave four times the risk (the research was carried out before the launch of 4G phones)

However, some of these results were based on very small numbers and so may not be reliable. And this type of study cannot prove that mobile phones cause brain tumours.

It has not taken into account other factors, including exposure to chemicals or occupational hazards, despite collecting this information. Even so, it could not account for every possible confounder.

It is also rather unlikely that the estimates for the extent of mobile phone usage are accurate. So, it remains unclear whether there are long-term cancer risks associated with mobile phone use.

 

Where did the story come from?

The study was carried out by researchers from the University Hospital in Örebro, Sweden and was funded by Cancer- och Allergifonden, Cancerhjälpen, the Pandora-Foundation for Independent Research, and the Berlin and Kone Foundation, Helsinki, Finland.

It was published in the peer-reviewed medical journal Pathophysiology, and appears to be available on an open-access basis.

The Mail Online has reported the story reasonably accurately, and put the findings into context, citing a previous large study looking at the risk of mobile phone use and brain cancer.

 

What kind of research was this?

This was a case-control study that aimed to see if there was an association between mobile phone use and the development of a type of brain tumour called glioma.

In this study, cases (people who have a glioma) were matched with controls (people of the same age without brain tumours). The researchers then looked at a variety of factors that each group had been exposed to.

This is a type of epidemiological study, which can identify potential risk factors for developing a brain tumour. However, this kind of study cannot prove that any of these factors directly caused the brain tumour.

 

What did the research involve?

The researchers contacted all adults aged 20 to 80 who were newly diagnosed with a brain tumour in central Sweden from 1997-2003, and all cases throughout Sweden aged 18 to 75 from 2007-09.

They recruited 1,498 (89%) people – 879 men and 619 women. The majority (1,380) had a glioma. The researchers matched each case by age and gender at random using the Swedish Population Registry to obtain a control group of 3,530 people.

A questionnaire was sent to all cases and controls to determine their exposure to mobile phones and cordless desktop phones. As mobile phones have changed during this timescale, the type of mobile phone exposure was recorded, including:

  • first generation – output power 1 Watt, 900 MHz
  • second (2G) generation – pulsed output power of tens of microWatts (mW), 900 or 1800 MHz
  • third generation (3G) – output power tens of mW, amplitude modulated

The questions asked about:

  • preferred ear for using a mobile or cordless phone
  • number of years of exposure and average daily use
  • overall working history
  • exposure to different chemicals
  • smoking habits
  • X-ray exposure to the head and neck
  • hereditary traits for cancer

If any of the answers were unclear, a follow-up telephone interview was conducted by someone who was not informed if the person was a case or a control.

The researchers performed statistical analyses to take socioeconomic status into account.

 

What were the basic results?

Any mobile phone use increased the risk of glioma by a third (odds ratio [OR] 1.3, 95% confidence interval [CI] 1.1 to 1.6).

More than 25 years' use of mobile phones trebled the risk of glioma (OR 3.0, 95% CI 1.7 to 5.2). This was based on 29 cases and 33 controls.

For the longest possible periods of use of newer mobile phones:

  • 2G phone use for 15 to 20 years doubled the risk of glioma (OR 2.1, 95% CI 1.5 to 3.0)
  • 3G phone use for 5 to 10 years gave four times the risk of glioma (OR 4.1, 95% CI 1.3 to 12) – this was based on 12 cases and 14 controls

Use of cordless phones also increased the risk (OR 1.4, 95% CI 1.1 to 1.7), with the greatest risk seen in people who had used cordless phones for 15 to 20 years (OR 1.7, 95% CI 1.1 to 2.5). This was based on 50 cases and 109 controls.

The odds of glioma increased significantly for every 100 hours of use and for every year of use.

First using a mobile or cordless phone before the age of 20 increased the odds of glioma more than first use at older ages.

 

How did the researchers interpret the results?

The authors report that this study gives further support to their previous research, in which they concluded that gliomas "are caused by RF-EMF [radiofrequency electromagnetic field] emissions from wireless phones, and thus regarded as carcinogenic, under Group 1 according to the IARC [International Agency on Research on Cancer] classification, indicating that current guidelines for exposure should be urgently revised".

 

Conclusion

This case-control study found mobile phone use is associated with an increased risk of the commonest type of brain tumour, glioma. But this type of study cannot prove that mobile phone use caused the brain tumours, as it cannot account for confounding factors.

Indeed, despite collecting data on variables such as exposure to chemicals and occupation, this information was not taken into account during the statistical analyses.

A further limitation of the study was that the extent of mobile phone use was estimated retrospectively up to a 25-year time period.

It is highly unlikely these estimates would be accurate because of factors such as memory recall, and patterns of mobile phone usage have changed substantially over the years.

There is also the possibility of cases having recall bias after receiving a brain cancer diagnosis and therefore overestimated their mobile usage.

Additionally, many of the calculations were based on very small numbers, which reduces the reliability of the findings.

This study does not prove that mobile phones cause brain cancer, and the long-term effects of mobile phone use remain unclear.

What is clear is that brain tumours are relatively uncommon. While this is a good thing, it means that "proving" what, if any, environmental factors cause them is likely to require a great deal of long-term research effort. 

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Do mobile and cordless phones raise the risk of brain cancer? Study finds risk is three times higher after 25 years of use. Mail Online, November 12 2014

Links To Science

Hardell L, Carlberg M. Cell and cordless phone risk for glioma - Analysis of pooled case-control studies in Sweden, 1997-2003 and 2007-2009. Pathophysiology. Published October 28 2014

Categories: NHS Choices

Watching 'Dad's Army' won't stop you going blind

NHS Choices - Behind the Headlines - Wed, 12/11/2014 - 10:20

"Fancy an episode of Dad's Army? How watching TV and films can save your eyesight," is the curious headline in the Daily Express.

Its headline is a rather abstract interpretation of research testing the potential for new computer eye-tracking software to help diagnose chronic glaucoma.

In glaucoma, pressure in the eyeball rises, damaging the optic nerve and threatening sight. Chronic glaucoma develops gradually, and loss of peripheral vision is usually the first sign.

The software being studied was designed to detect differences in eye movements between people with healthy eyes and those with glaucoma.

This study included just 44 older people with chronic glaucoma and 32 people of a similar age with healthy vision.

The computer software produced "scan paths", mapping eye movements while people watched three different film and TV clips, which indicated areas of visual loss.

As the news reports noted, one of the clips was from the ever-popular BBC sitcom "Dad's Army", although what was in the TV clips was irrelevant to the study or the patients' eyesight.

The computer software had fairly good accuracy for detecting glaucoma – about three-quarters of the people with glaucoma were correctly identified as having the condition using this test.

But we can only draw very limited further conclusions currently. We don't know whether the software will be affordable and become widely available, or whether it would offer any improvements on current methods used to detect chronic glaucoma.

 

Where did the story come from?

The study was carried out by researchers from City University London and was funded by a project grant awarded by Fight for Sight.

It was published in the peer-reviewed medical journal, Frontiers in Aging Neuroscience.

The media headlines give a misleading interpretation of this study. It is not possible for you to tell whether you have chronic glaucoma simply by watching an episode of "Dad's Army".

The historical sitcom just happened to be one of the TV clips that researchers showed people while tracking their eye movements using specialised computer software.

Even then, the software was not completely accurate at distinguishing which people did and did not have glaucoma. And we don't know that this test is an improvement on standard diagnostic tests.

 

What kind of research was this?

This was a diagnostic study where a control sample of elderly people with healthy vision, and another sample of people with glaucoma, received standard visual examinations. They also watched film and TV clips while a computer tracked their eye movements.

Researchers wanted to see whether they could differentiate between people with and without glaucoma by examining eye movements while someone watches a film.

Glaucoma is a condition where there is raised pressure in the eyeball. This can damage the optic nerve that carries visual information from the retina to the brain. The eye pressure increases because there is a blockage to the channels that drain aqueous fluid from the eye.

The patients in this study had chronic glaucoma, where the pressure in the eye gradually rises, causing a gradual loss of peripheral vision. Chronic glaucoma is more common with increasing age and can often run in families.

Current checks for chronic glaucoma include testing someone's peripheral visual fields, using a machine to measure the pressure in the eyeball, and looking at the back of the eye (retina) to check that the area where the optic nerve attaches to the eye looks healthy. Treatments can involve eye drops and laser surgery.

Chronic glaucoma is different from acute glaucoma, where the pressure in the eye suddenly rises very rapidly. Acute glaucoma is a medical emergency and needs immediate treatment to save the sight in the eye.

The researchers wanted to provide evidence that people with a diagnosis of chronic glaucoma can be distinguished from a group of age-matched healthy people by only using their visual scan paths while they watch a film or TV programme.

 

What did the research involve?

The researchers recruited 44 adults aged 63 to 77 with chronic glaucoma from Moorfields Eye Hospital in London. They deliberately recruited a sample of people who had variable degrees of visual field loss.

A comparison group of 32 adults (aged 64 to 75 years) with healthy vision were recruited from an eye clinic where they had received standard eye examinations. Both people with glaucoma and controls had no other significant health problems.

All participants had their visual fields tested using the optimal test designed to identify the early visual field loss associated with early glaucoma, the Glaucoma Hemifield Test (GHT), using a Humphrey Field Analyser (HFA).

The GHT was "outside normal limits" for all people with glaucoma and "within normal limits" for the controls.

The HFA mean deviation is the overall measure of the severity of the clinical field defect, and people with glaucoma were classed as having early disease if their mean deviation was better than -6dB in both eyes, and advanced disease if worse than -12dB.

The researchers outlined how people in the latter category would normally have symptoms and would most likely fail the visual field component for fitness to drive.

Best corrected visual acuity was also tested for all participants. There was little difference between people with glaucoma and healthy controls.

The main experiment involved participants viewing three separate TV and film clips taken from the 1970s TV comedy "Dad's Army", the 2006 film "The History Boys", and the 2010 Vancouver Winter Olympics men's ski cross event.

While they watched, the movements of the eye were tracked using special optical software. The software builds a scan path, illustrating the person's quick eye movements (called saccades) and fixations while they are watching. This scan path can indicate areas of vision loss.

 

What were the basic results?

Scan paths were built for each of the three film clips taken for both people with glaucoma and the controls – a total of 205 film clips.

Using a statistical measure known as the ROC curve, the researchers found the use of scan paths to detect chronic glaucoma was 0.85 (95% confidence interval [CI] 0.82 to 0.87) – with 1 indicating a perfectly accurate test, and 0.5 a useless diagnostic test with results no better than chance.

The result of 0.85 suggests scan paths obtained from this computer programme were a good – but not completely accurate – method of distinguishing between people with and without glaucoma.

The technique had a sensitivity of 76% (95% CI 58 to 86%), indicating roughly three-quarters of people with glaucoma would be accurately detected by using this test.

At this detection rate, the specificity was 90%, meaning 9 out of 10 people without glaucoma would accurately test as being free from the condition.

 

How did the researchers interpret the results?

The researchers concluded that, "Huge data from scan paths of eye movements recorded whilst people freely watch TV-type films can be processed into maps that contain a signature of vision loss.

"In this proof of principle study we have demonstrated that a group of patients with age-related neurodegenerative eye disease can be reasonably well separated from a group of healthy peers by considering these eye movement signatures alone."

 

Conclusion

This research demonstrates that a particular software application has fairly good accuracy for distinguishing between people with and without chronic glaucoma.

The scan paths that the software built, mapping eye movements while watching TV or film clips, were able to accurately pick up about three-quarters of those with glaucoma. Meanwhile, 9 out of 10 people without the condition accurately tested as being free from glaucoma.

The researchers appropriately call this a proof of concept study, in that they have demonstrated that the technique can reasonably separate people with and without chronic glaucoma.

But we can only draw limited further conclusions at this time. This study only tested a fairly small sample of people, and we don't know whether the same accuracy results would be obtained if a separate, bigger sample were tested.

We also don't know whether this test could offer any improvements on current methods for detecting chronic glaucoma. For example, it is not known whether the test could detect peripheral field defects any earlier than current standard visual field tests (combined with pressure testing), and so ultimately lead to the earlier detection and treatment of chronic glaucoma.

Of course, the ultimate aim of earlier detection is to improve outcomes for people in terms of preserving their vision. However, the current stage of research can offer no indication of whether this treatment could help "save your eyesight", as the Express headline suggests. As yet, no study has examined the longer-term outcomes of people with chronic glaucoma detected solely using this test.

Overall, these results suggest this software could have potential as a diagnostic technique to detect visual field loss in chronic glaucoma. However, it remains to be seen whether this test will ever be widely used in diagnostic practice, or how it would supplement or replace current standard tests.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Fancy an episode of Dad's Army? How watching TV and films can save your eyesight. Daily Express, November 12 2014

How watching Dad's Army could help detect the leading cause of blindness. Daily Mail, November 11 2014

Links To Science

Crabb DP, Smith ND, Zhu H. What's on TV? Detecting age-related neurodegenerative eye disease using eye movement scanpaths. Frontiers in Aging Neuroscience. Published November 11 2014

Categories: NHS Choices

Health workers 'neglect hygiene late in their shifts'

NHS Choices - Behind the Headlines - Wed, 12/11/2014 - 10:20

“Visit hospital in the morning to be sure of a doctor with clean hands,” reports The Daily Telegraph.

The Telegraph cites a US study which found healthcare workers often fail to wash their hands and are more likely to wash their hands as advised at the beginning of their shift (not necessarily the morning) than at the end.

Researchers used electronic ID tags for healthcare workers with detectors placed on soap dispensers and hand gels in patients’ rooms to collect data on when the workers washed their hands.

They found that, at most, workers washed their hands on 42.6% of the occasions that they should have. This figure reduced to 34.8% of occasions by the end of a 12-hour shift. Workers were also more likely to wash their hands after longer time off between shifts.

Despite national and local instructions on hand hygiene and infection control, it’s clear from this study that healthcare workers will forget or not bother. And it seems that the more tired – or less rested a worker is – the more likely they are to forget or overlook handwashing rules.

If you’re in hospital or you’re visiting an inpatient, you also have a responsibility to wash your hands before entering areas where patients are and wherever prompted, as well as when leaving. And don’t be afraid to ask health professionals if they’ve washed their hands too.

 

Where did the story come from?

The study was carried out by researchers from the University of Pennsylvania and the University of North Carolina at Chapel Hill and was funded by the Wharton Dean’s Research Fund and the Wharton Risk Management and Decision Processes Center.

The study was published in the peer-reviewed Journal of Applied Psychology.

The media reported the story fairly accurately, although both the Mail Online and the Telegraph made the error of suggesting that professionals in hospitals have cleaner hands in the morning. In fact the research found that workers were more likely to clean their hands at the start of a shift. As hospitals are open round the clock with shifts overlapping, this could be at many different times of the day. If you’re in doubt,  ask your health professional if they’ve washed their hands – they shouldn’t mind you asking.

The Telegraph’s headline focussed just on doctors, when they had actually only made up 4% of the health workers studied. The Mail Online illustrated its story with a photo of a surgeon washing his hands in an operating theatre environment, but the study did not involve any preparation around surgery.

 

What kind of research was this?

This was an observational study which compared the number of times health workers complied with the expectation that they should wash their hands both on entering and exiting a patient’s room. As the study was only conducted using radiofrequency equipment, it is not able to provide any explanation for why hands were not washed on each of these occasions.

 

What did the research involve?

The researchers obtained data from a company that uses “radiofrequency devices” to monitor whether health workers use hand hygiene measures on entering and exiting patient’s rooms. Radiofrequency devices use wireless technology to detect and record devices that have electronic tags implanted in them.

56 units from 35 US hospitals were fitted with these devices to measure hand hygiene opportunities between 2010 and 2013. In this case the radiofrequency devices comprised a “communication unit” attached to soap dispensers and hand sanitisers in patients’ rooms, and radio-frequency badges worn by hospital workers to track their movement and use of the dispensers. From this data, the researchers were able to calculate the number of opportunities that should have led to hand hygiene (the “compliance”), such as every time the health worker entered and exited a patient’s room, and the number of actual episodes.

They collected data from 4,157 health workers, which included:

  • nurses (65%)
  • patient care technicians (12%)
  • therapists (7%)
  • doctors (4%)
  • clinical directors, infection prevention specialists, and others (12%)

The researchers assumed shift patterns and time off work by at least a seven hour gap between the last time they had exited a room and the next time they entered. They excluded any shifts of more than 12 hours.

 

What were the basic results?

The main results were:

  • hand washing “compliance” reduced from 42.6% of opportunities during the first hour of a shift to 34.8% in the last hour of a 12-hour shift
  • increasingly frequent interactions with patients and more time spent in patients’ rooms reduced hand washing compliance
  • hand washing frequency improved after more days off between shifts
  • taking an additional half day off (12 hours) was associated with a 1.3% increase in hand washing compliance
  • the more hours worked in the previous week, the faster the hand washing compliance reduced during a shift

Using the results of a Swiss study which found that a 1% increase in hand washing reduced the number of infections by 3.9 per 1,000 admitted patients, the researchers calculated that:

  • the decrease in hand washing compliance would cause 7,500 unnecessary infections per year across the 34 hospitals they studied
  • this would equate to 0.6 million infections across all US hospitals per year
  • if 5.82% of hospital acquired infections are fatal, this would mean there would be 35,000 unnecessary deaths per year in the US

 

How did the researchers interpret the results?

The researchers concluded that their findings suggest that immediate and continuous demanding work environments result in a gradual reduction in compliance with professional standards over the course of aday. They want future research to look at what changes can be made that might improve compliance with hand washing.

 

Conclusion

This study has found that the compliance with the expectation for health workers to use soap or hand sanitiser both on entering and exiting patients’ rooms was, at best, only 46%. They also found that this reduced over the course of a shift to just 34.8%.

This is even more surprising given that the health workers knew they were being monitored and were wearing the badges.

Reasons for this seemingly low overall compliance rate include that there may have been occasions where there was no direct patient contact (such as just talking to the patient). However, this does not account for why the compliance rate changed over the course of a shift, and is not a valid excuse under the protocol for washing hands. The researchers suggest that the reduced compliance, especially over the course of a shift is due to depleted “mental reserves”. However, the study did not look at how any interventions might improve the compliance.

This study highlights our natural human fallibility – especially when tired. Even with protocols, guidelines and diktats, we tend to forget or neglect to do very important things. In this case, health professionals forgot vital hand hygiene when dealing with patients.

However, it’s worth bearing in mind that this study was carried out in the US, where hospital set-ups are likely to be different (for instance, the patients are described as having their own rooms, which is less common in NHS hospitals).

In the UK, NICE recommends that all healthcare workers should always clean their hands thoroughly, immediately before and immediately after coming into contact with a patient or carrying out care, and even after wearing gloves.

When visiting someone in hospital, you should also be vigilant about washing your hands before and after entering a patient’s room and in other areas of the hospital.

If you are concerned about the hand hygiene of doctors, nurses or anyone else who comes into contact with you or the patient you are visiting, ask them whether they have cleaned their hands.

 

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on twitter. Join the Healthy Evidence forum.

Links To The Headlines

Hospital patients ‘more likely to see doctors and nurses with clean hands if they have appointment in the morning’. Mail Online, 11 November 2014

Visit hospital in the morning to be sure of a doctor with clean hands. The Daily Telegraph, 11 November 2014

Links To Science

Dai H, et al. The Impact of Time at Work and Time Off From Work on Rule Compliance: The Case of Hand Hygiene in Health Care. Journal of Applied Psychology. 3 November 2014

Categories: NHS Choices

Genes tweaked to 'starve' prostate cancer cells

NHS Choices - Behind the Headlines - Tue, 11/11/2014 - 12:10

"Prostate cancer could be 'halted' by injections," reports The Independent.

While this headline rather simplifies the research findings, the research it's based on demonstrates an interesting way to stop prostate cancer – for mice, at least.

In the research, mice with a "model" of prostate cancer were treated with chemicals that inhibit a protein called SRPK1. As a result, the growth of their cancer was reduced.

The researchers performed additional experiments, which showed SRPK1 is involved in the control of how new blood vessels grow. Blood vessels are required to bring oxygen and nutrients and remove waste, which is thought to be key for tumour growth.

Without SRPK1, blood vessel formation was reduced – suggesting that this could "starve" the tumours (as the Mail puts it) of food and oxygen.

Interestingly, this study suggests SRPK1 inhibitors may be used to treat prostate cancer, and potentially some other forms of cancer.

Because this has only been tested on mice, we have no way of knowing if the treatment will be safe for men with prostate cancer, let alone more effective than other treatments currently available.

 

Where did the story come from?

The study was carried out by researchers from the University of Bristol, North Bristol NHS Trust, the University of the West of England and the University of Nottingham.

It was funded by Prostate Cancer UK, the Biotechnology and Biological Sciences Research Council, the Medical Research Council, Cancer Research UK and the Richard Bright VEGF Research Trust.

The study was published in the peer-reviewed medical journal Oncogene.

The media reporting was generally accurate, although the headlines might be construed as misleading, as they seem to imply that a new prostate cancer treatment is available.

But these headlines are based solely on the results of experiments on cells grown in the laboratory and on mice. We'll have to wait and see if the proposed therapy is safe and effective enough to treat men with prostate cancer.

 

What kind of research was this?

This was a laboratory and animal study that built on the results of previous research.

Blood supply is required to bring oxygen and nutrients, and remove waste from tissues – including tumour tissue. The growth of new blood vessels (known as angiogenesis) is believed to be essential for tumour growth.

Angiogenesis is caused by the protein vascular endothelial growth factor (VEGF). However, a different form of the protein can be made in a process called splicing, which does the reverse and actually inhibits the growth of new blood vessels.

The researchers recently found the choice of "form" of VEGF is controlled by another protein called SRSF1.

When SRSF1 is modified by another protein called SRPK1, it favours the formation of the form of VEGF, which promotes the formation of new blood vessels. When it is not modified, it favours the formation of the form of VEGF that inhibits the growth of new blood vessels.

In this study, the researchers wanted to see whether:

  • levels of SRPK1 or SRSF1 are increased in prostate cancer cells
  • modification of the VEGF form produced is able to reduce tumour growth in animals

They wanted to see if there was potential for SRPK1 inhibitors to be used as a prostate cancer treatment.

 

What did the research involve?

The researchers first looked at the levels of SRPK1 and SRSF1 in 17 human prostate cancer samples.

They then performed a series of experiments on prostate cancer cells grown in the laboratory.

Following this, they looked at the growth of the prostate cancer cells injected into mice.

 

What were the basic results?

The researchers found levels of both SRPK1 and SRSF1 were higher in malignant (cancerous) areas compared with benign (non-cancerous) areas in the 17 human prostate cancer samples they examined.

When the researchers modified prostate cancer cells so they didn't make SRPK1, they found cells then made more of the form of VEGF that inhibits the growth of new blood vessels. However, the cells were still able to grow, divide and move as normal.

The researchers then injected the prostate cancer cells (either unmodified, or modified so they didn't make SRPK1) into mice. The researchers found the modified prostate cancer cells grew slower, formed smaller tumours, and had fewer blood vessels. 

The researchers then did experiments with chemical inhibitors of SRPK1. They found the chemical inhibitors had similar effects to modifying the cells so they didn't make SRPK1.

When mice with a model of prostate cancer were treated with injections of chemical inhibitors of SRPK1, the growth of tumours was inhibited.

 

How did the researchers interpret the results?

The researchers say the results suggest that, "modulation of SRPK1 and subsequent inhibition of tumour angiogenesis by regulation of VEGF splicing can alter prostate tumour growth, and supports further studies for the use of SRPK1 inhibition as a potential anti-angiogenic therapy in prostate cancer".

 

Conclusion

In this study, researchers have found treating a mouse model of prostate cancer with chemicals that inhibit a protein called SRPK1 reduced cancer growth.

The researchers performed additional experiments, which showed SRPK1 is involved in the control of angiogenesis (the growth of new blood vessels). Blood vessels are required to bring oxygen and nutrients, and remove waste from tissues.

The formation of new blood vessels is thought to be key for tumour growth, and without SRPK1, blood vessel formation was reduced.

This study has suggested SRPK1 inhibitors could be used to treat both prostate and other forms of cancer. However, this study only trialled the potential therapy in mice.

Further studies are needed to show that the treatment is safe for men with prostate cancer. If that is confirmed, more work will be needed to show it is effective in treating men with prostate cancer. 

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Prostate cancer could be 'halted' by injections. The Independent, November 10 2014

Prostate cancer breakthrough as scientists STARVE tumours of their blood supply - stopping them growing and spreading. Mail Online, November 10 2014

Genetic discovery offers new hope for prostate patients. The Times, November 11 2014

Prostate cancer could be 'switched off' with injection. The Daily Telegraph, November 10 2014

Links To Science

Mavrou A, et al. Serine–arginine protein kinase 1 (SRPK1) inhibition as a potential novel targeted therapeutic strategy in prostate cancer. Oncogene. Published November 10 2014

Categories: NHS Choices

Claims cannabis 'rewires the brain' misleading

NHS Choices - Behind the Headlines - Tue, 11/11/2014 - 07:26

"Cannabis use 'shrinks and rewires' the brain," reports The Daily Telegraph, with much of the media reporting similar "brain rewiring" headlines.

The headlines are based on a study that compared the brain structure and connections of cannabis users with those of non-users.

The researchers identified several differences between cannabis users and non-users in a region of the brain called the orbitofrontal cortex.

This is part of the reward network, and is enriched with cannabinoid 1 receptors. These bind THC, the active ingredient in cannabis.

Some of the differences seen by the researchers were associated with how long people had used cannabis or the age they started using the drug.

However, although brain differences were found, it is not clear they were caused by cannabis use. It is possible that brain differences mean it is more likely that certain people use cannabis.

 

Where did the story come from?

The study was carried out by researchers from the University of Texas, The Mind Research Network and the University of New Mexico.

It was funded by the US National Institute on Drug Abuse.

The study was published in the peer-reviewed medical journal PNAS. This article was open access, so is free to read online.

The media generally reported the results of this study along the lines of The Guardian's headline: "Smoking cannabis every day 'shrinks brain but increases its connectivity'." But these headlines are misleading.

This study did find differences between the brains of cannabis users and non-users, but because it was only a snapshot in time, we can't tell if the brain differences were caused by cannabis.

It is possible that brain differences mean it is more likely that certain people use cannabis. These could be pre-existing differences in the parts of the brain associated with feelings of reward, and people with this brain structure are more likely to try or persist in using cannabis.

 

What kind of research was this?

This was a cross-sectional study that compared the brain structure and connections of people who used cannabis with the brain structure of non-users to see if there were any differences.

Although this type of study can identify differences in brain structure and connections between cannabis users and non-users, it cannot show that the differences were caused by cannabis use: people with different brain structures may be more likely to use cannabis, for example.

 

What did the research involve?

The researchers used magnetic resonance imaging (MRI) scans to look at the brains of 48 cannabis users, who were using cannabis at least four times a week over the previous six months, and 62 non-users.

The cannabis users varied in age, and the non-users were chosen because they were the same sex and age as the users.

The researchers also used the marijuana problem survey to assess the negative psychological (such as feeling bad about marijuana use), social (such as family problems), occupational (such as missing work), and legal consequences of marijuana use over the previous 90 days.

 

What were the basic results?

The researchers identified several differences between the brains of cannabis users and non-users.

These differences were in a region of the brain called the orbitofrontal cortex. This is part of the reward network of the brain, and is enriched with cannabinoid 1 receptors that bind THC (the "active" ingredient in cannabis).

The researchers found the orbitofrontal cortex was smaller in cannabis users, but there was more connectivity.

Some of the brain differences were correlated with behaviour related to cannabis. Some brain differences varied with duration of use, and some of the differences were associated with the age a person had started using cannabis.

 

How did the researchers interpret the results?

The researchers say that their findings "suggest that chronic marijuana use is associated with complex neuroadaptive processes, and that onset and duration of use have unique effects on these processes".

 

Conclusion

Links To The Headlines

Smoking cannabis every day ‘shrinks brain but increases its connectivity’. The Guardian, 10 November 2014

Cannabis use 'shrinks and rewires' the brain. The Daily Telegraph, 10 November 2014

Smoking cannabis every day 'warps your brain and shrinks grey matter', scientists warn. Daily Mail, 10 November 2014

Links To Science

Filbey FM, et al. Long-term effects of marijuana use on the brain. PNAS. Published 10 November 2014

Categories: NHS Choices

Are pollution and attention problems related?

NHS Choices - Behind the Headlines - Mon, 10/11/2014 - 11:47

“Could ADHD be triggered by mothers being exposed to air pollution while pregnant?,” asks the Mail Online.

Pregnant women have enough to worry about, without going round in a gas mask or moving to the country. Fortunately, the study that this news relates to doesn’t find a connection between exposure to pollution while pregnant and attention deficit hyperactivity disorder (ADHD).

In fact, the study looked at just 250 African-American and Dominican children in three suburbs of New York. It looked at whether symptoms of ADHD (rather than diagnoses) at the age of nine were associated with their pregnant mother’s exposure to environmental pollution, derived from traffic fumes and domestic heaters. The pollution – polycyclic aromatic hydrocarbons (PAHs) – was measured by levels of PAH DNA in maternal and cord blood samples taken at birth.

The researchers found there to be an association between PAH levels in maternal blood and ADHD symptoms. The mothers with high PAH levels had increased odds of being categorised as having “moderately to markedly atypical” scores on “inattentive” and “total symptom” scales.

However, there is no evidence that the association between symptoms and PAHs in the mothers’ blood was caused by environmental pollution. The researchers found no association between maternal blood PAH levels and air-measured PAH levels, nor estimations of dietary PAH intake. 

This relatively small study of a specific population sample demonstrates an association, but does not provide conclusive evidence of a link between exposure to pollution during pregnancy and a child’s chances of developing ADHD.

 

Where did the story come from?

The study was carried out by researchers from Columbia University in New York, and was funded by the National Institute for Environmental Health Sciences and the US Environmental Protection Agency. The study was published in the open access, peer-reviewed medical journal PLOS One.

The media appears to have taken these results at face value, but not considered the various limitations of this small study, which make the results far from conclusive.

 

What kind of research was this?

This was a US cohort study that investigated whether there is an association between childhood symptoms of ADHD and maternal exposure to PAHs during pregnancy

PAHs are toxic air pollutants released during incomplete combustion of fossil fuels. They are produced by traffic and residential heating, among other sources. As the researchers say, urban minority populations often have much higher exposure to air pollution than other populations.

This is a health concern because foetuses and developing children are potentially susceptible to PAHs and other pollutants. Previous laboratory studies have suggested a range of neurodevelopmental and behavioural effects from PAH exposure. Results from this cohort of mothers has already found that exposure to PAH before birth is associated with developmental delay at three years old, reduced IQ at five, and symptoms of anxiety or depression and attention problems at six to seven years old.

As ADHD is the most common behavioural disorder in childhood, the researchers also wanted to see if it was associated with ADHD at nine years of age.

However, a cohort study such as this can only demonstrate an association – it can’t prove cause and effect, as the relationship may be influenced by other factors.

 

What did the research involve?

This cohort study recruited a sample of African-American and Dominican women from antenatal clinics in three suburbs of New York City between 1998 and 2006. The women were all aged 18 to 35, non-smokers and did not use any other drug substances.

The researchers measured PAH exposure by levels of PAH-modified DNA in maternal and umbilical blood samples taken after delivery. They also measured air PAH levels during pregnancy, and questioned the women about their exposure to passive smoke and dietary PAH consumption (through grilled, fried or smoked meat).

Child ADHD behaviour problems were assessed when the children were nine years old using two validated parent-reported rating scales: 

  • the CBCL: a screening instrument assessing various child functioning problems
  • the CPRS-Revised: a focused assessment of ADHD

The CBCL and CPRS-revised scales also assessed child anxiety and depression symptoms.

The researchers analysed the association between PAH metabolites and ADHD symptoms, adjusting for other measured health and environmental factors, such as child age, sex, mother’s educational level and her own ADHD symptoms. The researchers also measured levels of PAH breakdown products detected in the child’s urine samples when aged three and five years, so that they could adjust for PAH exposure after birth.

The final sample included 250 children with complete data.

 

What were the basic results?

The researchers found that all CPRS subscales scores were significantly associated with levels of PAH-modified DNA in maternal blood.

The researchers then analysed the information to see whether there was an association with “moderately to markedly atypical” scores. Compared to those whose maternal blood was categorised as having low PAH levels, those with high levels had increased odds of being categorised as having “moderately to markedly atypical” scores on the “inattentive” and “total” DSM-IV subscales of the CPRS, but not the hyperactive-impulsive subscale.

There was some association between maternal blood PAH and the ADHD problems on the CBCL checklist scores, but this did not reach statistical significance.

Levels of PAH DNA in umbilical cord blood were available for fewer participants. There were no significant associations between PAH cord blood levels and CPRS or CBCL scores.

 

How did the researchers interpret the results?

The researchers conclude that their results, “suggest that exposure to PAHs encountered in New York City air may play a role in childhood ADHD behaviour problems”.

 

Conclusion

Overall, this relatively small cohort study demonstrates an association, but does not provide conclusive evidence, that exposure to pollution (in the form of PAHs) before birth is associated with the development of ADHD.

There are a number of limitations to consider. These include the fact that the study includes a relatively small sample of 250 children, with all of them from two specific ethnic groups (African-American and Dominican), and from three suburbs of New York City. The findings may not be generalisable to other populations.

While the researchers used valid assessment scales, it has not focused on examining actual diagnoses of ADHD.

Importantly, the only association identified by the researchers was between ADHD symptoms and levels of PAH DNA in maternal blood at the time of birth. There was no association between maternal blood PAH levels and environmentally-measured PAH levels or dietary PAH intake. Therefore, the source of this exposure is not known, and it cannot be reliably assumed to be due to environmental causes. Levels of PAH-modified DNA not only reflect exposure, but also an individual’s uptake, detoxification and DNA repair rates.  

Finally, there remains the possibility that if there is an association between maternal levels of PAH and child ADHD symptoms, it could be influenced by a variety of unmeasured health, lifestyle and socioeconomic factors.

While the findings are undoubtedly worthy of further research, there does not appear to be firm evidence from this study to support the media conclusion that exposure to environmental pollutants during pregnancy could lead to the development of ADHD.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Could ADHD be triggered by mothers being exposed to air pollution while pregnant? Mail Online, November 7 2014

Links To Science

Perera FP, et al. Early-Life Exposure to Polycyclic Aromatic Hydrocarbons and ADHD Behavior Problems. PLOS One. November 5 2014

Categories: NHS Choices

Anxiety affects children in different ways

NHS Choices - Behind the Headlines - Mon, 10/11/2014 - 11:47

"Teenage anxiety: Tailored treatment needed," BBC News reports, saying a "one-size-fits-all approach to treating teenagers with anxiety problems may be putting their futures at risk."

The news is based on research that looked at the diagnoses of a group of children and a group of adolescents – it did not look at how they were treated or how effective any treatment was.

But this research highlighted potential problems with assuming that "children" – defined as being aged 5 to 18 years – are affected by anxiety in the same way.

This study looked at different diagnoses among 100 children (aged six to 12 years) and 100 adolescents (aged 13 to 18 years) with anxiety problems referred to a specialist mental health service in England.

The findings showed that, despite children and adolescents often being considered as one group, their specific diagnoses – and therefore treatment needs – can differ.

In this sample, children more often had separation anxiety disorder, while adolescents were marginally (but not significantly) more likely to have generalised anxiety disorder and social anxiety disorder. Adolescents were also more likely than children to have mood disorder and have problems with school attendance.

However, as this study looked at a single consecutive sample of children and adolescents, it may not be representative of all young people with anxiety disorders: different results may be obtained from a different sample.

And this study does not provide evidence that children or adolescents are being incorrectly diagnosed or are receiving inadequate treatment.

 

Where did the story come from?

The study was carried out by researchers from the University of Reading and was supported by a Medical Research Council Clinical Research Training Fellowship awarded to one of the authors.

It was published on an open access basis in the Journal of Affective Disorders, a peer-reviewed medical journal.

The BBC News coverage is generally representative of this research.

 

What kind of research was this?

This was a case series reporting the diagnoses of 100 children (aged six to 12 years) and 100 adolescents (aged 13 to 18 years) who were consecutively referred to a specialist UK mental health service for anxiety problems.

The researchers report how little is known about the clinical characteristics of children and adolescents who are routinely referred for anxiety disorders.

And, when considered in studies, children and adolescents with anxiety disorders are often treated as one very similar (homogenous) group with an age range of five to 18 years, although they may differ in meaningful ways.

The researchers wanted to examine a series of cases of anxiety disorders to see whether there are key characteristics that distinguish children from adolescents referred for these conditions.

They expected that adolescents would have a higher anxiety severity, more social anxiety, disturbed school attendance and more frequent co-existing mood disorders.

 

What did the research involve?

The children and adolescents were consecutive referrals from general practice and secondary care to the care services at the Berkshire Healthcare NHS Foundation Trust Child and Adolescent Mental Health Service (CAMHS) Anxiety and Depression Pathway based at the University of Reading. CAMHS accepts referrals of children and adolescents with anxiety disorders from across the UK.

The child and adolescent assessments were conducted at one point in time, and involved separate diagnostic assessments or questionnaires with the child and their "primary caregiver" (usually a parent).

Child and adolescent diagnoses of anxiety disorders were determined using a structured interview called the Anxiety Disorders Interview Schedule for DSM IV – Child and Parent Version (ADIS-C/P). This assesses anxiety and other mood and behaviour disorders according to standard diagnostic criteria.

If the child or adolescent met diagnostic criteria, a clinician severity rating (CSR) was given from 0 (absent or none) to 8 (very severely disturbing or disabling), where 4 would be the score indicating a diagnosis.

The Spence Children's Anxiety Scale (SCAS-C/P) assesses symptoms reported by parents and the children themselves. These symptoms related to six domains of anxiety, rated on a scale from 0 (never) to 3 (always):

  • panic attacks or agoraphobia
  • separation anxiety
  • physical injury fears
  • social phobia
  • generalised anxiety
  • obsessive-compulsive symptoms

Other assessments include the Short Mood and Feelings Questionnaire (SMFQ-C/P) to assess self-reported depression, and the Strengths and Difficulties Questionnaire (SDQ-P) to assess parent-reported behavioural disturbance.

Caregivers' own psychological symptoms were assessed using the short version of the Depression Anxiety Stress Scales (DASS).

 

What were the basic results?

The majority of children and adolescents (84%) met a primary (main) diagnosis of anxiety disorder on the ADIS. Ten per cent of the children and 7% of adolescents did not meet any diagnostic criteria.

Six per cent of children and 9% of adolescents had non-anxiety primary diagnoses, including oppositional defiant disorder, attention deficit hyperactivity disorder (ADHD), and depression.

The results were based on the 84 children and 84 adolescents who met the criteria for a main diagnosis of anxiety disorder.

Children were significantly more likely than adolescents to have a diagnosis of separation anxiety disorder (affecting 44% of children versus 18% of adolescents).

Social anxiety disorder and generalised anxiety disorder were slightly more common in adolescents (affecting 52% and 55%, respectively) than children (affecting 45% and 49%, respectively), but the difference between children and adolescents was not statistically significant.

Although most children and adolescents had moderate severity anxiety, adolescents tended to have more severe diagnoses than children. The mean CSR score for anxiety was 5.33 for adolescents and 4.93 for children.

Mood disorders were also significantly more common in adolescents than children (affecting 24% of the total adolescent sample and 6% of children). School refusal was also significantly more frequent in adolescents (18%) than children (7%).

 

How did the researchers interpret the results?

The researchers conclude that, "The finding that children and adolescents with anxiety disorders have distinct clinical characteristics has clear implications for treatment.

"Simply adapting treatments designed for children to make the materials more 'adolescent friendly' is unlikely to sufficiently meet the needs of adolescents."

 

Conclusion

This is a useful exploratory study, which should give a good indication of the range of diagnoses among children and adolescents referred for anxiety disorders to specialist mental health services in England.

Children and adolescents, particularly in research, can often be placed into one homogenous group, and this study shows specific diagnoses can differ significantly between the groups. For example, this study showed that children more often had separation anxiety disorder.

And adolescents were marginally (but not significantly) more likely to have generalised anxiety disorder and social anxiety disorder. Adolescents were also more likely than children to have mood disorder and to have problems with school attendance.

The researchers warn they have considered childhood and adolescence as two distinct developmental periods, with age 13 being the turning point.

In reality, as they say, differences between diagnoses and treatment needs would be unlikely to occur in the same way in every growing child. They suggest that further studies focus on narrower age bands.

As the researchers also acknowledge, the people in this study were from a predominantly white British ethnic background and from relatively high socioeconomic backgrounds.

The study also did not include those with autism spectrum disorders, obsessive compulsive disorders, or post-traumatic stress disorder.

This study is likely to give a good indication of the proportion of children and adolescents with different anxiety diagnoses referred to this specialist mental health service, but we cannot be certain it is entirely representative of young people with anxiety disorders. Different results may be obtained from a different sample.

As the researchers say, their results highlight that children and adolescents with anxiety disorders are likely to have different treatment needs.

But this case study does not show that children and adolescents are being incorrectly diagnosed or are receiving inadequate treatment.

The present study focused solely on diagnosis, and not treatment. As the research did not look at treatments, it should not be assumed that children and adolescents are not receiving the appropriate treatment targeted at their diagnosis.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Teenage anxiety: Tailored treatment needed. BBC News, November 8 2014

Links To Science

Waite P, Creswell C. Children and adolescents referred for treatment of anxiety disorders: Differences in clinical characteristics. Journal of Affective Disorders. Published June 25 2014

Categories: NHS Choices

Stem cells could repair Parkinson's damage

NHS Choices - Behind the Headlines - Fri, 07/11/2014 - 11:30

"Stem cells can be used to heal the damage in the brain caused by Parkinson's disease," BBC News reports following the results of new Swedish research in rats.

This study saw researchers transplant stem cells into rats' brains. These cells then developed into dopamine-producing brain cells.

Parkinson's disease is a neurological condition associated with the loss of dopamine-producing brain cells. This leads to the symptoms characteristic of the condition, such as tremor, stiff, rigid muscles, and slow movements.

Parkinson's is currently treated with medication that attempts to compensate for the loss of these cells, but it cannot replace them.

This new research demonstrated it may be possible to use stem cell-derived dopamine nerve cells to treat the condition, giving long-term functional results.

Up to six months after the cells were grafted into the brains of the rats, brain scans and functional tests showed the transplanted cells had proliferated and matured, reinnervated the brain tissue, and were producing dopamine.  

The next step would to be to try to follow on from this research with clinical trials in humans.

 

Where did the story come from?

The study was carried out by researchers from Lund University in Sweden and other research institutions in France.

The research and individual authors received various sources of financial support, including the European Community's 7th Framework Programme.

The study was published in the peer-reviewed journal, Cell Stem Cell on an open access basis, so it is free to read online.

Both BBC News and ITV News gave a good representation of the research.

 

What kind of research was this?

In this laboratory study, researchers aimed to produce dopamine neurones (nerve cells) from human embryonic stem cells and graft these into a rat model of Parkinson's disease. They wanted to see if this had the potential to be used as a treatment for the disease.

Parkinson's is a neurological disease with an unknown cause, which sees a loss of the nerve cells in the brain that produce the chemical dopamine.

Loss of dopamine causes the three classic Parkinson's symptoms of tremor, stiff, rigid muscles and slow movements, as well as a range of other effects, including dementia and depression. There is no cure, and current drugs aim to try to control symptoms by treating this dopamine imbalance.

Human embryonic stem cells have the potential to develop into any type of cell in the body. Using these stem cells to replace dopamine nerve cells seems a promising area for research, and this study is the first step in investigating whether this type of treatment could one day be possible.

 

What did the research involve?

The researchers developed dopamine nerve cells from human embryonic stem cells (hESC) in the laboratory.

They then needed to see whether these cells would survive and function in the long term when grafted into brain tissue.

They transplanted these hESC-derived dopamine neurones into a rat model of Parkinson's disease, where the rats' brains were injected with a toxin to stop dopamine production.

The researchers followed the rats for six months after the cells were transplanted into their brains, carrying out various brain scans and tissue examinations to see how the cells had developed and were functioning.

They then carried out a behavioural test in the rats to see whether the transplanted cells had caused a recovery of their motor function (movement).

 

What were the basic results?

One to five months after the hESC-derived dopamine neurones had been grafted into the brains of the rats, MRI scans showed the transplanted cells had increased in volume, indicating that they were proliferating and maturing.

Further imaging was carried out using PET scans to detect a radiolabeled chemical marker that targets dopamine receptors.

Before grafting, the brains of the Parkinson's rats demonstrated a high level of binding of this chemical to the dopamine receptors, indicating that dopamine was lacking and that this marker was taking dopamine's place in the receptors.

Five months after grafting, binding of this chemical was reduced to normal levels, which indicated there was an active release of dopamine from the transplanted cells and dopamine was therefore now binding to these receptors.  

Examination of the rats' brain tissue confirmed these imaging findings, showing that the tissue was rich in dopamine neurones and that the transplanted cells had reinnervated the brain tissue.

The behavioural test also gave positive results, indicating that the transplanted hESC-derived dopamine neurones led to functional motor recovery in the rats.

 

How did the researchers interpret the results?

The researchers concluded they have "performed a comprehensive preclinical validation of hESC-derived [dopamine] neurons that fully supports their functional efficacy and capacity for long-distance, target-specific reinnervation, predictive of their therapeutic potential".

 

Conclusion

This is promising early-stage research that demonstrates it is possible to manufacture dopamine-producing nerve cells from human embryonic stem cells in the laboratory.

The cells were then transplanted into a rat model of Parkinson's disease (the rats were given a toxin that destroyed their dopamine-producing cells).

Up to six months after the cell transplant, brain scans and functional tests showed that the transplanted cells had proliferated and matured, reinnervated the brain tissue, and were producing dopamine.  

The next step is to follow on from this research with the first clinical trials in humans. The researchers say they hope they will be ready for the first clinical trial in about three years' time.

But there are several technical obstacles that need to be overcome first. Although the results indicate the transplanted cells were functioning well in the rat model at five months, as the researchers say, it is important to verify that these functional effects are robust and stable over significantly longer time periods.

Also, the rat brain is much smaller than the human brain. It would therefore need to be demonstrated that the transplanted cells have the capacity to grow nerve fibres that can reinnervate distances relevant to the size of the human brain.

This research holds promise for a future stem cell treatment that could restore the dopamine-producing nerve cells lost in people with Parkinson's disease. The next stages in this research are awaited eagerly.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Parkinson's stem cell 'breakthrough'. BBC News, November 7 2014

Major breakthrough puts scientists on path to first ever stem cell transplantations in people with Parkinson's disease. ITV News, November 7 2014

Links To Science

Grealish S, Diguet E, Kirkeby A, et al. Human ESC-Derived Dopamine Neurons Show Similar Preclinical Efficacy and Potency to Fetal Neurons when Grafted in a Rat Model of Parkinson's Disease. Cell Stem Cell. Published online November 6 2014

Categories: NHS Choices

Norovirus returns: advice is to stay away from GP

NHS Choices - Behind the Headlines - Fri, 07/11/2014 - 10:29

After Halloween and Bonfire Night, we have the return of another, much less welcome, winter tradition: the norovirus. Or, as The Times reports, “Tis the season for winter vomiting bug”.

The body responsible for public health in this country, Public Health England, has issued a bulletin reminding everyone experiencing symptoms of norovirus to stay at home and to telephone 111 for advice, if necessary. Symptoms of norovirus typically include forceful vomiting and watery diarrhoea.  

It's vital that people don’t visit GP surgeries, hospitals, schools and care homes if they think they may be infected. For the vast majority of people, the norovirus is self-limiting (it gets better by itself) but vulnerable groups, such as the elderly and people with a pre-existing illness or weakened immune system could be at risk of complications if exposed to the virus.

 

What is norovirus?

Norovirus is the most common stomach bug in the UK, and has a peak season during the winter months (roughly around October/November to March/April). It causes vomiting and diarrhoea, with symptoms lasting around one to two days, though the severity and duration of symptoms can vary between individuals.

Norovirus is a highly contagious virus, meaning that only a few viral particles are required to cause infection. It is passed on by virus particles picked up on the hands being transferred to the mouth (e.g. through touching contaminated surfaces or eating contaminated food). It can also be spread by inhaling small airborne particles of the virus (e.g. if someone nearby has profuse vomiting).

As it is so contagious, it’s frequently the cause of outbreaks in places where people are gathered together, such as schools, childcare centres, care homes and hospital wards.

It is a self-limiting virus, meaning that symptoms will clear up by themselves and no specific treatment is effective, or needed. However, as with any diarrhoea and vomiting bug, dehydration is the main risk, particularly for vulnerable people, such as the young or elderly. Therefore, regular fluids are very important.

 

What are the main ways to prevent norovirus spreading?

As norovirus is highly contagious, the key ways of stopping the virus spreading centre on:

  • effective hand washing
  • isolation or exclusion of the infected individual (e.g. from school or work)
  • effective cleaning and disinfection of environmental surfaces (e.g. bathrooms and toilets)

Effective hand washing is a key measure. Hands should be washed at all appropriate times, such as before eating and preparing food, and after using the toilet or helping others (e.g. changing nappies).

Effective hand washing includes:

  • wetting hands under running water
  • applying soap and rubbing this in thoroughly, over all hand surfaces; the length of time spent washing your hands is recommended to last as long as it takes to sing the “Happy Birthday” song
  • rinsing
  • drying hands thoroughly, ideally with disposable paper towels (reusable towels or flannels should not be shared between people)
  • using the paper towel to close the tap, so not to re-contaminate hands

Even if gloves are worn (e.g. when cleaning up spillages of vomiting), hands still need to be washed after removing the gloves.

If hand sanitiser/alcohol gel is used instead of hand washing, sufficient gel (e.g. the size of a 10p coin) needs to be applied to all hand surfaces and rubbed in for about 30 seconds. If hands are visibly soiled, they need to be washed with soap and water.

People with diarrhoea and vomiting should be sent home from work or school (or isolated if in a care home, for example) and should not return until 48 hours after symptoms have resolved.

This includes not visiting public places where people are gathered, such as GP surgeries, hospital wards or care homes. Public Health England has highlighted the risk in hospitals.

 

What does Public Health England say?

Public Health England says there were 18 outbreaks of norovirus in hospitals across England in November 2013, 17 of which led to ward closures. Data for July 2013 to June 2014 indicated that there were 610 reported hospital outbreaks over this one-year period, 94% of which led to ward closures. As Public Health England says, closures are necessary to stop the contagious virus spreading, but are also highly disruptive.

John Harris, an expert in norovirus at Public Health England, advises: “October usually marks the start of the norovirus season and the bulk of cases will occur between now and April next year.

“No two norovirus seasons are the same, and there is no way of predicting how busy a season will be. What we do know is that many people will be affected across the country and they will probably feel very unwell for a couple of days, but will get better.

“For patients already ill in hospital, this virus could cause further health complications, making it vital to prevent introducing the virus into the hospital environment.

“We strongly urge anyone affected to stay at home and to telephone NHS 111 for advice.”

 

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

If you've got winter vomiting bug, stay at home: Health officials warn patients NOT to visit their GP or hospital as norovirus spreads through wards. Mail Online, November 6 2014

Norovirus: What are the symptoms of the winter vomiting bug and how do you avoid it? The Independent, November 6 2014

‘Tis the season for winter vomiting bug. The Times, November 7 2014

NHS crisis warning over killer winter bug. Daily Express, November 7 2014

Categories: NHS Choices

Fruit chemical may prevent organ damage

NHS Choices - Behind the Headlines - Thu, 06/11/2014 - 12:00

"Could fruit help heart attack patients? Injection of chemical helps reduce damage to vital organs and boosts survival," reports the Daily Mail – "at least in rodents," it should have added.

When tissues are suddenly deprived of oxygen-rich blood (ischaemia), which can occur during a heart attack or stroke, they can suffer significant damage. Further damage can occur once blood supply is restored. Until now, scientists did not know the exact cause of this damage.

Through a set of animal experiments, researchers may have now identified the cause. It could be the result of an increase in a chemical called succinate. Succinate appears to interact with the returning oxygen molecules, creating harmful molecules (reactive oxygen species) that can damage individual cells.

The researchers were able to reduce the amount of succinate produced during periods of mouse heart ischaemia and brain ischaemia by injecting a chemical called dimethyl malonate, which is found in some fruits. This in turn reduced the amount of tissue damage that occurred when the blood supply was returned to the heart and brain.

Although the potential uses are wide ranging, including the use of dimethyl malonate as a potential preventative treatment during heart attacks, stroke or surgery, it will need to be shown to be both effective and safe through human trials.

 

Where did the story come from?

The study was carried out by researchers from the University of Cambridge, St Thomas' Hospital, University College London, the University of Glasgow, and the University of Rochester Medical Centre, New York.

It was funded by the Medical Research Council, the Canadian Institutes of Health Research, the Gates Cambridge Trust, and the British Heart Foundation.

The study was published in the peer-reviewed medical journal Nature.

The study was accurately reported by the Daily Mail, although the headline was misleading – dimethyl malonate has not yet been used to improve survival in humans. It has only been used in experiments involving mice and rats.

Also, although dimethyl malonate is found in some fruit, the chemical itself has been used, rather than the mice and rats being treated with pieces of fruit.

 

What kind of research was this?

This was an animal study looking at the mechanism behind the injury that occurs to tissues when blood supply is returned after a period of ischaemia (no blood supply).

It was previously believed that tissue injury in these cases, particularly seen after a heart attack, was a non-specific response to the cells regaining oxygen.

The researchers wanted to test the hypothesis that a specific metabolic process causes the injury. And, if so, they wanted to see whether they could develop a drug to limit the process and thereby prevent the injury.

 

What did the research involve?

The researchers looked at chemicals produced in mouse kidneys, livers and hearts, and rat brains after the animal had suffered from ischaemia and had then been reperfused (had their blood and oxygen supply returned).

After identifying one chemical, called succinate, which was increased in all of the tissues studied, the researchers performed a variety of experiments on mouse hearts to investigate the metabolic pathways responsible for the increased level and tissue damage.

They then tested a chemical, dimethyl malonate, that prevented the accumulation of succinate in mouse hearts and rat brains during ischaemia to mimic a stroke.

 

What were the basic results?

The chemical succinate was increased in all of the animal tissues by 3 to 19 times normal levels, and the level of succinate increased with longer periods of ischaemia. It returned to normal levels by five minutes after reperfusion.

Infusing mice with the chemical dimethyl malonate, which can act as an inhibitor of one of the enzymes that can make succinate, significantly reduced the succinate accumulation in the ischaemic heart.

It also stopped the accumulation of succinate in rats' brains during ischaemia (similar to a stroke), and reduced the amount of tissue damage and neurological disabilities.

Succinate is a chemical present in what is known as the citric acid cycle. This cycle is the series of chemical reactions used by all aerobic (oxygen using) organisms to produce energy from fats, carbohydrates and proteins. Interestingly, none of the other chemicals in this pathway were increased during the ischaemia.

Dimethyl malonate is a naturally occurring substance and has been detected in a number of fruits, such as pineapples, bananas and blackberries. It is also widely used in pharmaceuticals, agrochemicals, vitamins, fragrances and dyes.

 

How did the researchers interpret the results?

The researchers concluded they have shown how the chemical succinate accumulates during ischaemia, and that this drives the tissue injury seen when blood supply is returned in a range of rat and mouse tissues.

They found they can reduce the amount of accumulation and damage by using an infusion (injection of a solution) of dimethyl malonate. This research will now pave the way for human trials.

 

Conclusion

This exciting set of experiments has identified the metabolic driver of tissue injury seen when blood supply is returned after a period of ischaemia. The researchers have also shown this process can be limited by using an injection of dimethyl malonate in mice and rats.

It is likely the same increased metabolic processes occur in humans, so there are wide implications for the future, including the potential use of dimethyl malonate injections to prevent tissue damage during surgery.

At present it is unclear how this could be used practically during a heart attack or stroke, and this will be one of many issues that will be explored when human trials are initiated, along with the safety of this treatment.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Could FRUIT help heart attack patients? Injection of chemical helps reduce damage to vital organs and boosts survival. Daily Mail, November 5 2014

Links To Science

Chouchani ET, Pell VR, Gaude E, et al. Ischaemic accumulation of succinate controls reperfusion injury through mitochondrial ROS. Nature. Published online November 5 2014

Categories: NHS Choices

Does having a hobby help you live longer?

NHS Choices - Behind the Headlines - Thu, 06/11/2014 - 11:00

"Having a hobby can add YEARS to your life," The Daily Express reports. The headline is prompted by an international study that looked at ageing and happiness.

The study found older people who reported the greatest sense of purpose in life survived longer than those who reported having little sense of purpose, suggesting that having a meaning in life might play a role in protecting people's health.

But this study cannot prove having a hobby or other purpose in life increases the chances of surviving longer.

As the authors point out, there are many other factors involved that might have an effect on survival, including ill health and material income.

Other studies show a two-way connection between health and wellbeing. Being affected by common illnesses such as arthritis or heart disease, for example, can make it difficult to maintain a zest for life.

That said, it's obviously sensible for people to stay active as they grow older and to maintain their social activities and relationships. Having something to live for, whether it's as noble as eradicating world poverty or a little more down to earth, such as maintaining an attractive garden, could help you live longer.

Research shows people who regularly give up their time to help others, stay activelearn new things and connect with others tend to have higher reported feelings of wellbeing.

 

Where did the story come from?

The study was carried out by researchers from University College London, as well as Princeton University and Stony Brook University, which are both in the US.

Funding came from a variety of sources, including the US National Institute on Aging and several UK government departments.

The study was published in the peer-reviewed medical journal The Lancet.

Press coverage reported the study's findings uncritically and took some liberties in extrapolating the results. It would be simplistic to say – as the Express did – that, "having a hobby can add YEARS to your life", as many other confounding results are likely to be involved.

The Daily Telegraph's claim that, "pensioners with sense of purpose live two years longer than cynics" is overstating the study's findings. Cynicism was not even mentioned in the study.

BBC News took a slightly different take on the study, focusing on the global variations in happiness and how this changes over the course of a lifetime.

 

What kind of research was this?

This study is part of a Lancet series on ageing, which drew on various sources to look at the relationship between wellbeing, health and ageing.

It did not present any new evidence, but analysed findings from existing sources, such as an ongoing international poll on wellbeing and an English study of ageing.

According to the researchers, there are three different aspects to wellbeing:

  • evaluative wellbeing – or life satisfaction
  • hedonic wellbeing – feelings of happiness, sadness, anger, stress and pain
  • eudemonic wellbeing – sense of purpose and meaning in life

The researchers say subjective wellbeing is becoming a focus of intense debate in public policy and economics, with improvement in wellbeing a key aspiration.

Research suggests subjective wellbeing might even protect health, reduce the risk of chronic illness and promote longevity. Their paper summarises the present evidence linking subjective wellbeing with health in an ageing population.

 

What did the research involve?

The researchers searched online databases for relevant evidence, and included all articles published in English between January 2000 and March 2012.

For their analysis of the link between wellbeing and age in different parts of the world, they mostly drew on large-scale international surveys such as the Gallup World Poll, an ongoing survey taking place in more than 160 countries.

To look at the association between wellbeing and survival, they carried out a new analysis of an existing study, the English Longitudinal Study of Ageing (ELSA), relating eudemonic wellbeing to mortality. 

In this analysis, 9,040 people with an average age of 64.9 years were followed for an average of 8.5 years, with 1,542 deaths analysed. Eudenomic wellbeing was assessed by questionnaire on issues such as sense of control, purpose in life and self-realisation. The cohort was divided into quartiles of wellbeing and were analysed for the relationship between wellbeing and survival. 

 

What were the basic results?

The researchers' analysis of the ELSA found eudemonic wellbeing is associated with increased survival:

  • 29.3% of people in the lowest quartile of wellbeing died during the follow-up period of 8.5 years, compared with 9.3% of those in the highest quartile
  • after adjustment for factors such as education, health and income, the highest quartile had a 30% lower risk of dying within the study period

They also reported on other data, which shows:

  • a U-shaped relation between life satisfaction (evaluative wellbeing) and age in high-income English-speaking countries, with the lowest levels of wellbeing in those aged 45-54, after which levels start to rise
  • this pattern is not universal – for example, respondents from the former Soviet Union, Eastern Europe and Latin America show a large progressive reduction in wellbeing with age, while wellbeing in Sub-Saharan Africa shows little change with age

They also found studies that showed how the relation between physical health and subjective wellbeing is "bidirectional".

Older people with common illnesses of ageing, such as coronary heart disease, arthritis and chronic lung disease, show both increased levels of depressed mood and impaired wellbeing.

 

How did the researchers interpret the results?

The researchers concluded that the wellbeing of elderly people is an important objective for both economic and health policy.

"Even though the results do not unequivocally show that eudemonic wellbeing is causally linked with mortality, the findings do raise intriguing possibilities about positive wellbeing being implicated in reduced risk to health," the authors conclude.

They also concluded that the U-shaped curve in wellbeing in high-income English-speaking countries – with life satisfaction at its lowest in the 45-54 age group – is because this is the period for working and earning the most at the expense of wellbeing.

The findings about wellbeing in former Soviet Union and Eastern European countries is attributed to the recent transitions and resulting political and economic instabilities in these countries. Similar, if not as extreme, instabilities can be seen in the Caribbean and Latin America.

The flatlining of happiness in Sub-Saharan Africa, while not explicitly discussed by the researchers, is possibly a result of the high levels of poverty, and corresponding lack of opportunities to build a better life as a person grows older.

 

Conclusion

This is an interesting paper on the important issue of wellbeing and its potential effect on health and survival. However, as the authors point out, it does not prove that wellbeing protects health and increases the chance of living longer.

The association they found could be a result of both measured and unmeasured confounders, such as ill health. Wellbeing could be a marker for underlying biological processes that are responsible for the effect on survival.

There are likely to be bidirectional effects at work. Some people with poor health become unhappy, while others who are unhappy become physically unwell.

That said, it's sensible for people to stay active as they grow older, and maintain their social activities and relationships. Eating well, exercising regularly and maintaining a healthy lifestyle are also advised.

Read more about how to be happier.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

How you can add years to your life: Major lifestyle changes can combat killer diseases. Daily Express, November 6 2014

Pensioners with sense of purpose live two years longer than cynics. The Daily Telegraph, November 6 2014

Happiness 'dips in midlife in the affluent West'. BBC News, November 6 2014

Here's Where People Are Happiest Growing Old. Time, November 5 2014

Links To Science

Steptoe A, Deaton A, Stone AA. Subjective wellbeing, health, and ageing. The Lancet. Published online November 6 2014

Categories: NHS Choices

Kellie Maloney: 'My 60-year secret'

NHS Choices - Live Well - Thu, 06/11/2014 - 00:00
Kellie Maloney: 'My 60-year secret'

Retired boxing promoter Kellie Maloney, formerly Frank, talks about her dreams, diaries and secret dress-ups, and why it took her 60 years to come out.

Las Vegas, November 1999. Lennox Lewis has just beaten Evander Holyfield to become Britain's first undisputed heavyweight champion of the world in more than 100 years.

After the ringside celebrations among 6,000 delirious British fans, Lewis' promoter Frank Maloney retires to his hotel room, where the darkness engulfs him.

He should be feeling on top of the world. He's just masterminded the greatest success story of British boxing in a century. It should be the crowning glory of his career.

But Frank has a secret. A secret so huge he has never been able to whisper a word of it to anyone for fear of losing everything.

It will be another 15 years before Frank reveals his lifelong secret to the world: that he is a woman trapped in a man's body.

By the time of the announcement in August 2014, Frank had been living as Kellie for a year and was preparing to have gender reassignment surgery.

One of the main reasons Kellie didn't come out earlier is because of her father, Tom, a former railway worker, who had been so proud of all his son had achieved after leaving school aged 15.

"I had his respect," she says. "He put me on a pedestal. I could never hurt him." Looking back, Kellie now believes her father's death from cancer aged 87 may have been a tipping point.

'Living with the burden any longer would have killed me'

Kellie Maloney

"I couldn't tell him, even on his death bed," she says. But his passing in 2009 meant Kellie no longer had to live up to her father's expectations. "I couldn't disappoint him anymore," she says.

Kellie had planned to complete her transformation in private, but was forced into going public after a national newspaper threatened to out her.

Buoyed by the public's positive reaction, Kellie has embraced her new-found status as Britain's most famous trans person to help raise awareness about transgender issues.

She followed a string of newspaper and TV interviews with a three-week appearance on Channel 5's Celebrity Big Brother.

'I always knew'

At 61, and after nearly 60 years of corrosive silence, she says: "Living with the burden any longer would have killed me. I always knew … from the age of three or four. I didn't know what it was, but I didn't associate with what I saw in the mirror."

But growing up in 1950s Peckham, south London, with working class Irish Catholic parents, being different was not an option.

"I just wanted to be a normal boy," she says. "In those days, you'd get called names just for having a speech impediment or having ginger hair."

So Kellie tried "extra hard to be accepted as one of the boys", dating pretty girls and getting into sports such as athletics, football and boxing.

But as hard as she tried to be a Jack the lad and fit in, deep down she still wanted to be a girl, to dress and act like them.

"I have a female brain," she says. "I knew I was different from the minute I could compare myself to other children. I wasn't in the right body. I was jealous of girls.

"If I saw a girl and she looked really nice, I'd wonder how I would look if I was wearing that. Then I'd try to distract myself and stop thinking about it."

The only place Kellie could be herself was in her dreams. "In every dream I've ever had I'm a girl," she says. "At first I used to think I was dreaming of someone else."

The childhood dreams never went away, and became more and more vivid as Kellie got older. "I was living as the real me in my dreams," she says.

She has always kept a diary, even at the height of her power and influence in the boxing world, where the suppressed Kellie could come out for some air.

"Everything you need to know about Kellie is on those pages," she says. "The diary helped me get things off my chest. It was like therapy."

Secret dress-ups

With nobody to share her secret with, Kellie found strength from reading about other people's experiences in transgender publications and websites.

She remembers reading the stories of Christine Jorgensen and April Ashley, who were among the first transsexuals to have gender reassignment surgery in the 1950s. "It just didn't seem a realistic possibility to me," she says.

To satisfy her desire to be Kellie, if only for a while, she attended private dressing up sessions in shops off high streets and down back roads in Dublin and Manchester.

"You could get a makeover for about £300 and stay there for three to four hours," she says. "For a few hours I was able to be the real me."

'Boxing was a great distraction'

Then it was back to the macho world of boxing, where Kellie had built a reputation as a hard-nosed Cockney geezer famous for wearing Union Flag suits in the ring.

"Boxing was a great way to distract myself from my thoughts," she says. "I was totally absorbed. I buried myself in my work, 24/7."

On fight nights, Kellie would sometimes leave her team in the locker room and enter the thronging arena to allow herself one little indulgence.

"I would try to imagine what it would be like to be one of those glamorous women in the audience," she says. "But I'd head back to the locker room and focus on the boxing."

Away from the boxing ring, Kellie tried to conform to what society expected. She says she genuinely found love, married twice, and has three daughters.

'I'm a woman, and my transition won't be complete until my body matches my mind'

Kellie Maloney

"I thought being in love and marrying would beat what was going on inside of my head, but the inner fight in me made it harder to keep going as the years went by," she says.

Marriage break-up

Kellie's secret was slowly tearing her apart. Struggling with depression, she began drinking heavily and tried to shut herself away from the world and her family.

"My life was spiralling out of control," she says. "I was very unhappy. My temper was getting worse."

She turned to counselling and says the help she received over the phone and in face-to-face sessions over the last 12 years "has helped me enormously to come to terms with myself".

She recalls having an angry exchange with one counsellor. "All I wanted was for him to tell me that I wasn't transgender, but he said I needed to accept myself if I want to live a normal life."

Kellie is still in touch with some of the organisations that helped her, including transgender support group TG Pals, and is keen to support their work.

Her father's death, the suicide of boxer Darren Sutherland, and her own health – she had a heart attack watching a boxing match – all weighed heavily on her decision to reveal all to her wife in 2009.

"Tracey was the first person I had told outside counsellors," she says. "She swore she would never tell anyone. She would've taken my secret to the grave to protect our girls."

The couple tried in vain to rekindle their relationship. Following the breakdown of their marriage, Kellie tried to take her own life on Christmas Day in 2012 – her second attempt in recent years.

"It got to a point where I just couldn't cope any longer," she says. "I couldn't go on, and I had to start living the life of the person I should've been born as."

She revealed all to the rest of her family before the story went public. After the shock and tears, they have mostly come out in support of her, although her daughters are unlikely to stop calling her "Dad".

"I can't say enough about my 81-year-old mother," she says. "She told me that she had always known I was different from my brothers and that at last she could see why.

"I think it stems from her that the family has accepted it, because she made the point of telling them. Their support has given me the confidence to go out in the world and be myself."

Gender surgery and identity

Over the past two years, Kellie has received hormone therapy, hair removal electrolysis, voice coaching and specialist counselling.

The final phase of her transformation will involve "the realignment of my male genitalia to become female genitalia", as well as having breast implants and facial surgery – all done privately.

She says her transition was about gender identity and not about sexual orientation.

Why the name Kellie? "It's always been Kellie," she says. "It's short and sweet. I'm dyslexic and Kellie is easy to pronounce and spell."

"Until I have the operation I feel like half a person," she says. "I'm a woman and my transition won't be complete until my body matches my mind," she says.

"I don't want to be labelled," she says. "I'm not a transsexual woman. I'm a human being. All that is being done is to correct a mistake at birth."

And where's Frank? "Frank is still a part of me, but the roles have been reversed," she says. "Kellie used to be a small part of Frank, but now Frank is a small part of Kellie."

Categories: NHS Choices

Smoking 'increases risk of chronic back pain'

NHS Choices - Behind the Headlines - Wed, 05/11/2014 - 12:00

"Smokers are three times more likely to suffer from back pain," the Mail Online reports. The headline was prompted by the results of a recent study, which involved observing 68 people with sub-acute back pain (back pain lasting for 4 to 12 weeks with no back pain in the previous year) over one year.

The participants completed repeated questionnaires about their level of back pain and had four functional MRI brain scans over the course of the year.

Smokers were three times more likely to develop chronic back pain. They were also more likely to have increased activity in the brain pathways implicated in addiction (between the nucleus accumbens and the medial prefrontal cortex).

The researchers speculate this increased activity may also increase the risk of chronic pain developing. This increase in activity reduced in a small number of people who stopped smoking.

As this was an observational study, it cannot prove that the increased brain pathway activity or smoking caused the back pain to become chronic, but it does indicate they may be linked in some way.

Even if you don't suffer from back pain, there's no excuse not to try to quit smoking. It can cause lung cancer and heart disease, and increase your risk of a stroke – all of which can be fatal.

 

Where did the story come from?

The study was carried out by researchers from the Feinberg School of Medicine in the US, and was funded by the US National Institutes of Health.

It was published in the peer-reviewed medical journal, Human Brain Mapping.

The study was generally reported accurately by the Mail Online, although it didn't emphasise that the findings were only based on 68 people.

Similarly, the study was about how smoking influenced the risk of people moving from experiencing sub-acute back pain to chronic back pain, but this subtlety seemed to be lost.

Based on the headlines, readers may get the wrong impression that the study was about developing back pain full stop.

Also, the Mail's claim that "quitting can ease symptoms" – while well meaning – is unsupported by the evidence of this study.

 

What kind of research was this?

This was a longitudinal study looking at the potential relationship between developing chronic back pain and smoking tobacco.

Previous research suggested the brain pathways involved in addiction are also related to those implicated in the development of chronic pain.

The researchers aimed to test the theory people with new-onset back pain would be more likely to develop chronic back pain if they were smokers.

As this was a type of observational study, it cannot prove smoking causes a transition to chronic back pain, but it can show potential links that can be tested in more rigorous studies in the future.

It is often difficult to tease out the precise relationship between smoking and chronic back pain. Smokers tend to be unhealthy in other ways, such as not taking very much exercise, so this could also have a confounding effect.

 

What did the research involve?

The year-long study involved participants completing well-validated questionnaires about:

  • pain (McGill short form)
  • depression (Beck's Depression Inventory)
  • positive or negative feelings and emotions (Positive Affective Negative Affect Score, PANAS)
  • demographic information, including smoking status

After an initial visit, participants were assessed on four more occasions during the year using further questionnaires. They also had their brains scanned using functional MRI scans, which can – at least to a certain extent – measure brain activity.

Three groups of people were included in the research. The first and largest group consisted of 160 people with sub-acute back pain, defined as back pain lasting for 4 to 12 weeks with no back pain in the previous year. Of these, 123 were recruited to the study and 68 people completed follow-up after one year.

The second group included 32 people with chronic back pain for more than five years, of whom 24 completed the study. The third group of 33 people was considered to be the control group. These people had no back pain, and 19 completed the study.

For all groups, the researchers analysed whether smoking was linked to their back pain.

 

What were the basic results?

Of the 68 people with sub-acute back pain, 31 were considered to be recovering according to a pain decrease of at least 20% after one year (six of these were smokers and 25 were non-smokers). The other 37 had persistent pain (16 smokers and 21 non-smokers).

Those with persistent pain were three times more likely to be smokers than those who recovered, (odds ratio [OR] 3.17, 95% confidence interval [CI] 1.05 to 9.57) despite having similar levels of initial back pain.

They were also more likely to have increased activity in the brain pathways implicated in addiction (between the nucleus accumbens and the medial prefrontal cortex).

In nine participants with sub-acute back pain or chronic back pain, this brain pathway activity reduced after they quit smoking, but it is unclear what effect this had on their back pain.

Smoking also did not appear to relieve pain, as smokers did not have reduced back pain intensity either at baseline or after one year compared with non-smokers, and back pain did not increase when people stopped smoking.

At baseline, people with sub-acute back pain and chronic back pain were more likely to be smokers than controls. And the pain was also likely to have a negative impact on their mood, according to higher scores on the Beck Depression Inventory and negative PANAS scores.

 

How did the researchers interpret the results?

The researchers concluded that, "Smoking increases risk of transitioning to CBP [chronic back pain], an effect mediated by corticostriatal circuitry involved in addictive behaviour and motivated learning."

 

Conclusion

This longitudinal study found sub-acute back pain was three times more likely to progress to persistent back pain in smokers.

The researchers presented functional MRI findings, which indicated brain pathways that could be involved in this process. But further research will be required to fully understand the mechanisms at play.

The study did not find that smoking provided any pain relief, and indeed the pain intensity did not increase for those people who stopped smoking.

The study sample was quite small, meaning the results may not be applicable to larger and more diverse groups of people. As such, the results are not conclusive and should not be taken at face value.

The general advice for the early management of lower back pain is:

  • to continue normal activities as far as possible
  • to stay physically active and exercise within your capabilities
  • if medication is required, start with paracetamol and then consider other options, such as non-steroidal anti-inflammatory drugs (NSAIDs) such as ibuprofen, with appropriate stomach protection

While this research is not conclusive, there are many health benefits associated with stopping smoking that have a large and robust evidence base, such as a reduced risk of lung cancer and heart disease.

Read more advice about effective methods known to help many smokers quit.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Smokers are three times more likely to suffer from back pain - but quitting can ease symptoms. Mail Online, November 4 2014

Links To Science

Petre B, Torbey S, Griffith JW, et al. Smoking increases risk of pain chronification through shared corticostriatal circuitry. Human Brain Mapping. Published online October 12 2014

Categories: NHS Choices

'Elite controllers' may provide clues for HIV cure

NHS Choices - Behind the Headlines - Wed, 05/11/2014 - 11:50

“Scientists have uncovered the genetic mechanism which appeared to have led two HIV-infected men to experience a 'spontaneous cure’,” the Mail Online reports.

The men are what is known as “elite controllers”: people thought to have high levels of immunity against the virus, as they do not develop any symptoms of HIV, despite going untreated.

Both men had no trace of HIV in blood tests that are normally used to detect the virus, but did have the virus in their DNA. This study found that there had been a mutation in the virus, meaning it was unable to replicate. This mutation may have been caused by the increase of an enzyme called APOBEC, which is usually inhibited by HIV infection.

Current treatment of HIV involves taking anti-viral medication to keep the spread of the virus (viral load) minimised, so it doesn’t cause any symptoms. This treatment regime has actually been remarkably successful – one of the great achievements of modern medicine. However, the major drawback is that the person has to take drugs every day. This research may offer the possibility of modifying the HIV virus, making it harmless – thus achieving, to all extents and purposes, a complete cure.

The researchers now wish to determine if this same mutation is present in other people who appear to have immunity to HIV infection.

  

Where did the story come from?

The study was carried out by researchers from Aix-Marseille University, hospitals in Marseille, the University of Paris Est, and the Vaccine Research Institute in Créteil. The study was reported to be funded internally and the authors declare no conflicts of interest.

The study was published in the peer-reviewed medical journal Clinical Microbiology and Infection.

The Mail Online’s reporting of the study was accurate and provided useful insights from the study authors.

 

What kind of research was this?

This was a case study of two people who were HIV positive, but with no symptoms. These people are known as “elite controllers” as they seem to have an innate immunity against infection.

It's difficult to estimate exactly how common elite controllers are as, by their very nature, they remain free of symptoms, so they often go undiagnosed. They usually only come to light if a sexual partner or fellow drug user contracts HIV, so they are offered testing. The current best guess is that less than 1 in a 100 people have this immunity.

The researchers aimed to study the immune response of these two elite controllers to HIV infection, to understand why they have been asymptomatic.

 

What did the research involve?

The researchers investigated the DNA of two men who had been diagnosed with HIV in 1985 and 2011, but who had no HIV-related disease and no HIV detection in their blood after undertaking routine tests.

They performed laboratory tests to investigate how the HIV had been incorporated into the host DNA without it replicating. They also looked at the individual’s immune response to the HIV.

 

What were the basic results?

Viruses isolated from both men were inactive, meaning they could not spread in the men’s bodies, causing illness. This was due to a change in the genetic code of the virus, which effectively stopped it from replicating. For the geneticists out there, this involved many transformations of tryptophan codons into stop codons.

This transformation causes problems for the virus, as it can’t make the proteins it needs correctly. A specific enzyme called APOBEC makes this change, and this enzyme is usually inhibited by HIV infection.

The researchers speculated that, in these individuals, APOBEC might have been stimulated when they were first infected.

 

How did the researchers interpret the results?

The researchers concluded that their, “findings, which warrant further confirmation, are a first step in understanding the resistance to retroviruses. They may allow us to figure out the endogenisation [internal development] of retroviruses and detect resistant patients, as well as to initiate strategies of imitations of these patients to cure or prevent AIDS”. In other words, they suggest a new strategy for HIV treatments that does not require full eradication of the virus; instead, it allows incorporation of the virus into the DNA, but inactivating it. This is a new way of thinking.

 

Conclusion

This interesting research has found a likely reason for the apparent immunity of two men to HIV infection. This was through a transformation of an amino acid in the genetic code of the virus that stops it from replicating. The researchers now wish to reproduce their findings by looking at samples from other people who appear to be resistant to HIV infection. The next step in the quest for a cure would be to determine how to replicate this genetic switch in amino acids in people without natural resistance.

The results of the study have no immediate treatment implications, but do increase understanding of the virus and the disease, so could aid development of future treatments.

While current anti-viral treatment is effective, it usually requires a person to take drugs for the rest of their life, which can sometimes cause unpleasant side effects.

Therefore, a drug that could achieve a complete cure by stopping the HIV virus replicating would have a significant positive impact for people living with HIV.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Scientists uncover the secret of the 'elite controllers' who can spontaneously 'cure' HIV - and say it could lead to new treatments. Mail Online, November 4 2014

Links To Science

Colson P, Ravaux I, Tamalet C, et al. HIV infection en route to endogenization: two cases. Clinical Microbiology and Infection. Published online November 4 2014

Categories: NHS Choices

Short height 'linked to dementia death risk'

NHS Choices - Behind the Headlines - Tue, 04/11/2014 - 11:50

"Short men more likely to die from dementia," The Daily Telegraph reports, though the results of the study it reports on are not as clear cut as the headline suggests.

Researchers combined the results of 18 surveys, which included more than 180,000 people. They aimed to see whether reported height was associated with deaths from dementia over 10 years of follow-up.

They found decreasing height was associated with higher rates of death from dementia. Each standard deviation decrease in height was associated with a 24% increase in risk of dementia death for men, and a 13% increase for women. This was after adjustment for factors such as age and smoking.

However, there are important limitations to consider. Despite the large cohort size, only 0.6% of the cohort died from dementia. These are small numbers on which to base any analysis.

Also, despite the trend, none of the smaller height categories were associated with a significantly increased risk of dementia death.

So, for both men and women, the smallest people in the study did not have a significant increased risk of dementia when compared with the tallest.

This means the association seen between height and dementia death isn't entirely convincing.

 

Where did the story come from?

The study was carried out by researchers from the University of Edinburgh, University College London, and the University of Sydney.

The Health Survey for England is part of a programme of surveys commissioned by the UK NHS Health and Social Care Information Centre.

Other surveys have been carried out since 1994 by the Joint Health Surveys Unit of the National Centre for Social Research, and the Department of Epidemiology and Public Health at University College London.

A number of other funding sources are also acknowledged. No conflicts of interest were reported.

The study was published in the peer-reviewed British Journal of Psychiatry.

The UK media took the reported results at face value without considering the limitations of this research. That said, all news sources that reported on the study took pains to emphasise that shortness in itself is very unlikely to cause dementia.

 

What kind of research was this?

This was a meta-analysis of data collected from participants as part of English and Scottish health surveys. It aimed to investigate the association between height and death as a result of dementia.

A meta-analysis aims to summarise the evidence on a particular question from multiple related studies.

The researchers say height is a marker of early-life illness, adversity, nutrition and psychosocial stress, and that these characteristics influence brain development, which may then affect dementia risk.

As this study is based on observational data, it cannot prove cause and effect. Its limitations include the inability to adjust for all possible confounders that may be influencing the results.

Also, deaths as a result of dementia were identified through the use of death certificates, which have not always clarified the type of dementia or whether it was directly involved in a person's cause of death. As it is primarily a disease of ageing, many people die with dementia rather than of dementia.

 

What did the research involve?

The researchers performed a meta-analysis of 181,800 participants from the Health Survey for England for the years 1994 to 2008, and the Scottish Health Survey for 1995, 1998 and 2003.

As part of the health surveys, participants were visited by a trained interviewer, who measured their height and weight. Participants were also asked about their:

  • occupation
  • age on leaving full-time education
  • ethnic group
  • smoking status
  • whether they suffered from a longstanding illness

They were subsequently visited by a nurse, who measured their blood pressure and took a blood sample to measure their cholesterol levels.

Each participant was linked to the UK NHS death register. Researchers reviewed death certificates to look for International Classification of Diseases (ICD) codes related to dementia.

In their analyses, they considered any mention of dementia on the death certificate (it may not always have been the direct cause of death). 

The researchers looked at the association between height and death from dementia, controlling for age, gender and the other factors they had information about.

 

What were the basic results?

Increasing height was generally associated with a more favourable risk factor profile in both men and women.

Taller study members were younger, from higher socioeconomic backgrounds, had slightly lower body mass index, a lower prevalence of longstanding illness, and lower blood pressure and serum cholesterol levels. Taller men were also less likely to smoke, but the reverse was true of women.

During an average follow-up of 9.8 years, there were 17,533 deaths, of which 1,093 (0.6% of the cohort) were dementia related (426 men and 667 women).

Overall, there was a 27% increased risk of dementia death per standard deviation decrease in height in men (corresponding to 7.3cm; hazard ratio [HR] 1.24, 95% confidence interval [CI] 1.11-1.39) and a 13% increased risk of dementia death in women (corresponding to 6.8cm; HR 1.13, 95% CI 1.03-1.24).

These results show the association was stronger in men than women. This overall trend for increasing dementia risk with each standard deviation decrease in height was significant for both men and women.

However, when comparing the tallest height category with each of the three smaller height categories, none were associated with a significantly increased risk of dementia compared with the tallest – in other words, for both men and women, the smallest people in the study did not have an increased risk of dementia when compared with the tallest.

 

How did the researchers interpret the results?

The researchers concluded that, "early-life circumstances, indexed by adult height, may influence later dementia risk."

 

Conclusion

This study has combined the results of 18 health surveys for England and Scotland involving more than 180,000 people.

They found, overall, each standard deviation decrease in height was associated with an increased risk of death from dementia, with the trend slightly stronger for men than for women.

But this study has important limitations to consider. Despite the large cohort size, only 0.6% of the cohort (426 men and 667 women) died from dementia, as identified by documentation on their death certificate. These are small numbers on which to base analyses, particularly when further subdividing by gender and height category.

Though there was an overall trend for increasing risk with each standard deviation decrease in height, none of the smaller height categories were associated with a significantly increased risk of dementia death for either men or women, when compared with the tallest. Therefore, the association between height and dementia death isn't as clear cut as the media reporting implied.

The researchers considered dementia deaths to be any mention of dementia on the death certificate. We don't know from this study what the specific type of dementia was (Alzheimer's or vascular dementia, for example).

We also don't know that this was necessarily the direct cause of death. It could be the case that the person with dementia died from other causes. It is also possible the results are being influenced by confounding.

As the researchers consider, it is unlikely that height itself is a risk factor for dementia. It is more likely that decreased height could be a marker of other exposures, such as socioeconomic circumstances, nutrition, stress and illness during childhood.

This study did adjust for various factors, such as age, smoking, BMI, socioeconomic status and long-term illness, but the researchers would not have been able to take into account all the factors that could be influencing the relationship.

Overall, people with a shorter stature should not be too concerned by this study. The causes of dementia – in particular Alzheimer's, the most common type – are not clearly established.

Improving your cardiovascular health (keeping the flow of blood to your brain and heart well regulated) is probably the most effective step you can take to reduce your dementia risk.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Short men more likely to die from dementia, Edinburgh University finds. The Daily Telegraph, November 3 2014

Men shorter than 5ft 6ins are '50% more likely to die from dementia than those over 5ft 10ins'. Daily Mail, November 3 2014

Shortness can increase the risk of dementia. Daily Express, November 4 2014

Links To Science

Russ TC, Kivimäki M, Starr JM, et al. Height in relation to dementia death: individual participant meta-analysis of 18 UK prospective cohort studies. British Journal of Psychiatry. Published online November 3 2014

Categories: NHS Choices

Shift work 'ages the brain', study suggests

NHS Choices - Behind the Headlines - Tue, 04/11/2014 - 11:25

“Shift work dulls your brain,” BBC News reports. In a French study, researchers assessed 3,232 adults using a variety of cognitive tests and compared the results between people who reported they had never performed shift work for more than 50 days per year with those that had. They analysed the results, comparing the number of years of rotating shift work performed and how long ago the shift work had stopped.

They estimated that working shifts for 10 years or more "aged" the brain by 6.5 years. They also estimated that it takes at least five years of non-shift working to reverse the effects, though this was not based on individuals' recovery of cognitive abilities. It was based on a snapshot comparing people who had stopped shift work more than five years before with people who were currently doing shift work or had never done shift work.

The study did not prove shift work causes cognitive decline, as it did not take into account people’s baseline cognitive ability.

It is also unknown whether the small, observed differences in cognitive performance scores would have had any meaningful difference in terms of daily life and functioning.

So if you are reading this on a break during your night shift, you should not be overly concerned.

 

Where did the story come from?

The study was carried out by researchers from the University of Toulouse, Swansea University, Stockholm University, the Université Paris Descartes and the University of Monaco. It was funded by several French national organisations and the UK Institute of Occupational Safety and Health.

The study was published in the peer-reviewed medical journal Occupational and Environmental Medicine.

The UK media reported the findings accurately. However, what was not made clear in reports was that, although the participants were assessed on three occasions, the analysis of recovery was based on only one time point. Therefore, this does not prove that an individual will recover their cognitive abilities after stopping shift work. Media reports also did not made it clear that the differences seen could have been due to natural abilities, rather than shift work.

 

What kind of research was this?

This was a cohort study that aimed to assess the impact of shift work on mental ability. As it was a cohort study, it is useful to look for associations; however, it cannot prove causality as it does not take all other factors into account.

 

What did the research involve?

In 1996, 3,232 adults aged 32, 42, 52 or 62 years old were randomly recruited from French registries of salaried or retired workers. They completed questionnaires, had a clinical examination and performed a variety of well-validated cognitive tests, such as being asked to read 16 words three times and then immediately reciting the list from memory.

The results of these tests were pooled to provide a score for global cognitive performance, memory and processing speed on a scale of 0 to 100, with 100 indicating a higher performance. They were invited to have similar tests five and 10 years later. A total of 1,197 people attended on all three occasions.

The participants were also asked if their work involved any of the following types of shift work for more than 50 days per year, with responses categorised as either "current", "past" or "never":

  • rotating shift work (for example, alternating morning, afternoon and night shifts)
  • schedules that did not allow them to go to bed before midnight
  • work requiring them to get up before 5am
  • work preventing them sleeping during the night (night work)

The researchers also calculated the amount of exposure to rotating shift work and analysed whether longer duration of this type of shift work had any effect on the cognitive test scores. They grouped the participants according to:

  • never worked rotating shifts
  • 10 years or less
  • more than 10 years

Finally, they analysed whether the scores differed between people who were currently doing rotating shift work or those who had stopped more or less than five years before and people who had never done shift work.

They performed statistical analyses to take into account the following confounders:

  • age
  • gender
  • socioeconomic position
  • sleep problems
  • perceived stress
  • alcohol consumption
  • tobacco consumption

 

What were the basic results?

At baseline, the 1,635 people who reported never having done shift work for more than 50 days per year had higher average global cognitive performance scores compared to 1,484 people who had experienced shift work (56.0 compared to 53.3). This difference remained the same at each time point in the study. They also had slightly better memory scores (50.8 versus 48.5) and speed processing scores (78.5 versus 76.5).

The global cognitive performance scores were highest in the group aged 32 (59.6) and lowest in the group aged 62 (47.7).

People with more than 10 years of exposure to rotating shift work had poorer cognitive scores compared to those who had never worked rotating shifts. They compared the figures with the difference seen by age group at baseline and concluded that more than 10 years of rotating shift work was equivalent to 6.5 years of age-related decline. A similar difference was seen for the memory score, but not the speed processing score.

There were no significant differences in cognitive scores for people with 10 years or less exposure to rotating shift work compared to those who never had worked rotating shifts.

People who were currently working rotating shifts had an equivalent of 5.8 years of age-related decline, and people who had left within the past five years had an equivalent of 6.9 years of age-related decline compared to those who never had worked rotating shifts.

In contrast, those who had left rotating shifts more than five years before had no difference in cognitive tests compared to those who had never worked rotating shifts.

 

How did the researchers interpret the results?

The researchers concluded that, “exposure to shift work was associated with a chronic impairment of cognition; the association was highly significant for exposures to rotating shift work exceeding 10 years (with the exception of the speed scores among non-executive participants) and the recovery of cognitive functioning after having ceased any form of shift work took at least five years (with the exception of speed scores”. They also say, “the current findings highlight the importance of maintaining a medical surveillance of shift workers, especially of those who have remained in shift work for 10 years or more”.

 

Conclusion

The researchers conclude that, “shift work was associated with impaired cognition”, but as this was found at the start of the study, it cannot prove that shift work was the cause. It is possible that people who performed shift work differed in baseline cognitive ability from those who didn’t, which may be related to various other factors (such as educational attainment). To prove cause and effect, the study would need to assess cognitive ability in individuals before any exposure to shift work.

Further limitations of this study include that in each analysis, the control group considered never to have been exposed to shift work may actually have had up to 50 days of shift work per year. A more rigorous criteria for the control group, such as working no days of shift work per year, may have been more useful. 

It's not possible to draw firm conclusions about the cause of the association seen, as there was such a wide range of shift work patterns grouped together. It is also not known the type of shift work undertaken (for example, whether in a professional or more manual occupation).

The conclusion that cognitive function recovers five years after stopping rotating shifts is also not proven by this study. The researchers performed this section of the analysis using the information obtained at baseline only. They did not compare the cognition of individuals during periods of rotating shift work with their cognition five years after stopping. They compared people who had stopped with people who were still doing rotating shifts. Therefore, this analysis does not take into account their natural cognitive abilities.

Finally, it is not known whether the small differences in cognitive functioning, memory and processing scores observed between shift workers and day workers, would have actually made any meaningful difference in terms of the person’s daily life and functioning.

Overall, this study demonstrates an association between shift work and poorer cognitive function scores, but it did not prove that shift work was the cause.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Shift work dulls your brain – report. BBC News, November 4 2014

Shift work could be affecting your mental ability, scientists claim. The Independent, November 4 2014

Long-term shift work ages the brain: More than a decade 'knocks six years off memory and thinking skills'. Mail Online, November 4 2014

Long-Term Shift Work Ages Brain, Study Finds. Sky News, November 4 2014

Links To Science

Marquié J, Tucker P, Folkard S, et al. Chronic effects of shift work on cognition: findings from the VISAT longitudinal study. Occupational & Environmental Medicine. Published online November 3 2014

Categories: NHS Choices

Weight loss surgery cuts diabetes risk in very obese

NHS Choices - Behind the Headlines - Mon, 03/11/2014 - 11:55

“Weight loss surgery can dramatically reduce the odds of developing type 2 diabetes,” BBC News reports.

The underlying research identified a group of 2,167 obese adults without diabetes, the majority of whom were severely obese, with a body mass index (BMI) of 40 or above.

This group had undergone weight loss surgery, so researchers compared them with a comparison group matched for age, sex and BMI, who did not have surgery. They looked at the development of type 2 diabetes in both groups.

Using the maximum follow-up period in the study (seven years), they found that the “surgery group” had an 80% reduced risk of developing diabetes compared with the “no surgery” group.

These findings are mainly applicable to those with a very high BMI (over 40). Results at lower BMIs (30 to 35) were still positive, but did not have statistical significance.

It's important to stress that weight loss surgery is no magic bullet and is associated with both short- and long-term risks and complications, such as unsightly excess skin.

Regardless, the results are consistent with current English guidelines, which recommend offering weight loss surgery to people with a BMI of 40 or more if a number of additional conditions are fulfilled. People with a BMI of 35 to 40 can also be offered weight loss surgery if they have other medical conditions that are compounded by obesity. 

Read more about who is eligible for weight loss surgery on the NHS.

 

Where did the story come from?

The study was carried out by researchers from London-based University and Hospital Departments, and was funded by the UK National Institute for Health Research.

The study was published in the peer-reviewed medical journal The Lancet – Diabetes & Endocrinology. The study has been made available on an open access basis, so is free to read online.

Both the BBC and the Daily Express reported the study accurately.

 

What kind of research was this?

This was a (matched) cohort study in a large group of obese individuals, assessing the effect of weight loss surgery (also called bariatric surgery) on the risk of developing type 2 diabetes.

Cohort studies have the ability to give an indication of cause and effect, but not direct proof. Common limitations of such a study design include high dropout rates, and the possibility of confounding – that there are other differences between the people with the different exposures that are influencing the outcomes.

That said, due to the size of the reduction in relative risk in the surgery group, it would be surprising if surgery did not have at least some influence on the study's outcomes.

 

What did the research involve?

The research team recruited two groups of closely matched obese adults: one group underwent weight loss surgery and one group didn't. They then analysed whether the surgery influenced if they went on to develop type 2 diabetes over the following seven years.

The study recruited adults (age 20 to 100 years) identified from a UK-wide database of family practices, who were obese (BMI ≥30 kg/m2) and did not have diabetes.

They enrolled 2,167 patients who had undergone weight loss surgery between Jan 1 2002 and April 30 2014 and matched them according to BMI, age, sex, index year and a blood glucose measure for diabetes (HbA1c) with 2,167 controls who had not had surgery. Weight loss surgical procedures included:

  • laparoscopic gastric banding (n=1053)
  • gastric bypass (795)
  • sleeve gastrectomy (317)

In two people, procedures were undefined.

The main outcome the team were interested in was development of clinical diagnosis of diabetes, which was extracted from electronic health records.

 

What were the basic results?

The group reported that they found a reduction in diabetes risk in both men and women due to surgery, across age groups, and after different types of surgical procedures.

The average BMI for both groups was 43 – well above the minimum threshold level for obesity (30). People who had bariatric surgery were more likely to have high blood pressure or cholesterol, and to be treated with medications for these conditions.

Maximum follow up was seven years after surgery; however, most were followed up for less. The average (median) follow up was 2.8 years (interquartile range: 1.3 to 4.5 years).

By the end of the maximum seven-year follow-up period, 4.3% (95% confidence interval (CI) 2.9 to 6.5) of the weight loss surgery group had developed diabetes, compared with 16.2% (13.3 to 19.6) in the matched control group. This analysis took into account the time between surgery and diabetes, so gives different figures from the above.

This meant that the number of newly diagnosed diabetes cases (incidence) was significantly lower in the weight loss group compared with the controls, giving a hazard ratio of 0.20 (95% CI 0.13 to 0.3). This analysis was adjusted for confounders, including comorbid cardiovascular disease and depression, smoking, high blood pressure, and cholesterol and their associated treatments. This means that the surgery reduced the relative risk of developing diabetes by 80% compared to not having surgery.

 

How did the researchers interpret the results?

Their interpretation was that, “bariatric surgery [weight loss surgery] is associated with reduced incidence of clinical diabetes in obese participants without diabetes at baseline for up to seven years after the procedure".

 

Conclusion

This research suggests that weight loss surgery may reduce the risk of developing diabetes in people who are morbidly obese (with an average BMI of 43) compared with no surgery. The beneficial effect appeared to increase over time and at the maximum follow-up period assessed in the study (seven years), the relative risk of developing diabetes had reduced by 80%.

There was variation in the risk reduction depending on age, BMI and the type of procedure, but all were beneficial.

The study had many strengths, but also some key limitations.

The obese participants were sampled from a database that indicated whether or not they had surgery. The comparison group was only matched for age, sex and BMI, so it is likely that there is some other differences between these people which influenced their selection for surgery. For example, it could have been for reasons such as personal choice, inadequate trial of non-surgical measures, or being unsuitable for anaesthesia and surgery.

Despite the results being adjusted for various medical confounders that could have an influence, these other unknown and unmeasured differences may have meant the groups had different diabetes risk to begin with.

This could make it harder to be certain how much of the difference in diabetes risk is specifically down to the effect of surgery, and how much is due to other influences.  

It is also important to recognise that the results do not apply to all people who are categorised as obese. The average BMI of recruits was high overall, at 43, meaning the results may be less applicable to people with BMIs at the lower end of the obesity scale. Further evidence of this came from a sub-analysis by BMI category. They found significant risk reductions in BMI groups 35 to 39.9, and 40 and above. At BMI levels 30 to 34.9, there was still a 60% or so reduction in risk reported, but this failed to meet statistical significance, meaning it may be a chance finding.

However, in any case, most people with BMIs below 35 are not currently eligible for bariatric surgery on the NHS, in line with UK guidance.

A further factor to bear in mind is that the control group were not offered any intervention at all, such as an intensive weight loss programme. Hence, the results tell us how much surgery is better than doing nothing, rather than if it is better than specific non-surgical alternatives, such as the NHS Choices diet and exercise plan.

The results are consistent with current English guidelines, which recommend offering weight loss surgery to people with a BMI of 40 or more if a number of additional conditions are fulfilled. People with a of BMI 35 to 40 can also be offered weight loss surgery if they have other medical conditions. For full details, see Weight Loss Surgery – who can use it?

As with any surgery, weight loss surgery has risks. The balance of risks and potential benefits would need to be discussed between doctor and patient on a case-by-case basis. Information from studies like this may inform the conversation.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Weight loss surgery reduces diabetes risk. BBC News, November 3 2014

New research says surgery could be key to beating diabetes. Daily Express, November 3 2014

Links To Science

Booth H, Khan O, Prevost T, et al. Incidence of type 2 diabetes after bariatric surgery: population-based matched cohort study. The Lancet – Diabetes & Endocrinology. Published online November 3 2014

Categories: NHS Choices

Sadness 'lasts longer' than other emotions

NHS Choices - Behind the Headlines - Mon, 03/11/2014 - 11:30

"Sadness lasts 240 times longer than other emotions, study claims," is the somewhat sobering news on the Mail Online.

Researchers surveyed 233 young adults from a Belgian high school with an average age of 17, and found emotions vary widely in duration.

Of the 27 emotions studied, sadness lasted the longest, whereas shame, surprise, fear, disgust, boredom, feeling touched, irritation and relief were the shortest-duration emotions. 

Emotions that lasted longer were associated with more important event triggers, as well as more reflection about the feelings and the consequences of the event that prompted the emotion.

While the study is intriguing, it has a number of limitations to consider. Chiefly, the sample size (233) was small for a cross-sectional study and recruited a relatively homogenous (similar) group of students, who were aged around 17 years.

Young students who are coming out of the emotional turmoil that is puberty, as well as facing exam stress, may be more likely to report feeling sad for longer periods than other groups. This means it is uncertain whether similar findings would be seen in other populations.

While the results give us a tentative estimate of the duration of different emotions in a group of young adults, this can't be generalised to other age and demographic groups at this stage.

 

Where did the story come from?

The study was carried out by researchers from the faculty of psychology and educational sciences at the University of Leuven in Belgium.

It was funded by the University of Leuven Research Fund, the Interuniversity Attraction Poles Programme, which is financed by the Belgian government, and a postdoctoral research fellowship from the Fund for Scientific Research, Flanders.

The study was published in the peer-reviewed medical journal, Motivation and Emotion. This is an open-access study, meaning anyone can read it free online.

Generally, the Mail Online reported the study results accurately, although it tended to take the findings at face value, without discussing any of the limitations inherent in the research.

However, the Mail did include a useful infographic showing the duration of all the different emotions tested, with sadness being noticeably higher.

 

What kind of research was this?

This was a cross-sectional study investigating which emotions last longest and why.

The researchers wanted to describe any differences in the duration of different emotions and attempt to explain what might be behind these differences.

From a health perspective, the researchers suggested this might be useful because the duration of emotional disturbances are symptoms of some mental health conditions, such as depression.

The researchers specifically looked at emotions, which they outlined were distinct from moods, because emotions start in response to an external or internal event.

For example, you may wake up in a grumpy mood, whereas receiving an unexpected tax bill stimulates emotions such as anxiety and anger.

 

What did the research involve?

The research team asked a small group of young adults to recall the duration of past emotions, their triggers and coping strategies.

The team recruited 233 high school students (112 women, 118 men, three no gender reported) with an average age of 17 years. Participation in the study was a compulsory part of their high school course.

Using a long questionnaire, participants were asked to recollect emotional episodes, report their duration, and answer questions regarding their appraisal of the emotion-eliciting event, as well as any strategies they used to regulate the emotion.

Each questionnaire had nine emotions to prompt recall from a larger set of 27.

These included admiration, anger, anxiety, feeling touched, boredom, compassion, contentment, desperation, disappointment, disgust, enthusiasm, fear, gratitude, guilt, hatred, hope, humiliation, irritation, jealousy, joy, pride, relaxation, relief, sadness, shame, stress and surprise.

Each questionnaire had a different set of nine questions. The different questionnaires were then randomly distributed to participants.

Participants were asked to rate the emotion-eliciting event using a number of appraisal dimensions. One of the main ones asked participants to indicate to what extent the event that elicited the emotion was important to them (importance).

They were also asked to report on a number of coping strategies, including to what extent they "kept on thinking about their feelings and the consequences of the event that elicited the emotion (rumination)".

To see whether the findings depended on the way emotion duration was defined, half of the participants were told that an emotion ends as soon it is no longer felt for the first time, whereas the other half were told that an emotion ends as soon as one has fully recovered from the event. All participants had the difference between emotions and moods explained to them.

 

What were the basic results?

Out of 27 emotions assessed, sadness lasted the longest, whereas shame, surprise, fear, disgust, boredom, feeling touched, irritation and relief were the shortest-lived emotions.

One appraisal dimension and one regulation strategy accounted for almost half of the variability in duration between emotions.

Compared with short emotions, persistent emotions were elicited by events of high importance and were associated with high levels of rumination (reflection or musing on an event).

The study group reported these broad findings held true across the two different emotion duration definitions, as well as when taking into account how recent and intense the emotion being recalled was.

 

How did the researchers interpret the results?

The research team summed up that their "present study revealed that meaningful differences in duration between emotions exist and that these differences can be partially explained by differences in one appraisal dimension (event importance) and one regulation strategy (rumination)".

 

Conclusion

This small cross-sectional survey of young adults suggests emotions vary widely in duration. Of the 27 emotions the researchers looked at, sadness lasted the longest by far.

Emotions that lasted longer were associated with more important event triggers, as well as more rumination about the feelings and the consequences of the event that elicited the emotion.

The study is intriguing, but has a number of limitations to consider. The sample size, for example, was small for a cross-sectional study at just 233.

It also recruited a relatively homogenous group of students aged around 17 years, so emotional duration may be very different for other age groups and groups from other educational backgrounds.

The accuracy of recalling emotions may be a further source of error, as some emotions may be far easier to recall than others: consider recalling instances of hatred, compared with hope.

This was partly addressed by the researchers by adjusting for the intensity of the emotion, but may not have completely eliminated a potential recall bias.

The results are also perhaps only as would be expected. For example, it makes logical sense that sadness is likely to be a more persistent emotion.

Sadness is likely to be influenced by a particular situation or trigger and, if there is no immediate resolution to this situation, continuing to reflect on it or being troubled by it is likely to result in a longer-lasting emotional effect.

Meanwhile, emotions such as surprise or disgust are likely to be the result of more transient events that would not have longer-term effects on the person, so they would be expected to be much shorter-term emotions.

Overall, the results give us some indication of the emotional duration of a group of young adults, but limited wider implications can be drawn from this research.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Feeling sad? It could take up to FIVE DAYS to shift your mood: Sadness lasts 240 times longer than other emotions, study claims. Mail Online, October 31 2014

Links To Science

Verduyn V, Lavrijsen S. Which emotions last longest and why: The role of event importance and rumination. Motivation and Emotion. Published online October 31 2014

Categories: NHS Choices

Brain differences linked to chronic fatigue syndrome

NHS Choices - Behind the Headlines - Fri, 31/10/2014 - 11:00

"Scientists find three differences in the brain [of people with chronic fatigue syndrome]," the Mail Online reports.

Chronic fatigue syndrome (CFS) affects around a quarter of a million people in the UK and causes persistent symptoms, such as fatigue, that can have a significant adverse impact on people's quality of life. The cause of CFS is unknown and the condition continues to be researched. 

The study behind this headline used a specialised type of MRI scan to examine whether there were any differences in the brain volume and structure of 15 people with CFS, compared with 14 people without.

The researchers found the volume of white matter (brain cell nerve fibres) was lower in the group with CFS. There were also some differences on the right side of the brain in the nerve fibres that connect the temporal to the frontal brain regions.

These are interesting developments in furthering our understanding of CFS. However, the study only involved a very small sample of 15 people, and we don't know how representative they are of all people with the condition.

The design of the study is able to demonstrate brain features that may be associated with the condition, but it cannot show cause and effect. We also don't know the order in which events happened.

It's not known whether these differences could have led to the development of CFS (and if so, whether they were always present, or whether some other unknown factors caused them to occur), or whether these are new changes that have occurred since the people developed CFS.

The next step would be to try to understand how these differences are associated with the condition's development.

 

Where did the story come from?

The study was carried out by researchers from Stanford University School of Medicine in California.

Financial support was provided by the Division of Infectious Disease Chronic Fatigue Syndrome Fund, and one of the authors received support from GE Healthcare.

The study was published in the peer-reviewed medical journal, Radiology.

The Mail Online's headline, "Is this proof chronic fatigue DOES exist?", casts doubt upon whether CFS actually exists. It's known CFS affects many thousands of people, with often severely debilitating consequences, so its existence is not in doubt.

However, the causes of CFS remain poorly understood. This study has tried to further understanding of the condition by examining brain features that may be associated with it. This study provides a starting point, but not the whole picture.

 

What kind of research was this?

This was a cross-sectional study that took brain scans of 15 people with CFS and a comparison group of age and sex-matched people without CFS. It aimed to research differences in the brain structure.

As the researchers explain, CFS is a debilitating condition characterised by six or more months of persistent or relapsing fatigue without any associated medical or mental health disorder.

The researchers consider that brain imaging may help inform diagnosis and prognosis, though conventional scan findings to date have been inconsistent and of limited help in any further understanding of the condition.

This study used a special MRI technique called diffusion tensor imaging (DTI). DTI measures the diffusion (movement or spread) of water through the brain tissues, which provides 3D images of the size, shape and microscopic structure of tissues.

 

What did the research involve?

The researchers scanned the brains of 15 people with CFS and compared them with 14 age- and gender-matched people without CFS. They were looking for any brain volume and structure differences between the two groups that may be linked to the condition.

People with CFS had to meet two inclusion criteria:

  • a clinical diagnosis of CFS made up of fatigue for six months or longer, with at least four other symptoms from: impaired memory or concentration, sore throat, tender lymph nodes, headaches, muscle pain, joint pain, unrefreshing sleep and malaise after exertion
  • ongoing memory or concentration problems causing severe enough impairment that a doctor thought brain imaging was necessary to confirm no other disease process was occurring

The group with CFS had an average age of 46 years. Eight people in the group were female (55%) and the average duration of their CFS symptoms was 12 years.

The age- and sex-matched comparison group were people without CFS, depression or substance use in the past year. Of 28 recruited, 14 chose to participate.

All participants completed a 20-item Multidimensional Fatigue Inventory (MFI-20), which assesses general, physical and mental fatigue, reduced motivation and activity. It is said to be a well-validated tool for assessing CFS, with higher MFI-20 scores indicating increased severity.

They also assessed whether each person was right- or left-handed or ambidextrous, as this is linked to differences in structure and volume in some brain areas.

The main analysis compared differences in brain volume and structure between the two groups using MRI (DTI) brain scans. This took into account variations in age, handedness and total brain volume.

 

What were the basic results?

The researchers found, on average, people with CFS had a lower total volume of white matter (nerve cell fibres) in their brain than people without.

They took a measure known as fractional anisotropy (FA), which gives a value between zero and one indicating the degree of diffusion of water, and whether there are any restrictions in any direction. A value of zero would mean that diffusion is the same in all directions.

They found significant differences in the FA of people with and without CFS in one particular region of the brain on the right side, called the right arcuate fasciculus. This is a nerve fibre tract that links the temporal region on the right side of the brain with the frontal region.

Most right-handed people with CFS had a maximum FA in the right arcuate fasciculus above 0.6, while those without CFS had a FA value below 0.6. They noticed that in people with CFS, FA of the right arcuate fasciculus tended to increase with disease severity.

The researchers also observed that people with CFS had areas of thickening in parts of the grey matter connected by the right arcuate fasciculus.

 

How did the researchers interpret the results?

The researchers concluded there is a loss of white matter in people with CFS. They also suggest the fractional anisotropy of the right arcuate fasciculus might be a biological indicator for CFS.

 

Conclusion

This study used a specialised type of MRI to examine whether there were any differences in the brain volume and structure of 15 people with CFS, compared with 14 people without.

They found the volume of white matter (nerve fibres) appeared to be lower, on average, in the people with CFS. There were also differences in the magnitude of water diffusion (a measure known as fractional anisotropy) in one particular white matter tract on the right side of the brain, which connects the temporal with the frontal brain regions.

These are interesting developments in furthering our understanding of CFS. But there are points to bear in mind when considering the meaning of these findings.

It must be remembered this research only used a very small sample of 15 people with CFS from the US, who may not be representative of the many thousands of people affected by this condition in the UK or elsewhere.

For example, these were people who had severe and persisting memory or concentration problems, such that their doctor thought brain imaging was required to make sure no other disease process was going on. The differences seen between these 15 people with CFS and 14 without may not be identical to differences that may be seen in a different sample.

Also, as this is a cross-sectional study, it cannot prove cause and effect: it can't tell us the order in which events happened. For example, it can't tell us whether these are structural features that occurred before people developed CFS, which may have been involved in the development of the condition, or whether these are changes that happened after the people developed CFS.

Further imagining studies in larger samples of people with this condition may reveal whether these results are consistent observations in the brain structure of people with CFS. The next step would then be to try to understand how these differences are associated with the condition's development.

These findings have no immediate treatment or preventative implications for CFS.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Is this proof chronic fatigue DOES exist? Scientists find three differences in the brain that suggest condition may not just be 'in the mind'. Mail Online, October 30 2014

Chronic fatigue syndrome is real, researchers say. CNN, October 30 2014

Links To Science

Zeineh MM, Kang J, Atlas SW, et al. Right Arcuate Fasciculus Abnormality in Chronic Fatigue Syndrome. Radiology. Published online October 30 2014

Categories: NHS Choices

Genes may play a role in Ebola survival chances

NHS Choices - Behind the Headlines - Fri, 31/10/2014 - 11:00

"Genetic factors could play an important role in whether people survive the Ebola virus," BBC News reports. Researchers found around one in five mice remained unaffected by the infection.

Researchers investigated how mice with a different genetic make-up responded to Ebola infection. The research involved eight research strains of mice said to represent the majority of genetic variation seen across major mouse species. They were infected with Ebola and had their disease response examined.

The researchers found mice with different genetic profiles show variable disease response, ranging from complete resistance to infection with full recovery, to the disease being fatal.

Mice with resistance and those who died from the disease tended to have differences in the activity of certain genes, which was associated with differences in their immune and inflammatory response.

But the findings do not necessarily mean a similar pattern will be seen in humans, who have quite different genetics to mice.

Environmental factors such as access to good healthcare and hygiene standards (which, sadly, are of a low standard in West Africa), as well as the age, health and fitness of the person, are also likely to play a significant role in how infection with Ebola affects any individual.

Nevertheless, learning more about the genetic and immune responses to the Ebola virus could help contribute to the future creation of an effective anti-viral treatment.

Experts believe Ebola is highly unlikely to spread within the UK. To understand why, read Why Ebola risk is low for people in the UK.

 

Where did the story come from?

The study was carried out by researchers from the University of Washington and other research institutions in the US.

It was funded by grants from the US National Institute of Allergy and Infectious Diseases, the National Institutes of Health, and the Intramural Research Program of the National Institute of Allergy and Infectious Diseases, National Institutes of Health. 

The study was published in the peer-reviewed scientific journal Science Express on an open access basis, so it is free to read online.

The UK media's stories generally provide an accurate summary of the research, with most stating early on that the study was in mice.

However, the Mail Online's headline, "Will Ebola kill you? It depends on your genes," is overly conclusive and does not take account of the uncertainty of the research or its unproven applicability to people.

 

What kind of research was this?

This was an animal study investigating how mice with a different genetic make-up responded to Ebola infection in different ways.

The researchers explain how most animal studies examining the disease development of Ebola, or looking at the effectiveness of vaccines or treatments, have had to use primates or small mammals.

This is because when mice have been infected with Ebola in the laboratory, they don't demonstrate the same haemorrhagic syndrome (for example, complete dysfunction of the clotting system in the body) that occurs in humans.

This study specifically examined the role of host genetics in determining the severity of disease caused by Ebola infection.

 

What did the research involve?

This study involved infecting genetically diverse mice with different strains of Ebola to see if their genetics influenced the symptoms they developed, and whether they ultimately lived or died.

The study used mice from what is called the Collaborative Cross (CC) resource, a genetically diverse group of inbred mice obtained from the cross of eight mouse strains – five said to be classic laboratory strains, and three wild-type (found in nature) strains.

The eight "founder" mice strains are said to represent 90% of the common genetic variation seen across three major mouse species.

The researchers infected the eight CC founder strains with two strains of Ebola virus – a mouse strain and the wild-type strain, which doesn't normally cause haemorrhagic syndrome in mice.

They carried out a detailed analysis of the disease symptoms and the disease response in the mice.

 

What were the basic results?

When infected with the mouse strain of Ebola virus, the researchers observed different disease responses across the mice, ranging from complete resistance to infection to fatal disease. Some of the fatal cases developed disease changes consistent with haemorrhagic syndrome, while others did not.

The researchers performed more detailed analysis on two of the mouse lines – those resistant to disease and those that developed Ebola haemorrhagic fever.

Mice from both of these lines lost about 15% of their body weight in the five days following infection. The susceptible mice died on day five or six, while resistant mice fully recovered two weeks after infection.

Those that died demonstrated disease features consistent with Ebola haemorrhagic fever, including internal bleeding, prolonged blood coagulation times, spleen enlargement and liver discolouration. The resistant mice had no disease changes or alteration in their liver.

On further study, the researchers found differences in the inflammatory and immune response of mice susceptible or resistant to infection. This difference in response seemed to be mediated by differences in gene expression.

In particular, expression of the Tek gene in the liver was lower in the susceptible mice, and this was associated with onset of haemorrhagic disease. 

When infected with the wild-type Ebola strain, however, neither the susceptible nor resistant mice developed clinical disease. The animals had very low levels of the virus in their liver and spleen – up to 1,000 times lower than their levels when infected with the mouse strain.

At five days after infection, there was no longer any virus detectable, indicating that the wild-type Ebola virus is not able to replicate in mice.

 

How did the researchers interpret the results?

The researchers concluded that their results indicate genetic background determines susceptibility to Ebola hemorrhagic fever.

 

Conclusion

This research across mouse strains demonstrates that mice with different genetic profiles show variable disease response after infection with the Ebola virus. Responses ranged from complete resistance to infection with full recovery, to fatal disease, with or without changes consistent with Ebola haemorrhagic fever.

When comparing the mice that were resistant with those that developed fatal Ebola haemorrhagic syndrome, they found differences in the activity of certain genes, which was associated with different immune and inflammatory response.

However, these results in mice should not be extrapolated too far at this stage. The findings that different genetic strains of mice respond to Ebola infection in different ways does not mean the case will be exactly the same in people, who have quite different genetics to mice.

Genes may play a more or less important role in Ebola symptoms and survival in people, but at this stage we simply don't know.

Similarly, the different infection responses were seen only when mice were infected with the mouse strain of Ebola. The wild Ebola strain was not able to replicate in mice, further demonstrating the dissimilarities to human disease.

As BBC News reports, Andrew Easton, Professor of Virology at the University of Warwick, said the study "provided valuable information, but the data could not be directly applied to humans because they have a much larger variety of genetic combinations than mice".

Even if in humans (as in mice) our genetics play a role in how we respond to Ebola infection, it is unlikely to provide the whole answer. Factors such as the environment we live in – such as healthcare and hygiene standards – and our own underlying age, health and fitness are likely to play a large role in how we respond to Ebola infection.

Nevertheless, this study contributes to the wider understanding of Ebola, and may help direct further research examining the causes and effects of this devastating disease, as well as effective treatments at some point in the future.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Ebola virus: Genes 'play significant role in survival'. BBC News, October 31 2014

Ebola outbreak: Ebola may not be a deadly disease for everyone, scientists find. The Daily Telegraph, October 31 2014

Will Ebola kill you? It depends on your genes: Scientists discover DNA could determine if victims live or die. Mail Online, October 31 2014

Links To Science

Rasmussen AL, Okumura A, Ferris MT, et al. Host genetic diversity enables Ebola hemorrhagic fever pathogenesis and resistance. Science. Published online October 30 2014

Categories: NHS Choices

Pages