Tag

Tagged: commentary

Sponsored
 
  • Each year cancer kills 8m people worldwide and cost billions
  • 40% of cancer deaths could be prevented by early detection
  • Nearly half of all cancer sufferers are diagnosed late when the tumors have already spread
  • Victims and doctors often miss early warning signs of cancer
  • Traditional tissue biopsies used to diagnose cancer are invasive, slow, costly, and often yield insufficient tissue
  • New blood tests are being devised that simultaneously detect cancer early and inform where the cancer is in the body
  • Such tests - liquid biopsies - are positioned to end the late diagnosis of cancer
  • But before liquid biopsies become common practice they need to overcome a number of significant challenges
  
World’s first blood tests that detect and locate cancer
 
Just as there is a global race among immunotherapists to enhance cancer treatment, so there is a parallel race among bioengineers to speed up and improve the detection of cancer. Such races are important because nearly half of all cancer sufferers are diagnosed late, when their tumors have already metastasized: 30% to 40% of cancer deaths could be prevented by early detection and treatment.
 
Here we describe advances in blood tests - “liquid biopsies” - which can simultaneously detect cancer early, and identify its tissue of origin. We also, describe the growing commercialization of the technology, and some significant hurdles it still has to be overcome.
 
A costly killer disease

Each year cancer kills more than 8m people worldwide, 0.6m in the US and nearly 0.17m in the UK. Survival rates for pancreatic, liver, lung, ovarian, stomach, uterine and oesophageal cancers are particularly low. A large proportion of people do not know they have cancer, and many primary care doctors fail to detect its early warning signs. According to The Journal of Clinical Oncology, a staggering 44% of some types of cancers are misdiagnosed. A significant proportion of people discover that they have cancer only after presenting a different condition at A&E. Each year, the total cost of cancer to the UK’s exchequer is nearly £20bn. In the US, national spending on cancer is expected to reach US$156bn by 2020. And as populations age so some cancer prevalence rates increase, despite substantial endeavours to reduce the burden of the disease.
  
The UK: a stereotypical case

The UK is indicative of what is happening elsewhere in the developed world with regard to cancer diagnosis and treatment. Epidemiological trends suggest that although progress is being made to fight the disease, much work is still required. Death rates for a number of individual cancer types have declined, but rates for a few cancers have increased.

Recently, the UK’s Department of Health invested £450m to improve diagnosis, including giving primary care doctors better access to tests such as CT and MRI scans. But each year there are still some 0.17m cancer deaths in the UK, and 1 in 4 British cancer patients are unlikely to live longer than 6 months after diagnosis because they and their doctors have missed early signs of the disease. For example, in the UK only 23% of lung cancer cases are diagnosed early, as are 32% of cases of non-Hodgkin lymphoma, and 44% of ovarian cancer.

Not only does late detection increase morbidity and mortality, it significantly increases treatment costs. According to the UK’s NHS National Intelligence Network, a case of ovarian cancer detected early costs an average of £5,000 to treat, whereas one detected late at stage three or four costs £15,000. Similarly, a colon cancer patient detected early typically costs £3,000, while one not identified until a later stage would cost some £13,000.

 
Traditional tissue biopsies

Currently, oncologists look to pathologists for assistance in tumor diagnosis. Indeed, oncologists cannot proceed with therapy without a tissue diagnosis, nor are they able to discuss prognosis with the patient. After detecting a tumor through a physical examination or imaging, doctors use traditional tissue biopsies to gather information on the attributes of a patient’s cancer.
 
These pinpoint a cancer’s mutations and malignancy, but solid tissue biopsies are not always straightforward. While some cancers are easily accessed, others are hidden deep inside the body or buried in critical organs. Beyond the physical challenge, sampling from such tumors can be dangerous to patients, and once achieved, they do not always inform on current tumor dynamics. Further, traditional solid tissue biopsies are costly and time consuming to perform; they can yield insufficient tissue to obtain a good understanding of the tumor, and they can be hampered by a patient’s comorbidities, and lack of compliance.

 
Two significant studies
 
Although solid tumor tissue is still the gold standard source for clinical molecular analyses, cancer-derived material circulating in the bloodstream has become an appealing alternative showing potential to overcome some of the challenges of solid tissue biopsies.

Findings of two significant studies of liquid biopsies published in 2017 promise a more effective and patient-friendly method for diagnosing cancer: one in the journal Genome Biology, and the other in the journal Nature Genetics. Both studies are on the cusp of developing the world’s first simple blood test, which can both detect early stage cancer, and identify where in the body the cancer is located.

.
The Genome Biology study
 
​The study, reported in Genome Biology, describes findings of a blood test, referred to as the CancerLocator, which has been developed by Jasmine Zhou, Professor of Biological and Computer Sciences and her team at the University of California, Los Angeles (UCLA). The  Locator detected early stage cancer in 80% of breast, lung and liver cases.
 
Zhou and her colleagues devised a computer program that uses genetic data to detect circulating tumor DNA (ctDNA) in blood samples. Once identified, the ctDNA is compared to a database of genetic information from hundreds of people to identify where the tumor is located.  Zhou’s team discovered that tumors, which arise in different parts of the body, have different signatures, which a computer can spot. “The technology is in its infancy and requires further validation, but the potential benefits to patients are huge  . . . . . Non-invasive diagnosis of cancer is important, as it allows the early detection of cancer, and the earlier the cancer is caught, the higher chance a patient has of beating the disease,” says Zhou.
 
The Nature Genetics study

Researchers led by Kun Zhang, Professor of Bioengineering at the University of California, San Diego (UCSD), are responsible for the study published in the journal Nature Genetics. Zhang developed a test that examined ctDNA in blood from cancer patients and, like Zhou, discovered that not only could it detect cancer early, but could also locate where the tumor is growing in the body. When a tumor starts to take over a part of the body, it competes with normal cells for nutrients and space, killing them off in the process. As normal cells die, they release their DNA into the bloodstream; and that DNA can identify the affected tissue.
 
There are many technical differences on how each approach works . . . The work by the UCLA group is a computer program that uses data published previously by other groups, and has reduced the cancer detection error from roughly 60% to 26.5%. In contrast, we developed a new theoretical framework, generated our own data from over 100 patients and healthy people, and our accuracy of locating cancer in an organ is around 90%,” says Zhang, but he adds, “Major medical challenges don’t get solved by one team working alone”.
 
Confluence and advances in computing and biology

The research endeavors of Professors Zhou and Zhang have been made possible by the confluence and advances in computing and molecular biology. Over the past 20 years, there has been a paradigm shift in biology, a substantial increase in computing power, huge advances in artificial intelligence (AI), and the costs of data storage have plummeted. It took 13 years, US$3bn, and help from 7 governments to produce the first map of the human genome, which was completed in 2003. Soon it will be possible to sequence an entire genome in less than an hour for US$100.
 
The end of traditional in vitro diagnostics

Liquid biopsies are a sequencing-based technology used to detect microscopic fragments of DNA in just a few drops of blood, and hold out the potential to diagnose cancers before the onset of symptoms. Roger Kornberg, Professor of Structural Biology at Stanford University, and 2006 Nobel Laureate for Chemistry for his work in understanding how DNA is converted into RNA, “which gives a voice to genetic information that, on its own, is silent,” describes how advances in molecular science are fueling the replacement of traditional in vitro diagnostics with virtually instantaneous, point-of-care diagnostics without resort to complex processes or elaborate and expensive infrastructure. Liquid biopsies, such as those developed by Zhou and Zhang, have the potential to provide clinicians with a rapid and cheap means to detect cancer early, thereby enabling immediate treatment closely tailored to each patient’s disease state.

 
 
FDA approval of liquid biopsy
 
In 2016, the US Food and Drug Administration (FDA) granted Swiss pharmaceutical and biotech firm Roche approval for a liquid biopsy, which can detect gene mutations in the most common type of lung cancer, and thereby predict whether certain types of drugs can help treat it. 

The clinical implementations of such a test are not widespread, and there has been no regulatory approval of liquid biopsies for diagnosing cancer generally. Notwithstanding, ctDNA is now being extensively studied, as it is a non-invasive “real-time” biomarker that can provide diagnostic and prognostic information before and during treatment; and at progression.
 

cfDNA and ctDNA

Cell-free DNA (cfDNA) is a broad term that describes DNA, which is freely circulating in the bloodstream, but does not necessarily originate from a tumor. Circulating tumor DNA (ctDNA) is fragmented DNA, which is derived directly from a tumor or from circulating tumor cells (CTCs).
 
Commercialization of the liquid biopsy race
 
Bill Gates, Jeff Bezos and leading venture capitalists have poured hundreds of millions into the goal of developing liquid biopsies. The US market alone is projected at US$29bn, according to a 2015 report from investment bank Piper Jaffray. Currently, there are about 40 companies in the US analyzing blood for fragments of DNA shed by dying cancer cells. Notwithstanding, only a few companies have successfully marketed liquid biopsies, and these are limited to identifying the best treatments for certain cancers, and to update treatments as the cancer mutates. So far, no one has been successful in diagnosing incipient cancer from a vial of blood drawn from a patient who looks and feels perfectly healthy.
 
Some US companies in the liquid biopsy race

At the 2016 meeting of the American Society of Clinical Oncology (ASCO), a Silicon Valley start-up, Guardant Health, which has raised some US$200m, presented findings from a large study involving over 15,000 participants, which demonstrated the accuracy of its liquid biopsy test, Guardant360, for patients with advanced solid tumors. The study found the same patterns of genomic changes in cfDNA reported by the Guardant360 test as those found in 398 patients with matching tissue samples between 94% and 100% of the time.

The 70-gene test is the first comprehensive, non-invasive genomic cancer-sequencing test to market, and according to the company, about 2,000 physicians worldwide have used it. Guardant expects to continue to develop its technology, and maintain a commercial lead in the cfDNA liquid biopsy space. The next step for Guardant is to go beyond sequencing, which matches patients to targeted oncology drugs to the early detection of cancer itself. 
 
Also in 2016 Gates and Bezos teamed up with San Diego's Illumina, which makes most of the DNA sequencing machines that pick appropriate treatments for cancer patients, to launch another liquid biopsy start-up called Grail. In 2017, Grail raised US$900m to help it develop blood-based diagnostics to enable routine, early detection of cancer. The company aims to refine and validate its liquid biopsy technology by running a number of large-scale clinical studies where it expects to sequence hundreds of thousands of patients. Another Californian-based biotech start-up, Freemome,  raised US$65m to validate its liquid biopsy technology for the early detection of cancer.
 
Takeaways

Despite findings of the two 2017 studies reported in the journals Genome Biology and Nature genetics, FDA approval of Roche’s liquid biopsy, massive increase in investment, and significant commercial biotech activity, there is a gap between reality and aspirations for liquid biopsies. To provide doctors with a reliable, point-of-care means to detect cancer early, liquid biopsies will have to overcome several significant challenges. The major one is assay sensitivity and specificity for analysis of ctDNA and cfDNA. To compete with the gold standard solid tissue biopsy, and to ensure that patients receive early diagnosis and appropriate treatment, a successful liquid biopsy assay will have to demonstrate a high positive predictive value. Concomitantly, good sensitivity and excellent specificity will be required to yield acceptable rates of false positives and false negatives. Notwithstanding, the race among bioengineers to develop a non-invasive “real-time” liquid biopsy to detect cancer early is gaining momentum.
 

view in full page
 
  • Competition is intensifying among scientists to develop and use gene editing and immunotherapy to defeat intractable diseases
  • Chinese scientists were the first to inject people with cells modified by the CRISPR–Cas9 gene-editing technique
  • Several studies have extracted a patient’s own immune cells, modified them using gene-editing techniques, and re-infused them into the patient to seek and destroy cancer cells
  • A new prêt à l'emploi gene editing treatment disables the gene that causes donor immune cells to attack their host
  • The technique harvests immune cells from a donor, modifies and multiplies them so that they may be used quickly, easily and cheaply on different patients
  • Commercial, technical, regulatory and ethical barriers to gene editing differ in different geographies 

Gene editing battles

Gene editing and immunotherapy are developing at a pace. They have been innovative and effective in the fight against melanoma, lung cancer, lymphomas and some leukaemias, and promise much more. Somatic gene therapy changes, fixes and replaces genes at the tissue or cellular levels to treat a patient, and the changes are not passed on to the patient’s offspring. Germ line gene therapy inserts genes into reproductive cells and embryos to correct genetic defects that could be passed on to future generations.  Although there are still many unanswered clinical, commercial and ethical questions surrounding gene therapy, its future is assured and will be shaped by unexpected new market entrants and competition between Chinese and Western scientists, which is gaining momentum.
  
14 February 2017

On the 14th February 2017 an influential US science advisory group formed by the National Academy of Sciences and the National Academy of Medicine gave support to the modification of human embryos to prevent “serious diseases and disabilities” in cases where there are no other “reasonable alternatives”. This is one step closer to making the once unthinkable heritable changes in the human genome. The Report, however, insisted that before humanity intervenes in its own evolution, there should be a wide-ranging public debate, since the technology is associated with a number of unresolved ethical challenges. The French oppose gene editing, the Dutch and the Swedes support it, and a recent Nature editorial suggested that the EU is, “habitually paralysed whenever genetic modification is discussed”. In the meantime, clinical studies, which involve gene-editing are advancing at a pace in China, while the rest of the world appears to be embroiled in intellectual property and ethical debates, and playing catch-up.
 
15 February 2017

On the 15th February 2017, after a long, high-profile, heated and costly intellectual property action, judges at the US Patent and Trademark Office ruled in favor of Professor Feng Zhang and the Broad Institute of MIT and Harvard, over patents issued to them associated with the ownership of the gene-editing technology CRISPR-Cas9: a cheap and easy-to-use, all-purpose gene-editing tool, with huge therapeutic and commercial potential.
 
The proceedings were brought by University College Berkeley who claimed that the CRISPR technology had been invented by Professor Jennifer Doudna of the University, and Professor Emmanuelle Charpentier, now at the Max Planck Institute for Infection Biology in Berlin, and described in a paper they published in the journal Science in 2012. Berkeley argued that after the 2012 publication, an “obvious” development of the technology was to edit eukaryotic cells, which Berkeley claimed is all that Zhang did, and therefore his patents are without merit.

The Broad Institute countered, suggesting that Zhang made a significant inventive leap in applying CRISPR knowledge to edit complex organisms such as human cells, that there was no overlap with the University of California’s research outcomes, and that the patents were therefore deserved. The judges agreed, and ruled that the 10 CRISPR-Cas9 patents awarded to Zhang and the Broad Institute are sufficiently different from patents applied for by Berkeley, so that they can stand. 
 
The scientific community

Interestingly, before the 15th February 2017 ruling, the scientific community had appeared to side with Berkeley. In 2015 Doudna, and Charpentier were awarded US$3m and US$0.5m respectively for the prestigious Breakthrough Prize in life sciences and the Gruber Genetics Prize. In 2017 they were awarded the Japan Prize of US$0.45m for, “extending the boundaries of life sciences”. Doudna and Charpentier have each founded companies to commercially exploit their discovery: respectively Intellia Therapeutic, and CRISPR Therapeutics.
 
16 February 2017

A day after the patent ruling, Doudna said: “The Broad Institute is happy that their patent didn’t get thrown out, but we are pleased that our patent based on earlier work can now proceed to be issued”. According to Doudna, her patents are applicable to all cells, whereas Zhang’s patents are much more narrowly indicated. “They (Zhang and the Broad Institute) will have patents on green tennis balls. We will get patents on all tennis balls,” says Doudna.
 
Gene biology

Gene therapy has evolved from the science of genetics, which is an understanding of how heredity works. According to scientists life begins in a cell that is the basic building block of all multicellular organisms, which are made up of trillions of cells, each performing a specific function. Pairs of chromosomes comprising a single molecule of DNA reside in a cell’s nucleus. These contain the blueprint of life: genes, which determine inherited characteristics. Each gene has millions of sequences organised into segments of the chromosome and DNA. These contain hereditary information, which determine an organism’s growth and characteristics, and genes produce proteins that are responsible for most of the body’s chemical functions and biological reactions.

Roger Kornberg, an American structural biologist who won the 2006 Nobel Prize in Chemistry "for his studies of the molecular basis of eukaryotic transcription", describes the Impact of human genome determination on pharmaceuticals:
 
 
China’s first
 
While American scientists were fighting over intellectual property associated with CRISPR-Cas9, and American national scientific and medical academies were making lukewarm pronouncements about gene editing, Chinese scientists  had edited the genomes of human embryos in an attempt to modify the gene responsible for β-thalassemia and HIV, and are planning further clinical studies. In October 2016, Nature reported that a team of scientists, led by oncologist Lu You, at Ghengdu’s Sichuan University in China established a world first by using CRISPR-Cas9 technology to genetically modify a human patient’s immune cells, and re-infused them into the patient with aggressive lung cancer, with the expectation that the edited cells would seek, attack and destroy the cancer. Lu is recruiting more lung cancer patients to treat in this way, and he is planning further clinical studies that use similar ex vivo CRISPR-Cas9 approaches to treat bladder, kidney and prostate cancers
 
The Parker Institute for Cancer Immunotherapy
 
Conscious of the Chinese scientists’ achievements, Carl June, Professor of Pathology and Laboratory Medicine at the University of Pennsylvania and director of the new Parker Institute for Cancer Immunotherapy, believes America has the scientific infrastructure and support to accelerate gene editing and immunotherapies. Gene editing was first used therapeutically in humans at the University of Pennsylvania in 2014, when scientists modified the CCR5 gene (a co-receptor for HIV entry) on T-cells, which were injected in patients with AIDS to tackle HIV replication. Twelve patients with chronic HIV infection received autologous cells carrying a modified CCR5 gene, and HIV DNA levels were decreased in most patients.
 
Medical science and the music industry

The Parker Institute was founded in 2016 with a US$250m donation from Sean Parker, founder of Napster, an online music site, and former chairman of Facebook. This represents the largest single contribution ever made to the field of immunotherapy. The Institute unites 6 American medical schools and cancer centres with the aim of accelerating cures for cancer through immunotherapy approaches. 

Parker, who is 37, believes that medical research could learn from the music industry, which has been transformed by music sharing services such as Spotify. According to Parker, more scientists sharing intellectual property might transform immunotherapy research. He also suggests that T-cells, which have had significant success as a treatment for leukaemia, are similar to computers because they can be re-programed to become more effective at fighting certain cancers. The studies proposed by June and colleagues focus on removing T-cells, from a patient’s blood, modifying them in a laboratory to express chemeric antigen receptors that will attack cancer cells, and then re-infusing them into the patient to destroy cancer. This approach, however, is expensive, and in very young children it is not always possible to extract enough immune cells for the technique to work.

 
Prêt à l'emploi therapy

Waseem Qasim, Professor of Cell & Gene Therapy at University College London and Consultant in Paediatric immunology at Great Ormond Street Hospital, has overcome some of the challenges raised by June and his research. In 2015 Qasim and his team successfully used a prêt à l'emploi gene editing technique on a very young leukaemia patient. The technique, developed by the Paris-based pharmaceutical company Cellectis, disables the gene that causes donor-immune cells to attack their host. This was a world-first to treat leukaemia with genetically engineered immune cells from another person. Today, the young leukaemia patient is in remission. A second child, treated similarly by Qasim in December 2015, also shows no signs of the leukaemia returning. The cases were reported in 2017 in the journal Science Translational Medicine.
 
Universal cells to treat anyone cost effectively

The principal attraction of the prêt à l'emploi gene editing technique is that it can be used to create batches of cells to treat anyone. Blood is collected from a donor, and then turned into “hundreds” of doses that can then be stored frozen. At a later point in time the modified cells can be taken out of storage, and easily re-infused into different patients to become exemplars of a new generation of “living drugs” that seek and destroy specific cancer cells. The cost to manufacture a batch of prêt à l'emploi cells is estimated to be about US$4,000 compared to some US$50,000 using the more conventional method of altering a patient’s cells and returning them to the same patient. Qasim’s clinical successes raise the possibility of relatively cheap cellular therapy using supplies of universal cells that could be dripped into patients' veins on a moment’s notice.
 
Takeaways
 
CRISPR-Cas9 provides a relatively cheap and easy-to-use means to get an all-purpose gene-editing technology into clinics throughout the world. Clinical studies using the technology have shown a lot of promise especially in blood cancers. These studies are accelerating, and prêt à l'emploi gene editing techniques as an immunotherapy suggest a new and efficacious therapeutic pathway. Notwithstanding the clinical successes, there remain significant clinical, commercial and ethical challenges, but expect these to be approached differently in different parts of the world. And expect these differences to impact on the outcome of the scientific race, which is gaining momentum.
 
view in full page
 
 
  • The convergence of MedTech and pharma can generate innovative combination devices that promise significant therapeutic and commercial benefits
  • Combination devices such as advanced drug delivery systems offer more precise, predictable and personalized healthcare
  • The global market for advanced drug delivery systems is US$196bn and growing
  • Biosensors play a role in convergence and innovative drug delivery systems
  • Roger Kornberg, Professor of Medicine at Stanford University and 2006 Nobel Prize winner for Chemistry describes the technological advances, which are shaping new medical therapies

    

The convergence of MedTech and pharma and the role of biosensors

MedTech and pharma companies are converging.
What role do biosensors play in such a convergence?
 
Traditionally, MedTech and big pharma have progressed along parallel paths. More recently, however, their paths have begun to converge in an attempt to gain a competitive edge in a radically changing healthcare landscape. Convergence leverages MedTech’s technical expertise and pharma’s medical and biological agents to develop combination devices. These are expected to significantly improve diagnosis, monitoring and treatment of 21st century chronic lifetime diseases, and thereby make a substantial contribution to an evolving healthcare ecosystem that demands enhanced patient outcomes, and effective cost-containment.
 

Conventional diagnostics & drug delivery

Conventional in vitro diagnostics for common diseases are costly, time-consuming, and require centralized laboratories, experienced personnel and bulky equipment. Standard processes include the collection and transportation of biological samples from the point of care to a centralized laboratory for processing by experienced personnel. After the results become available, which usually takes days, the laboratory notifies doctors, who in turn contact patients, and modify their treatments as required. Conventional modes of treatment have mainly consisted of simple, fast-acting pharmaceuticals dispensed orally or as injectables. Such limited means of drug delivery slows the progress of drug development since most drugs are formulated to accommodate the conventional oral or injection delivery routes. Concerns about the quantity and duration of a drug’s presence, and its potential toxic effect on proximal non-diseased tissue drives interest in alternative drug delivery systems and fuels the convergence of MedTech and pharma.



The end of in vitro diagnostics

Roger Kornberg, Professor of Medicine at Stanford University, reflects on the limitations of conventional in vitro diagnostics, and describes how technological advances facilitate rapid point-of-care diagnostics, which are easier and cheaper:

 
 
Converging interest
 
Illustrative of the MedTech-pharma convergence is Verily's (formerly Google Life Sciences) partnership with Novartis to develop smart contact lenses to correct presbyopia, (age-related farsightedness), and for monitoring diabetes by measuring glucose in tears. Otsuka’s, partnership with Proteus Digital Health is another example. This venture expects to develop an ingestible drug adherence device. Proteus already has a FDA-approved sensor, which measures medication adherence. Otsuka is embedding the Proteus’s sensor, which is the size of a sand particle, into its medication for severe mental illnesses in order to enhance drug adherence, which is a serious problem. 50% of prescribed medication in the US is not taken as directed, resulting in unnecessary escalation of conditions and therapies, higher costs to health systems, and a serious challenge for clinical studies.

Drivers of change

The principal drivers of MedTech-pharma convergence include scientific and technological advances, ageing populations, increased chronic lifestyle diseases, emerging-market expansion, and developments in therapies. All have played a role in changing healthcare demands and delivery landscapes. Responding to these changes, both MedTech and pharma have continued to emphasize growth, while attempting to enhance value for payers and patients. This has resulted in cost cutting, and a sharper focus on high-performing therapeutics. It has also fuelled MedTech-pharma convergence and the consequent development of combination devices. According to Deloitte’s 2016 Global Life Science Outlook, combination devices “will likely continue to rapidly increase in number and application”.

MedTech’s changing business model
 
Over the past two decades, MedTech has been challenged by tighter regulatory scrutiny, and continued pressure on healthcare budgets, but advantaged by technological progress, which it has embraced to create new business models. This has been rewarded by positive healthcare investment trends. Over a similar period, pharma has been challenged by the expiry of its patents, advances in molecular science, and changing demographics, but buoyed by increased healthcare spending trends, although the forces that increase health costs are being tempered by a demand for value.

As pharma has been increasingly challenged, so interest has increased in the potential of MedTech to address some of the more pressing healthcare demands in a radically changing healthcare ecosystem. Unlike pharma, MedTech has leveraged social, mobile, and cloud technologies to develop new business models and innovative devices for earlier diagnoses, faster and less invasive interventions, enhanced patient monitoring, and improved management of lifetime chronic conditions.
 
Such innovations are contributing to cheaper, faster, and more efficient patient care, and shifting MedTech’s strategic focus away from curative care, such as joint replacements, to improving the quality of life for patients with chronic long-term conditions. This re-focusing of its strategy has strengthened MedTech commercially, and is rapidly changing the way in which healthcare is delivered, the way health professionals treat patients, and the way patients’ experience healthcare.
 
Josh Shachar, founder of several successful US technology companies and author of a number of patents, describes the new healthcare ecosystem and some of the commercial opportunities it offers, which are predicated on the convergence of MedTech and pharma:
 
 
The decline of big pharma’s traditional business model
 
Pharma’s one-size-fits-all traditional business model, which has fuelled its commercial success over the past century, is based on broad population averages. This now is in decline as patents expire on major drugs, and product pipelines diminish. For example, over the past 30 years the expiry of pharma’s patents cost the industry some US$240bn.

Advances in genetics and molecular biology, which followed the complete sequencing of the human genome in 2003, revolutionized medicine and shifted its focus from inefficient one-size-fits-all drugs to personalized therapies that matched patients to drugs via diagnostic tests and biomarkers in order to improve outcomes, and reduce side effects. Already 40% of drugs in development are personalized medicines, and this is projected to increase to nearly 70% over the next five years.

Today, analysts transform individuals’ DNA information into practical data, which drives drug discovery and diagnostics, and tailors medicines to treat individual diseases. This personalized medicine aims to target the right therapy to the right patient at the right time, in order to improve outcomes and reduce costs, and is transforming how healthcare is delivered and diseases managed. 

 
Personalized medicine

Personalized medicine has significantly dented pharma’s one-size-fits-all strategies. In general, pharma has been slow to respond to external shocks, and slow to renew its internal processes of discovery and development. As a result, the majority of new pharma drugs only offer marginal benefits. Today, pharma finds itself trapped in a downward commercial spiral: its revenues have plummeted, it has shed thousands of jobs, it has a dearth of one-size-fits-all drugs, and its replacement drugs are difficult-to-find, and when they are, they are too expensive.

Illustrative of the advances in molecular science that helped to destroy pharma’s traditional commercial strategy is the work of Kornberg. Here he describes an aspect of his work that is related to how biological information encoded in the genome is accessed to inform the direction of all human activity and the construction of organisms for which Kornberg received the Nobel Prize in Chemistry 2006, and created the foundations of personalized medicine:

 

  
Advanced drug delivery systems
 
Over the past 20 years, as pharma has struggled commercially and MedTech has shifted its business model, drug delivery systems have advanced significantly. Evolving sensor technologies have played a role in facilitating some of these advances, and are positioned to play an increasingly important role in the future of advanced drug delivery. According to BCC Research, the global market for advanced drug delivery systems, which increase bioavailability, reduce side effects, and improve patient compliance, increased from US$134bn in 2008 to some US$196bn in 2014.
 
The growth drivers for innovative drug delivery systems include recent advances of biological drugs such as proteins and nucleic acids, which have broadened the scope of therapeutic targets for a number of diseases. There are however, challenges.

 

Proteins are important structural and functional biomolecules that are a major part of every cell in your body. There are two nucleic acids: DNA and RNA. DNA stores and transfers genetic information, while RNA delivers information from DNA to protein-builders in the cells.


For instance, RNA is inherently unstable, and potentially immunogenic, and therefore requires innovative, targeted delivery systems. Such systems have benefitted significantly from progress in biomedical engineering and sensor technologies, which have enhanced the value of discoveries of bioactive molecules and gene therapies, and contributed to a number of new, advanced and innovative combination drug delivery systems, which promise to be more efficacious than conventional ones. 
 
Biosensors
 
The use of biosensors in drug delivery system is not new. The insulin pump is one example. Introduced in its present form some 30 years ago, the insulin pump is a near-physiologic programmable method of insulin delivery that is flexible and lifestyle-friendly.

Biosensors are analytical tools, which convert biological responses into electrical signals. In healthcare, they provide analyses of chemical or physiological processes and transmit that physiologic data to an observer or to a monitoring device. Historically, data outputs generated from these devices were either analog in nature or aggregated in a fashion that was not conducive to secondary analysis. The latest biosensors are wearable and provide vital sign monitoring of patients, athletes, premature infants, children, psychiatric patients, people who need long-term care, elderly, and people in remote regions. 
 
Increased accuracy and speed
 
The success of biosensors is associated with their ability to achieve very high levels of precision in measuring disease specific biomarkers both in vitro and in vivo environments. They use a biological element, such as enzymes, antibodies, receptors, tissues and microorganisms capable of recognizing or signalling real time biochemical changes in different inflammatory diseases and tumors. A transducer is then used to convert the biochemical signal into a quantifiable signal that can be transmitted, detected and analysed, and thereby has the potential, among other things, for rapid, accurate diagnosis and disease management.
 
Recent technological advances have led to the development of biosensors capable of detecting the target molecule in very low quantities and are considered to have enhanced capacity for increased accuracy and speed of diagnosis, prognosis and disease management. Biosensors are robust, inexpensive, easy to use, and more importantly, they do not require any sample preparation since they are able to detect almost any biomarker  - protein, nucleic acid, small molecule, etc. - within a pool of other bimolecular substances. Recently, researchers have developed various innovative strategies to miniaturize biosensors so that they can be used as an active integral part of tissue engineering systems and implanted in vivo.

 
Market for biosensors
 
Over the past decade, the market in biosensors and bioinformatics has grown; driven by advances in artificial intelligence (AI), increased computer power, enhanced network connectivity, miniaturization, and large data storage capacity.

Today, biosensors represent a rapidly expanding field estimated to be growing at 60% per year, albeit from a low start. In addition to providing a critical analytical component for new drug delivery systems, biosensors are used for environmental and food analysis, and production monitoring. The estimated annual world analytical market is about US$12bn, of which 30% is in healthcare. There is a vast market expansion potential for biosensors because less than 0.1% of the analytical market is currently using them.

A significant impetus of this growth comes from the healthcare industry, where there is increasing demand for inexpensive and reliable sensors across many aspects of both primary and secondary healthcare. It is reasonable to assume that a major biosensor market will be where an immediate assay is required, and in the near-term patients will use biosensors to monitor and manage treatable lifetime conditions, such as diabetes cancer, and heart disease.

The integration of biosensors with drug delivery
 
The integration of biosensors with drug delivery systems supports improved disease management, and better patient compliance since all information in respect to a person’s medical condition may be monitored and maintained continuously. It also increases the potential for implantable pharmacies, which can operate as closed loop systems that facilitate continuous diagnosis, treatment and prognosis without vast data processing and specialist intervention. A number of diseases require continuous monitoring for effective management. For example, frequent measurement of blood flow changes could improve the ability of health care providers to diagnose and treat patients with vascular conditions, such as those associated with diabetes and high blood pressure. Further, physicochemical changes in the body can indicate the progression of a disease before it manifests itself, and early detection of illness and its progression can increase the efficacy of therapeutics.
 
Takeaways

Combination devices, which are triggered by the convergence of MedTech and pharma, offer substantial therapeutic and commercial opportunities. There is significant potential for biosensors in this convergence. The importance of biosensors is associated with their operational simplicity, higher sensitivity, ability to perform multiplex analysis, and capability to be integrated into different functions using the same chip. However, there remain non-trivial challenges to reconcile the demands of performance and yield to simplicity and affordability.
 
 
view in full page
 
  • Chinese scientists lead the world in editing genomes of human embryos in order to develop new therapies for intractable diseases
  • US and UK regulators have given permission to edit the genes of human embryos
  • CRISPR-Cas9 has become the most common gene editing platform, which acts like is a pair of molecular scissors
  • CRISPR technology has the potential to revolutionize medicine, but critics say it could create a two-tiered society with elite citizens, and an underclass and have called for a worldwide moratorium on gene editing
  • Roger Kornberg, professor of medicine at Stanford University and 2006 Nobel Prize winner for Chemistry explains the science, which underpins gene-editing technology
  
Gene editing positioned to revolutionise medicine
 
It is a world first for China.
 
In 2015, a group of Chinese scientists edited the genomes of human embryos in an attempt to modify the gene responsible for β-thalassemia, a potentially fatal blood disorder. Researchers, led by Junjiu Huang from Sun Yat-sen University in Guangzhou, published their findings in the journal Protein & Cell.
 
In April 2016, another team of Chinese scientists reported a second experiment, which used the same gene editing procedure to alter a gene associated with resistance to the HIV virus. The research, led by Yong Fan, from Guangzhou Medical University, was published in the Journal of Assisted Reproduction and Genetics. At least two other groups in China are pursuing gene-editing research in human embryos, and thousands of scientists throughout the world are increasingly using a gene-editing technique called CRISPR-Cas9.
 
 

CRISPR-Cas9

Almost all cells in any living organism contain DNA, a type of molecule, which is passed on from one generation to the next. The genome is the entire sequence of DNA or an organism. Gene editing is the deliberate alteration of a selected DNA sequence in a living cell. CRISPR-Cas9 is a cheap and powerful technology that makes it possible to precisely “cut and paste” DNA, and has become the most common tool to create genetically modified organisms. Using CRISPR-Cas9, scientists can target specific sections of DNA, delete them, and if necessary, insert new genetic sequences. In its most basic form, CRISPR-Cas9 consists of a small piece of RNA, a genetic molecule closely related to DNA, and an enzyme protein called Cas9. The CRISPR component is the programmable molecular machinery that aligns the gene-editing tool at exactly the correct position on the DNA molecule. Then Cas9, a bacterial enzyme, cuts through the strands of DNA like a pair of molecular scissors. Gene editing differs from gene therapy, which is the introduction of normal genes into cells in place of missing or defective ones in order to correct genetic disorders.
 
Ground-breaking discovery 

The ground-breaking discovery of how CRISPR-Cas9 could be used in genome editing was first described by Jennifer Doudna, Professor of Chemistry and Cell Biology at the University of California, Berkeley, and Emmanuelle Charpentier, a geneticist and microbiologist, now at the Max Plank Institute for Infections in Berlin, and published in the journal Science in 2012.

In 2011 Feng Zhang, a bioengineer at the Broad Institute, MIT and Harvard, learned about CRISPR and began to work adapting CRISPR for use in human cells. His findings were published in 2013, and demonstrated how CRISPR-Cas9 can be used to edit the human genome in living cells.  

Subsequently, there has been a battle, which is on-going, between the scientists and their respective institution over the actual discovery of CRISPR’s use in human embryos, and who is entitled to the technology’s patents.
 
Gene editing research gathers pace worldwide: a few western examples

In 2016 a US federal biosafety and ethics panel licensed scientists at the University of Pennsylvania’s new Parker Institute of Cancer Immunotherapy to undertake the first human study to endow T-cells with the ability to attack specific cancers. Patients in the study will become the first people in the world to be treated with T-cells that have been genetically modified.

T-cells are designed to fight disease, but puzzlingly they are almost useless at fighting cancer. Carl June, the Parker Institute’s director and his team of researchers, will alter three genes in the T-cells of 18 cancer patients, essentially transforming the cells into super fighters. The patients will then be re-infused with the cancer-fighting T-cells to see if they will seek and destroy cancerous tumors.

Also in 2016, the UK’s Human Fertilisation and Embryology Authority (HFEA), which regulates fertility clinics and research, granted permission to a team of scientists led by Kathy Niakan at the Francis Crick Institute in London to edit the genes of human IVF embryos in order to investigate the causes of miscarriage. Out of every 100 fertilized eggs, fewer than 50 reach the early blastocyst stage, 25 implant into the womb, and only 13 develop beyond three months.
 
Frederick Lander, a development biologist at the Karolinska Institute Stockholm, is also using gene editing in an endeavour to discover new ways to treat infertility and prevent miscarriages. Lander is the first researcher to modify the DNA of healthy human embryos in order to learn more about how the genes regulate early embryonic development. Lander, like other scientists using gene-editing techniques on human embryos, is meticulous in not allowing them to result in a live birth. Lander only studies the modified embryos for the first seven days of their growth, and he never lets them develop past 14 days. “The potential benefits could be enormous”, he says.
 
Gene editing cures in a single treatment

Doctors at IVF clinics can already test embryos for genetic diseases, and pick the healthiest ones to implant into women. An advantage of gene editing is that potentially it could be used to correct genetic faults in embryos instead of picking those that happen to be healthy. This is why the two Chinese research papers represent a significant turning point. The gene editing technology they used has the potential to revolutionize the whole fight against devastating diseases, and to do many other things besides. The main benefit of gene editing therapy is that it provides potential cures for intractable diseases with a single treatment, rather than multiple treatments with possible side-effects.
 

The promise of gene editing for fatal and debilitating diseases
 
Among other things, gene editing holds out promise for people with fatal or debilitating inherited diseases. There are over 4,000 known inherited single gene conditions, affecting about 1% of births worldwide. These include the following:- cystic fibrosis, which each year affects about 70,000 people worldwide, 30,000 in the US, and about 10,000 in the UK; Tay-Sachs disease, which results in spasticity and death in childhood. The BRCA1 and BRCA2 inherited genes predispose women with a significantly greater chance of developing breast or ovarian cancer. Sickle-cell anaemia, in which inheriting the sickle cell gene from both parents causes the red blood cells to spontaneously “sickle” during a stress crisis; heart disease, of which many types are passed on genetically; haemophilia, a bleeding disorder caused by the absence of genetic clotting agent and. Huntington disease, a genetic condition which slowly kills victims by affecting cognitive functions and neurological status. Further, genomics play a significant role in mortality from chronic conditions such as cancer, diabetes and heart disease.
 
A world first

Huang and his colleagues set out to see if they could replace a gene in a single-cell fertilized human embryo. In principle, all cells produced as the embryo develops would then have the replaced gene. The embryos used by Huang were obtained from fertility clinics, but had an extra set of chromosomes, which prevented them from resulting in a live birth, though they did undergo the first stages of development. The technique used by Huang’s team involved injecting embryos with the enzyme complex CRISPR-Cas9, which, as described above, acts like is a pair of molecular scissors that can be designed to find and remove a specific strand of DNA inside a cell, and then replace it with a new piece of genetic material.
 
The science underpinning gene editing

In the two videos below Roger Kornberg, professor of medicine at Stanford University and 2006 Nobel Prize winner for Chemistry for his work on “transcription”, the process by which DNA is converted into RNA, explains the science, which underpins gene-editing technology:
 
How biological information, encoded in the genome, is accessed for all human activity

 
 
Impact of human genome determination on pharmaceuticals
 
An immature technology
 
Huang’s team injected 86 embryos, and then waited 48 hours; enough time for the CRISPR-Cas9 system, and the molecules that replace the missing DNA to act, and for the embryos to grow to about eight cells each. Of the 71 embryos that survived, 54 were genetically tested. Only 28 were successfully spliced, and only a fraction of those contained the replacement genetic material.
 
Therapy to cure HIV
 
Fan, the Chinese scientist who used CRISPR in an endeavor to discover a therapy for HIV/Aids, collected 213 fertilized human eggs, donated by 87 patients, which like embryos used by Huang, were unsuitable for implantation, as part of in vitro fertility therapy. Fan used CRISPR–Cas9 to introduce into some of the embryos a mutation that cripples an immune-cell gene called CCR5. Some humans who naturally carry this mutation are resistant to HIV, because the mutation alters the CCR5 protein in a way that prevents the virus from entering the T-cells it tries to infect. Fan’s analysis showed that only 4 of the 26 human embryos targeted were successfully modified.
 
Deleting and altering genes not targeted
 
In 2012, soon after scientists reported that CRISPR could edit DNA, experts raised concerns about “off-target effects,” where CRISPR inadvertently deletes or alters genes not targeted by the scientists. This can happen because one molecule in CRISPR acts like a bloodhound, and sniffs around the genome until it finds a match to its own specific sequence. Unfortunately, the human genome has billions of potential matches, which raises the possibility that the procedure might result in more than one match. 
 
Huang is considering ways to decrease the number of “off-target” mutations by tweaking the enzymes to guide them more precisely to a desired spot, introducing the enzymes in a different format in order to try to regulate their lifespans, allowing enzymes to be shut down before mutations accumulate; and varying the concentrations of the introduced enzymes and repair molecules. He is also, considering using other gene-editing techniques, such as LATENT.

 
The slippery slope to eugenics

Despite the potential therapeutic benefits from gene editing, critics suggest that genetic changes to embryos, known as germline modifications, are the start of a “slippery slope” that could eventually lead to the creation of a two-tiered society, with elite citizens, genetically engineered to be smarter, healthier and to live longer, and an underclass of biologically run-of-the-mill humans.
 
Some people believe that the work of Huang, Fan and others crosses a significant ethical line: because germline modifications are heritable, they therefore could have an unpredictable effect on future generations. Few people would argue against using CRISPR to treat terminal cancer patients, but what about treating chronic diseases or disabilities? If cystic fibrosis can be corrected with CRISPR, should obesity, which is associated with many life-threatening conditions? Who decides where the line is drawn?
 
40 countries have banned CRISPR in human embryos. Two prominent journals, Nature and Science, rejected Huang’s 2012 research paper on ethical grounds, and subsequently, Nature published a note calling for a global moratorium on the genetic modification of human embryos, suggesting that there are “grave concerns” about the ethics and safety of the technology.
 
A 2016 report from the Nuffield Council on Bioethics suggests that because of the steep rise in genetic technology, and the general availability of cheap, simple-to-use gene-editing kits, which make it relatively straightforward for enthusiasts outside laboratories to perform experiments, there needs to be internationally agreed ethical codes before the technology develops further.
 
Recently, the novelist Kazuo Ishiguro, among others, joined the debate, arguing that social changes unleashed by gene editing technologies could undermine core human values. “We’re coming close to the point where we can, objectively in some sense, create people who are superior to others,” says Ishiguro.
 
Takeaways

CRISPR has been described as the “Model T of genetics”.  Just as the Model T was the first motor vehicle to be successfully mass-produced, and made driving cheap and accessible to the masses, so CRISPR has made a complex process to alter any piece of DNA in any species easy, cheap and reliable, and accessible to scientists throughout the world. Although CRISPR still faces some technical challenges, and notwithstanding that it has ignited significant protests on ethical grounds, there is now a global race to push the boundaries of its capabilities well beyond its present limits.
 
view in full page
 
  • Influenza, or flu, outbreaks are recurrent and every year pose  a significant risk to global health
  • Influenza affects millions: each year 3m to 5m cases of severe disease and 500,000 deaths
  • Pandemics occur about three times a century
  • The 1918 flu pandemic killed 21m . . . Total deaths in World War I was 17m
  • Effective treatment of patients with respiratory illness depend on accurate and timely diagnosis
  • Early diagnosis of influenza can reduce the inappropriate use of antibiotics and provide the option of using antiviral therapy
  • Rapid Influenza Diagnostic Tests (RIDTs) are useful in determining whether outbreaks of respiratory disease might be due to influenza
  • RIDTs vary in their sensitivity, specificity, complexity, and time to produce results
  • There is a pressing need for faster, cheaper, and easier-to-use flu tests with higher levels of sensitivity and specificity than those currently available
  • The large, fast-growing, global and under-served RIDT market drives a host of initiatives
  • Various development challenges pose significant threats
 
 
The critical importance of new rapid influenza diagnostic tests
 
What challenges face developers of cheap, easy-to-use, rapid and accurate diagnostic tests for influenza, or flu, which improve on tests currently available?

 
Influenza

Influenza is a highly contagious respiratory illness caused by a virus, and occurs in distinct outbreaks of varying extent every year. Its epidemiologic pattern reflects the changing nature of the antigenic properties of influenza viruses. The viruses subsequent spread depends upon multiple factors, including transmissibility and the susceptibility of the population. Influenza A viruses, in particular, have a remarkable ability to undergo periodic changes in the antigenic characteristics of their envelope glycoproteins; the hemagglutinin and the neuraminidase. Anyone can get influenza. It is usually spread by the coughs and sneezes of an infected person. You can also catch flu by touching an infected person (e.g. shaking hands). Adults are contagious one to two days before getting symptoms and up to seven days after becoming ill, which means that you can spread the influenza virus before you even know you are infected. Influenza presents as a sudden onset of high fever, myalgia, headache and severe malaise, cough (usually dry), sore throat, and runny nose. There are several treatment options, which aim to ease symptoms until the infection goes, and aims to prevent complications. Most healthy people recover within one to two weeks without requiring any medical treatment. However, influenza can cause severe illness or death especially in people at high risk such as the very young, the elderly, and people suffering from medical conditions such as lung diseases, diabetes, cancer, kidney or heart problems.
  
Costly killer

Influenza is a cruel, costly killer with a history of pandemics. It causes millions of upper respiratory tract infections every year as it spreads around the world in seasonal epidemics, and poses on-going risks to health. The most vulnerable are the young, the old and those with chronic medical conditions such as heart disease, respiratory problems and diabetes. Each year, on average 5% to 20% of populations in wealthy countries get influenza. In the US it causes more than 200,000 hospitalizations and 36,000 deaths annually, and each year costs the American economy between US$71 to US$167bn.
 
History of pandemics

The 1918-19 “Spanish Flu” pandemic caused 21m deaths, and was one of three 20th century influenza pandemics. At least four pandemics occurred in the 19th century, and the first pandemic of the 21st century was the 2009 “Swine Flu”. Its virulence and global human impact was less deadly than originally feared, but it still resulted in 18,449 laboratory confirmed deaths. If you account for people who died as a result of complications precipitated by the Swine Flu, the actual death toll is significantly higher. Mindful of the potential accelerated spread of a pandemic subtype of the influenza virus, the World Health Organization (WHO), and national governments continuously monitor influenza viruses. Assessment of pathogenicity and virulence is the key to taking appropriate healthcare actions in the event of an outbreak.

However, without widespread access to improved diagnostic tests, each year millions will not receive timely anti-viral medication, tens of thousands of influenza sufferers will develop complications, and thousands will die unnecessarily, as the growing interconnections and complexity of the world present an increasing challenge to influenza prevention and control.
 

The influenza viruses

Influenza is a single-stranded, helically shaped, RNA virus of the orthomyxovirus family. Influenza viruses are divided into two groups: A and B. Influenza A has two subtypes which are important for humans: A(H3N2) and A(H1N1). The former is currently associated with most deaths. Influenza viruses are defined by two different protein components, known as antigens, on the surface of the virus. They are haemagglutinin (H) and neuraminidase (N) components. Influenza viruses circulate in all parts of the world, and mutate at a low level, referred to as "genetic drift", which allows influenza to continuously evolve and escape from the pressures of population immunity. This means that each individual is always susceptible to infections with new strains of the virus. "Genetic shift" occurs when a strain of influenza A virus completely replaces one or more of its gene segments with the homologous segments from another influenza A strain, a process known as reassortment. If the new segments are from an animal influenza virus to which humans have had no exposure and no immunity, pandemics may ensue.
 
Gold standard diagnosis rarely used

The gold standard method for the detection of influenza viruses is rarely performed, as patients with suspected influenza are most likely to be seen by a primary care doctor with limited resources, and the gold standard test requires sophisticated laboratory infrastructure, and takes at least 48 hours. Even the faster Reverse Transcription-Polymerase Chain Reaction (RT-PCR) test, which is a relatively new type of molecular assay that uses isothermal amplification of viral cells, has a turnaround time of four to six hours. It is also expensive, and therefore not commonly used.

The slowness and expense of traditional influenza tests led to the development of an array of commercially available Rapid Influenza Diagnostic Tests (RIDTs), which screen for influenza viruses, and provide results within as little as 15 minutes after sample collection and processing. Such tests are largely immunoassays that can identify the presence of influenza A and B viral nucleoprotein antigens in respiratory specimens and display the results in a qualitative way (positive or negative). About 10 such tests have FDA approval and are available in the US. About 20 have been determined suitable for the European market. All are growing in their usage. However, the RIDTs vary in their sensitivity, specificity, complexity, and in the time needed to produce results.
  
Tests rule in Influenza but do not rule it out
 
According to the Centers for Disease Control and Prevention (CDC) the commercially available RIDTs in America have a sensitivity ranging from 50% to 70%. This means that in up to 50% of influenza cases, test results will still be negative. A study showed that tests for the N1H1 virus, a subtype of influenza A that was the most common cause of the Swine Flu in 2009, and is associated with the 1918 Spanish Flu pandemic, have a sensitivity ranging from 32% to 50% depending on the brand of test. A 2012 meta-analysis of the accuracy of RIDTs reported an average sensitivity for detecting influenza in adults of only 54%. Sensitivity in children is somewhat higher since they tend to shed a greater quantity of virus. Thus some 30% to 50% of flu samples that would register positive by the gold standard viral culture test may give a false negative when using a RIDT, and some may indicate a false positive when a person is not infected with influenza. Thus, RIDTs that are currently available allow Influenza to be ruled in but not ruled out. More sensitive tests are needed.
 
New flu tests

There are a number of innovative nano-scale molecular diagnostic influenza tests in development, which are expected to deliver more accurate validations than existing antigen-based molecular tests. The new tests use a platform, comprised of an extremely thin layer of material, which detects the presence of influenza proteins in saliva or blood. This is attached to an electronic chip, which transforms the platform into a sensor. This is an essential part of the measuring device as it converts the input signal to the quantity suitable for measurement and interpretation. The presence of influenza proteins in saliva or blood triggers an electrical signal in the chip, which is then communicated to a mobile phone.
 
Here Roger Kornberg, Professor of Medicine at Stanford University and 2006 Nobel Laureate for Chemistry describes how advances in molecular science are enabling the replacement of traditional in vitro diagnostics with rapid, virtually instantaneous point-of-care diagnostics without resort to complex processes or elaborate infrastructure.  Antiviral drugs for influenza are available in some countries and may reduce severe complications and deaths. Ideally they need to be administered early (within 48 hours of onset of symptoms) in the disease.  An almost instantaneous point-of-care test will enable better access to appropriate treatment particularly in primary care:

 
 
Challenges

Notwithstanding all the recent scientific advances, new and innovative influenza detection tests will need to overcome significant challenges to outperform current RIDTs. In addition to the usual challenges associated with sensitivity and specificity, new developers have to be aware of recent changes in immunochromatographic antigen detection testing for influenza viruses, and the rapid development of commercially available nucleic acid amplification tests. Also, there are the usual development challenges associated with miniaturization, fabrication, scaling, marketing, and regulation. Effective from 13 February 2017, the FDA reclassified antigen based rapid influenza detection tests from class I into class II devices. Class II devices are higher risk than Class I, and require greater regulatory controls to provide reasonable assurance of the device’s safety and effectiveness. This was provoked by the potential for the devices to fail to detect newer versions of the influenza virus. For instance, a novel variant of influenza A,H7N9, has emerged in Asia, and H5N1 is also re-emergent.
 
Another challenge, especially for start-ups with limited resources, is the fluctuating nature of the influenza virus itself. A bad year for patients, when influenza causes millions of people to become ill, is a good year for manufacturers of RIDTs. Conversely, a good year for patients, when influenza affects a lower percentage of the population, is a bad year for manufacturers who suffer from unsold inventory, and reduced revenues. Thus, the vagaries of the flu virus not only have the potential to kill millions of people, they also pose a significant threat to start-ups dedicated to developing RIDTs.
 
Takeaways

Despite all the challenges, there is a significant commercial opportunity in the current under-served global RIDT market for improved tests. Each year, in the US, more than 1bn people visit primary care doctors, and in the UK, the NHS, deals with over 1m patients every 36 hours. The global in vitro diagnostics market was valued at US$60bn in 2016. Between 2016 and 2021, the market is expected to grow at a CAGR of 5.5% to reach US$79bn by 2021. Over the same period, the global point-of-care diagnostics sub-market is expected to grow at a CAGR of 10% to reach US$37bn by 2021. Large corporates, small start-ups, and university research laboratories have spotted the opportunity, and started developing new and innovative RIDTs. Given that each-year influenza causes widespread morbidity as well as mortality, it should be a matter of priority to support all efforts to develop swift and reliable RIDTs. A significant step forward would be a RIDT with greater sensitivity and usability such that the test could be administered and a result given within a 10-minute primary care consultation.
 
view in full page
 
  • People are using A&E departments as convenient drop-in clinics for minor ailments because they cannot get GP appointments
  • In January 2017 the British Red Cross said A&E was struggling with a "humanitarian crisis" to keep up with a rush of patients over  the winter
  • UK’s Prime Minister suggests that all GP surgeries should open from 8am to 8pm, 7 days a week 
  • Primary care in England is in crisis, fuelled by a large and increasing demand and a shrinking supply of GPs
  • 75% of GPs across 540 general practices over the age of 55 are nearing retirement, and newly trained GPs are seeking employment abroad
  • By 2020 there could be a shortfall of 10,000 GPs in England
  • Curing the primary care crisis would relieve pressure on A&E departments
  • A simple, cheap and easy-to-use online dashboard could help relieve the primary healthcare crisis
 
A smarter approach to the UK’s GP crisis
 
Could the vast and escalating primary care crisis in England be helped with a new and innovative online dashboard, which automatically sends short videos contributed by clinicians to patients’ mobiles to address their FAQs?
 
Dr Seth Rankin an experienced GP thinks it can. Click on the photo below to access a short video, which demonstrates how the dashboard works.

 
 
 

UK’s Secretary of State predicted the healthcare crisis
 
The UK’s Secretary of Health has frequently stressed the urgent need for more innovation in healthcare. In 2015 he said: “If we do not find better, smarter ways to help our growing elderly population remain healthy and independent, our hospitals will be overwhelmed – which is why we need effective, strong and expanding general practice more than ever before in the history of the NHS.
 
An easy and effective way to improve GP services

Most patients don’t remember half of what is said in short GP consultations. This is why videos are so important. Unlike doctors and pamphlets videos never get tired, never wear out, and are available 24/7, 365 days a year. Unlike the Internet, the dashboard provides premium reliable healthcare information, which easily can be consumed by patients and shared among family, friends and carers. The video content can be viewed many times, from anywhere, and at anytime. The dashboard is fully automated [see figure below], relieves GPs of a lot of unnecessary work, and importantly, reports on how patients’ use the different videos,” says Rankin; CEO of the London Doctors Clinic; and formally the managing partner of the Wandsworth Medical Centre, and co-chair of Wandsworth CCG’s Diabetes Group.
 
A fully automated dashboard to improve efficiency and increase the quality of care
 
 
Reducing unnecessary A&E visits

‘The dashboard uses videos of local healthcare professionals because both patients and doctors want to improve their connectivity. The dashboard is embedded with about 120 short, 60 to 80 second, talking-head videos, which address patients’ frequently asked questions. Research suggests that the average attention span for people watching videos on mobiles is between 60 to 80 seconds. The dashboard has been specifically designed to help increase patients’ knowledge of their condition, propel them towards self-management, slow the onset of complications, lower the number of unnecessary visits to A&E, reduce face-time with GPs, and enhance the quality of care,” says Rankin.
 
Essential behavioral techniques

The efficacy of healthcare education is enhanced by embedded behavioral techniques, which nudge people to change their diets and lifestyles, improve self-monitoring of their condition, and increase adherence to medications.  The HealthPad dashboard benefits from such behavioral techniques.
 
Part of comprehensive communications system

The dashboard has been developed by health professionals with significant patient input, and aims to get effective educational content to the largest number of people at the lowest price possible; and without requiring effort from health professionals to mediate or facilitate the flow of the knowledge. To achieve this the dashboard is not a “lock-in” system, but designed to be easily and cheaply re-engineered to integrate with various other communications systems, see diagram below. The only thing that the dashboard requires is a connection to the Internet. 
 

 
GP surgeries at saturation point

A 2016 study published in The Lancet suggests that between 2007 and 2014 the workload in NHS general practice in England had increased by 16%, and that it is now reaching saturation point. According to Professor Richard Hobbs of Oxford University and lead author of the study, "For many years, doctors and nurses have reported increasing workloads, but for the first time, we are able to provide objective data that this is indeed the case . . . . . As currently delivered, the system [general practice in England] seems to be approaching saturation point . . . . . Current trends in population growth, low levels of recruitment and the demands of an ageing population with more complex needs will mean consultation rates will continue to rise.”
 
More than 1m patients visit GPs every day

A 2014 Deloitte’s report commissioned by the Royal College of General Practitioners (RCGP) suggests that the GP crisis in England is the result of chronic under-funding and under-investment when the demand for GP services is increasing as the population is ageing, and there is a higher prevalence of long-term conditions and multi-morbidities.
 
Each day in England, more than 1m patients visit their GPs. Some GPs routinely see between 40 to 60 patients daily. Over the past 5 years, the number of GP consultations has increased by 60m each year, and now stands at about 370m a year. Over the same period, the number of GPs has grown by only 4.1%.
 
Stress levels among GPs are high and increasing

Deloitte’s findings are confirmed by of a 2016 comparative study undertaken by the prestigious Washington DC-based Commonwealth Fund, which concluded that increasing workloads, bureaucracy and the shortest time with patients has led to 59% of NHS GPs finding their work either “extremely” or “very” stressful: significantly higher stress levels than in any other western nation. GP stress levels are likely to increase.
 
In a speech made in June 2015, the UK’s Secretary of Health said, “Within 5 years we will be looking after a million more over-70s. The number of people with three or more long term conditions is set to increase by 50% to nearly three million by 2018. By 2020, nearly 100,000 more people will need to be cared for at home.” Dr. Maureen Baker, the former chair of the Royal College of General Practitioners (RCGP) has warned that, “Rising patient demand, excessive bureaucracy, fewer resources, and a chronic shortage of GPs are resulting in worn-out doctors, some of whom are so fatigued that they can no longer guarantee to provide safe care to patients.” And Dr  Helen Stokes-Lampard, the new head of the RCGP, warns that patients are being put at risk because they often have to wait for a month before they can see a GP.

 
Newly trained GPs are seeking employment abroad

Trainee GPs are dwindling and young GPs are moving abroad. According to data from the General Medical Council (GMC), between 2008 and 2014 an average of 2,852 certificates were issued annually to enable British doctors to work abroad. We now have a dangerous situation where there are hundreds of vacancies for GP trainees. Meanwhile, findings from a 2015 British Medical Association (BMA) poll of 15,560 GPs, found that 34% of respondents plan to retire in the next five years because of high stress levels, unmanageable workloads, and too little time with patients.
 
5,000 more GPs by 2020

In 2016 the government announced a rescue package that will see an extra £2.4bn a year ploughed into primary care services by 2020. This is expected to pay for 5,000 more GPs and extra staff to boost practices. When the Secretary of Health trailed this in 2015, doctors’ leaders did not view it as a viable solution. Dr Chaand Nagpaul, chair of the BMA’s GP committee, warned that, “delivering 5,000 extra GPs in five years, when training a GP takes 10 years, was a practical impossibility and would never be achieved.” In 2016, Pulse, a publication for GPs, suggested that the Health Secretary understands that he cannot deliver on his election promise of 5,000 new doctors by 2020, and is negotiating with Apollo Hospitals, an Indian hospital chain, to bring 400 Indian GPs to England.
 
Pharmacists in GP surgeries
 
In July 2015 the NHS launched a £15m pilot scheme, supported by the RCGP and the Royal Pharmaceutical Society (RPS), to fund, recruit and employ clinical pharmacists in GP surgeries to provide patients with additional support for managing medications and better access to health checks.
 
Dr Maureen Baker said, “GPs are struggling to cope with unprecedented workloads and patients in some parts of the country are having to wait weeks for a GP appointment yet we have a ‘hidden army’ of highly trained pharmacists who could provide a solution”. Ash Soni, former president of the RPS suggested that it makes sense for pharmacists to help relieve the pressure on GPs, and said, “Around 18m GP consultations every year are for minor ailments. Research has shown that minor aliment services provided by pharmacists can provide the same treatment results for patients, but at lower cost than at a GP surgery.”
 
Progressive and helpful move
 
The efficacy for an enhanced role for pharmacists in primary care has already been established in the US, where retail giants such as CVS, Walgreens and Rite Aid provide convenient walk-in clinics staffed by pharmacists and nurse practitioners. Over time, Americans have grown to trust and value their relations with pharmacists, which has significantly increased adherence to medications, and provided GPs more time to devote to more complex cases. Non-adherence is costly, and can lead to increased visits to A&E, unnecessary complications, and sometimes death. According to a New England Healthcare Institute report, Thinking Beyond the Pillbox, failure to take medication correctly, costs the US healthcare system $300bn annually, and results in 125,000 deaths every year. 
 
Takeaway

People with complex conditions deserve to be seen by a GP who is not stressed and who can devote the time and attention they need. “Videos could play a similar role to practice-based pharmacists. Both deal with simple day-to-day patient questions, and relieve pressure on GPs, which allows them to focus their skills where they are most needed,” says Rankin.
 
view in full page
 
  • Psoriasis is a serious chronic inflammatory disorder of the immune system
  • It affects more than 90m people worldwide: 1.2m in Britain, 7.5m in the US
  • The condition runs in families
  • Symptoms include red patches of skin covered with silvery scales that itch and burn
  • 30 to 40% of people with psoriasis may experience psoriatic arthritis, which may lead to chronic pain, disability and sometimes; mutilating joint disease
  • New drugs are changing the prospects for people with psoriasis and psoriatic arthritis
  • Dr. Sonya Abraham, Imperial College London, and British Psoriatic Arthritis Consortium describes some of the causes of psoriasis, and prospects for future therapies
 
At war with my skin and joints
 
The novelists John Updike and Vladimir Nabokov, among others, suffered from psoriasis, which significantly influenced and shaped their respective lives.
 
Psoriasis affects more than 100m people worldwide: 1.2m in Britain, and 7.5m in the US. 30 to 40% of these can experience psoriatic arthritis. In 2014, the WHO recognized psoriasis as a serious chronic non-communicable disease, and suggested that people with the condition suffer needlessly because of, “incorrect or delayed diagnosis, inadequate treatment options and insufficient access to care, and because of social stigmatization”.


Therapies for psoriasis include a range of topical and systemic medications as well as phototherapy. Many of the systemic therapies can also reduce the pain and disability from arthritis and other manifestations of the condition.
 
Health professionals have been somewhat constrained by the limited therapies specifically for psoriasis, but this is beginning to change. New treatments are improving the prospects for people with psoriasis, and psoriatic arthritis. “The outlook is good for the millions of people with the conditions”, says Dr. Sonya Abraham of Imperial College London, and a member of the British Psoriatic Arthritis Consortium (Brit-PACT):
 
 

Psoriasis

Psoriasis is a serious chronic inflammatory disorder triggered by an immune systems fault that causes the over production of skin cells. It runs in families, and has an unpredictable course of symptoms. It mainly presents in adults, usually before the age of 35.

Psoriasis mostly affects the skin and joints, and usually occurs on the scalp, knees, elbows, hands and feet. It also may affect the fingernails, the toenails, the soft tissues of the genitals and the inside of the mouth. The condition is characterized as ‘mild’, ‘moderate’, and ‘severe’ according to the amount of body surface area (BSA) affected and the severity of redness, thickness, and scaling of the skin. Approximately 80% of those affected have mild to moderate disease, while 20% have moderate to severe psoriasis affecting more than 5% of the body surface area. The most common form of psoriasis affecting about 80 to 90% of psoriasis patients, is ‘plaque psoriasis’, which is characterized by patches of raised, reddish skin covered with silvery-white scale. There are other forms of psoriasis, including inverse, erythrodermic, pustular, guttate and nail disease. 

 
Psoriatic arthritis and associated conditions

Below Sonya Abraham describes some of the causes of psoriatic arthritis and the effects that the condition may have on various organs of the body. Up to 40% of people with psoriasis experience joint inflammation that produces symptoms of arthritis. Psoriatic arthritis can lead to chronic pain and change in physical appearance. Patients suffering from psoriatic arthritis have reduced physical fitness, compared to psoriasis patients without arthritis. Typically, psoriatic arthritis occurs in conjunction with longstanding skin lesions, but it can occur in the absence of psoriasis.

Psoriasis and psoriatic arthritis may be associated with other diseases and conditions. The incidence of Crohn’s disease and ulcerative colitis, two types of inflammatory bowel disease, is 3.8 to 7.5 times greater in psoriasis patients than in the general population. Patients with psoriasis also have an increased incidence of lymphoma, heart disease, obesity, type-2 diabetes and metabolic syndrome. Depression and suicide, smoking, and alcohol consumption are also more common in psoriasis patients.
 
 
Causes of psoriatic arthritis

What does psoriatic arthritis do to the body?

Living with psoriasis

Updike was affected by psoriasis throughout his whole life, and his writings provide a vivid insight into the significant physical and psychological challenges that sufferers face. In his book Self Consciousness he devotes a chapter to the condition, and in 1985 he wrote a personal history of his psoriasis for The New Yorker entitled, “At War with My Skin”.
 
Updike says that he became a writer because of his psoriasis: “writers do not have to be presentable”. He married young because he found a person “who forgave” his skin, and moved to a small town in Massachusetts near a beach where he could swim and sunbath to relieve his symptoms. During the cold New England winters Updike moved to the Caribbean where he could continue to swim and sunbathe. The stress of leaving his wife in 1974 aggravated his condition, which resulted in a failure of his salt water and sun therapy. Consequentially, he enrolled in what was then an experimental light therapy, which, together with some systemic medication cleared his skin.
 
Nabokov was deeply disturbed by his psoriasis, which he tried to conceal, except to people close to him.  In 1937, after suffering a bad attack, he wrote to his wife describing his agony, "I continue with the radiation treatments every day, and am pretty much cured. You know, now I can tell you frankly, the indescribable torments I endured before these treatments, drove me to the border of suicide; a border I was not authorised to cross because I had you in my luggage”.
 
Treatment options for psoriasis

Updike and Nabokov’s descriptions provide insights of the devastating impact that psoriasis can have on the quality of life. Until recently, there has been few drugs specifically targeted for psoriasis, but this is beginning to change and the outlook for people with psoriasis and psoriatic arthritis looks promising. Here we describe some of the new medications that have recently come to market. But before doing so, we briefly describe current therapies.
  
Mild to moderate psoriasis
 
Mild to moderate psoriasis is managed with topical therapies, which are not very effective. These include coal tar, emollients, salicylic acid, topical retinoids and corticosteroids, and forms of vitamin D, which can sometimes be used together with other medications.
 
People with moderate to severe psoriasis may be treated with traditional systemic medications, phototherapy or biologic agents. In cases of more extensive psoriasis, topical agents may be used in combination with phototherapy, or traditional systemic or biologic medications. Phototherapy includes narrowband and broadband ultraviolet B (UVB), and furocoumarins plus UVA (PUVA), which have to be used sparingly because light therapies may increase the risk of skin cancer.
 
Psoriatic arthritis therapies
 
In the video below, Sonya Abraham describes some of the conventional therapies for psoriatic arthritis. Medical treatment regimens for the condition include the use of non-steroidal anti-inflammatory drugs (NSAIDs), and disease-modifying anti-rheumatic drugs (DMARDs). Conventional therapy usually consists of NSAIDs and local corticosteroid injections, with DMARDs being reserved for NSAID-resistant cases. However, because 40% of patients may develop erosive and deforming arthritis the early use of more aggressive treatment with DMARDs may be warranted.
 
DMARDs include methotrexate, sulfasalazine, cyclosporine, and leflunomide, as well as biologic agents, such as monoclonal antibodies targeting tumour necrosis factor therapies (TNF) - alpha, interleukin-12/23 (IL-12), IL-17, or IL-23.
 
In September 2013, the US Food and Drug Administration (FDA) approved ustekinumab, an IL-12/23 inhibitor, for the treatment of active psoriatic arthritis in adults who have not responded adequately to previous treatment with non-biologic DMARDs. The drug was already approved in Europe and the US for treatment of moderate to severe psoriasis plaques in adults.
 
 

Monoclonal antibodies
 
A monoclonal antibody is an antibody produced by a single clone of cells, and is therefore a single pure type of antibody. Monoclonal antibodies can be made in large quantities in a laboratory, and are a cornerstone of immunology, and increasingly are being introduced as therapeutic agents. The anti-Tumour necrosis Factor Therapies, (anti-TNF) monoclonal antibody biologics include adalimumab, certolizumab, golimumab, infliximab, and the fusion protein, etanercept. All have FDA and EU approvals. Immunology is a branch of biomedical science that covers the study of all aspects of the immune system in all organisms.

 
New drugs
 
Secukinumab
 
Secukinumab is an immunosuppressant that reduces the effects of a chemical substance in the body that can cause inflammation. It works by blocking a certain natural protein in your body (interleukin-17A) that may cause inflammation and swelling. Marketed by Novartis as Cosentyx®, it is the first drug to target psoriatic arthritis, and could help those who suffer from the worst effects of the condition. The therapy is self-administered by a monthly injection, and is aimed at the parts of the immune system known to make proteins called interleukins, which are believed to be faulty in the amounts of serum they release. Up to 84% of psoriatic arthritis patients treated with Cosentyx® at two years had no radiographic progression in their joints. Clinical studies found 80% of patients saw a 75% improvement after using the drug for 12 weeks. 70% saw a 90% improvement by week 16. 405 found their symptoms disappeared completely.
 
Mark Tomlinson, from Novartis, the drug manufacturer, said: “In those without psoriasis, the immune system is like an orchestra; each section perfectly balanced and working harmoniously together. When a person has psoriasis, it is like one violinist in the orchestra playing out of tune. It dominates the sound and rhythm. IL-17A is like a maverick violinist”.
 
Apremilast

 
Apremilast is a recently licenced oral drug for psoriasis and psoriatic arthritis, which inhibits the phosphodiesterase enzyme, which in turn has affects on regulating pro and anti-inflammatory cytokines and proteins such and TNF and IL-17. In randomised control studies, Apremilast has shown improvement in psoriasis skin and arthritis disease activity assessments.
 
Ixekizumab

 
Another new anti-IL-17 drug, ixekizumab a cloned antibody, has been approved for treating adult patients with moderate to severe plaque psoriasis (covering 10% or more of the body) who are candidates for phototherapy or medications that are absorbed into the blood stream (systemic therapy). Ixekizumab has been shown to clear symptoms in 80% of people. Research published in the New England Journal of Medicine in 2016, suggests that Ixekizumab neutralises the inflammatory effects of an interleukin, a protein in the skin that carries signals to cells.
 
To test the drug's efficacy over time, three studies enrolled 3,736 adult patients at more than 100 study sites across 21 countries. Researchers assessed whether the drug reduced the severity of the symptoms of psoriasis compared to a placebo, and evaluated its safety by monitoring any side effects. By 12 weeks, 76 to 82% of people in the study had their condition classified as 'clear' or 'minimal'; compared to 3.2% of patients in the placebo. By 60 weeks, 69 to 78% showed their improved condition had been maintained. Kenneth Gordon, professor of dermatology at Northwestern University, and the first author of the study, said: 'Based on these findings, we expect that 80% of patients will have an extremely high response rate to ixekizumab, and about 40% will be completely cleared of psoriasis.”
 
Takeaway 

None of these new drugs represent a magic bullet, but they do appear to provide significant relief for a substantial percentage of sufferers of psoriasis and psoriatic arthritis.
 
view in full page
 
 
  • Obesity is one of the most serious global public health challenges of the 21st century and a major cause of type-2 diabetes (T2DM), a life-threatening illness, which costs billions
  • 60% of adults in the UK are either overweight or obese, 74% in the US
  • Low calorie diets and exercise are difficult to sustain and therefore tend to fail as treatment options 
  • Conventional treatments for T2DM have failed to dent the vast and escalating burden of the condition, so interest is increasing in alternative treatment options
  • Bariatric (stomach reduction) surgery is a therapy for obesity, which has been shown to “cure” T2DM
  • In 2016, 45 international health organizations called for bariatric surgery as a treatment for T2DM
  • Is bariatric surgery the biggest step forward in T2DM treatment in 100 years?
 

Weight loss surgery to treat T2DM


It is five minutes to midnight for healthcare systems struggling in vein to reduce the vast and escalating burden of type-2 diabetes (T2DM). Doing more of the same is no longer an option. Given the lack of alternatives, experts are calling for an increase in bariatric surgery because it has been shown to “cure” T2DM.
 
Bariatric surgery not only reduces weight, it also improves glycemic control by a combination of enforced caloric restriction, enhanced insulin sensitivity, and increased insulin secretion with a consequent reduction in the symptoms of T2DM.
 
In the video below Kenneth D’Cruz, Senior Consultant Gastroenterological Surgeon at Narayana Health, India describes bariatric surgery, which refers to a range of procedures including gastric bypassgastric sleeve, gastric band, and gastric balloon. Such procedures are often performed to limit the amount of food that an individual can consume, and are mainly used to treat those with a body mass index (BMI) of above 40, and in some cases where BMI is between 30 and 40, if the patient has additional health problems such as T2DM.
 
 
Epidemiology of obesity

Overweight and obesity are principal risk factors of T2DM. In the UK, the number of people classified as obese has doubled over the past 20 years and continues to rise. According to data from the 2014 Health Survey for England, 24% of adults in England are obese and a further 36% are overweight. In 2015, there were 440,288 admissions to England's hospitals for which obesity was the main reason or a secondary factor.
 
Data from the National Child Measurement Programme (NCMP), suggest 10% of children in the UK are obese by the time they start primary school, and 25% are so by the time they finish. 6% of people in the UK are living with diabetes of which 90% have T2DM. Over the past decade the incidence rate of T2DM has increased by 65%.
 
The situation is similar in the US, where 36% of adults are obese, and 6.3% have extreme obesity. Almost 74% of adults are considered either overweight or obese. Over the past 30 years, childhood obesity has more than doubled, and it has quadrupled in adolescents. The percentage of children who were obese increased from 7% in 1980 to nearly 18% in 2012. 9.3% of people in the US are living with diabetes.
 
The World Health Organization warns that obesity is, “one of the most serious global public health challenges of the 21st century”.
 
Causes of obesity

There are many complex behavioural and societal factors that combine to contribute to the causes of obesity. At its simplest, the body needs a certain amount of energy (calories) from food to keep up basic life functions. When people consume more calories than they burn, their energy balance tips toward weight gain, excess weight, and obesity. In the videos below Mohammed Hankir, Department of Medicine, University of Leipzig, Germany, describes what causes obesity, and the relationship between obesity and T2DM:
 
What are the causes of obesity?
 
What is the relationship between obesity and type-2 diabetes?
 
The cost of diabesity

Obesity costs the UK £47bn every year. The medical care costs alone for obesity in the US are estimated to be more than US$147bn. Diabetes treatment and indirect medical costs run to £10.3bn in the UK and US$176bn in the US, representing significant increases over the past five years. The medical costs for an individual with diabetes are typically 2.5 times higher than for someone without the disease. As prevalence of obesity increases these costs will rapidly rise.
 
T2DM prevention and treatment

NHS England, Public Health England and Diabetes UK’s National Diabetes Prevention Program is based upon diet and exercise-induced weight loss, which sometimes remedies insulin resistance. For obese people dietary and lifestyle therapies have limited short-term and almost non-existent long-term success records. According to Professor John Wilding, Head of the Department of Obesity and Endocrinology at the University of Liverpool, UK; the problem with low calorie diets, “is that most people will lose weight, but most people will also regain much of that weight that has been lost.” The UK’s National Institute of Health and Clinical Excellence (NICE) does not support the routine use of low calorie diets.
 
Once an overweight or obese person has T2DM the stakes change. With the limited success of conventional medical therapies, bariatric surgery has become an increasingly popular treatment in the war against obesity and latterly also for T2DM. The 2014 UK National Bariatric Surgery Registry reported that there is good evidence from randomised controlled studies that surgery is superior to medical therapy in improving diabetes control and metabolic syndrome. Surgery lowers the number of hypoglycaemic medications needed, including some people no longer needing insulin. It also means many people living with T2DM going into remission, and it markedly lowers the incidence of T2DM compared to matched-patients not having surgery.
 
NICE guidelines for bariatric surgery as a therapy for diabesity

Concerned about the rising prevalence of diabesity (obesity and diabetes) and the limited success of conventional strategies, in 2011, the International Diabetes Federation endorsed bariatric surgery as a T2DM treatment for obese people. The Federation’s endorsement is a validation of research and medical experience showing that surgery to reduce food intake can alter the biochemistry of the entire body. It also marked the beginning of a major new assault on diabetes.

In 2014, NICE introduced guidelines for bariatric surgery as a treatment option for obese adults, and suggested that it would greatly help T2DM. Current NICE guidelines state that bariatric surgery should be offered to anyone who is morbidly obese (a BMI of 40 or over), to those with a BMI over 35 if they have another condition, such as T2DM, and to those with a BMI of at least 30 with a recent diagnosis of diabetes.
 
In the UK only about 6,500 people each year have bariatric surgery. This is significantly lower than other European countries, which perform on average about 50,000 stomach reduction surgeries each year. Under the NICE guidelines, up to 2m people would be eligible for free bariatric surgery on the NHS, which would cost the taxpayer £12bn.

 
Biggest breakthrough in diabetes care since the introduction of insulin
 
In 2016 a review written by a group of researchers led by David Cummings, an endocrinologist at the University of Washington set out guidelines for bariatric surgery as a treatment option for diabetes. Francesco Rubino, one of the experts behind the guidelines and professor of metabolic and bariatric surgery at King's College London, said: “This is the closest that we have ever been to a cure for diabetes. It is the most powerful treatment to date.” Other doctors who drew up the guidelines said such changes could amount to the most significant breakthrough in diabetes care since the introduction of insulin in the 1920s.
 
The modern Roux-en-Y gastric bypass

The ‘gold standard’ bariatric surgical procedure is the Roux-en-Y Gastric Bypass, which is the most commonly performed bariatric procedure worldwide, named after a 19th century Swiss surgeon César Roux, who first performed the surgery to reroute the small intestine. The modern version of the procedure involves reducing the stomach to a little pouch, to curb eating and appetite, and then connecting that pouch to a lower section of the intestine. By using less of the intestine, fewer nutrients are absorbed, and the patient loses weight.
 
Until recently it has been poorly understood why, after bariatric surgery, a significant proportion of patients with T2DM leave hospital either needing no insulin, or lower doses, before ever losing any weight. Re-plumbing the GI-tract appears to reprogram the body’s hormones and resets its metabolism.

 
Advances in bariatric surgery

Thirty years ago there was little interest in bariatric surgery, which was risky, and not widely practiced. It involved a large, bloody incision, the prising apart of the heavy, fatty abdominal walls with metal arms, which then had to be held in place while the surgeon carried out procedures deep in the gut. Patient recovery times were long, and the risk of complications high.

By the first decade of the 21st century, when obesity became an epidemic in advanced economies the relationship between bariatric surgery and T2DM was given more attention. The medical device industry developed new surgical tools to facilitate blood free minimally invasive procedures for obese people, but researchers were still struggling to understand why bariatric surgery “cured” diabetes.

 
Understanding why bariatric surgery cures diabetes

One of the scientists to discover why bariatric surgery cures T2DM is Blandine Laferrère, an endocrinologist at the New York Obesity Nutrition Research Center at St. Luke’s. Our gut hormone ghrelin signals to our brain that we are hungry and to start eating. Receptors in out GI tract signal to our brain that we are full and to stop eating. In obese people such signalling malfunctions, and leaves them perpetually hungry. According to Laferrère, “It just happened that the surgeons did this type of surgery for weight loss, and that turned out to have a spectacular effect on the remission of T2DM.

Further research was undertaken by Laferrère and influenced by Werner Creutzfeldt, a German doctor who published work on gut hormones that increased stimulation of insulin secretion, which he called an “incretin effect”. According to Laferrère, bariatric surgery, rather than actual weight loss, stimulates the incretin effect, which boosts the production of insulin while lowering the symptoms of diabetes. She concluded that the surgery itself triggered the hormone network, which diet-induced weight loss could not provide.
 
Takeaways

Scientists claim that bariatric surgery is the biggest step forward in diabetes treatment in 100 years, and suggest we are no longer talking about the treatment of obesity, but treatment of diabetes.
 
view in full page
 
 
  • Tobacco is a legacy recreational drug that causes cancers, and kills over 6m people each year
  • No new food, drink, recreational or over the counter drug with a similar adverse health profile would ever be approved in the modern world
  • Smoking causes 150 extra mutations in every lung cell
  • New research demonstrates that smoking causes cancers in organs not exposed to smoke such as the bladder, kidney and pancreas
  • Smoking triggers cell mutations that can cause cancer years after quitting
  • Anti-smoking campaigns have decreased the prevalence of smoking, but incidence rates have increased because of population growth
  • Identifying all the cancer genes will eventually improve treatments
 
 
Smoking is playing Russian roulette with your life
 
Tobacco is the only legal drug that kills millions when used exactly as intended by manufacturers. New research into the root causes of cancer demonstrates how tobacco smoke mutates DNA, and gives rise to more than 17 types of cancers, and surprisingly, causes cancers in organs not directly exposed to tobacco smoke.
 

Cell mutation and the body’s natural resistance
 
A mutation occurs when a DNA gene is damaged or changed in such a way as to alter the genetic message carried by that gene. The more mutations a cell acquires, the more likely it is to turn cancerous.
 
Decreased prevalence, but increased incidence of smoking

Globally, smoking prevalence - the percentage of the population that smokes regularly - has decreased, but the number of cigarette smokers worldwide has increased due to population growth. Today, over 1bn people worldwide smoke tobacco, which each year causes nearly 6m early deaths, many different cancers, pain, misery and grief; not to mention the huge costs to healthcare systems and the loss of productivity.  If current trends continue tobacco use will cause more than 8m deaths annually by 2030. On average, smokers die 10 years earlier than nonsmokers.
 

Cancer and the body’s natural resistance

Cancer is a condition where cells in a specific part of the body mutate and reproduce uncontrollably. There are over 200 different types of cancer. Cancerous cells can invade and destroy surrounding healthy tissue and organs. Cancer sometimes begins in one part of the body before spreading to other areas. This process is known a metastasis. The body has a capacity to naturally resist cancer, through tumor suppressor genes, which function to restrain inappropriate mutations, and stimulate cell death to keep our cells in proper balance.New therapies that boost the body’s own immune system to fight cancer are believed to be a game-changer in cancer treatment.

Cancer and the causes of cancer

Whitfield Growdon, a surgical oncologists from Harvard University Medical School and the Massachusetts General Hospital in Boston, describes cancer and the causes of cancer:
 
What is cancer?



What causes cancer?
 
Epidemiology of smoking

Today, it is widely accepted that tobacco use is the single most important preventable health risk in the developed world, and an important cause of premature death worldwide. The research of the British epidemiologists Richard Doll and Tony Bradford Hill, more than anyone else, is responsible for the link between tobacco use and lung cancer. Following reports of several case-controlled studies in the early 1950’s Doll and Hill published findings of a larger case-controlled study in 1954 in the British Medical Journal, which suggested that smoking was, "a cause, and an important cause" of lung cancer. This was followed by the publication of further research findings in 1956. Doll and Hill’s latter study confirmed their earlier case-controlled findings: that there is a higher mortality rate among smokers than in non-smokers, and a clear dose-response relationship between the quantity of tobacco used, and the death rate from lung cancer. Data also indicated a significant progressive reduction in mortality rates with the length of time following the cessation of smoking.
 
US Surgeon General Report of smoking and lung cancer

The research of Doll and Hill, along with other cohort studies published in the 1950s, formed the basis for the game-changing 1964 report of the US Surgeon General, which concluded that, "Cigarette smoking is causally related to lung cancer in men; the magnitude of the effect of cigarette smoking far outweighs all other factors". This led to groundbreaking research on tobacco use, and investments by governments and nonprofit organizations to reduce tobacco prevalence and cigarette consumption, which in some developed countries has been successful. In 2003, the Framework Convention on Tobacco Control was adopted by the World Health Organization, and has since been ratified by 180 countries.  
 
The best and the worst countries for smoking related lung cancer
 
Between 1980 and 2012 age-standardized smoking prevalence decreased by 42% for women and 25% for men worldwide. Canada, Iceland, Mexico, and Norway have reduced smoking by more than half in both men and women since 1980. The greatest health risks for both men and women are likely to occur in countries where smoking is pervasive and where smokers consume a large quantity of cigarettes. These countries include China, Ireland, Italy, Japan, Kuwait, South Korea, the Philippines, Uruguay, Switzerland, and several countries in Eastern Europe. The number of cigarettes smoked worldwide has grown to more than 6 trillion. In 75 countries: smokers consume an average of more than 20 cigarettes a day.
 
Smoking-related deaths in the UK and US

19% (10m) of adults in the UK, and 17% (40m), of adults in the US are current cigarette smokers, a figure, which has more than halved since the mid 1970s. Results from a 50-year study shows that half to two thirds of all lifelong cigarette smokers will be eventually killed by their habit. Death is usually due to lung cancer, chronic obstructive lung disease and coronary heart disease. Many who suffer from these diseases experience years of ill health and subsequent loss of productivity. Every year, around 96,000 people in the UK, and 480,000 people in the US, die from diseases caused by smoking. This equates to 226 and 1,300 smoking-related deaths every day in the UK and US respectively.
 
Costs

In addition to death and sickness, tobacco use also imposes a significant economic burden on society. These include direct medical costs of treating tobacco-induced illnesses, indirect costs including loss of productivity, fire damage and environmental harm from cigarette litter and destructive farming practices. Cigarettes sales contribute significant tax revenues to national coffers; the industry employs tens of thousands of people who also pay taxes. Notwithstanding, the total burden caused by tobacco products outweighs any economic benefit from their manufacture and sale.
 
Direct link between the number of cigarettes smoked and cancers

Scientists from the Wellcome Trust Sanger Institute near Cambridge, UK, the Los Alamos National Laboratory in New Mexico, and others have discovered a direct link between the number of cigarettes smoked and the number of mutations in the tumor DNA, and that smoking also causes cancers in organs not exposed to tobacco smoke.

Research published in the Journal Science in 2016 analyzed more than 5,000 cancer tumors from smokers and nonsmokers, and concluded that if you smoke even a few cigarettes a day you will erode the genetic material of most of the cells in your body, and thereby be at a significantly greater risk of cancer. "Before now, we had a large body of epidemiological evidence linking smoking with cancer, but now we can actually observe and quantify the molecular changes in the DNA due to cigarette smoking," says Ludmil Alexandrov, a theoretical biologist at Los Alamos National Labroratory and an author of the study.
 
The discovery means that people who smoke a pack of cigarettes a day for a year, develop on average, 150 extra mutations in every lung cell, and nearly 100 more mutations than usual in each cell of the voice box, 39 mutations for the pharynx, 23 mutations for mouth, 18 mutations for bladder, and 6 mutations in every cell of the liver.
 
Smoking causes cancers not exposed to smoke
 
Scientists were surprised to find that tobacco smoke caused mutations in tissues that are not directly exposed to smoke. While more than 70 of the 7,000 chemicals in tobacco smoke have long been known to raise the risk of at least 17 forms of cancer, the precise molecular mechanisms through which these chemicals mutate DNA, and give rise to tumours in different tissues have never been altogether clear, until now. The study showed that some chemicals from tobacco smoke damage DNA directly, but others found their way to different organs and tissues, and ramp up the natural speed at which mutations built up in the tissues in more subtle ways, often by disrupting the way cells function. The more mutations a cell acquires, the more likely it is to turn cancerous.
 
Why some smokers get cancer and others do not

It won’t happen to me. . . . My grandfather started smoking when he was 11, smoked 20 a day, and lived ‘til he was 90”. We have all heard this before. But we now know why some smokers get cancer and others do not. it is because of the way mutations arise. When a person smokes, the chemicals they inhale create mutations at random points in the genome. Many of these changes will be harmless, but others will not be so benign. The more smoke a person is exposed to, the greater the chance that the accumulating mutations will hit specific spots in the DNA that turn cells cancerous. Even decades after people stop smoking, former smokers are at a long-term increased risk of developing cancers.“You can really think of it as playing Russian roulette,” says Alexandrov.
 
Takeaways

Until now, it has not been fully understood how smoking increases the risk of developing cancer in parts of the body that do not come into direct contact with smoke.
 
Sir Mark Walport, director of the Wellcome Trust, says that the findings from the research described above: “will feed into knowledge, methods and practice in patient care.” Dr Peter Campbell, from the Wellcome Trust Sanger Institute says: “The knowledge we extract over the next few years will have major implications for treatment. By identifying all the cancer genes we will be able to develop new drugs that target the specific mutated genes, and work out which patients will benefit from these novel treatments.”
 
view in full page
 
 

The Mexican Connection
A Special Report 

 
  • People are eating themselves to death and our healthcare systems and governments are failing to stop it
  • Obesity and type-2 diabetes (diabesity) kills thousands unnecessarily, and threatens the stability of healthcare systems around the world
  • In the UK there is mounting frustration with the diabetes establishment’s failure to make inroads into the prevention and management of diabesity
  • Mexico is re-engineering the way primary care delivers its services in order to prevent and reduce the burden of diabesity
  • There are lessons from Mexico for healthcare systems challenged by the diabesity epidemic
 

Breaking the cycle of ineffective diabesity services
 
People are eating themselves to death, and our healthcare systems are failing to stop it. Not more so than in Mexico, where 70% of the population is overweight and 33% obese; both risk factors of type-2 diabetes (T2DM), which kills 70,000 Mexicans each year.
 
The situation is not that different in the UK, which has the highest levels of obesity in Western Europe: 64% of adults in the UK are either overweight or obese, and the incidence rates of diabetes have more than trebled over the past 30 years. Each year, in the UK diabetes kills 22,000 people unnecessarily, and leads to 7,000 avoidable lower limb amputations.
 
The two countries differ however in their respective responses to the epidemic of obesity and diabetes (diabesity), which is the subject of this Commentary. While the UK’s diabetes establishment appears to be locked into a cycle of ineffectiveness, the Fundación Carlos Slim (FCS), is re-engineering the way Mexico’s primary healthcare system delivers its services in order to prevent and reduce the vast and escalating burden of diabesity. The FCS’s endeavours have important lessons for the UK, and indeed other countries battling with a similar epidemic.  
Diabesity a global challenge
Diabesity is no longer a disease of rich countries; it is increasing everywhere. An estimated 422m adults were living with diabetes in 2014, compared to 108m in 1980. The global prevalence (age-standardized) of diabetes has nearly doubled since 1980, rising from 4.7% to 8.5% in the adult population. This reflects an increase in associated risk factors such as being overweight or obese. Uncontrolled diabesity has devastating consequences for health and wellbeing, and it also impacts harshly on the finances of individuals and their families, and the economies of nations.


Mounting frustration with the UK’s diabetes establishment

Although there is consensus about what needs to be done to prevent and enhance the management of obesity and T2DM, and although each year NHS England spends £10.3bn on diabetes care, and £4bn to treat obesity, the prevalence rates of the conditions continue to rise, and the UK’s diabetes establishment seem unable to do anything about it.
 
This ineffectiveness has caused mounting frustration with the diabetes establishment on the part of the UK government’s National Audit Office (NAO) and the Public Accounts Committee (PAC). Numerous official inquiries into adult diabetes services have found no evidence to suggest that T2DM prevention and care are effectively managed, and failure to do so leads to higher costs to the NHS as well as less than adequate support for at risk people and those with the condition.
 
Damning official inquires into adult diabetes services
A 2015 NAO report into adult diabetes services found, “that performance in delivering key care processes and achieving treatment standards [recommended by the National Institute for Health and Care Excellence (NICE)], which help to minimise the risk of diabetes patients developing complications in the future, is no longer improving . . . . There are significant variations across England in delivering key care processes, achieving treatment standards and improving outcomes for diabetes patients, (and)  . . . There are still 22,000 people estimated to be dying each year from diabetes-related causes that could potentially be avoided”. 
The 9 basic processes for diabetes care
The nine NICE recommended basic processes of diabetes care are: (i) blood glucose level measurement (HbA1c), (ii) blood pressure measurement, (iii) cholesterol level measurement, (iv) retinal screening, (v) foot and leg checks, (vi) kidney function testing (urine),  (vii) kidney function testing (blood), (viii) weight check, and (ix) smoking status check.
No strong national leadership and depressingly poor progress
When the Public Accounts Committee (PAC) reported on adult diabetes services in 2012 it found that, "progress in delivering the (NICE) recommended standards of care and in achieving treatment targets has been depressingly poor. There is no strong national leadership, no effective accountability arrangements for commissioners (local healthcare providers), and no appropriate performance incentives for providers." Four years later, a 2016 PAC inquiry into adult diabetes services reported that nothing of significance had changed. The Committee was concerned, “that performance in delivering key care processes and achieving treatment standards is no longer improving”, and it challenged, “the Department of Health, the NHS and Public Health England on their lack of progress in improving patient care and support”.
 
The UK’s cycle of ineffective diabesity services
The NAO and the PAC inquiries appear to have identified a cycle of ineffectiveness among the UK’s diabetes establishment, which manifests itself in a familiar scenario. Here is a stereotypical picture.
 
Each year, after the publication of the latest prevalence data for obesity and diabetes, Diabetes UK, a leading charity, “calls on the government to do more”, the National Clinical Director for Obesity and Diabetes at NHS England makes a defensive statement usually emphasising the positive aspects of diabetes services. NHS England continues to spend £14.3bn each year on the treatment of diabesity. There continues to be little improvement in the 20,000 plus unnecessary annual diabetes-related deaths, and 7,000 avoidable amputations. Diabesity services continue to be inflexible and process, rather than outcomes driven. Nothing of substance changes, prevalence rates and eye-watering costs continue to rise, and no one is accountable.
 
This cycle of ineffectiveness reflects a dearth of national leadership among the diabetes establishment.
 
The Fundación Carlos Slim (FCS) appears successfully to have broken a similar cycle of ineffectiveness for the prevention and treatment of diabesity in Mexico. The Fundación used the weaknesses in Mexico’s primary healthcare system as an opportunity to re-engineer the prevention and treatment of diabesity with an innovative program called Casalud. The name is derived from two Spanish words: “casa” (house) and “salud” (health): ‘Homehealth’.
 
In 2008, when the FCS launched the Casalud program, the primary care services of both the UK and Mexico were similar in in their inflexibility, and in emphasising treatment processes and service delivery rather than value-based healthcare. This emphasis results in weak primary care systems, which contribute to the increased prevalence of diabesity.
 
We will draw lessons from the Casalud program, but before doing so let us consider the grounds for a comparison between the healthcare systems of the UK and Mexico.
 


UK and Mexico compared

In both countries the prevalence of obesity and T2DM are high and increasing. Both governments’ healthcare systems are struggling to effectively cope with the vast and growing burden of diabesity. Mexico’s Seguro Popular, which is roughly equivalent to NHS England, serves about 57m people: which includes 60% - 34m - of Mexico’s poorest non-salaried workers employed in the informal sector. Mexico’s population is younger than the UK’s. The median age of Mexico’s 129m citizens is 29 years, whereas in the UK, which has a population of 65m, the median age is 40 years.
 
Both the UK and Mexico struggle with structural challenges associated with the supply and competence levels of health professionals. These manifest themselves in significant local variations in the effectiveness of diabesity prevention and treatment, and in lengthy waiting times for GP consultations.
 
Annual foot checks in the UK and Mexico
In England for instance, standard annual recommended foot checks for people with diabetes vary as much as 4Xs depending on where you live. Each year 415,000 or 13.3% of people with T2DM do not receive foot checks, which increases their risk of amputation, and fuels the 7,000 avoidable lower limb amputations carried out each year. Similarly in Mexico, 60% of people with diabetes fail to have their feet examined during primary care consultations, and between 86,000 and 134,000 diabetes-related amputations occur each year.
 
Responding to the recent English findings, Professor Jonathan Valabhji, the National Clinical Director for Obesity and Diabetes at NHS England said; “It is very important as many people as possible receive their foot checks at the right time – currently each year 85% of people with diabetes receive these foot checks.”
 

Leadership to break the cycle of ineffective healthcare services
In contrast to the UK’s diabetes establishment, the Casalud program provides strong, well-coordinated national leadership, and effective accountability and performance incentives for local healthcare providers. It does not however, deliver direct healthcare services; these are provided by the state. Instead Casalud concentrates on fostering the implementation and use of innovative technology, which it has designed to enhance patient centred primary care, extend healthcare into communities and homes, encourage self-management, engage in prevention programs, and enhance the competence and capacity of healthcare professionals within Seguro Popular.
 
For the Casalud program to stand a chance of being supported by the Mexican government, and implemented nationally, the FCS understood that it was essential to collect convincing performance data in its pilot program. From its inception therefore, the Casalud program developed and agreed with the relevant healthcare agencies a suite of performance measures, data collection protocols and reporting systems. This helped the Fundación to secure the backing of key national and regional healthcare agencies.
 
The FCS chose a social franchising model for the Casalud program, which uses commercial best practice to achieve socially beneficial ends, rather than profit. This makes the program significantly different to the endeavours of some UK public and non-profit bureaucracies, which provide diabesity services.
Some common aspects of bureaucracies
Here we briefly describe some common aspects of bureaucracies, which suggest that over time, bureaucratic organizations may become ineffective diabesity service providers. Bureaucracies are machine-like organizations characterised by hierarchical authority, a detailed division of labour, and a set of rules and standard procedures, which staff are obliged to follow. Rules provide a means for achieving organisational goals, but the following of the rules sometimes displaces the actual objective of the organisation, and organisational objectives become secondary. This is encouraged by the fact that people in bureaucracies tend to be judged on the basis of observance of rules and not results. For example, in an organisation, say committed to diabetes services, performance may be judged on the basis of whether expenditure has been incurred according to rules and regulations. Thus, expenditure becomes the criterion of performance measurement, and not the results achieved through expenditure. Bureaucracies almost completely avoid public discussion of its techniques, although there may be some discussion of its policies. This secrecy is believed to be necessary to prevent “valuable information” from leaking out, and going to competitors. “Trained incapacity” is a term sometimes applied to bureaucracies to describe training and skills, which have been successful in the past, but are unsuccessful under present changed conditions. Inadequate flexibility, in an evolving environment such as healthcare, will result in ineffectiveness.
 mHealth platform embedded with bespoke tools
The Casalud program avoided bureaucratic traps that result in ineffectiveness by developing a flexible mHeath platform (the use of mobile phones and other wireless technology in medical care) with an embedded suite of proprietary software, which connects patients to health providers, nudges people to self-manage their own health, and to become integral members of local care teams. The platform is used for mobile screening, providing patients with their own individual healthcare dashboards, online healthcare education, supply chain monitoring, standardizing electronic patient records, and big data strategies. It also acts as an entry point for patients, support for health professionals to identify at-risk people, make early diagnosis, and quickly begin diabesity management, and structure follow-up with patients over time.
 


The Casalud program’s successful pilot

In 2009, the FCS began a 3-year pilot of its Casalud program in 7 Mexican states, which resulted in improved patient knowledge about diabesity, enhanced self-management among people with the condition, increased clinician knowledge of diabesity prevention and management, and improved clinical decision-making.
 
The FCS used performance data from its pilot to secure a partnership with the Mexican Ministry of Health to extend the Casalud program to 120 primary care clinics serving 1.3m people across 20 Mexican states - 4 to 10 clinics in each state. Also, the performance data was successful in getting the Casalud program adopted as an integral component of the National Strategy for the Prevention and Control of Pre-obesity, Obesity and Diabetes. So, within three years the Casalud program went from a relatively small charity-backed start-up to a significant component in a nationally supported healthcare system.
 
It is reasonable to assume that this was partly due to the leadership provided by the FCS, and partly due to setting, collecting and reporting appropriate performance indicators. The FCS acted similarly to a lead institution in a commercial endeavour, and successfully recruited key contributing partners who were prepared to share the costs of the program’s national rollout. The FCS covers the cost of all the software development, and the training of healthcare professionals for the Casalud program. All the software is owned by the FCS, and licensed free-of-charge to the Mexican government. The federal government covers the cost of all computer hardware used in participating clinics, and local state governments cover the cost of Casalud’s operations, which include such things as laboratory tests and medications.
 


The 5 components of the Casalud program

To better understand the Casalud program and its contribution to enhanced diabesity services we review its five components: (i) proactive prevention and detection of diabesity, (ii) evidence-based management of diabesity, (iii) supply chain improvements, (iv) capacity-building of healthcare professionals, and (v) patient engagement and empowerment. Each component has an on-going monitoring system associated with it, which informs the FCS on the status of the program’s implementation.
 
1. Proactive prevention and detection of diabesity
Previous attempts in Mexico at community based screening for diabesity have failed. However, the FCS insisted that a national screening strategy was important for reducing the burden of diabesity, but understood its case would need to be supported by appropriate performance data, which would require systematic collection and reporting. To help achieve this the FCS developed two online risk assessment tools, which capture, assess and report data on peoples’ risk factors of diabesity.
 
One of these tools is used in clinics, and the other, which is portable, used in homes and communities. Both screen and categorise people as, (i) healthy, (ii) at risk of diabesity, and (iii) already diagnosed as obese or with T2DM. Screening allows local healthcare professionals to suggest personalised lifestyle changes to individuals either to help them reduce their risk of diabesity or to improve their management of the condition. Each participating clinic has a screening goal. Screening data are collated and reported weekly on a pubic system, which incentivizes the clinics in their screening endeavours.
 
Having a portable device means that populations, which previously did not have access to healthcare are included in the screening. While this increased the number of reported people with diabesity, over time it lowered healthcare costs because early detection reduced the use of urgent care facilities. This proactive component of the Casalud program and the performance data resulted in the support of federal healthcare officials who saw the advantages of using technology to integrate communities, families, and patients into a continuum of care. The tools also extended care to people and communities that previously had little access to healthcare, and encouraged patients to use technology to manage their own health, which health authorities appreciated.
 
2. Evidence-based diabesity management
The second component of the Casalud program is an evidence-based diabesity management system, which is supported by more software developed by the FCS. This includes agreed international best practice protocols for diabesity prevention and management, a digital portfolio for health professionals, electronic monitoring of patients in order to improve the accuracy and reliability of performance measurements and patient data. Such data are used to improve the quality of clinical decision-making.

Examples of the data collected and reported are the percentages of people with T2DM and their corresponding laboratory test results. Casalud’s study found that out of 961,733 patients with T2DM, only 20% had an HbA1c (blood glucose) measurement. Further, only 40.7% of patients with an HbA1c measurement had their HbA1c levels under control (below 7%).  All data are made available at the national, state and clinic levels, and are thereby expected to empower healthcare providers to base their health policy decisions on the areas of most need.
 
3. Supply chain improvement
Mexico like other emerging countries suffers from an inconsistent supply of medicines and laboratory tests, which is a significant obstacle to optimal disease prevention and management. Drug supply decisions in Mexico are centralized and made at a state or federal level. This is different to the UK, and other developed countries.
 
This component of the Casalud program uses a proprietary online information system that standardizes metrics for stock management at the clinic level to improve the supply of medicines and laboratory tests. The software is made available on mobile phones to make it easy for health professionals to ensure that stock levels are adequate for clinics to provide a quality service. In addition, Casalud uses these data to raise awareness with federal and state healthcare officials of inefficiencies in supply chains, which could fuel complications and increase healthcare costs. Prior to Casalud there was no accurate and systematic way to assess and report on the supply of medicines and laboratory tests.
 
4. Capacity building for healthcare professionals
Casalud’s forth component is an interactive platform to develop the capacity of healthcare professionals through online education, which leads to diplomas conferred by national and foreign universities. The FCS partnered with Harvard University’s Joslin Diabetes Center, and Mexico’s National Institute of Medical Sciences and Nutrition to develop courses that certify competence in key areas of diabesity prevention, diagnosis and management. One course is designed to update doctors’ knowledge of diabesity, and the other is a practical course developed by faculty of the Joslin Diabetes Center in which health professionals solve real-life cases to test their knowledge in practical settings.
 
Certificates act as non-monetary incentives for health professionals, and to promote competition between clinics and health professionals. This helps to increase participation in the program, improve the quality of care, encourage openness and transparency, and increase collaboration between clinics.
 
Software developed by the FCS assists local clinics to capture data on the characteristics of the participating healthcare professionals, their baseline knowledge, and improvements after each course. These data are aggregated to choose a clinic of excellence for each state, and a national clinic of excellence; both of which are publicly recognised awards, and help with Casalud’s national rollout strategy.
 
Further, performance data are contributed to the National Strategy for Improving Skills and Capacity of Healthcare Personnel, which obliges all Mexican healthcare institutions to engage in formal online training that is, personalized, linked to a continuing education program, validated by academic institutions and independently monitored. Casalud’s capacity building component fulfils all of these criteria.
 
5. Patient engagement and empowerment
With the help of the Joslin Diabetes Center, the Mayo Clinic, and Mexico’s National Nutrition Institute, this component has two mobile applications, which assess patient engagement, knowledge of diabesity, and confidence and skills in order to help them understand their health, begin to self-monitor their condition, interpret their own results, and implement beneficial lifestyle changes. A specific app for people with T2DM allows them to schedule medicines and appointment reminders, input glucose and weight measurements, and receive immediate personalized feedback and educational messages from health professionals.

However, the FCS changed its approach following evidence from the program’s pilot, which suggested that due to the characteristics of the patient population – elderly, rural, and with limited access to and familiarity with technology – mobile technology alone would not lead to a high percentage of patient engagement. So, Casalud implemented a suite of in-person interactions and activities, which are thought to be more appropriate for the specific patient population.

Such a change may not be necessary in the UK and other developed countries. In the UK for instance, the growth trend in smartphone ownership is present in all age groups, and fastest among 55-64 year olds, which jumped from 39% in 2014 to 50% in 2015. While those aged over 55 are more likely to own a laptop the gap is closing. Among younger age groups, 90% of those aged 16-24 now owns a smartphone.
 


Takeaways

Although the Casalud program has encountered challenges associated with Mexico’s patchy technological infrastructure, entrenched attitudes of some health professionals, and fragmentation and lack of uniformity of its primary healthcare system; the program has been successful; not least because of its flexibility and speed of adjusting to prevailing conditions. In 2015 a Brookings Institution research paper concluded that, “Casalud has made significant strides in transforming care delivery in Mexico”. 

Casalud’s development and implementation continues. It is an innovative program, which employs appropriate technology and evidence-based knowledge to re-engineer Mexico’s public sector primary healthcare system by encouraging patient self-management to reduce the country’s vast and increasing diabesity burden.
 
Casalud provided leadership and seed money to secure financial support from and create consensus between the federal and state governments, and obtain local support from clinics, healthcare professionals and patients. The program is on-going and warrants consideration from the UK’s diabetes establishment, and those of other countries wrestling with the burden of diabesity.
 
view in full page