Tag

Tagged: commentary

Sponsored

Earnest Hemingway, the novelist, used to say he, "drank to make other people more interesting".

Today, binge drinking is a silent epidemic.

Often unrecognized, binge drinking is a serious issue among British and American young women.

In the US, nearly 14 million women binge drink about three times a month and each year nearly 1,400 American college students die from binge drinking.

Professor Dame Sally Davies, the UK’s Chief Medical Officer, highlighted the rising tide of UK deaths from alcohol related liver disease. "We really have young people who are binge drinking and it is damaging their livers.” Liver disease costs the UK NHS £1 billion a year.

A hidden problem

In addition to causing liver disease, binge drinking also increases the chances of breast cancer, heart disease, sexually transmitted diseases and unintended pregnancy.

Researchers at University College London have recently reported that alcohol consumption could be much higher than previously thought, with more than three quarters of people in England drinking in excess of the recommended daily alcohol limit.

Since the beginning of 2010 more than 2,400 more girls than boys have been seen by hospitals because of alcohol. Suggesting that alcohol abuse appears to have a much greater immediate effect on women than men.

The ladette culture of binge drinking is not confined to young women. UK Department of Health figures show that in 2010 there were 110,128 alcohol related hospital admissions for women between 35 and 54. A switch to drinking at home has contributed to the problem of women increasingly drinking.

In February 2013 the debate over a minimum price for alcohol was reopened by a report by the Alcohol Health Alliance, a coalition of 70 health organisations and published by the University of Stirling. It recommends that a 50p minimum charge for a unit of alcohol is needed to end the "avoidable epidemic" of binge drinking deaths.

Dr Paul Southern, a consultant hepatologist at Bradford's Royal Infirmary Hospital in the UK, said that people in their 20s are dying from liver disease caused by binge drinking and children as young as 12 are falling prey to the “pocket money alcohol business.”

According to Dr Southern there is, “only one single effective deterrent (for binge drinking) and that is taxation.” While recognising the problem of binge drinking the UK government has not yet delivered a solution.

A cultural change

While supporting the call to increase the price of a unit of alcohol sold in supermarkets, Professor Dame Sally also suggests that, "We need a cultural change.”

Mobile Apps are now available for predicting alcohol abuse, using research-based questionnaires to help patients determine if they are at risk, while other more light-hearted Apps allow users to see the effect of alcohol abuse on their future appearance.

Innovative ideas to make people think twice, but with little research evidence available, several doctors have come out against such aids saying that they wouldn’t recommend such Apps without empirical evidence in place to support their effectiveness.

In such settings is scientific medicine holding back opportunities for mHealth?

view in full page

Can patient aides, comprised of online video content libraries of trusted health information, enhance shared decision making between patients and their doctors, lower costs and increase the quality of healthcare? American payors think they can.


Both ends of the stethoscope

We know very little about the hidden dynamics of doctor-patient relationships. We do know however, that doctors have a moral and legal obligation to inform patients about their medical conditions and explain treatment options, but only patients have the right to decide on their treatment. So, how do patients decide about competing treatment options?

For example, how do women, diagnosed with breast cancer, choose between a mastectomy and a lumpectomy? How do mothers choose between Gardasil and Cervarix for their daughters?

Peter Ubel, a professor at Duke University and author of Critical Decisions, provides some insights into the elusive world of private medical consultations between doctors and their patients. According to Ubel medical consultations are fraught with a multitude of unresolved communication issues because doctors', "moral obligations to inform patients, outstrip their abilities to communicate".

In the US there is mounting concern that doctors are aggressively pushing for more costly invasive procedures, even though they may not be any better or safer than slower and simpler ones. Ubel describes how hidden dynamics in doctor-patient relationships and the dearth of premium, trusted and independent patient aides, prevent patients from making optimal medical decisions. This, he says, increases costs and lowers the quality of care.


Spurious online health information

Doctor-patient relationships are further complicated by the ease that patients can access spurious and misleading online health information. It’s true that they also have access to accredited online medical information such as that provided by WebMD. The difficulty however, is for patients and their carers to judge between legitimate and spurious online medical information.

This is confirmed by research published in 2010 by the US National Institute of Health, which reported that over 75% of all people who search online for health information encounter difficulties in understanding what they find and as a consequence become frustrated and confused.

In December 2012, such difficulties resulted in a UK mother, Sally Roberts, denying her seven year old son Neon radiotherapy to treat his brain tumour. Information she found on the internet convinced her that radiotherapy would do more harm to her son than good. The UK hospital treating Neon disagreed, took legal action and a High Court Judge ruled that Neon should receive radiotherapy.

 

The increasing importance of video in healthcare

US payors are becoming increasingly confident that online video libraries of premium trusted medical information that assist patients to reach more informed decisions about their health are important in shifting emphasis away from clinicians towards patients and their needs, wishes and preferences.

Large US hospital groups are producing trusted and reliable consumer aids that they are using to create, develop and manage specific online patient communities. One example is the Cleveland Clinic, which employs online videos to share health tips and clinical research with patients.
Why video?
One reason video has become so popular among patients is because it delivers a human-touch to health information that digitalized written words don’t. So it’s not surprising that video is the preferred format for patients to receive health information, which increasingly they access on smartphones.


American initiatives
The main push for patient aides to inform shared decision making is from the US Government and health payors and is driven by their efforts to control escalating healthcare costs while improving the quality of care.

For the past six years the state of Massachusetts has produced videos to help terminally ill patients and their carers better understand end-of-life decisions. Washington State, among others, provides patients with video aides to support shared decision making. And three patient aide projects sponsored by the Center for Medicare & Medicaid Innovation are expected to yield savings of more than US$130 million within three years, while enhancing the quality of healthcare.

According to James Weinstein, CEO and President of the Dartmouth-Hitchcock Health System, comprised of 16 medical centres that treat millions, “Patients want to have good information about their health care decisions, which is independent of any bias.”

Jack Daniel, Executive Vice-President of Med-Expert International, a Californian based company, which produces patient aides for people on Medicare and Medicaid said, “When a person calls us we can say here’s what the world’s best medical minds are saying about your condition.”


Takeaways
In 2010 business leaders participating in the prestigious Salzburg Global Seminar concluded that, “Informing and involving patients in decisions about their medical care is the greatest untapped resource in healthcare." Shared decision making they said, “is ethically right and practical, since it lowers costs and reduces unwarranted practice variations”.

Over the past 30 years patients have become better educated and better informed about their healthcare options. Everything suggests that this is just the beginning. Over the next decade, healthcare systems will be increasingly challenged by aging populations, escalating incidences of chronic diseases and fiscal constraints and consumers and communications will assume a more pivotal role. This will accelerate the need for premium, trusted, online health information that patients can access at speed, anytime, anywhere and anyhow.

Until patient aides become commonplace we will not change the way we communicate inside hospitals and doctors’ surgeries. Health costs will continue to rise, the provision of healthcare will continue to be stretched and the quality of care will continue to be challenged.

view in full page

“If I’d known I was going to live this long I would have taken better care of myself.” Memorable words from Eubie Blake, the American jazz composer, lyricist and pianist who died in 1983 at the age of 96. Today, people do take better care of themselves. Examples of people who do, include rock legends Mick Jagger and Paul McCartney, the badboys of the 1960s who became the goodboys of the 1990s. Now, at 70 and 69 respectively, they continue to work, support worthy causes and enjoy a good quality of life.

Over the past 50 years, the number of people over 65 in the developed world has tripled and is projected to triple again by 2050. The UK’s Office of National Statistics forecasts that a third of babies born in 2012 will live to 100. “Age is uninteresting,” said Groucho Marx, “All you have to do is to live long enough.” Age, however has become interesting as it is an unavoidable part of the human condition and a significant challenge for nations where millions will be retiring with a third of their lives still ahead of them. They will no longer be productive, but will be in need of healthcare. Healthcare systems have been slow to adjust to the new realities of aging populations and the financial costs of treating the elderly.

One way for nations to manage retirement and aging was suggested by Euripides in 500BC. “I hate men,” he said, “who would prolong their lives by foods and charms of magic art, preventing nature’s course to keep off death. They ought, when they no longer serve the land, to quit this life and clear the way for youth.” Euripides’ sentiment resonates today. In advanced industrial economies there is a relatively low tolerance of elderly people. This is manifest in the number of offences against elderly vulnerable patients, which involves neglect and physical violence. In his 2013 Report into the UK's Mid-Staffordshire NHS Foundation Trust, where hundreds of patients had died as a result of inadequate care, Robert Francis said that between 2005 and 2009 patients were subject to, “appalling and unnecessary suffering”. In June 2012, at a conference in London’s Royal Society of Medicine, Professor Patrick Pullicino claimed that each year UK National Health doctors prematurely end the lives of about 130,000 elderly hospital patients because they are difficult to manage and to free up beds for younger patients.

According to a UN Report presented at the World Assembly on Aging in 2002, population aging is an unprecedented global phenomenon. The 21st century will witness more rapid aging than did the 20th century and countries that started the process later will have less time to adjust. There will be no return to the young populations of previous generations and aging populations will have profound implications for healthcare.

Moralists argue that healthcare is a human right and all people should be treated similarly unless there are sound moral reasons not to do so. But, who pays? Daniel Callahan, a contemporary philosopher widely recognized for his innovative studies in biomedical ethics has an answer. Invoking Euripides he argues that age should be a limiting factor in decisions to allocate certain kinds of health services to the elderly. The demographic shift, says Callahan, increases competition for scarce healthcare resources and therefore healthcare should be rationed. Life extending care for the over 70s should be replaced with less expensive pain relieving treatment. Opponents of rationing suggest that wealthy governments should reduce their defense spending and increase their commitment to healthcare and enact reforms to cut costs and improve the efficiency of healthcare systems.

Callahan, however, has little faith in political leaders to deliver cost cutting strategies and argues that calls to cut healthcare waste and inefficiency have been made for decades with no effect. This is definitely the case in the UK where subsequent governments have failed to reconcile escalating costs of healthcare with maintaining and improving the quality of care for the elderly. According to Callahan, “Our whole health care system is based on a witch’s brew of sacrosanct doctor-patient autonomy, a fear of threats to innovation, corporate and (sometimes) physician profit-making, and a belief that, because life is of infinite value, it is morally obnoxious to put a price tag on it.”

Some age related incurable diseases that affect mostly older people in wealthy countries have contributed to the ghettoizing of age. One such disease is Parkinson’s, a progressive degenerative neurological movement disorder, which affects between six and 10 million people worldwide. In the US, the combined direct and indirect costs of Parkinson’s disease is estimated to be nearly US$25 billion per year. Medication costs for an individual person with Parkinson’s is on average US$2,500 a year and therapeutic surgery, such as deep brain stimulation, can cost up to US$100,000 dollars per patient.

However, not all age related diseases are like Parkinson’s. Indeed, it is not altogether true that old age corresponds to debilitating diseases and hikes in healthcare costs. Indeed, healthy years among the elderly are increasing and the spike in health costs tend to be in the last two years of life, regardless whether a person is 99 or nine. Rather than viewing the elderly as a burden and assessing them by their chronological age, it might be more appropriate to view them as assets and assess them by their number of healthy years. Healthy years are not necessarily years without illness, but years in which people manage whatever medical conditions they might have. A good example of this is Dame Maggie Smith, the English film, stage and television actress, who at the age of 78 has recently won a Golden Globe Award for her role as the Dowager Countess of Grantham in the television series Downton Abbey.


Longevity is one of the greatest successes of 20th century medical science and nutrition, but its challenges include the dearth of health workers with geriatric skills, the prevention of physical disabilities and the extension of healthy years. Recent studies suggest that healthy aging is possible and chronic non communicable illnesses such as heart disease, diabetes and dementia, may be delayed or prevented by certain lifestyle choices. Notwithstanding, currently there are millions of elderly people who have not taken good care of themselves and require specialist geriatric care.

In the US there is a monetary disincentive for doctors to specialise in geriatrics since geriatricians earn significantly less per year than more mainstream specialists. Further, only 11 of the 145 US medical schools have fully fledged geriatric departments. In 2010 the US federal budget allocated $11 million to fund geriatric education. Interestingly, today a substantial amount of geriatric care in wealthy countries is undertaken by health professionals trained in poorer countries. This raises ethical questions about rich countries encouraging the immigration of health workers from countries that lack them and the responsibilities of migrant health professionals to countries of their origin. Although geriatricians in the UK are well compensated, the British Geriatric Society reports that the number of geriatricians is not keeping pace with the needs of geriatric care.

According to the OECD between 10% and 20% of populations in developed economies require long term care and costs between 1% and 2% of GDP and these costs are projected to increase. The costs of long term care are skewed because a significant proportion of elderly care is carried out by informal, unpaid carers who are often family members. For example, in the UK there are 1.5 million official carers and about 5 million unpaid carers. In the developing world the situation is more extreme and some 60% of people over the age of 60 live with their children or grandchildren. While familial care may yield significant benefits, it is not a long term solution because as developing economies become more westernized, their family structures become more nuclear and less able to provide the support and care that they do now.

According to the first noble truth of Buddhism, life is painful and involves suffering. For a significant proportion of elderly people this is certainly the case, but it need not be. On an individual level, living longer must be welcome, but more generally, the greying of populations is perceived in terms of increased costs and pressure on overstretched healthcare systems, rather than freeing-up valuable resources that may contribute to society. Although elderly people tend to have long term medical conditions, increasingly they are successfully managed to allow a good quality of life. Old age is not a disease. Elderly people are a valuable resource of intellectual capital and knowhow, which nations cannot afford to waste. Unlocking this reservoir of grey-knowledge is important for the future wealth of nations. Let us hope nations have something better to offer their elderly than to call on them to do as Captain Oates did on the 16th March 1912. On his return from the South Pole, Oates, convinced that his ill health compromised his comrades, walked from his tent into a blizzard saying, "I am just going outside and may be some time.” He was never seen again.

Whose age is it anyway?

view in full page

You can’t see “it”. You can’t touch “it”. “It” tends to creep up on you unnoticed. Every year “it” kills tens of millions and costs billions. “It” destroys households, communities and even nations. "It" has been described as "the biggest threat to the 21st century.”

“It” is chronic non communicable diseases (NCDs): cancers, cardiovascular diseases, respiratory conditions and type2 diabetes, four of the biggest killers that have emerged as one of the greatest social and economic development challenges of this century.

In December 2012, the Lord Crisp, representing the All Party Parliamentary Group on Global Health, introduced a debate in the Atlee Suite of the London Houses of Parliament on NCDs. Drawing on his experience as a former CEO of the UK’s National Health Service he suggested that the global NCD burden may only be successfully addressed by changing the way healthcare is delivered. Other speakers emphasised the complexity of global health issues.

From a global health perspective, NCDs now account for more deaths every year than AIDS, tuberculosis, malaria and all other causes combined and result in roughly two out of three deaths worldwide. Mental illness, which has significant health, social and economic implications, is also considered by some as a NCD, but rarely leads to mortality.

A 2011 report produced by the World Economic Forum and the Harvard School of Public Health, argues that, “Over the next 20 years, non communicable diseases will cost more than US$30 trillion, representing 48% of global GDP in 2010 and pushing millions of people below the poverty line. Mental health conditions alone will account for the loss of an additional US$16.1 trillion over this time span, with dramatic impact on productivity and quality of life.”

NCDs are often viewed as diseases of affluence as their prevalence is highest in wealthier countries and are caused by bad diets and sedentary lifestyles. The economic impact of NCDs in rich nations is compounded by the ageing and shrinking of their populations and extends beyond the costs to health services since they affect economies, households and individuals by reducing labour productivity, increasing medical treatment costs and lost savings. Over time developed economies have accumulated knowledge and expertise to treat and manage NCDs. In developing nations, however, NCD's are a relatively recent phenomenon, but currently, they are growing exponentially and each year kills millions at dramatically young ages. This is because developing economies lack the knowledge and expertise to treat and manage the diseases and their policy makers show little interest in the prevention and control on NCDs.


This knowledge gap between developed and developing economies exacerbates the global NCD burden. Narrowing it entails capturing and organising relevant healthcare knowledge from wealthy nations, transferring it to developing countries and distributing it to where it is needed the most. Such narrowing of the global NCD knowledge gap will help significantly to reduce and manage the global NCD burden, but this will only be achieved by widespread use of cost effective healthcare technology.

What is the most ubiquitous healthcare innovation? . . . . . . . . . . . . . . . . . . . . . . . the mobile telephone and the smartphone, which combines telephony and computing. Although operationally relevant, such devices are underdeveloped healthcare applications. In today’s world, the implementation of any global healthcare strategy should not be contemplated without leveraging telephony and computing technologies. As the reduction and management of NDCs is increasingly about scarce information and connectivity, these technologies and mHealth should become increasingly important.

Despite its underdeveloped status, over the past decade, mHealth programmes, which use mobile telephones to distribute health information, have increased significantly in developing economies, especially in Africa. They are suggestive of scalable, cost effective strategies to help reduce the NCD knowledge gap and address the growing global burden of NCDs.

Recently, the FDA has approved a number of mobile phone-based medical imaging and data monitoring devices. One is a $99 electrocardiogram, which allows remote patients to monitor their heart rhythms at anytime from anywhere. The mobile app gives immediate feedback and data can be simultaneously relayed to a cardiologist anywhere in the world for a specialist opinion.

Microsoft is taking advantage of mobile telephony’s broad reach in Africa to develop an integrated healthcare information service, which serves both health workers and the general public. The system uses mobile phones to allow health workers to capture, store and process, transmit and access health information. Importantly, Microsoft has demonstrated that this lowers costs and enhances efficiency by eliminating redundancy and reducing the amount of time devoted to health information input. The public can also turn to the system for information: individuals pose frequently asked questions about health issues via SMS messages and receive replies straight to their mobile phones. Despite a high proportion of the users being poor, migrant, illiterate rural workers, Microsoft is convinced that its African mHealth service has the potential to become a valuable tool and is increasing its scope.

A study, published in the American Journal of Managed Care, concluded that mHealth can improve the management of diabetes and other NCDs while reducing visits to clinics. It argues that personalised healthcare is an under-represented feature in the management of NCDs and suggests that social media concepts developed by Facebook might be used in the self-management of NCDs and merit more consideration.


The International Telecommunications Union estimates that mobile subscribers worldwide reached 6.5 billion by the end of 2012 and is projected to reach eight billion by the end of 2016. Cheaper handsets, ever-decreasing data charges, the improvements in phone web browsers and increased 3G coverage have fundamentally changed the way we use our phones, resulting in smartphones increasingly becoming used as healthcare devices.

According to Strategy Analytics, in Q3 2012 the number of smartphone users globally rose to above a billion and the current paths of mobile technology and social networking are inextricably linked. Currently, some 650 million people globally use their mobile for emails and social networking. Although smartphone users make up only 13% of the world’s mobile users, they generate two-thirds of the world’s mobile traffic. Over the next five years this data traffic is expected to increase by 700% on average per user. By 2015 the number of smartphone users is expected to reach 1.4 billion, which will represent about 30% of total mobile subscribers worldwide.

Today, Australia, UK, Sweden, Norway, Saudi Arabia and UAE each have more than 50% of their population on smartphones. The US, New Zealand, Denmark, Ireland, Netherlands, Spain and Switzerland have greater than 40% smartphone penetration. All these countries have an escalating burden of NCDs. Mobile phone penetration across Africa is around 70%, but smartphone penetration in Africa is only 10% to 15%. Nigeria is the leading African country in smartphones with a penetration 41%, followed by South Africa with 31% and Kenya with 7%. However, the costs of smartphones are falling and telecom companies, such as Huawei and ZTE, are aggressively driving smartphone sales in the developing world’s rapidly modernizing consumers and looking to lift smartphone penetration in Africa closer to the 70% level. In February 2011 Huawei partnered with Safaricom, to offer the Android-based Ideos smartphone to the Kenyan market for US $80. Huawei is now attempting to build on the Ideos’ momentum in Nigeria.

These trends suggest that there are significant opportunities to reduce and manage NCDs by healthcare programmes piggybacking on existing global and local mobile networks. Narrowing the global NCD knowledge gap requires targeting risk factors and promoting healthier lifestyles. This means focused prevention efforts while mitigating the impact of NCDs on economies, health systems, households and individuals. Such a strategy must involve individuals, households and communities because the causal risk factors are deeply embedded in the social and cultural framework of communities. This will require a significant change in the way healthcare is implemented: a move away from diagnosis and treatment towards prevention and the promotion of wellbeing.

To reduce premature morbidity and mortality caused by NCDs, governments will need to invest in mHealth strategies to improve patient awareness of their own health and encourage them to manage their own wellbeing. Over time, this should free up resources that can then be focused on the patients most in need, while relieving the economic burden of NCDs on society as a whole and eventually leading to increased productivity.

The Lord Crisp is right to suggest that the global NCD burden will only be successfully addressed by changing the way healthcare is delivered.
The “complexity” of global health issues, suggested by speakers at the London NCD debate, is more a function of the forces protecting the status quo rather than the issues themselves. Mobile networks are ubiquitous. mHealth is operationally relevant. Governments are slow to address effectively the NCD burden. Is the missing part recruiting the help of Mark Zuckerberg?

view in full page

We have always been and always will be married to our own health. In the future, however, we will be taking greater responsibility for it. The British government is encouraging more people to use modern technology to increase control over their health. Under the new UK NHS Mandate launched on 14th November 2012 by Health Secretary Jeremy Hunt, patients will be encouraged to give feedback on the quality of their care, so others can then choose between hospitals. By 2015, modern communications technology is expected to play a substantially bigger role in the UK’s healthcare system and a significant proportion of patients will be booking GP appointments online and ordering repeat prescriptions over the internet. Launching The Mandate, Hunt said: “Never in its long history has the NHS faced such rapid change in our healthcare needs, from caring for an older population, to managing the cost of better treatments, to seizing the opportunities of new technology.”

The UK Mandate marks the beginnings of a redefinition of health and healthcare away from its focus on disease towards a focus on patients and individual wellbeing. A significant driver of this shift is the rising cost of care. In January 2012, Standards & Poors published a report suggesting that the creditworthiness of leading developed countries would be in jeopardy if they did not stem the escalating cost of healthcare. Highlighting the US, Germany, the UK and France, it said: “We project that healthcare costs for a typical advanced economy will stand at 11.1 per cent of GDP by 2050, up from 6.3 per cent of GDP in 2010.”

In the US, which is richer than other countries, the situation is particularly bad and Americans are willing to spend more on healthcare. In 2000 US healthcare was 13.8% of GDP, by 2010 it had increased to 19.8%. Over the past 40 years, healthcare costs in the US have been rising significantly faster than the overall economy or personal incomes, a trend that cannot continue forever. Americans pay hospitals and doctors more than most patients do in other rich countries. US insurance incentives entice doctors and patients to use expensive medical services more than is often warranted. Americans rely more on costly specialists, who tend to overuse advanced imaging technologies and resort to costly surgical or medical procedures a lot more than doctors do in other countries. This suggests that wealth as well as aging is a significant driver of health costs.

As advanced industrial economies become wealthier, their healthcare spends converge with that of the US. According to Stuart Fletcher, the CEO of BUPA, British patients are increasingly bi-passing health insurance companies and paying private hospitals and specialists directly and specialist fees are continuing to rise. Rising fees and the relative lack of competition and transparency among private hospitals slows the rate that people take up private health insurance. In the medium-term this, says Fletcher, will create an “affordability crunch”. BUPA’s European and US business saw half year profits for Europe and North America fall by 22%. BUPA has responded to the changing market conditions by offering patients more power and greater choice. BUPA is trialling a new scheme where it acts as a broker for patients, helping them negotiate the best price and quality of treatment on 70 common conditions.

 This is not only an issue for health insurers. In most countries healthcare expenditure is rising twice as fast as economic growth. This suggests that unless something is done to change the situation, healthcare systems are economically unsustainable and this will inevitably lead to healthcare programmes being either reduced or cut. Healthcare costs are set to escalate further because of the worldwide pandemic of chronic non-communicable diseases: cancer, diabetes, heart and respiratory conditions. Margaret Chan, the Director General of the World Health Organization views chronic non communicable diseases as “The biggest threat to the 21st Century”. Today, 60% of all deaths are due to these diseases: twice the number due to communicable diseases. However, this is not only about mortality, it is also about morbidity and dependency and the economic impact on both treatment and lost productivity, which has been estimated to be nearly US$50 trillion over the past 20 years.

Richard Saltman, Professor of Public Health at Emory University in the US, said a key theme as governments seek to curtail healthcare costs, would be “rethinking the balance between collective and individual responsibility.” This raises the prospect in many countries of people being expected to take greater charge of their own health and this is what the new UK NHS Mandate is nudging towards.

Patient focused healthcare means a shift away from reactive medicine comprised of diagnosis and treatment. Reactive healthcare systems primarily treat patients after the onset of disease; incur significant costs and usually do not restore patients to perfect health. To be sustainable, healthcare systems will have to go further than the new UK NHS Mandate. They will have to be redefined away from prevention and treatment of illness to one of promoting well-being. This places a greater importance on mental illness and on complementary and traditional medicine and shifts the emphasis away from hospitals and clinics towards less traditional places including the home where individuals will be better positioned to take control of their own health.

 

view in full page

This year the World was gripped by who would get the keys to the White House. One thing we all learnt from the 2012 Presidential election is that America is a deeply divided society and this is no more evident than in the nation’s capital.

Washington DC, the capital city of the richest country on Earth, has an HIV infection rate of 3.2%, the highest HIV rate of any large city in America and placing it well above many African cities renowned for their high prevalence of HIV AIDS. How can this be so in the world’s wealthiest nation with a plentiful supply of antiretroviral drugs, efficient systems to administer them and effective popular ways of interrupting the spread of the disease?

In North America alone, there are 1.4 million existing cases of HIV AIDS and in the US the disease is the sixth-leading cause of death among 25 to 44 year-olds. Over the past decade, the US has been stuck at about 50,000 new infections of HIV AIDS each year, while in the rest of the world the rate of new infections has slowed. Washington’s high rate of HIV infection is a story of two Americas brought into sharp relieve during the Presidential election: one of affluence and another of neglect, poverty and unresolved social issues.

In July 2012 a premier gathering of some 30,000 people comprised of those working in the field of HIV, as well as policy makers, persons living with HIV and other individuals committed to ending the pandemic, converged on Washington DC to participate in the 19th International AIDS Conference. It was the first time the conference could be held in the US thanks to bipartisan action by Presidents Obama and George W. Bush and the Congress to lift the ban on people living with HIV entering the US.

Participants celebrated the fact that the global AIDS pandemic is under control and over the past five years, the rate of new annual HIV AIDS infections dropped significantly. A fact ceased on by Secretary of State Hilary Clinton in her opening remarks to the conference, “The ability to prevent and treat the disease has advanced beyond what many might have reasonably hoped 22 years ago.”

Since the AIDS pandemic started in the early 1980s, more than 60 million have been infected with HIV and nearly 30 million died of HIV-related causes. HIV is one of the world's leading infectious killers, claiming more than 25 million lives over the past 30 years. In 2011, there were approximately 34 million people living with HIV. HIV AIDS affect economies, health systems, households and individuals by reducing labour productivity, increasing medical treatment costs and lost savings.

HIV AIDS is most threatening to people between the ages of 18 and 44 and therefore affects economies and households by killing off young adults. It significantly weakens nations and slows their economic growth by reducing the taxable population and resources available for public expenditure, such as education and health services. At the household level, HIV AIDS increases the cost of medical care, while the ability for a family to earn income or undertake productive work decreases. The loss of adults in a family has dramatic implications for family wellbeing and the growing prevalence of women infected by HIV AIDS has significant repercussions for future generations.

For many years, there were no effective treatments for AIDS, but things are very different today as sufferers can use a number of drugs to treat their infection. Although there is no cure for HIV infection, antiretroviral therapy (ART) can suppress HIV by controlling the replication of the virus within a person's body and allow an individual's immune system to strengthen and regain the power to fight off infections. With ART, people with HIV can live healthy and productive lives.

The 2012 Washington International AIDS Conference closed with the message that, short of a vaccine and cure, getting treatment to more of the world's 34 million sufferers is critical to curbing the epidemic. Nobel Laureate Francoise Barre-Sinoussi, co-discoverer of the AIDS virus said, "It is unacceptable," that scientifically proven treatment and prevention tools are not reaching people who need them most. However, in recent years there have been significant successes in this regard. By the end of 2011 more than 8 million people living with HIV in low- and middle-income countries were receiving ART. This is a 20-fold increase in the number of people receiving ART in developing countries between 2003 and 2011 and a 20% increase in just one year: from 6.6 million in 2010 to more than 8 million in 2011.


In the US and other rich countries many HIV patients are taking a combination of antiretroviral drugs; a regimen known as highly active antiretroviral therapy (HAART). When successful, combination therapy can reduce the level of HIV in the bloodstream to very low levels and sometimes enable the body's immune cells to rebound to normal levels.
In May 2003, when antiretroviral therapies were not generally available, especially in developing countries, the US Congress approved President George W. Bush’s request for a five-year, $15 billion programme that launched the US Global AIDS initiative and the President's Emergency Plan for AIDS Relief (PEPFAR). Although President Bush advocated HIV AIDS as a health and human rights issue, it is reasonable to assume his motivation was also influenced by the pandemic’s negative impact on economic development.

Fast forward to December 2012 and Secretary Clinton commemorated World AIDS Day by unveiling the PEPFAR Blueprint: Creating an AIDS-free Generation that provides an actionable strategy to reduce and control the AIDS epidemic within the next four to five years. PEPFAR spends nearly US$7 billion a year in more than 35 countries. It is supported by state-of-the-art technology, scalable global distribution systems and influential organisations such as the Melinda and Bill Gates Foundation and the Clinton Foundation.

Researchers are working to develop new therapies known as fusion and entry inhibitors that can prevent HIV from attaching to and infecting human immune cells. Efforts are also underway to identify new targets for anti-HIV medications and to discover ways of restoring the ability of damaged immune systems to defend against HIV and the many illnesses that affect HIV-infected individuals. Ultimately, advances in rebuilding the immune system in HIV patients will benefit people with a number of serious illnesses, including Alzheimer's disease, cancer, multiple sclerosis and immune deficiencies associated with aging and premature birth.

The management of HIV AIDS is challenged by the fact that in many high-prevalence countries, the number of people becoming infected with HIV each year exceeds the number starting antiretroviral therapy, which perpetuates the growth of the epidemic. For AIDS to be controlled this phenomenon needs to be reversed. A 2011 study showed that antiretroviral therapy reduces an infected person’s chances of transmitting the virus through sexual intercourse by 96%. When HIV positive pregnant women take antiretroviral drugs fewer than 5% of their babies become infected. Circumcision reduces a man’s chances of acquiring HIV sexually by about 60%. Secretary Hilary Clinton’s Blueprint to reduce and manage the global HIV AIDS epidemic is simple: control HIV by a concerted effort that starts more infected people on antiretroviral therapy, ensures that every HIV-positive pregnant woman is treated and circumcise men in high-prevalence countries. Within four to five years, this strategy is expected to produce a tipping point that would allow the disease to start burning itself out.

Despite continued intensive research we are still a long way from achieving a safe, effective and affordable AIDS vaccine. Until such a time, using condoms is by far the most cost effective and scalable means of preventing the transmission of HIV. The No1 means of transmitting HIV infection is unprotected sex, which encompasses oral, anal and vaginal sex. Since the surest form of transmission is blood-to-blood, this risk is greatly increased with trauma to the oral cavity. Persons with bleeding gums, ulcers, genital sores or STDs have an increased risk of transmission through oral contact.

Washington’s high incidence of HIV infection is a story of sex and the city. Today, the majority of the world’s poorest people live in urban areas, which are incubators of disease and Washington DC is no exception. Worldwide, there are some 600 cities with more than one million inhabitants. In cities throughout the world there are entrenched and unresolved social issues, under privilege, lack of education, low esteem, drug abuse and alcoholism and too much unprotected sex and too many citizens not having a clue about their sexual partner’s HIV status. In the US this will manifest itself every week throughout 2013, when about 1,000 Americans, with a high concentration in Washington DC, will acquire HIV infection and some will eventually die from it.

view in full page

Africa is sick. Ninety per cent of the world’s cholera cases occur in Africa. Meningococcal meningitis is epidemic in most African countries. Yellow fever is endemic in 23 African countries. Africa has more than 28 million HIV/AIDS cases and 75% of the world’s AIDS population live in sub-Saharan Africa. Of the one million annual malaria deaths, 90% occur in the same region. Measles are common throughout Africa and result in high levels of morbidity and mortality. Lassa fever accounts for about 0.4 million deaths each year and avian influenza is endemic in many African countries.

This is not the whole story. In addition to being plagued with infectious diseases, Africa has a neglected epidemic of chronic non-communicable disease (NCDs). Over the next decade the continent is projected to experience the largest increase in mortality rates from cardiovascular disease, cancer, respiratory disease and diabetes.

Although, international health agencies and national governments are beginning to recognize and confront the significant global burden of NCDs, its awareness in Africa is still relatively low and political leaders there have not shown much interest in NCDs and this has been reflected in the allocation of health budgets. This neglect compounds Africa’s healthcare and development challenges, since the projected rise in NCDs throughout the continent is expected to occur on a compressed timeline compared to high income countries and Africa has restricted capacity to respond to the magnitude of its disease burden.

International organisations have flagged the magnitude and the urgency of the challenge. Healthcare advice from numerous non government agencies in the developed world on ways to deal with Africa’s escalating disease burden is forthcoming. This has been especially the case over the past decade when humanitarian aid budgets have peaked. Agency recommendations have been high on overall strategy and low on cost effective and scalable means of delivering such strategy. 

 

Most advice includes epidemiological surveillance, primary programmes that target healthy populations and secondary preventative programmes aimed at reducing complications in affected populations. All agencies agree that human resources are crucial to viable African health systems. Hitherto, human resources have been a neglected component of African healthcare. A common implementation strategy recommended and implemented by several non government agencies is to organise health workers from the developed world to spend time in African countries teaching the teachers. To assist such programmes, some agencies recommend that African governments build more roads to enable health workers to gain better access to rural areas where healthcare provision is poor or non-existent. Education is crucially important, but the key question is, how do you educate enough people to make a difference?

Africa has a population of over one billion; about 15% of global population,but only 2% of global GDP and its population is projected to double by 2050. Africa is exposed to multiple health risks combined with inadequate preventative healthcare and education. Projected trends of Africa’s disease burden and consequent rates of morbidity and mortality highlight the inadequacy of some popular traditional response to Africa’s healthcare challenge. In addition to the enormity of its disease burden, Africa, which has weak health systems, also has significant long standing structural, logistic, human and organisational barriers to the implementation of well intended traditionalhealthcare programmes many of which focus on teaching the teachers.

So, despite well intended traditional interventions, Africa’s disease burden continues to grow and its overall effect is likely to decrease productivity, lower competiveness, increase fiscal pressure, expand poverty and create greater inequity in most African countries. More scalable and effective solutions are required. These should build on Africa’s strength, which are her established and fast growing telecommunications networks and her relative absence of healthcare legacy systems. Current trends in disease prevalence and treatment costs will force African countries to make deliberate and innovative choices in order to address their disease burdens in sustainable and effective ways. Such choices are more likely to employ modern technology than to build more roads. In Africa, mobile penetration exceeds infrastructure development, including paved roads and access to electricity and the internet. According to the World Health Organization’s (WHO) Global Observatory for mHealth some 40 African countries are using mobile health services.

 

Africa is the fastest-growing mobile telephone market in the world and the biggest after Asia. Over the past five years the number of subscribers on the Continent has grown some 20% each year. By the end of 2012 it is projected that Africa will have 735 million mobile subscribers.The nature of Africa’s mobilemarket is also changing. Today, smart phone penetration rate in Africa is estimated to be about 18%: almost one in five and projected to reach 40% by 2015. While patchy, mobile penetration rates in Sub Saharan Africa, where the disease burden is greatest, are not low and the rate of smart phone penetration is estimated to be about 20%.

In 2007 Sarafaricom, a leading mobile phone network in Kenya, launched M-Pesa, a mobile phone‐based payment and money transfer service for people too poor to have a bank account. M-Pesa spread quickly and has become the most successful mobile phone‐based financial service in the developing world. Today there are some 17 million registered M-Pesa accounts in Kenya. It is only a small step to offer a mobile health information service for all M-Pesa account holders.

Africa’s new highways to carry healthcare information are virtual rather than physical. They already exist, they are extensive and, over the course of the next five years, are projected to rapidly expand and improve. With such an infrastructure one teacher can educate millions of people, which is significantly more cost effective and sustainable than traditional healthcare programmes.

Further, Africa will not be able to diagnose and treat its way out of its disease burden. Increasingly, healthcare programmes will need to emphasis prevention, alongside efforts to strengthen health systems to provide early diagnosis; targeted cost-effective and scalable treatments that are fiscally sustainable depending on countries’ epidemiological profile. Such solutions will need to fit complex, overstretched and under-resourced health systems; address the enormity of the escalating disease burden and bring about desired changes in specific African countries’ health systems. This cannot be achieved only by repeating traditional healthcare programmes delivered by non government agencies from developed countries.

According to the International Telecommunications Union there are some 5 billion wireless subscribers in the world today and over 70% of these reside in low and middle income countries. In 2011, Africa held its first mobile health summit in South Africa and firmly put mobile telephony at the centre of improving healthcare in poor countries. A 2011 WHO global survey of the use of mobile telephony in healthcare; mHealth, reported that commercial wireless signals cover over 85% of the world’s population. Eighty three per cent of the 122 countries surveyed in the Report used mobile phones for free emergency calls, text messaging and pill reminders.

Modern technologies have the scalability to provide the basis for Africa to develop country-congruent health policies that are locally applicable. Technological systems such as mobile telephony, the internet and biometric identification, which are appropriately implemented, have the capacity to empower individuals and encourage them to take care of their own health. Further, such technologies have the capacity to improve targeting, reduce fraud and increase access to healthcare. Technologically based healthcare strategies offer Africa an opportunity to leapfrog its ineffective traditional healthcare systems and begin to manage the enormity of its disease burden and, in turn, may benefit the whole world by demonstrating the benefits of patient centred healthcare.   

view in full page

Is it possible for doctors to provide care without being perceived as taking sides during conflicts? This question is posed more and more as attacks on health workers in war zones increase.

In January 2012, Khalil Rashid Dale, a doctor travelling in a clearly marked International Committee of the Red Cross (ICRC) vehicle to Quetta, the capital of Baluchistan province in Pakistan, was abducted by unknown armed men. Some four months later the doctor’s beheaded body was found in an orchard. Also in January two Médicins Sans Frontières (MSF) health workers were killed in Mogadishu, Somalia. The consequences of such attacks are disproportionate in their impact. A consequence of the Somalia killings led to the MSF closing two 120-bed medical facilities in Mogadishu, which served a population of some 200,000 and which over the previous year, had treated close to 12,000 malnourished children and provided measles’ vaccinations and treatment to another 68,000 patients.

In 2011 Robin Coupland, a former trauma surgeon, now a medical adviser with the ICRC, co-authored Health Care in Danger, a study, which describes how and why health workers get caught in the cross fire and what the consequences are when they do. The study was used to launch an ICRC campaign to raise awareness of the problem and make a difference to health workers on the ground.

For some people however, it is impossible for doctors to provide care without being perceived as taking sides during conflicts. Some argue that as the quantum of humanitarian aid has increased over the past decade, so humanitarian aid agencies have been compelled to rely on sub-contracting in actual conflict areas. This, it is suggested, provides a breeding ground for aid corruption to finance nefarious elites and to further destabilize conflict areas, implying that healthcare activities of humanitarian organisations in war-torn regions have become increasingly politicised. Even agencies that make considerable efforts to disassociate themselves from political actors and project an image of neutrality have not been immune from attack.

Do warring factions perceive health workers as supporting the enemy and therefore see them as legitimate targets? Or are health workers targeted because they represent an opportunity to amplify messages to a global audience? It is likely both are true, but the impact on society as a result of removing vital healthcare in war zones, due to these attacks, can have devastating consequences.

view in full page