Archive for the ‘Medicine’ Tag

Future Treatments of Pain

When something causes less pain than expected it is even possible for it to feel pleasant, a new study reveals. These findings may one day play a key role in treating pain and substance abuse. If you accidently kick your toe against a doorframe you are probably going to find it very painful. As a purely intellectual experiment, imagine purposefully kicking a doorframe hard enough to potentially break your toe. When it turns out your toe has been battered but not broken, the pain may be interpreted more as a relief. It is not hard to understand that pain can be interpreted as less severe when an individual is aware that it could have been much more painful. Less expected, however, is the discovery that pain may be experienced as pleasant if something worse has been avoided!

When working as a research fellow at Oxford University, Dr Leknes became curious about what can be called the "it could have been worse" phenomenon. How is the experience of pain affected by a feeling of relief from realizing that it was not as bad as expected? Dr Leknes recruited 16 healthy subjects who prepared themselves for a painful experience. They were repeatedly exposed to heat of varying intensity applied to their arm for four seconds. The experiments were carried out in two different contexts: in the first, the heat was either not painful or only moderately painful – about the same as firmly holding a coffee cup that is slightly too hot. In the second, the heat was either moderately or intensely painful. In this context, moderate pain was the lesser of two evils.

The research subjects reported how they interpreted the pain. In addition, while they were exposed to the stimuli their brain activity was measured by functional magnetic resonance imaging. The intense heat triggered negative feelings among all participants, whereas the non-painful heat produced positive reactions. What intrigued the researchers was the subjects’ response to moderate pain. In the experiments where moderate pain was the worst alternative, the pain felt was unpleasant. In the instances where it was the best alternative, the subjects experienced the moderate pain as positive – even comforting. The likely explanation is that the subjects were prepared for the worst and thus felt relieved when they realized the pain was not going to be as bad as they had feared. In other words, a sense of relief can be powerful enough to turn such an obviously negative experience as pain into a sensation that is comforting or even enjoyable.

When a sense of relief turned pain into pleasure, researchers found activity in the middle of the frontal lobes of the brain, areas normally associated with comfort and the relative value of a specific experience (Image A). At the same time, they witnessed a change in activity in the same region of the brain stem that regulates pain, for example, when morphine is administered in medical treatment (Image B)

The functional magnetic resonance imaging examination revealed that the brain changed how it processed moderate pain according to the context and what the alternative was. When the pain was comforting, there was more activity in the areas of the brain associated with pleasure and pain relief and less activity in the areas associated with pain.

Can this become a future in treating pain?

The study illustrates that exposure to one and the same stimulus is interpreted very differently among individuals and that the experience is connected to expectation and context. Some individuals like the burning sensation of eating chili peppers, for example, while others enjoy sadomasochistic sex. Also, envisioning that an even worse alternative exists than what is actually experienced may even help a person to interpret involuntary pain as something agreeable. Nevertheless, pain is generally a highly unpleasant experience and current pain alleviation treatments are inadequate for many people. That is why it is so important to find out how and to what degree the brain can control pain on its own.

Would it always be advisable then for a doctor to inform a patient that a procedure or treatment is going to be very painful? In some situations this may be a good approach, but not always. Doctors observe that their patients react very differently to the information they are given; certain patients are likely to experience a genuine sense of relief if they prepare for the worst only to find it not so bad after all, whereas others prefer to avoid worrying beforehand and want to know as little as possible about what they will be undergoing.

Substance abuse – from pleasure to relief: Relief is also likely a vital factor in substance abuse. Over time, the effect of alcohol and drugs will change from triggering feelings of pleasure to primarily alleviating the discomfort of addiction. The brain’s regulatory processes change, causing substance abusers to experience a shift; at some point, they use alcohol and drugs in order to achieve a neutral state and avoid feeling awful.

By studying relief in order to understand how this process works, scientists came up with new ideas for treating substance dependence more effectively. From pain research, scientists know that the relief mechanisms in the brains of patients with chronic pain become disrupted. This may be something shared by patients suffering from pain and alcohol and drug addiction alike.


Choosing Wisely


An expanded group of medical societies has released an update to last year’s Choosing Wisely list of common but often unnecessary tests and procedures, a host of cardiovascular no-goes among them. Announced Thursday, the new list includes almost 90 tests and procedures, with 17 medical groups as signatories, including, for the first time, the American Society of Echocardiography, the Society of Cardiovascular Computed Tomography, and the Society for Thoracic Surgeons, among others.

As reported by heartwire, this year’s list takes particular note of a peculiar kind of patient—the one without symptoms. No fewer than 12 of the guidelines issued as part of the American Board of Internal Medicine Foundation Choosing Wisely campaign caution physicians that asymptomatic patients probably don’t need a given treatment. A few examples pertinent for cardiologists, along with the society that recommended them, include not screening for carotid artery stenosis in asymptomatic adult patients (American Academy of Family Physicians). No routine pre-discharge echocardiogram after cardiac valve-replacement surgery (Society for Thoracic Surgery). Avoid using stress echocardiograms on asymptomatic patients who meet low-risk scoring criteria for coronary disease (American Society of Echocardiography). Patients who have no cardiac history and good functional status do not require preoperative stress testing before noncardiac thoracic surgery (American Society of Echocardiography). Don’t use coronary artery calcium scoring for patients with known coronary artery disease, including stents and bypass grafts (Society for Cardiovascular Computed Tomography). Don’t repeat echocardiograms in stable, asymptomatic patients with a murmur/click, where a previous exam revealed no significant pathology (American Society of Echocardiography).

American Board of Internal Medicine president Dr Christine Cassel advises such rules of thumb seek to change the mindset in physicians and patients alike that more are better, which leads to wasteful spending and sometimes puts the patient at risk. The watchdog organization Consumer Reports is working with other consumer-oriented groups such as AARP, the Leapfrog Group, and the National Partnership for Women & Families as well as Wikipedia to spread the Choosing Wisely guidelines to patients. This public outreach seeks to educate Americans that not every test and procedure is appropriate for a particular condition. Sometimes, patients request a treatment they don’t need. It takes much longer to dissuade a patient from asking for test than actually ordering the test.

CT Angiography Sees Coming Heart Attack Risk From Afar

Coronary computed tomography angiography is an indispensable tool for determining the risk of heart attacks and other adverse cardiac events in patients with suspected coronary artery disease, but no treatable risk factors, such as high cholesterol or high blood pressure, according to a new study published online in the journal Radiology. Coronary computed tomography angiography should be considered as an appropriate first-line test for patients with atypical chest pain and suspected but not confirmed coronary artery disease.

Heart disease is the leading cause of death in the U.S., according to the Centers for Disease Control and Prevention. Treatment often involves addressing modifiable cardiovascular risk factors such as elevated cholesterol, high blood pressure, diabetes and smoking. However, some risk factors, like family history are not modifiable, and no risk models exist to help guide internists to identify those symptomatic patients without cardiac risk factors who are at an increased risk of death and myocardial infarction.

This setup, where patients are symptomatic but have no cardiac risk factors, comes up often in clinical practice. Internists lack a good tool to stratify these patients into risk groups. Coronary computed tomography angiography is a noninvasive test that has shown high accuracy for the diagnosis or exclusion of coronary artery disease in individuals. However, referral for patients with suspected coronary artery disease is often based on clinical risk factor scoring. Less is known about the prognostic value of Coronary computed tomography angiography in individuals with no medically modifiable risk factors.

In the first study of its kind, Dr. Leipsic and colleagues correlated Coronary computed tomography angiography findings with the risk of major adverse cardiac events in patients with suspected coronary artery disease but no medically modifiable risk factors. They selected the data from the Coronary CT Angiography Evaluation For Clinical Outcomes: An International Multicenter CONFIRM registry.

After an average follow-up of 2.3 years, 104 patients had experienced a major adverse cardiovascular event. The researchers identified a high prevalence of coronary artery disease in the study group, despite the absence of modifiable risk factors. More than one-quarter of the patients had non-obstructive disease or disease related to the buildup of plaque in the arteries, and another 12 percent had obstructive disease with a greater than 50 percent narrowing in a coronary artery.

Scientists found that patients with narrowing of the coronary arteries on CT had a much higher risk of an adverse cardiac event. This was true even for those without a family history of heart disease. Both symptomatic and asymptomatic patients with obstructive disease faced an increased risk for a major cardiac event. In contrast, the absence of coronary artery disease on Coronary computed tomography angiography was associated with a very low risk of a major event.

These findings highlight the need for refinement in the evaluation of individuals who may be missed by traditional methods of coronary artery disease evaluation. If a patient shows up with vague symptoms and no medically modifiable risk factors, internists often dismiss them or do a treadmill test, which won’t identify atherosclerosis and only has a modest sensitivity for detecting obstructive disease. Coronary computed tomography angiography could help address this problem by helping to diagnose or rule out coronary artery disease and identifying those who may benefit from more intensive therapy.

The researchers continue to study The Coronary CT Angiography Evaluation For Clinical Outcomes International Multicenter registry data with the aim to learn more about the relationship between plaque and heart attacks and the long-term outlook for patients with coronary artery disease. Scientists are now collecting data to determine the prognostic value of Coronary computed tomography angiography after five years or more of follow-up, which will be very important for the field.

CONFIRM: The Coronary CT Angiography Evaluation For Clinical Outcomes International Multicenter registry.



The Bedside-to-Bench Program funds research teams seeking to translate basic scientific findings into therapeutic interventions for patients and to increase understanding of important disease processes. The Bedside-to-Bench Program accomplishes this mission by addressing barriers, such as the traditional silos between basic and clinical researchers in biomedical research, which can hinder progress toward finding new therapeutics for patients in need. Bedside-to-Bench teams involve basic and clinical researchers, often from different National Institutes of Health Institutes and Centers. In 2006, the Bedside-to-Bench program’s charge was expanded to unite the efforts of intramural and extramural National Institutes of Health researchers. Intramural science refers to research that takes place on National Institutes of Health campus under the auspices of federal employees, while extramural research is funded by National Institutes of Health and conducted by investigators and institutions outside of National Institutes of Health.

The Bedside-to-Bench program exemplifies the benefits associated with intramural – extramural collaborations; the extramural community gains access to the Clinical Center’s unique resources and the intramural community can pursue innovative research with extramural investigators. Projects are funded by various National Institutes of Health offices and institutes, have represented several research categories such as AIDS, rare diseases, behavioral and social sciences, minority health and health disparities, women’s health, rare diseases drug development, pharmacogenomics, and general.

Through the end of the 2012 program cycle, about 700 principal and associate investigators have collaborated on 209 funded projects with approximately $48M distributed in total bedside-to-bench funding. The introduction of extramural collaborations in 2006 has resulted in partnerships at 74 institutions, 27 of which are Clinical and Translational Science Award sites.

Modern medicine keeps unraveling new ways to investigate autoimmunity, leading to the production of boundless amounts of genetic, cellular and imaging data. Although the precision with which this information can define the etiology and mechanisms of a particular autoimmune disease is encouraging, much work lies ahead until all the knowledge acquired can be translated into the clinic. In ‘Bedside to Bench’, Calliope A. Dendrou, John I. Bell and Lars Fugger discuss the promises and limitations of genome-wide and next-generation genetic studies to provide further understanding of mechanisms driving autoimmune disorders and the role of experimental medicine in the new era of integrative clinical practice and personalized medicine.

In ‘Bench to Bedside’, Lawrence Steinman argues the concept of a ‘hub and spoke’ pattern of T cell activation and organ targeting in multiple sclerosis, inflammatory bowel disease and type 1 diabetes. This paradigm suggests new ways to develop drugs to keep autoreactive T cells in the organ where activation occurs and preclude them from reaching the target organ and cause disease.


New Drugs, Awareness Are Need Of The Hour About Increasing Population With Arterial Stiffness And Hypertension

Image Credits:  Paul Paradis.  Flowers are the sweetest things God ever made and forgot to put a soul into.

Image Credits: Paul Paradis. Flowers are the sweetest things God ever made and forgot to put a soul into.

An aging population grappling with rising rates of hypertension and other cardiometabolic risk factors should prompt an overhaul of how hypertension is diagnosed and monitored and should spur development of drugs with entirely new mechanisms of action. Speaking here at the 2013 International Conference on Prehypertension and Cardiometabolic Syndrome, meeting cochair Dr Reuven Zimlichman of Tel Aviv University, Israel is of the opinion that the definitions of hypertension, as well as the risk-factor tables used to guide treatment, are no longer fitting for a growing number of patients.

In recent decades, the population has been growing older, and the elderly now make up the biggest group of hypertensive patients. This will continue to get bigger in the next few years, as we prolong life and people live longer. This has led to burgeoning numbers of patients with isolated systolic hypertension whose disease is typically ill-diagnosed and underserved by current therapies. Whereas classic, diastolic hypertension is caused by humoral changes and excessive vasoconstrictive factors, isolated systolic hypertension is caused by arterial stiffening. So we have a growing group of patients with hypertension caused by a different mechanism—arterial stiffness—yet we treat these patients with conventional medications that lower blood pressure.

Most antihypertensives today work by producing vasodilation or decreasing blood volume and so are ineffective treatments in isolated systolic hypertension patients. In the future, we will have to start looking for a totally different medication that will aim to improve or at least to stabilize arterial elasticity; that is to say, medications that might affect factors that determine the stiffness of the arteries, like collagen, like fibroblasts. Those are not the aim of any group of antihypertensive medications today.

Taking linear progression into account the definitions of essential and secondary hypertension have changed very little over the past few decades and have typically only been tweaked up or down related to other cardiovascular risk factors. Diastolic hypertension has been the primary goal of treatment, and treatment goals have not adequately taken patient age into account in whom arterial stiffening plays a larger role, and they have typically relied too heavily on threshold cut offs, rather than the linear progression of risk factors and their impact on organ damage.

This means that a patient with a systolic blood pressure of 138 mm HG will be classified as normal risk, while the patient who has 142 mm Hg will be classified as hypertensive, despite there being just 4-mm-Hg difference between the two. We know that if we repeat the measurement we will probably not get the same result. What we have to understand is that various patients will need different approaches. So a patient with hypertension but also with additional risk factors like diabetes, dyslipidemia, etc, probably will need to be treated earlier, at lower levels of blood pressure, and we do not always take this into account.

Existing risk tables incorporate presence or absence of cardiometabolic risk factors but fail to take into account whether elevated lipids or blood pressure or glucose have already had a pathologic effect. For example, we can see sometimes a patient with diabetes—treated or not treated, compliant or not compliant—who has had diabetes 20 or 30 years with no evidence of end-organ damage at all. In the opposite group, you can have a patient who has diabetes and who within two or three years develops renal failure, microalbuminuria, glycoproteinuria, or severe vascular disease. Those two patients might have similar levels of glucose, of hyperglycemia, but they clearly respond in different ways. As such, grading a patient based on a threshold cut point for any given risk factor doesn’t tell internist the whole story, because physicians have to include the individual response of the patient to various risk factors as well. These are partially incorporated into present guidelines, but not enough.

Existing databases could be used to develop algorithms that take this progression of disease into account, in order to better guide hypertension management. The new ambulatory blood-pressure-monitoring devices also measure arterial elasticity. Unquestionably, these will improve our ability to diagnose both the status of the arteries and the changes of the arteries with time as a result of our treatment. So if we treat the patient and we see no improvement in arterial elasticity, or the patient is worse, something is wrong, something is not working—either the patient is not taking the medication, or our choice of medication is not appropriate, or the dose is insufficient, etc.

Evidence is mounting that arterial stiffening is a key predictor of future organ damage. A big proportion of cardiologists are not exposed to new information that is published in recent years regarding the evaluation of arterial properties and prediction of events. In this field, there are very important new findings and also implications regarding treatment.

As researchers learn more about isolated systolic hypertension and the role of arterial elasticity, there are larger more philosophical questions to be considered. We have normograms of how stiff arteries are according to age groups in healthy patients, and we usually compare those values with the values that we measure in patients with problems like metabolic syndrome, hypertension, diabetes, etc. But is stiffening of the arteries a physiological process, part of normal aging, or is it really pathological?

In future, drugs that can slow, stop, or reverse arterial stiffening, but this should give some pause for thought. Imagine somebody who is 80 years old and has, according to age group, so-called ‘normal’ stiffness—will we change the patient’s fate if we improve the status of the arteries at a younger age, even in a ‘normally’ aging subject? Will it prolong life or prevent events? We don’t know.


Amazing Skill of our Brain

Brilliant Blue

Brilliant Blue

Neurobiologists at the Research Institute of Molecular Pathology in Vienna examined how the brain is able to cluster external stimuli into stable categories. Neurobiologists found the answer in the distinct dynamics of neuronal circuits. The journal Neuron published the results in its current issue. How do we manage to know a friend’s face, regardless of the light conditions, the person’s hairstyle or make-up? Why do we always hear the same words, whether they are spoken by a man or woman, in a loud or soft voice? It is due to the amazing skill of our brain to turn a wealth of sensory information into a number of defined categories and objects. The ability to produce constants in a changing world feels natural and effortless to a human, but it is extremely difficult to train a computer to perform the task.

At the Institute of Molecular Pathology in Vienna, neurobiologist Simon Rumpel and his post-doc Brice Bathellier have been able to show that certain properties of neuronal networks in the brain are responsible for the formation of categories. In experiments with mice, neurobiologists produced range of sounds and monitored the activity of nerve cell-clusters in the auditory cortex. They found that groups of 50 to 100 neurons displayed only a limited number of different activity-patterns in response to the different sounds.

Neurobiologists then selected two basic sounds that produced different response patterns and created linear mixtures from them. When the mixture ratio was varied continuously, the answer was not an incessant change in the activity patterns of the nerve cells, but rather a sudden transition. Such dynamic behavior is indicative of the behavior of artificial attractor-networks that have been suggested by computer scientists as a solution to the categorization problem. The findings in the activity patterns of neurons were endorsed by behavioral experiments with mice. The animals were trained to distinguish between two sounds. They were then exposed to a third sound and their reaction was trailed. Whether the answer to the third tone was more like the reaction to the first or the second one, was used as an indicator of the similarity of perception. By looking at the activity patterns in the auditory cortex, neurobiologists were able to predict the reaction of the mice.

The findings that are published in the current issue of the journal Neuron, demonstrate that discrete network states provide a medium for category formation in brain circuits. Neurobiologists suggest that the hierarchical structure of discrete representations might be essential for elaborate cognitive functions such as language processing.


Schizophrenia and Cardiovascular Disease

A new study published in the American Journal of Human Genetics, expands and deepens the biological and genetic links between cardiovascular disease and schizophrenia. Cardiovascular disease is the leading cause of premature death among schizophrenia patients, who die from heart and blood vessel disorders at a rate double that of persons without the mental disorder. “These results have important clinical implications, adding to our growing awareness that cardiovascular disease is under-recognized and under-treated in mentally ill individuals, says study first author Ole Andreassen, MD, PhD, an adjunct professor at the University of California, San Diego School of Medicine and professor of psychiatry at the University of Oslo. Its presence in schizophrenia is not solely due to lifestyle or medication side effects. Clinicians must recognize that individuals with schizophrenia are at risk for cardiovascular disease independent of these factors.

Led by principal investigator Anders M. Dale, PhD, professor of radiology, neurosciences, psychiatry and cognitive science at University of California, San Diego School of Medicine, and an international team of researchers used a novel statistical model to magnify the analytical powers of genome-wide association studies. These are studies in which differing bits of sequential DNA, called single nucleotide polymorphisms, in persons and groups are compared to find common genetic variants that might be linked to a trait or disease. The researchers boosted the power of genome-wide association studies by adding information based on genetic pleiotropy, the concept that at least some genes influence multiple traits or phenotypes.

This approach is different in that study uses all available genetic information for multiple traits and diseases, not just single nucleotide polymorphisms below a given statistical threshold. This significantly increases the power to discover new genes by leveraging the combined power across multiple genome-wide association studies of pleiotropic traits and diseases. The scientists confirmed nine single nucleotide polymorphisms linked to schizophrenia in prior studies, but also identified 16 new loci, some of which are also associated with cardiovascular disease. Among these shared risk factors are triglyceride and lipoprotein levels, waist-hip ratio, systolic blood pressure and body mass index.

The findings suggest that shared biological and genetic mechanisms can help explain why schizophrenia patients have a greater risk of cardiovascular disease, says study co-author Rahul S. Desikan, MD, PhD, research fellow and radiology resident at the University of California, San Diego School of Medicine.

In addition to schizophrenia, this new analysis method can be used to examine the genetic overlap between a number of diseases and traits. Examining overlap in common variants can shed insight into disease mechanisms and help identify potential therapeutic targets for common diseases.


%d bloggers like this: