Monday, April 8, 2013
Thursday, March 21, 2013
A single-take video of the artist Messe Kopp walking through Jerusalem - he is walking backwards but the video is played in reverse so it appears that he is walking forward and the whole world is moving in reverse:
Now watch the video "the wrong way round" or perhaps not - this is how the video was actually taken:
Now watch the video "the wrong way round" or perhaps not - this is how the video was actually taken:
Saturday, March 9, 2013
The psychiatrist held the door open for me and my first thought as I entered the room was “Where is the couch?”. Instead of the expected leather couch, I saw a patient lying down on a flat operation table surrounded by monitors, devices, electrodes, and a team of physicians and nurses. The psychiatrist had asked me if I wanted to join him during an “ECT” for a patient with severe depression. It was the first day of my psychiatry rotation at the VA (Veterans Affairs Medical Center) in San Diego, and as a German medical student I was not yet used to the acronymophilia of American physicians. I nodded without admitting that I had no clue what “ECT” stood for, hoping that it would become apparent once I sat down with the psychiatrist and the depressed patient.
I had big expectations for this clinical rotation. German medical schools allow students to perform their clinical rotations during their final year at academic medical centers overseas, and I had been fortunate enough to arrange for a psychiatry rotation in San Diego. The University of California (UCSD) and the VA in San Diego were known for their excellent psychiatry program and there was the added bonus of living in San Diego. Prior to this rotation in 1995, most of my exposure to psychiatry had taken the form of medical school lectures, theoretical textbook knowledge and rather limited exposure to actual psychiatric patients. This may have been part of the reason why I had a rather naïve and romanticized view of psychiatry. I thought that the mental anguish of psychiatric patients would foster their creativity and that they were somehow plunging from one existentialist crisis into another. I was hoping to engage in some witty repartee with the creative patients and that I would learn from their philosophical insights about the actual meaning of life. I imagined that interactions with psychiatric patients would be similar to those that I had seen in Woody Allen’s movies: a neurotic, but intelligent artist or author would be sitting on a leather couch and sharing his dreams and anxieties with his psychiatrist.
I quietly stood in a corner of the ECT room, eavesdropping on the conversations between the psychiatrist, the patient and the other physicians in the room. I gradually began to understand that that “ECT” stood for “Electroconvulsive Therapy”. The patient had severe depression and had failed to respond to multiple antidepressant medications. He would now receive ECT, what was commonly known as electroshock therapy, a measure that was reserved for only very severe cases of refractory mental illness. After the patient was sedated, the psychiatrist initiated the electrical charge that induced a small seizure in the patient. I watched the arms and legs of the patients jerk and shake. Instead of participating in a Woody-Allen-style discussion with a patient, I had ended up in a scene reminiscent of “One Flew Over the Cuckoo's Nest”, a silent witness to a method that I thought was both antiquated and barbaric. The ECT procedure did not take very long, and we left the room to let the sedation wear off and give the patient some time to rest and recover. As I walked away from the room, I realized that my ridiculously glamorized image of mental illness was already beginning to fall apart on the first day of my rotation.
During the subsequent weeks, I received an eye-opening crash course in psychiatry. I became acquainted with DSM-IV, the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders which was the sacred scripture of American psychiatry according to which mental illnesses were diagnosed and classified. I learned ECT was reserved for the most severe cases, and that a typical patient was usually prescribed medications such as anti-psychotics, mood stabilizers or anti-depressants. I was surprised to see that psychoanalysis had gone out of fashion. Depictions of the USA in German popular culture and Hollywood movies had led me to believe that many, if not most, Americans had their own personal psychoanalysts. My psychiatry rotation at the VA took place in the mid 1990s, the boom time for psychoactive medications such as Prozac and the concomitant demise of psychoanalysis.
I found it exceedingly difficult to work with the DSM-IV and to appropriately diagnose patients. The two biggest obstacles I encountered were a) determining cause –effect relationships in mental illness and b) distinguishing between regular human emotions and true mental illness. The DSM-IV criteria for diagnosing a “Major Depressive Episode”, included depressive symptoms such as sadness or guilt which were severe enough to “cause clinically significant distress or impairment in social, occupational, or other important areas of functioning”. I had seen a number of patients who were very sad and had lost their job, but I could not determine whether the sadness had impaired their “occupational functioning” or whether they had first lost their job and this had in turn caused profound sadness. Any determination of causality was based on the self-report of patients, and their memories of event sequences were highly subjective.
The distinction between “regular” human emotions and mental illness was another challenge for me and the criteria in the DSM-IV manual seemed so broad that what I would have considered “sadness” was now being labeled as a Major Depression. A number of patients that I saw had severe mental illnesses such as depression, a condition so disabling that they could hardly eat, sleep or work. The patient who had undergone ECT on my first day belonged to that category. However, the majority of patients exhibited only some impairment in their sleep or eating patterns and experienced a degree of sadness or anxiety that I had seen in myself or my friends. I had considered transient episodes of anxiety or unhappiness as part of the spectrum of human emotional experience. The problem I saw with the patients in my psychiatry rotation was these patients were not only being labeled with a diagnosis such as “Major Depression”, but were then prescribed antidepressant medications without any clear plan to ever take them off the medications. By coincidence, that year I met the forensic psychiatrist Ansar Haroun, who was also on faculty at UCSD and was able to help me with my concerns. Due to his extensive work in the court system and his rigorous analysis of mental states for legal proceedings, Haroun was an expert on causality in psychiatry as well the definition of what constitutes a truly pathological mental state.
Regarding the issue of causality, Haroun explained to me the complexity of the mind and mental states makes it extremely difficult to clearly define cause and effect relationships in psychiatry. In infectious diseases, for example, specific bacteria can be identified by laboratory tests as causes of a fever. The fever normally does not precede the bacterial infection nor does it cause the bacterial infection. The diagnosis of mental illnesses, on the other hand, rests on subjective assessments of patients and is further complicated by the fact that there are no clearly defined biological causes or even objective markers of most mental illnesses. Psychiatric diagnoses are therefore often based on patterns of symptoms and a presumed causality. If a patient exhibits symptoms of a depressed mood and has also lost his or her job during that same time period, psychiatrists then have to diagnose whether the depression was the cause of losing the job or whether the job loss caused depressive symptoms. In my limited experience with psychiatry and the many discussions I have had with practicing psychiatrists, it appears that the leeway given to psychiatrists to assess cause-effect relationships may result in an over-diagnosis of mental illnesses or an over-estimation of their impact.
I also learnt from Haroun that the question of how to address the distinction between the spectrum of “regular” human emotions and actual mental illness had resulted in a very active debate in the field of psychiatry. Haroun directed me towards the writings of Tom Szasz, who was a brilliant psychiatrist but also a critic of psychiatry, repeatedly pointing out the limited scientific evidence for diagnoses of mental illness. Szasz’ book “The Myth of Mental Illness” was first published in 1960 and challenged the foundations of modern psychiatry. One of his core criticisms of psychiatry was that his colleagues had begun to over-diagnose mental illnesses by blurring the boundaries between everyday emotions and true diseases. Every dis-ease (discomfort) was being turned into a disease that required a therapy. The reasons for this overreach by psychiatry were manifold, ranging from society and the state trying to regulate what was acceptable or normal behavior to psychiatrists and pharmaceutical companies that would benefit financially from the over-diagnosis of mental illness. An excellent overview of his essays can be found in his book “The Medicalization of Everyday Life”. Even though Tom Szasz passed away earlier this year, psychiatrists and researchers are now increasingly voicing their concerns about the direction that modern psychiatry has taken. Allan Horwitz and Jerome Wakefield, for example, have recently published “The Loss of Sadness: How Psychiatry Transformed Normal Sorrow into Depressive Disorder” and “All We Have to Fear: Psychiatry's Transformation of Natural Anxieties into Mental Disorders”. Unlike Szasz who even went as far as denying the existence of mental illness, Horowitz and Wakefield have taken a more nuanced approach. They accept the existence of true mental illnesses, admit these illnesses can be disabling and acknowledge the patients who are afflicted by mental illnesses do require psychiatric treatment. However, Horowitz and Wakefield criticize the massive over-diagnosis of mental illness and point out the need to distinguish true mental illnesses from normal sadness and anxiety.
Before I started my psychiatry rotation in San Diego, I had been convinced that mental illness fostered creativity. I had never really studied the question in much detail, but there were constant references in popular culture, movies, books and TV shows to the creative minds of patients with mental illness. The supposed link between mental illness and creativity was so engrained in my mind that the word “psychotic” automatically evoked images of van Gogh’s paintings and other geniuses whose creative minds were fueled by the bizarreness of their thoughts. Once I began seeing psychiatric patients who truly suffered from severe disabling mental illnesses, it became very difficult for me to maintain this romanticized view of mental illness. People who truly suffered from severe depression had difficulties even getting out of bed, getting dressed and meeting their basic needs. It was difficult to envision someone suffering from such a disabling condition to be able to write large volumes of poetry or to analyze the data from ground-breaking experiments. The brilliant book “Creativity and Madness: New Findings and Old Stereotypes” by Albert Rothenberg helped me understand that the supposed link between creativity and mental illness was primarily based on myths, anecdotes and a selection bias in which the creative accomplishments of patients with mental illness were glorified and attributed to the illness itself. Geniuses who suffered from schizophrenia or depression were not creative because of their mental illness but in spite of their mental illness.
I began to realize that the over-diagnosis of mental illness and the departure of causality that had become characteristic for contemporary psychiatry also helped foster the myth that mental illness enhances creativity. Many beautiful pieces of literature or art can be inspired by emotional states such as the sadness of unrequited love or the death of a loved one. Creativity is often a response to a state of discomfort or dis-ease, an attempt to seek out comfort. However, if definitions of mental illness are broadened to the extent that nearly every such dis-ease is considered a disease, one can easily fall into the trap of believing that mental illness indeed begets creativity. In respect to establishing causality, Rothenberg found, contrary to the prevailing myth, mental illness was actually a disabling condition that prevented creative minds from completing their artistic or scientific tasks. A few years ago, I came across “Poets on Prozac: Mental Illness, Treatment, and the Creative Process” a collection of essays written by poets who suffer from mental illness. The personal accounts of most poets suggest that their mental illnesses did not help them write their poetry, but actually acted as major hindrances. It was only when their illness was adequately treated and they were in a state of remission that they were able to write poems. A recent comprehensive analysis of studies that attempt to link creativity and mental illness can be found in the excellent textbook “Explaining Creativity: The Science of Human Innovation” by Keith Sawyer, who concludes that there is no scientific evidence for the claim that mental illness promotes creativity. He also points to a possible origin of this myth:
The mental illness myth is based in cultural conceptions of creativity that date from the Romantic era, as a pure expression of inner inspiration, an isolated genius, unconstrained by reason and convention.I assumed that the myth had finally been laid to rest, but, to my surprise I came across the headline Creativity 'closely entwined with mental illness' on the BBC website in October 2012. The BBC story was referring to the large-scale Swedish study “Mental illness, suicide and creativity: 40-Year prospective total population study” by Simon Kyaga and his colleagues at the Karolinska Institute, published online in the Journal of Psychiatric Research. The BBC news report stated “Creativity is often part of a mental illness, with writers particularly susceptible, according to a study of more than a million people” and continued:
Lead researcher Dr Simon Kyaga said the findings suggested disorders should be viewed in a new light and that certain traits might be beneficial or desirable.These statements went against nearly all the recent scientific literature on the supposed link between creativity and mental illness and once again rehashed the tired, romanticized myth of the mentally ill genius. I was puzzled by these claims and decided to read the original paper. There was the additional benefit of learning more about the mental health of Swedes, because my wife is a Swedish-American. It never hurts to know more about the mental health or the creative potential of one’s spouse.
For example, the restrictive and intense interests of someone with autism and the manic drive of a person with bipolar disorder might provide the necessary focus and determination for genius and creativity.
Similarly, the disordered thoughts associated with schizophrenia might spark the all-important originality element of a masterpiece.
Kyaga’s study did not measure creativity itself, but merely assessed correlations between self-reported “creative professions” and the diagnoses of mental illness in the Swedish population. Creative professions included scientific professions (primarily scientists and university faculty members) as well as artistic professions such as visual artists, authors, dancers and musicians. The deeply flawed assumption of the study was that if an individual has a “creative profession”, he or she has a higher likelihood of being a creative person. Accountants were used as a “control”, implying that being an accountant does not involve much creativity. This may hold true for Sweden, but the creativity of accountants in the USA has been demonstrated by the recent plethora of financial scandals. The size of the Kyaga study was quite impressive, involving over one million patients and collecting data on the relatives of patients. The fact that Sweden has a total population of about 9.5 million and that more than one million of its adult citizens are registered in a national database as having at least one mental illness is both remarkable and worrisome.
The main outcome was the likelihood that patients with certain mental illnesses such as depression, schizophrenia or anxiety disorders were engaged in a “creative profession”. The results of the study directly contradicted the BBC hyperbole:
We found no positive association between psychopathology and overall creative professions except for bipolar disorder. Rather, individuals holding creative professions had a significantly reduced likelihood of being diagnosed with schizophrenia, schizoaffective disorder, unipolar depression, anxiety disorders, alcohol abuse, drug abuse, autism, ADHD, or of committing suicide.Not only did the authors fail to find a positive correlation between creative professions and mental illnesses (with the exception of bipolar disorder), they actually found the opposite of what they had suspected: Patients with mental illnesses were less likely to engage in a creative profession.
Their findings do not come as a surprise to anyone who has been following the scientific literature on this topic. After all, the disabling features of mental illness make it very difficult to maintain a creative profession. Kyaga and colleagues also presented a contrived subgroup analysis, to test whether there was any group within the “creative professions” that showed a positive correlation with mental illness. It appears contrived, because they only break down the artistic professions, but did not perform a similar analysis for the scientific professions. Among all these subgroup analyses, the researchers found a positive correlation between the self-reported profession ‘author’ and a number of mental illnesses. However, they also found that other artistic professions did not show such a positive correlation.
How the results of this study gave rise to the blatant misinterpretation reported by the BBC that “the disordered thoughts associated with schizophrenia might spark the all-important originality element of a masterpiece” is a mystery in itself. It shows the power of the myth of the mad genius and how myths and convictions can tempt us to misinterpret data in a way that maintains the mythic narrative. The myth may also be an important component in the attempt to medicalize everyday emotions. The notion that mental illness fosters creativity could make the diagnosis more palatable. You may be mentally ill, but don’t worry, because it might inspire you to paint like van Gogh or write poems like Sylvia Plath.
A study of the prevalence of mental illness published in the Archives of General Psychiatry in 2005 estimated that roughly half of all Americans will have been diagnosed with a mental illness by time they reach the age of 75. This estimate was based on the DSM-IV criteria for mental illness, but the newer DSM-V manual will be released in 2013 and is likely to further expand the diagnosis of mental illness. The DSM-IV criteria had made allowance for bereavement to avoid diagnosing people who were profoundly sad after the loss of a loved one with the mental illness depression. This bereavement exemption will likely be removed from the new DSM-V criteria so that the diagnosis of major depression can be used even during the grieving period. The small group of patients who are afflicted with disabling mental illness do not find their suffering to be glamorous. There is a large number of patients who are experiencing normal sadness or anxiety and end up being inappropriately diagnosed with mental illness using broad and lax criteria of what constitutes an illness. Are these patients comforted by romanticized myths about mental illness? The continuing over-reach of psychiatry in its attempt to medicalize emotions, supported by the pharmaceutical industry that reaps large profits from this over-reach, should be of great concern to all of society. We need to wade through the fog of pseudoscience and myths to consider the difference between dis-ease and disease and the cost of medicalizing human emotions.
Image Credit: Wikimedia Commons Public Domain ECT machine (1960s) by Nasko and Self-Portait of van Gogh.
An earlier version of this article was first published on the 3Quarksdaily Blog.
Kyaga, S., Landén, M., Boman, M., Hultman, C., Långström, N., & Lichtenstein, P. (2013). Mental illness, suicide and creativity: 40-Year prospective total population study Journal of Psychiatric Research, 47 (1), 83-90 DOI: 10.1016/j.jpsychires.2012.09.010
Tuesday, March 5, 2013
When we observe an interaction between two other human beings (Person A and Person B), we sometimes draw conclusions about the personality traits or character of these two individuals. For example, if we see that Person A is being rude to Person B, we may be less likely to trust Person A, even though we are merely "third-party" evaluators. i.e. not directly involved in the interaction. Multiple studies with humans have already documented such third-party social evaluation, which can even occur among children. A study published in 2010 showed that 3-year old children were less likely to help adults who had previously acted in a harmful manner in front of the kids, i.e. torn up a picture drawn by another adult in a staged experiment.
Do animals who observe humans also conduct such third-party social evaluations of humans? The recent study "Third-party social evaluation of humans by monkeys" published in Nature Communications by James Anderson and colleagues staged interactions with human actors in front of tufted capuchin monkeys (Cebus apella). The researchers found that the monkeys indeed evaluate humans after witnessing third-party interactions involving either helpful interventions or a failure to help fellow humans.
In front of each monkey, two actors performed either "helper" sessions or "non-helper" sessions. In the "helper" sessions, Actor A tried to get a toy out of a container and requested help from Actor B, who complied and helped out Actor A. In the "non-helper" sessions, Actor B refused to help. After the sessions, both actors offered a piece of food to the monkey. In the helper sessions, monkeys readily accepted food from both actors. On the other hand, monkeys in the non-helper sessions accepted food more frequently from actor A (the requester of help) than Actor B (the non-helper).
The researchers also added an interesting twist to the experiment by creating a situation in which both actors had their own containers. The researchers then created an "occupied non-helper" condition in which Actor B did not even acknowledge Actor A's request because Actor B was pre-occupied by their own container. In this "occupied non-helper" situation, the monkeys accepted food from both actors equally. In an "explicit non-helper" condition, Actor B acknowledged the request for help from Actor A but explicitly rejected the request. In this latter situation, the monkeys were less likely to accept food from Actor B.
This study is not the first study to evaluate third party social evaluations of humans by non-human primates, but its strength lies in its meticulous design. Both actors offered the same type and amount of food to the monkeys, so that the most likely explanation for the monkeys' choices was indeed the interactions of the humans with each other.
The research presented in this study gives us a fascinating insight into how third party social evaluations by non-human primates. The fact that the monkeys discriminated between occupied non-helpers (i.e people who were too busy to notice the request for help) and explicit non-helpers (i.e. people who were just plain mean - they noticed the call for help but rejected it) shows a very fine-tuned analysis of human interactions. It is a good reminder of how human interactions can leave lasting impressions on fellow beings - humans and non-humans.
Image credit: Adult Tufted Capuchin by Charles J Sharp, via Wikimedia Commons (Creative Commons License).
Monday, February 25, 2013
The concept “superiority illusion” refers to the fact that people tend to judge themselves as being superior to the average person when it comes to positive traits such as intelligence, desirability or other personality traits. This is mathematically not possible, because in a normally distributed population, most people cannot be above average. The “superiority illusion” belongs to a family of positive illusions, such as the “optimism bias”, which is characterized by an unrealistic positive outlook regarding our future. It is thought that such positive illusions may help ward off depressive symptoms and promote mental health.
The neural mechanisms responsible for the “superiority illusion” are poorly understood. The recent study “Superiority illusion arises from resting-state brain networks modulated by dopamine” published in the Proceedings of the National Academy of Sciences by Yamada and colleagues used resting functional MRI (fMRI) and PET imaging of the brain in 24 male subjects without known psychiatric or neurologic disease to investigate the neural mechanisms involved in the generation of the superiority illusion. Their findings suggest that the degree of superiority illusion correlates negatively with functional connectivity between two parts of the brain (the anterior cingulate cortex and the striatum) and that the proposed mediator is the neurotransmitter dopamine. This would mean that increasing dopamine levels in the striatum could promote a person’s superiority illusion.
One limitation of the study was that the findings were purely associative and did not prove an actual causal link between dopamine levels and the superiority illusion. Another limitation of the study was that the researchers only performed imaging at one time point and did not track whether changes in the self-perception of superiority in the subjects (over time or in response to certain interventions) also correlated with changes in the brain imaging.
Despite these limitations, the study is quite novel in that it attempts to define the neural mechanism for the “superiority illusion”. The fact that it points to dopamine as a mediator could have important implications. The authors of the paper believe that the “superiority illusion” promotes self-esteem and is an innate counterbalance to depressive symptoms. If further studies confirm a causal role for dopamine in promoting the “superiority illusion”, one could conceivably design novel pharmacologic therapies that target the dopaminergic system and help patients with severe depression who suffer from low-self-esteem.
However, a lot more mechanistic research needs to be conducted before pharmacologic dopaminergic stimulation can be pursued as a treatment for depression. We also need to be aware of the fact that psychiatric medications are often over-prescribed. If newer medications become available which are able to raise self-esteem and foster “superiority illusions”, they might be unnecessarily prescribed to many people who do not suffer from true major depression. The last thing we need is a world in which everyone becomes even more convinced how superior and wonderful they are.
Image credit: Striatum from Anatomography maintained by Life Science Databases(LSDB) via Wikimedia Commons (Creative Commons License).
Wednesday, February 13, 2013
WASHINGTON, D.C.- Plagiarism is back in the headlines. The German Education Minister Annette Schavan recently resigned because of allegations of plagiarism in her doctoral dissertation. There was also significant outrage when it became public that the now discredited science journalist Jonah Lehrer was paid $20,000 to speak at the Knight Foundation about plagiarism and other forms of journalistic misconduct that he has engaged in.
Christopher Robin of the Winnie Foundation feels that plagiarists are unfairly maligned. His foundation conducted a survey, which proved that plagiarism scandals usually result in weeks of extensive reporting and investigations, thus providing new job opportunities for investigative journalists and academic committees. "Plagiarists create jobs for others. They should be seen as heroes and not as villains, especially during a recession when there aren't too many jobs out there."
Robin also said that plagiarism may soon become a highly attractive career for US college graduates. "Lehrer is becoming an excellent role model. He shows that you can earn good money while you are engaging in plagiarism. Even if you are caught, you still receive large honoraria to speak about your misconduct. Plagiarists have excellent job security."
Meanwhile, the Cocaine Retailer Association of Chicago (CRAC) says that at least three of its members are applying to the Knight Foundation for an opportunity to give a lecture. "They would like to speak about how wrong it is to sell drugs and some of them would be willing to do it for only half of the Lehrer honorarium."
Image Credit: Bengt Ruda's chair and a plagiarized version via Wikimedia Commons
Tuesday, February 12, 2013
Teju Cole writes in the New Yorker about the German author W.G. Sebald:
Throughout his career, W. G. Sebald wrote poems that were strikingly similar to his prose. His tone, in both genres, was always understated but possessed of a mournful grandeur. To this he added a willful blurring of literary boundaries and, in fact, almost all his writing, and not just the poetry and prose, comprised history, memoir, biography, autobiography, art criticism, scholarly arcana, and invention. This expert mixing of forms owed a great deal to his reading of the seventeenth-century melancholics Robert Burton and Thomas Browne, and Sebald’s looping sentences were an intentional homage to nineteenth-century German-language writers like Adalbert Stifter and Gottfried Keller. But so strongly has the style come to be associated with Sebald’s own work that even books that preceded his, such as those by Robert Walser and Thomas Bernhard, can seem, from our perspective as readers of English translations, simply “Sebaldian.”Sebald’s reputation rests on four novels—“Vertigo,” “The Emigrants,” “The Rings of Saturn,” and “Austerlitz”—all of them reflections on the history of violence in general, and on the legacy of the Holocaust in particular. Our sense of this achievement has been enriched by his other works: the ones published in his lifetime (the lectures “On the Natural History of Destruction” and the long poem “After Nature”), and those that were released posthumously (including the essay collection “Campo Santo,” and the volumes of short poems “Unrecounted” and “For Years Now”). Sebald’s shade, like Roberto Bolaño’s, gives the illusion of being extraordinarily productive, and the publication now of “Across the Land and the Water,” billed as his “Selected Poems 1964-2001,” does not feel surprising. Ten years on, we are not quite prepared for him to put down his pen.
Read more here
Nasim Saber writes in Qantara:
He was a contemporary of Indian pacifist Mahatma Gandhi and always preached an Islam of nonviolence: Abdul Ghaffar Khan, the man who was venerated by the Pashtuns as "King of Chiefs" died 20 years ago in
Abdul Ghaffar Khan was born in 1890 in Charsadda near
British-occupied northwest sector of the Indian subcontinent. He was a member
of the Mohammadzai family, a respected Pashtun dynasty, to which Zahir Shah,
the last king of Peshawar ,
also belonged. Afghanistan
Abdul Ghaffar Khan grew up to become a pioneer of nonviolence in a region plagued by wars. The Pashtuns still revere him today as "Badshah Khan" (King of Chiefs).
In 1910, when he was only 20 years old, Abdul Ghaffar Khan already built a school near Utmanzai in the northwest region of what is today Pakistan. He went on to found the "Anjuman-e islah ul Afghana" (Afghan Reform Association) and to publish the magazine "Pashtoon" in order to reach the masses under British domination.
Read more here
Image Credit: Abdul Ghaffar Khan and Gandhi in 1940, Public Domain image via Wikimedia
Monday, February 11, 2013
Whether we cruise the internet, turn on the TV or simply open up our email Inbox, we are bound to encounter advice regarding obesity and weight loss. The problem is that a lot of the circulated opinions about obesity and weight gain are only poorly supported by medical and scientific evidence. The recent paper “Myths, Presumptions, and Facts about Obesity” published in the New England Journal of Medicine on January 31, 2013 by Krista Casazza and colleagues investigates popular notions about obesity and tests whether they are actually backed up by peer-reviewed, evidence-based studies. Their findings are quite surprising and unravel many of the “myths” that relate to obesity and weight problems. The authors refer to these notions as “myths”, because they were unable to find adequate evidence to back them up and even find some evidence that actually refutes the notions. Unfortunately, the data presented by the authors does not always provide definitive evidence, so it may be rather premature to dismiss these widely held beliefs as “myths”.
Here are the seven “myths” about obesity and weight gain that the authors discuss:
Myth number 1: Small sustained changes in energy intake or expenditure will produce large, long-term weight changes.
The authors claim that this is a myth, because it is based on the assumption that small dietary or activity changes yield benefits that continue to accumulate and result in large changes. They think that these calculations overestimate the achieved weight loss, because they do not adequately take into account that the metabolism adapts to the ongoing weight loss. A very obese person with a high caloric intake may respond strongly to a minor increase in daily exercise levels, but the degree of weight loss will decrease over time.
I have to disagree with Casazza and colleagues on this point, because I think that their analysis does not refute the idea of small sustained changes having long-term benefits. One can disagree about the magnitude of the long-term benefit, but there is still a long-term benefit.
Myth number 2: Setting realistic goals for weight loss is important, because otherwise patients will become frustrated and lose less weight.
Casazza and colleagues cite multiple studies which show that ambitious weight loss goals may be associated with better outcomes.
Myth number 3: Large, rapid weight loss is associated with poorer long-term weight-loss outcomes, as compared with slow, gradual weight loss.
The authors point to a meta-analysis (summary analysis of multiple published studies) which showed that very low energy diets (rapid weight loss) and low energy diets are similarly successful in terms of achieving weight loss.
Myth number 4: It is important to assess the stage of change or diet readiness in order to help patients who request weight-loss treatment.
The evidence does not support the need to wait for people being “ready” for weight loss. It may be best to start right away.
Myth number 5: Physical-education classes, in their current form, play an important role in reducing or preventing childhood obesity.
The authors of the paper summarize the results of multiple studies which did not show any statistically significant and consistent benefit of increasing physical education time in school on childhood obesity. They state that there is probably a level of activity that will be beneficial, but that this level may not be achieved in the limited amount of time that children have in school for physical education.
The problem with the analysis of the Casazza and colleagues is that they dismiss the findings as “inconsistent”, but this inconsistency may reflect that some children do benefit from the intervention while others do not. One study, for example, showed a benefit in girls that were overweight, but not in boys. This “inconsistency” does not necessarily invalidate the notion, it merely means that we need to identify the group of children that are most likely to benefit and to perhaps modify the type and duration of physical education in schools to help even more groups of children.
Myth number 6: Breast-feeding is protective against obesity.
Casazza and colleagues reviewed all the major studies in this area and found no significant evidence that breast-feeding children protects them against obesity, but they concede that breast-feeding may be associated with other benefits for the child, unrelated to obesity.
Myth number 7: A bout of sexual activity burns 100 to 300 kcal for each participant.
The authors calculate the amount of calories burned during sexual activity and estimate that the actual amount is probably closer to 20 to 30 kcal (calories) and not 100 to 300
These are the seven “myths” that the authors claim to have debunked. I also think that it is important to note the disclosures at the end of the article, which shows that the authors have very strong ties to food manufacturers. Here are the financial disclosures for just one of the authors:
“Dr. Astrup reports receiving payment for board membership from the Global Dairy Platform, Kraft Foods, Knowledge Institute for Beer, McDonald’s Global Advisory Council, Arena Pharmaceuticals, Basic Research, Novo Nordisk, Pathway Genomics, Jenny Craig, and Vivus; receiving lecture fees from the Global Dairy Platform, Novo Nordisk, Danish Brewers Association, GlaxoSmithKline, Danish Dairy Association, International Dairy Foundation, European Dairy Foundation, and AstraZeneca; owning stock in Mobile Fitness”
These financial ties do not invalidate the analysis, but they should be considered when interpreting the results.
Overall, I think this is an important paper, because it shows us that we often operate under certain assumptions about obesity and weight loss without there being adequate evidence to back it up. This highlights the need for more unbiased research in this area. However, I am disappointed by some of the analyses made by the authors, when they summarily dismiss a belief as a “myth”, just because there are some inconsistencies or differences in estimated benefits. Instead of using the somewhat sensationalist term “myth”, it would have been better if the authors had just focused on pointing out weaknesses in the current evidence and need for more studies.
Image credit: Painting “Schlaraffenland” (“TheCasazza K, Fontaine KR, Astrup A, Birch LL, Brown AW, Bohan Brown MM, Durant N, Dutton G, Foster EM, Heymsfield SB, McIver K, Mehta T, Menachemi N, Newby PK, Pate R, Rolls BJ, Sen B, Smith DL Jr, Thomas DM, & Allison DB (2013). Myths, presumptions, and facts about obesity. The New England journal of medicine, 368 (5), 446-54 PMID: 23363498
”, 1567) by
Pieter Bruegel the Elder – via Wikimedia Land
WASHINGTON, D.C.- Pope Benedict XVI surprised everyone by announcing that he is going to retire. He will be the first pope in nearly 700 years to resign, because most popes retain their office until they die.
The Council Of Muslim-Americans (COMA) responded to this announcement by pointing out a long tradition of anti-Muslim discrimination among Catholic clergy. At a press conference, the COMA spokesperson Abdullah Abdullah said that this was an opportunity for the Catholic Church to prove that it has moved beyond Islamophobia.
"We have reviewed the religious affiliations of all the previous popes and we noted that none of them have been Muslim. This is clearly a sign of anti-Muslim discrimination and Islamophobia. There has also never been an American pope and none of the popes have been women. As proud Americans and advocates for the rights of women, we believe that the Vatican should engage in papal equality and choose an American Muslim woman as the next pope."
Image Credit: Pope Innocent, Fresco at the cloister Sacro Speco via Wikimedia Commons