They were also causing a lot of crime, which the authorities tried to conceal by arguing that most addicts had convictions before they became addicts so that depriving them of a legal supply of drugs made no difference to their criminality. All this was in line with an increasingly repressive regime in the United States, from which the British Government clearly took its cues. Yet in line with this political and publicity campaign, treatment for addicts, where it was available, was becoming increasingly a question of instant abstention and appeals to moral fibre.
Some consultants were claiming 90 per cent success in withdrawing their clinic patients from opiate use. These claims were not credible. Under the Guidelines of Good Clinical Conduct in the Treatment of Drug Misuse published in , patients were to be treated similarly, regardless of whether their addiction had lasted for weeks or decades. The maximum initial prescribed dose of methadone, 80 milligrams, was under 5 per cent of what the clinics had been prescribing at the start of the s. The guidelines encouraged general practitioners to include addicts among their patients: as the number of addicts escalated, it was becoming harder to regard them as an excluded, abnormal minority requiring specialist services.
By the Battle of Somme in July , the rate of gas gangrene had declined by 75 percent compared to the previous year. The overall rate of hospital deaths from wounds documented in the war was 4. Research was also substantially catalyzed in pharmacology, although no new breakthrough drugs were developed to augment the medical armamentarium. Most importantly, the need to treat sick and wounded soldiers with remedies of the alike toxicity facilitated the standardization of drugs.
Yet it was not merely for medicinal purposes that intoxicants were used on the battlefield. This article is an attempt to provide an overview of the multifaceted pharmacological world of the war. It begins with the discussion of the purely therapeutic application of morphine to kill pain and the use of anesthetics in surgery. Next, the extensive and common, though not universal, administration of alcohol rations by the belligerent states is presented in the context of prohibitionist measures introduced by many governments on their home fronts.
The final section details the military use of cocaine, the plots of German narcotic subversion against France and Britain , and the wartime cocaine panic in the latter which fostered the introduction of a national drug control regime. By , little had changed in the art of analgesics since morphine had entered into use in the 19 th century and was regularly employed during the American Civil War The narcotic was routinely administered to subdue the pain of the wounded.
Given the enormous number of servicemen who got injured, one of the predominant features of the First World War was the body-in-pain. Shrapnel from exploding shells ripped flesh and shattered bones, causing ghastly wounds. Injuries to the kidney, lungs, and bladder brought agonizing suffering, as did facial mutilations.
When during the initial treatment and dressing of wounds front-line medics administered the injured morphine, they marked their foreheads with crosses in indelible ink in order to prevent drug overdose at further stages of medical care.
A widely accepted recommended dose was one-fourth of grain sixteen milligrams. Problems, however, abounded: patients were mistakenly issued the narcotic more than once and the badly injured were deliberately given larger amounts to ease their very severe pain. Moreover, if administered not as an injection but in tablet form, the absorption rate of morphine varied unpredictably. The risk occurred, too, when the wounded self-medicated themselves with morphine while awaiting professional aid. Morphine overdose could have been harmful because it obscured the clinical picture and hampered proper diagnosis.
Morphine continued to be regularly administered, frequently for weeks on end, during the subsequent treatment and recovery of patients. Unsurprisingly then, like in the previous armed conflicts of the 19 th century, particularly the American Civil War and the wars of German unification, many veterans developed an addiction.
Perhaps the most famous of these patients-turned-morphinists was Hermann Goering , a fighter pilot ace and later powerful member of the Nazi elite and commander-in-chief of the Luftwaffe. The narcotic was employed to kill not only the pain but sometimes also the hopelessly maimed servicemen. If a soldier had no chance of survival or had received horrendous injuries, doctors would, and not at all sporadically, order a humanitarian dose of morphine.
Often unable to perform euthanasia themselves, they would ask nurses or fellow soldiers to administer a lethal injection. Morphine sulphate was also used, next to atropine, omnopon, and sometimes ethyl-chloride, as a preliminary drug before anesthesia in surgeries. The practice of anesthesia itself had not considerably advanced from the mid th century but the war fueled some progress. So on the one hand, the same substances as before were employed: ether, chloroform, and nitrous oxide for general anesthesia and Novocain, adrenaline, or cocaine hydrochloride as local anesthetics.
On the other hand, they were administered with greater awareness and attention than in the past. The experimentation and debate over a universal method of anesthesia endured as military surgeons began to recognize a close connection between anesthesia and the rates of mortality and morbidity. It became apparent that some patients died not from their wounds but because of the inappropriate use of anesthesia: the wrong choice of an agent, dose or method of delivery be it gas and oxygen, spinal, or rectal.
Yet it was not only the physical body under surgery that needed to be desensitized; the psyche of the fighting men called for strong detachment, too. Widely sanctioned by cultural and social practices, drinking was inherent to soldiering and served four general purposes.
The first was medical: to anesthetize, disinfect, and cure it was believed to have potent and versatile healing properties. The second was mental-therapeutic: to numb emotionally suppress fear, stress, and bad memories , relax, and reward for the hardships of combat. The third benefit was enhancement: to inspire courage and keep soldiers going. In the course of the First World War, governmental rations of alcohol and self-medicated drinking served all these time-honored functions.
Moderate consumption was, in general, recognized as desirable, since it raised the fighting spirit and preserved morale. As alcohol boosted self-confidence and increased the willingness to take risks, inebriated soldiers felt invincible and could easier go over the top running toward the hostile fire. Finally, alcohol helped repress traumatic memories and cope with the actualities of modern war. While most of the belligerent parties provided their troops with regular rations of alcohol, at the home fronts the conflict worked in favor of temperance movements.
The Pursuit of Oblivion: A Social History of Drugs
The motivations for introducing state restrictions on the manufacture, sale, and consumption of alcoholic beverages varied. Dominating, though, was the disapproval of wasting crucial resources in the time of ultimate national emergency: uncontrolled drinking could have hampered the general mobilization of societies and jeopardized their productive energies. On ethical grounds, leisure drinking was presented as downright immoral and highly improper. Civilians should have sympathized with their troops, who were expected to make a supreme, if not always sober, sacrifice. And finally, because the consumption of alcohol in the strained time of war could get out of control, governments felt obliged to undertake measures to prevent social decadence and preserve public order.
Overall, however, these measures turned out to be counter-productive. The Russians massively restored to running hooch, while the state income from the sale of alcohol considerably dropped, straining an already stretched war budget. We have been well reminded that in sternly prohibiting the sale of spirituous liquor Russia has already vanquished a greater foe than the Germans.
From the times of Peter I, Emperor of Russia , a soldierly allowance of vodka had been customary. Initially issued in the navy three times a week, in the practice developed into daily governmental rations known as charka milliliters. It later became commonplace also in the infantry , but in this long-standing practice was terminated. The first was the diminished fighting power of the Russian army, its poor combat efficiency, and jeopardized discipline in the aftermath of the humiliating Japanese defeat in The second was the eroding authority of the individual officer-alcoholics and the overall officer corps.
Hence a Russian soldier in was expected to fight the Austro-Hungarians and Germans devoid of the formerly traditional provisions of vodka and, due to tsarist restrictions placed on its military supplies, having fairly limited access to drink. Not surprisingly then, when the fighting stopped on the Eastern Front , soldiers eagerly traded bread, sugar, and other items with their German counterparts for some alcohol.
Albeit prematurely, because soldiers would not find an ally in the communists, for whom alcoholism epitomized the tsarist oppressive order. They aimed to uproot drunkenness, the very instrument for the debasement of the working class. Recruits of the newly-established Red Army were dissuaded from drinking vodka. When the fighting moved to Ukraine , well-stocked with various forms of beverages, Leon Trotsky , the head of the Red Army, grew anxious. Preventively, he issued a draconian order under which many soldiers caught drunk in the units deployed on the southern front during the Ukrainian campaign were shot on the spot.
Historically, however, the ambitious Bolshevik temperance plans were doomed to failure. In August , in an attempt to encourage its troops to face the advancing Wehrmacht , Stalinist authorities reintroduced daily rations of vodka: one hundred grams a day.
- ISBN 13: 9781842125526.
- Eves Diary (Illustrated Edition).
- Account Options.
The temperance fervor also ran high in Great Britain. In , the Central Control Board was established and quickly restricted the alcohol market, for example by limiting pub opening hours and reducing the strength of spirits. But was alcohol equally damaging for the troops on the front as it was claimed to be at home? Traditionally, English sailors and infantrymen were issued provisions of wine, beer, brandy, and from the 18 th century on, mostly rum. The phrase derived from the English soldiers fighting in the Netherlands in the English-Dutch wars of the 17 th century who developed the habit of fueling their fighting spirit with one or two sips of a Dutch gin.
The distribution of rum belonged to the commander of a division. Officially, the army justified the practice entirely and exclusively as a medical necessity: as a remedy for fatigue, stress, and hardships during arduous campaigning. So, formally, commanders were expected to consult military doctors, but most were willing to unconditionally grant alcohol to their men. A standard allowance was 2. Rum, inherent to the life of the British soldier, became synonymous with combat.
Mornings in the trenches usually began about a. Men were given tea, bread, bacon, and as Paul Fussell noted:. To foster a fighting mood before going over the top soldiers were given a double ration of rum which they usually drank blended with coffee, tea, or cocoa. What rum was for the British army, wine was for the French. Paul the apostle considered wine to be a creation of God and therefore inherently good and recommended its use for medicinal purposes but condemned intoxication and recommended abstinence for those who could not control their drinking.
The Bible itself contains nearly two thousand references to vineyards and wine, and numerous references to drinking that both condemn its use in excess and extoll its virtues in moderation [ 6 ]. As alcohol consumption remained high in colonial America, the abuse of alcohol came to be considered a sin by the church and was increasingly condemned by society [ 8 ]. The movement sought to cement its cause in morality and set forth a number of arguments designed to reconcile the absolutist beliefs of the temperance movement with a number of positive references to wine in the bible e.
Although prohibition was enacted and eventually repealed, the characterological and moral problems believed to be associated with the sinful vice of excessive alcohol consumption remained. One sign, perhaps, of the perseverance of such beliefs was the groundswell of the post-prohibition grass-roots self-help group, Alcoholics Anonymous, founded in on the belief that alcoholism represented a medical disease worthy of professional attention and not societal enmity.
Today, the organization boasts more than 2 million members worldwide [ 33 ]. As the field of mental health has come to recognize that the process of human development is inexorably linked to and fundamentally shaped by the environment in which we are enmeshed, so, too, is the ever-unfolding process of conceptualizing substance use shaped by the habits, beliefs, and traditions of the larger society.
Top-down cultural influences can be seen to exert notable effects on substance use and perceptions of substance use, particularly in the 19th century. The culturally bound perception of morphine addiction of the Victorian age, for example, was enmeshed with the highly restrictive sexual attitudes towards women characteristic of the era the same era in which psychoanalysis rose to prominence. Due to the drugs well-known effect on decreasing libido, for example, opium was often prescribed to women for the treatment of neuroses, hysteria, and hypochondriacal disorders; all of which were linked to sexual desires and frustrations among women [ 34 , 35 ].
Thus, the integration of societal standards regarding female sexuality into the mental health profession and diagnostic nomenclature is representative of the way in which the cultural zeitgeist at any given time can influence, if not directly promote, the misuse of substances. The influence of industrialization upon the attitudes and perceptions of substance users is readily apparent as America progressed into the industrial revolution.
The rapid change from an agricultural to an industrial economy during this time was largely a result from the establishment of the factory system, where labor was carried out by individuals in a centralized location on a large scale [ 36 ]. The already negative view of excessive consumption became magnified as society came to rely heavily upon individual personal characteristics incompatible with intoxication—namely productivity, reliability, and punctuality [ 4 ]. This was coupled with a shift in the national zeitgeist towards values consistent with the engine of the new economy, including the accumulation of materials and personal wealth.
Such views were only strengthened by the concomitant rise of problems typically associated with industrialization and urbanization such as increased crime, poverty, and infant mortality rates [ 2 , 4 ]. The effects of industrialization on substance use were not limited solely to alcohol, wherein excessive consumption was antagonistic to the zeitgeist of the times. Harkening back to the provision of coca leaves by the Spanish Conquistadors to the Peruvian slaves in order to increase mining of silver, the modern day equivalent of the coca leaf, cocaine, was supplied by American industrialists and plantation owners to black construction and plantation workers to increase productivity see Figure 1.
Nonetheless, the association of the drug with racial minorities resulted in racialized, zealous accounts of minorities i. The propagation of such attitudes of disapproval across various strata of society would play a principal role in criminalization of substance use including, most notably, the Temperance Movement and Prohibition.
The socio-political American Temperance Movement coincided with the increasing religious and moral condemnation of alcohol use as detrimental to religious ideals and values related to family and society [ 4 ]. Due to the widespread use of narcotic medications to treat wartime injuries, societies around the world found a rise in the number of addicted individuals following the American Civil War — , the Austro-Prussian War , and the Franco-Prussian War — A series of laws were enacted starting in the early 20th century which criminalized the distribution of cocaine [ 27 ].
As motor vehicles became increasingly common in American early 20th century, research into the metabolic effects of alcohol on driving impairments increased, and the newfound dangers posed by alcohol intoxication took on additional costs to society [ 15 ]. As the temperance movement drew strength in industrialized America, so too did it influence attitudes abroad, with prohibition enacted in Russia — , Hungary , Norway — , Finland — and the United States — , among others [ 4 ]. Attitudes towards drug use and the increasing costs to a newly industrialized society resulted in widespread legislation designed to restrict their possession and distribution which in turn resulted in the criminalization of substance use and the entrenched association of addiction with crime, an association which has persisted even within the mental health field.
For over 30 years until its most recent iteration, the DSM has included references to legal problems as part of the criteria for SUDs see Section 7. In the last several decades, substantial advances in pharmacology have led to the identification of endogenous G-protein coupled opioid receptors and the use of synthetic opioids e. Due to their potent analgesic effect, opiate drugs have been increasingly used over the past 20 years by physicians in the treatment of chronic pain.
There is a growing acceptance, however, that the long term benefits of opiates for the treatment of chronic pain are limited by analgesic tolerance, worsening of pain, the development of an opioid use disorder in those in whom the opiates were initially prescribed for chronic pain. Additionally, the diversion of prescription opioid medication is believed to have resulted in increased illicit use stemming from the subjective reduction in anxiety, mild sedation, and sense of well-being or euphoria induced by consumption of these drugs [ 38 , 39 ]. In , about 12 million Americans age 12 or older reported nonmedical use of prescription painkillers during the past year, with nearly a million emergency department visits associated with prescription painkillers with an associated cost to health insurers of In , over 18, deaths have been attributed to overdose from prescription opioid pain relievers, in addition to those associated with their illicit counterpart, heroin [ 41 ].
Today, there is increasing recognition on a national level in the U. Relatively recent advances in our understanding of the pharmacology of cannabis has led to the identification of its active ingredient, chemicals collectively termed cannabinoids , including tetrahydrocannabinol THC , the chemical most associated with psychotropic effects [ 42 ]. The last several decades have also seen an unprecedented rise in physician approved marijuana use for the treatment of medical conditions in a growing number of American states [ 42 ]. Despite these advances, in , it is estimated that The pharmacological properties of cocaine and related drugs are now well known and its effects on behavior are primarily attributable its effect on the neurotransmitter dopamine [ 28 , 44 ].
Cocaine, coca leaves, and ecgonine are presently listed as Schedule II substances by the Drug Enforcement Administration [ 45 ]. In , it is estimated that 1. Alcohol is now largely used as a ritualistic and recreational intoxicant. For example, while excessive use of alcohol remains the third preventable leading cause of death in the United States and contributes to over diseases and health related conditions, there is also a growing recognition of the potential benefits of moderate drinking, including decreased risk of diabetes, ischemic stroke, risk heart disease and related mortality [ 21 , 46 ].
In , slightly more than half After World War II, following the decline of German influence on psychiatric nosology, the center of psychiatry shifted to the United States and the APA commissioned its constituents to create its own psychiatric nosology [ 11 , 47 ]. Evidence of the influence of psychoanalysis and the psychosocial model in the DSM-I are evident with its observable emphasis on psychoneurosis and functional reactions to environmental stressors [ 11 , 47 ]. The first DSM conceptualized substance use disorder i.
Although DSM-I conceptualized the etiology of substance use disorder as a symptom of a broader underlying disturbance, it did leave some room for exceptions—at least in coding. That these exceptions were noteworthy exemptions, and not the rule, however, speaks to the strength of the etiological conceptualization of SUD as being secondary to , or arising from a primary personality disorder.
Anxiety or depression associated with Borderline Personality Disorder may be intensified as the person uses a psychoactive substance in an unsuccessful attempt to treat his or her mood disturbance Compton et al. In contrast to DSM-5, objective numerical count not provided; q Applies to Dependence only; r Mild: Presence of 2—3 symptoms, Moderate: Presence of 4—5 symptom, Severe: Presence of 6 or more symptoms; s Minimum duration. Specified for Abuse only; t Abuse, Dependence.
In , only seven years after the publication of DSM-I, major advances in the treatment of mental disorders i. This further emphasized the need for a classification system based on the medical model [ 11 , 47 , 49 ]. The publication of the DSM-II [ 16 ] however, did little to change the influence of psychoanalysis and its characteristic descriptions of disorders described in the DSM-I.
As seen in Table 1 , three recognized types of alcoholism were recognized in DSM-II: a episodic excessive drinking intoxication four times per year ; b habitual excessive drinking given to alcoholic persons who become intoxicated more than 12 times a year or are recognizably under the influence of alcohol more than once a week, even though not intoxicated ; and c alcohol addiction defined in terms of dependency, suggested by withdrawal which may be evidenced by inability to abstain for one day or heavy drinking for three months or more [ 16 ].
Although withdrawal was emphasized for Drug Addiction, it was also recognized that dependence could occur without withdrawal a point of semantic confusion which would follow the DSM until its most recent publication. In keeping with the growing need for a valid and reliable diagnostic compendium for clinicians and researchers alike, the third edition of the DSM DSM-III [ 1 ] broke with psychoanalytic tradition and instituted consensus based diagnoses and diagnostic criteria [ 47 ]. These criteria, including those for SUDs, were based on the Research Diagnostic Criteria which were, in turn, influenced by the Feighner criteria [ 50 ] and earlier diagnostic attempts by Jellinek [ 15 ] to classify alcoholism.
The DSM-III is thus considered a major milestone in the field, reflecting a reemergence of the medical model and the rise of research investigators as the most prominent voices within the field [ 35 , 36 ]. Starting in DSM-III, the categories of Substance Abuse and Substance Dependence were adopted, and, although little explicit explanation is offered within the manual as to the basis for adopting this distinction, it seems that the former was equated with pathological use e. Among other things, they stated that the.
Thus, for example, actions of a legislature in a particular state can determine the number of residents who met DSM-III criteria for a mental disorder i. Such criticisms would form the basis for recommendations to alter these categories in the next iteration. For example, the manual made the explicit additional requirements of a pathological use criterion for Alcohol and Cannabis Dependence diagnoses in addition to the main physiological criterion; the manual also stated that data was lacking in support of the main physiological criterion necessary for a Cannabis Dependence diagnosis, i.
While the third edition of the DSM reflected, up to this point, the most profound changes in conceptualization of psychiatric nosology since its inception, its successor, the DSM-III-R also evidenced important changes. By grouping pathological behavioral dysfunctions with physiological processes in a polythetic diagnostic set, the conceptualization of the new Dependence category stood in contrast to earlier view that physiological symptoms were both necessary and sufficient for a dependence diagnosis.
In light of these and other conceptual validity problems, recommended revisions to the DSM-III-R included elimination of the Abuse category and incorporation of elements into a newly expanded Dependence category [ 52 ]. The recommendation to expand the Dependence criteria while removing the Abuse category offers some justification for the integration of the pathological use criterion into the Dependence category and the reversal of the DSM-III stance that physiological use was, in most cases, the hallmark of the disorder.
One admitted disadvantage to the re-conceptualized single disorder model was the potential for diagnostic abandonment of individuals with lower level problems who did not meet the criterion for the would-be expanded Dependence category [ 52 ]. Most notably, DSM-5 combines Abuse and Dependence into a single unified category and measures severity on a continuous scale from mild 2—3 symptoms endorsed , moderate 4—5 symptoms endorsed and severe 6 or more symptoms endorsed out of 11 total symptoms versus the previous 7 see Table 1.
The shift to a unified category measured along a dimension of severity represents a notable change from the post-hoc categorical severity specifiers in the previous version and also further cements the difference between the now defunct DSM diagnosis of Dependence and the medical concept of physiological dependence , a distinction which had been increasingly emphasized over time. As reported in Hasin , et al.
The Pursuit of Oblivion: A Global History of Narcotics
These empirical findings suggests that, contrary to the categorization of abuse and dependence as more-or-less distinct entities with different severity levels, the criterion items actuality represent a single continuum-of-severity construct. The integration of dimensional elements of classification seen here in SUD also mirrors the call for such an approach among a number of other categorical diagnostic classifications [ 58 , 59 , 60 ].
The chapter also, for the first time, includes a behavioral addiction i. These changes i. While the departure from psychoanalytic etiology and adoption of atheoretical consensus based diagnostic entities in the DSM-III is regarded as one of the greatest advances in the field over the last century, the fact that the definitive manual for the diagnosis of mental disorders provides no known etiology or pathophysiology and relies, instead, on defining a disorder by its symptoms may pose a challenge not only for the field in general but for the treatment of SUDs more specifically.
One way in which atheoretical classification may prove problematic is the actual clinical usage of the diagnostic criteria themselves. While in vivo studies of clinician usage of DSM-5 for substance use disorder have yet to be carried out beyond the routine clinical practice field trials, past research comparing clinical psychiatric diagnosis versus vs. According to a review by First et al. This observation raises several points of consideration.
First, while there are substantial benefits in utilizing consensus-driven standardized diagnostic criteria e. Second, while more research is needed in order to determine precisely how clinicians are arriving at diagnoses if not utilizing full diagnostic criteria, cognitive research into the clinical reasoning of clinical psychologists suggests that experienced practitioners still rely on their own particular causal theoretical conceptualizations despite the atheoretical diagnostic criteria that have been in place for well over thirty years [ 62 ].
SAGE Books - Substance Use & Abuse: Cultural and Historical Perspectives
Thus, the same lack of universally recognized etiology that was the impetus to move beyond the early psychoanalytic influence and advance to a more valid and reliable model may run contrary to innate mechanisms for conceptualizing diagnostic entities among individual providers. Given the necessity of the diagnostic agnosticism of the DSM and the substantial benefits this model provides to both clinicians and researchers when used correctly, the question arises as to what contributions does the particular state of psychiatric nosology today have on the field as a whole and for SUD specifically?
The adoption of a consensus-based symptomological approach might represent the lack of a shared etiology among professionals. Indeed, some of the major controversies e. Today, while the DSM continues to retain its etiological neutrality, the field of substance use has undoubtedly moved in the direction of explicitly emphasizing biological and disease model conceptualizations of addictive behaviors.
While advocates of a strict disease model of pathology underlying SUDs point out important achievements including improved recognition of the neurobiological process involved in addiction as well as new pharmacotherapies for treating addiction [ 66 ], this conceptualization is still, today, not without its detractors [ 67 , 68 ]. Within SUD treatment, this gap is exemplified in the hesitancy among some practitioners and training programs to readily adopt and promote Evidence Based Practices EBPs in favor of empirically unsupported alternative approaches.
Differences in support for and knowledge of the effectiveness of EBPs has been shown to be related to provider level of education, institutional culture, provider type, training, academic affiliation [ 73 , 74 , 75 , 76 ] and, despite the effectiveness of both psychosocial and pharmacological EBPs, research has shown that their widespread adoption has remained challenging, if not controversial, in some arenas [ 61 ].
In addition to such macro-influences, individual provider attitudes and beliefs may be another link between conceptualization of SUDs and use of EBPs, with providers with higher responsibility-focused conceptualizations of addiction holding more negative views of the use of naltrexone in the treatment of AUD [ 79 ]. Interestingly, the use of pharmacotherapies is particularly low for SUDs i. One of the significant DSM-5 changes identified above Section 6. The removal of the legal problems criterion was reported to reflect the low prevalence for endorsement of this item in the general population, as well as poor fit with other criteria, and little added information based on item response theory IRT and differential item functioning analyses [ 81 , 82 , 83 ].
In contrast to simple summations of items endorsed by an individual in determining an outcome i. The data gathered from these models suggests that legal problems were the least associated with the overarching construct when compared to the other items and model fit was actually improved when the legal problems criterion was omitted [ 82 ].
Thus, while the removal of this criterion was accomplished through the impartiality of advanced empirical models, as described above, the significance of the departure of tradition of using the legal problems criterion as a diagnostic criterion reveals the ways in which even a purportedly atheoretical nosology can be influenced by specific contexts and cultural changes. This point becomes particularly salient when we consider the original contextual factors e. The historical example of the use of opium-based drugs on women from the not-so-distant Victorian age past illustrates the powerful enmeshment of legality, medical acceptance, and cultural norms which remain so saturated the culture of the time that they remain effectively invisible.
Although not substance related, perhaps the most salient example of social norms affecting diagnosis in recent history is the diagnostic evolution of homosexuality in the DSM which was, much like early conceptualization of SUD, considered a symptom of a real psychological illness i. Following the advent of the LGBT rights movement in the s and subsequent research into the condition, the APA eventually reversed its stance on the issue and today it is recognized that the pathology of sexual behavior which was, in part, justified by the subjective level of disturbance it caused is related not to an underlying pathology but rather to socially accepted norms and stigmatization.
Consequently, homosexuality is no longer considered a disease or a representation of underlying personality disturbance and is conceptualized from a non-pathological viewpoint and indeed labelled differently in order to avoid the long held stigma associated with the term homosexual [ 87 ]. Thus, history provides clear examples of how even an atheoretical psychiatric nosology such as the DSM is vulnerable to pathologizing behavior based on socially accepted norms- norms which only come to be revealed as reflecting large scale societal biases as they change over time though shifts in generational perspectives.
Cannabis and its derivations, for example, hold the distinction of being classified as both a Schedule I no currently accepted medical use and lack of safety and its active ingredient, THC, in pill-form, as Schedule II accepted medical use and high potential for abuse [ 45 ]. Disparities can also be seen in the legal status of alcohol use which, despite its non-illicit standing, has been recognized to provide a relatively greater level of harm to individuals and society compared to illicit drugs i.
The removal of the legal problems criterion may be reflective of a larger cultural change of increased recognition of the somewhat arbitrary division between legal status and levels of harm of substances. The removal of the legal problems criterion underscores the larger philosophical issue of relying on a fluctuating socially-constructed criterion with arguable racial and socio-economic disparity in defining an ostensibly biological disorder in an atheoretical symptom- based diagnostic manual. Another significant change to the latest iteration of the DSM identified above Section 6.
Thus, while the two categories were intended to be diagnostically distinct, they were often interpreted as being related- a conceptualization which was argued in the s and resurrected, albeit in a different form, in the new millennium. In making the case for the changes to the DSM-5, empirical findings derived from modern statistical models of the dimensionality of these categories was used, which found that the criteria aligned themselves on a single dimension, a single underlying construct [ 83 , 84 , 93 , 94 , 95 ].
Thus, the issue of validity was again brought into the spotlight. Similar to the socially constructed legal criterion described in the previous section, research into the validity of the Abuse category revealed a disproportionate number of cases of Abuse being diagnosed by a criterion item i. One study, for example, reported that out of individuals diagnosed with current alcohol abuse, The same study found a positive relationship between socio-economic status and DSM-IV Alcohol Abuse diagnosis, which may be explained by higher income drinkers having greater access to vehicles which, in turn, may lead to higher rates of hazardous drinking and, subsequently, Alcohol Abuse [ 96 , 97 , 98 ].
Such findings recall the recommendations described earlier [ 52 ] warning of the socially constructed and therefore problematic nature of the Abuse diagnosis. Although the DSM-5 has been criticized by some for retooling the longstanding dichotomy, this change may be viewed, in a larger sense, as finally addressing the conceptual validity problems underlying this distinction.
For example, if Abuse was best conceptualized not a standalone mental disorder but rather as one dimension of the larger construct of the dependence syndrome as described by Edwards and Gross [ 54 ], then the amalgamation of the two diagnostic entities in the DSM-5 has increased not only the empirical but the conceptual validity of this underlying construct.
While the categorical classification of substance users in the DSM was done from a etiologically agnostic standpoint, is it plausible that, because the format is consensus vs. As once in famously pointed out, symptoms of mental illness are directly tied to the social and ethical culture in which they take place [ 99 ]. While the advancement of empirical inductive reasoning which prompted the shift to the current model is a step-forward in the science of classification, it is not without its limitations; some disagreements exist about relying on mathematical models to disprove clinically entrenched concepts [ 55 ] while others have raised concerns about the validity of diagnostic thresholds i.
Looking forward, it remains to be seen what effect this continuum of severity conceptualization has on clinical work and reliability and validity of diagnoses. Another significant change in the DSM-5 identified above Section 6. While craving has been noted in previous versions as a feature of the disorder, DSM-5 marks the first use of the symptom as an actual criterion item.
According to Hasin, Fenton, Beseler, Park and Wall [ 57 ], the inclusion of craving was supported on several fronts, including its theoretical centrality in accurately describing a clinical feature of SUD, its association with cued self-administration and relapse, its well-studied role in human and animal models of substance use, its inclusion in the ICD, as well as the potential for pharmacotherapeutic intervention for craving and its neural substrates. Indeed, craving is often associated with increased likelihood of relapse to alcohol use, and therefore it is thought that managing craving may improve treatment outcomes.
As such, a number of pharmacologic interventions have been investigated in the last several decades which target craving reduction as a mechanism to reduce substance use including acamprosate, naltrexone, disulfiram, varenicline, lamotrigine and others [ ]. To date, the results of clinical studies on reducing craving have been promising although somewhat inconsistent and await future developments e. Current hypotheses on the neurobiology of craving i. As craving is then, perhaps, the only criterion which may persist following protracted abstinence, future questions may arise about how to treat and code for craving and what role craving plays in identifying remission.
Since the DSM-III-R, the field has defined addictive behaviors as relating to compulsive substance use despite adverse consequences with physiological changes often present. The inclusion of behavioral addictions as psychiatric disorders likely marks the next large paradigm shift in the field of addictions and, not surprisingly, has already garnered some debate.
Although the future of behavioral addictions may lack certitude as of yet, what does seem clear, from a nosological standpoint, is the eventual expansion of the conceptualization of the broader category of addictions. That routine ingestion of a psychopharmacologic substance is not needed in conceptualizing addictive pathology may point to the growing conceptualization of addiction as the sum of a host of neuroadaptations related to dysregulation of endogenous neurotransmitters as well as behavioral, genetic, and pyscho-social factors of which exogenous chemicals play a historically important but potentially diminishing part as the field progresses.
Indeed, the rationale presented in the DSM-5 i. While concern has been expressed about over-pathologizing human behavior, decreasing individual responsibility, and allowing for a deluge of un- or under-supported diagnoses to saturate and hence weaken the credibility of the field [ , , ], future research into the neurobiological substrates of impulse-related disorders and addictions may lay a more solid framework for the behavioral addictions.
Epidemiological and cultural factors of behavioral addictions will likely be an area of future research, as well as identifying behavioral and pharmacological treatment targets, creating validated and reliable measures, and measuring treatment outcomes.
The history of psychoactive substance use is remarkably long, dating as far back, in some cases, as the recorded history of human civilization allows.