Additional related information may be found at:
Neuropsychopharmacology: The Fifth Generation of Progress

Back to Psychopharmacology - The Fourth Generation of Progress

The Neurobiology of Treatment-Resistant Mood Disorders

Robert M. Post and Susan R. B. Weiss

 

INTRODUCTION

In 1921, Kraepelin (30) not only differentiated manic-depressive illness from schizophrenia, but he crystallized the critical observations on the longitudinal development of the illness based on his systematic patient records and life-chart methodology. He described the pleomorphic aspects of its clinical presentation and the tremendous variability in its clinical course. At the same time, he abstracted the general principle that patients often undergo a pattern of cycle acceleration with longer intervals occurring between the first and second episodes than those occurring later in the illness. He also noted a progression from precipitated to autonomous episodes, such that psychosocial stresses (particularly loss or the threat of loss) appeared to be implicated in initial episodes but not in subsequent episodes, which occur more spontaneously, that is, without apparent external precipitating factors. It is against this backdrop of a potentially progressive and evolving illness that issues of treatment resistance should be considered.

In the ensuing decades, these initial clinical observations have been documented and redocumented in more formal clinical studies (46). In systematic studies examining the issue of cycle acceleration, the general pattern of decreasing duration of well intervals as a function of successive episodes has been supported in virtually every study (for reviews, see refs. 46 and 51). Many of these studies occurred in the prepsychopharmacological era or before systematic prophylactic treatments were utilized. Even in the current era of long-term treatments, if one examines patients with treatment-refractory illness a pattern of cycle acceleration is often apparent (51). The impact of psychopharmacological treatment on the course of illness is obviously a major confounding variable, particularly since some agents, such as the heterocyclic antidepressants, have been implicated in illness exacerbations, that is, either in the precipitation of manias or in cycle acceleration (70). Rouillon (62) has reviewed the controlled literature and suggests that in prospectively followed bipolar patients, the switch rate is approximately 25% on placebo (or with lithium and an antidepressant) but approximately 50% in bipolar patients treated with tricyclic antidepressants alone. In our National Institute of Mental Health (NIMH) cohort of treatment-refractory patients, we have observed a high incidence of switches during treatment with tricyclic antidepressants based on retrospective life-chart methodology. Approximately 35% of switches appear to be related to the antidepressant treatment and not typical of the patient's prior course of illness (2). Additionally, Wehr and Goodwin (70) have highlighted the earlier suggestions of Kukopulos and others that antidepressants might also be associated with increasing cycle acceleration and conversion from intermittent to continuous cycling patterns. Kukopulos et al. (32) have argued that this pharmacological impact is associated with lithium treatment resistance. In the study by Wehr and associates, the impact of tricyclics on cycle length was found to be a deceleration upon drug discontinuation, and reacceleration once antidepressant treatment was reinstituted. In our series, approximately 26% of patients showed cycle acceleration during treatment with antidepressants, with 35% of patients showing definite or likely patterns attributed to the antidepressant and not the course of illness (2). In each of the instances of cycle acceleration, antidepressant discontinuation confirmed the association because it slowed the cycles.

Patients who either begin their illness with a pattern of rapid cycling (four or more episodes in a given year) or who progressively evolve to this faster pattern also tend to be resistant to treatment with lithium (see ref. 47 for a review). It is of interest that many patients with initial episodes of depression appear to go on to more rapid cycling forms of the illness and that the pattern of illness associated with biphasic episodes of depression, followed by mania, and then a well interval (D-M-I) is less responsive to lithium than the pattern of illness in patients who show an initial mania, followed by a depression, and then a well interval (M-D-I) (44). Recent data of Okuma and associates (40) also indicate that carbamazepine is less effective in patients with rapid cycling and the D-M-I pattern of illness.

It had previously been assumed that patients with 48-hr cycles represented the limit of cycle acceleration in manic-depressive illness. However, more recently, faster patterns of mood oscillation have been identified and are associated with relative treatment resistance (31). In these instances, patients not only show ultrarapid cycling, where episode durations are approximately one week or less, but ultra-ultra rapid (ultradian) cycling, where marked fluctuations in mood occur faster than once every 24 hrs. In these latter instances, mood fluctuations are distinct, dysrhythmic, or chaotic. George et al. (18) have modified the equations used by May and Gottschalk to model this chaotic pattern. In this model, systematic variation of a single constant can be associated with progressively more marked alterations in rhythmicity of the illness. This or related mathematical models may be of assistance in defining and classifying different phases of illness ranging from (a) isolated, intermittent episodes, to (b) more rapid, rhythmic, and continuous patterns, and, (c) to ultrafast frequencies associated with chaotic mood oscillations. These course-of-illness observations are of interest for several reasons. The chaotic patterns of mood oscillations often tend to occur late in the course of illness and may represent not only an evolving neural substrate but also one that tends to be lithium and, in many instances, carbamazepine-refractory (31). In some instances, combination treatment appears to be required to treat this phase of the illness, and, as noted below, preliminary observations suggest the potential utility of the calcium-channel blocker nimodipine in this phase of illness as well (42, 49). Such a progression could represent either the increasing dysregulation of a single biochemical variable or a much more complex series of evolving biological processes, similar to those described in the evolution of behavioral sensitization and kindling. Sensitization and kindling are related in that they provide a conceptual framework for considering evolving processes that may be pertinent to treatment resistance.

 

COCAINE INDUCED BEHAVIORAL SENSITIZATION AND ITS CROSS-SENSITIZATION TO STRESS

Chronic cocaine administration is a pertinent model for consideration of refractory mood illness from three different perspectives. First, acute low-dose cocaine administration produces an important model for the euphoric and psychomotor components of hypomania (55). However, with repeated administration, particularly involving higher doses, chronic cocaine administration also models the dysphoric and psychotic components of mania. As such, it represents a model for mania in evolution. Understanding the neurobiological mechanisms underlying the progressive increases in behavioral responsivity to the same dose of cocaine, which is the hallmark of behavioral sensitization, may thus provide a useful paradigm for elucidating progressive alterations in manic symptomatology and their differential response or lack of response to pharmacotherapeutic interventions.

Second, cocaine itself is an interesting model of a stressor, with many of its effects on neurotransmitters and peptide hormones mimicking the stress response, including increases in corticotropin-releasing hormone (CRH), adrenocorticotropic hormone (ACTH), and cortisol. In addition, behavioral sensitization to the psychomotor stimulants cocaine and amphetamine in many instances shows cross-sensitization to stress (4, 27). That is, stimulantinduced behavioral sensitization results in increased reactivity to stress, and pretreatment of animals with acute or repeated stressors of appropriate frequencies, intensities, and modalities (such as tail pinch and mild foot shock) results in increased reactivity to psychomotor stimulant administration. Thus, consideration of the mechanisms underlying repeated cocaine administration may provide insights into those underlying long-term changes in responsivity to stressors as well as to maniclike syndromes of hyperactivity.

Third, there is a very high incidence of comorbidity in primary affective disorders, particularly bipolar illness with substance abuse disorders, where cocaine is often the substance abused (see the review in ref. 56). There is some evidence supporting the conclusion that substance abuse impacts negatively on treatment and provides one route to treatment resistance, perhaps through such mechanisms as increased incidence of noncompliance with therapeutic regimes, particularly lithium (48) and other agents, as highlighted in the studies of Aangaard and Vestergaard (1). Treatment resistance could also emerge because of lack of responsivity of patients with comorbid substance abuse to otherwise therapeutic regimes. For example, it has been widely reported that those patients with substance abuse are less responsive to lithium carbonate than those without such a variable (22). It is possible that neurobiological alterations accompanying chronic cocaine administration may interact with the primary pathophysiology of mood disorders in such a fashion that pharmacological responsivity is altered.

Our understanding of the mechanisms underlying cocaine-induced behavioral sensitization is, in itself, in the process of evolution; however, a preliminary blueprint of the critical processes involved is now available. The quality, magnitude, and duration of sensitization varies as a function of dose, frequency of repetition, intervals between drug administration, and a variety of other factors, including the environment (51, 54, 55, 76). For example, in some paradigms, all of the behavioral sensitization appears attributable to conditioning variables; animals repeatedly pretreated and tested in the same environment show increased reactivity, whereas animals with equal exposures to cocaine but in different environments do not show sensitization (i.e., their behavior is like that of saline-pretreated controls). However, with higher doses of cocaine and more chronic administration, some components of behavioral sensitization can be demonstrated to be independent of the environmental context.

As such, dissection of the neurobiological mechanisms involved must necessarily distinguish between these two components of behavioral sensitization--conditioned and unconditioned. For example, in a 1-day cocaine sensitization paradigm with a low-dose challenge on the second day, Fontana and associates (15) found evidence of increased dopamine overflow in the nucleus accumbens measured by in vivo dialysis in animals pretreated with cocaine in the context of the test cage but not in those pretreated in a different environment or in controls treated with saline. Following repeated cocaine challenge paradigms, Brown et al. (10) failed to see significant evidence of conditioned sensitization when a saline challenge was utilized, however. This difference may have been attributable to either the difference in the pretreatment regimen, the greater pharmacological or interoceptive cueing associated with a cocaine compared to a saline challenge, and/or the presence of cocaine as a dopamine reuptake blocker in the test dialysate. Kalivas and Duffy (25, 26) and Kalivas and Stewart (27h) have elucidated a time course of unconditioned increases in dopamine overflow in terminal field areas (nucleus accumbens) or in dendritic autoreceptor fields (ventral tegmental area) following repeated cocaine administration compared with saline in an unconditioned paradigm as well. In these and other studies, behavioral sensitization to cocaine outlasted the regional alterations in dopamine overflow, suggesting that other mechanisms, over time, subserved behavioral sensitization. In this regard, White et al. (78) reported an increased responsivity of dopamine D1 receptors in the accumbens, as revealed by direct iontophoretic application studies. These effects were longer lasting but, in some instances, behavioral sensitization can last for months, exceeding the period of time of increased dopamine receptor responsivity.

Cocaine has been reported to increase the immediate early gene (IEG) transcription factors c-fos (and zif-268) through a D1 receptor mechanism (20, 36, 81). These substances acting through leucine zipper and zinc finger motifs, respectively, are thought to alter subsequent transcription of late effector genes and regulate levels of peptides, receptors, and enzymes. For example, chronic cocaine administration is associated with increases in dynorphin, substance P, and neurotensin, as well as decreases in neuropeptide Y (NPY), somatostatin, and, in some areas of brain, thyrotropin-releasing hormone (TRH) (56). Receptor binding is increased for m-opiate and central- and peripheral-type benzodiazepine receptors and decreased for quinuclidinyl benzylate (QNB) and corticotropin-releasing factor (CRF) receptor binding. Although none of these neuropeptide or receptor changes has been specifically linked to alterations in immediate early gene regulation, it is thought that induction of c-fos and fos-related antigens (FRAS), zif-268, and a variety of other transcription factors and their interactions, initiate a sequence for transcription of these late effector adaptations that could help convey the altered neurobiochemical and behavioral responsivity to cocaine in a long-lasting fashion. Thus, prior experience with cocaine changes the brain not only in a short-term fashion at the level of neurotransmitters and receptors and their nongenomic adaptations, but also in a much more long-lasting fashion through alterations in genomic expression, which may affect not only biochemistry but neural structure in much longer-lasting temporal domains. Evolving processes in different neural systems may similarly occur with the experience of repeated stressors and episodes of mood illness; that is, stress and episode sensitization could ultimately affect treatment response.

 

KINDLING AS A MODEL FOR ILLNESS EVOLUTION AND TRANSITION TO AUTONOMY

In contrast to the behavioral sensitization described above, which directly models many aspects of mood illness (particularly euphoric and dysphoric manias), kindling is a nonhomologous model for mood illness evolution. Increased behavioral reactivity is measured with a seizure endpoint and none of the behaviors in kindling evolution are similar to those observed in patients with bipolar illness. Thus, although we might consider how various aspects of bipolar illness undergo kindlinglike transitions, it must be restated that kindling is only a conceptual bridge that might help describe the kinds of neurobiological processes and their spatial and temporal evolution in the brain that could be associated with the progression of a disorder.

Given these caveats, which violate most of the traditional principles of animal modeling of mood disorders, why discuss kindling as a potentially useful model in this context at all? In kindling, there is (a) increased behavioral responsivity to the same stimulation over time and (b) a progression to spontaneity following sufficient numbers of triggered kindled seizures. These syndrome characteristics are paralleled in vastly different time domains in some patients with mood disorders. We can then ask whether any of the neurobiological principles underlying kindling evolution at the level of gene expression and neuronal microstructure uncovered in seizure kindling provide a conceptual framework for making predictions about illness evolution and pharmacological responsivity in mood disorders. Finally, seizures are easily observable and measurable, thus making the quantification of kindling evolution relatively precise. Given the fact that there are few well-accepted or validated models for mood disorders that are easy to induce, reliably manifest, and long-lasting in terms of their memory characteristics, kindled seizure evolution and its robustness offers certain practical and logistical advantages.

In kindling, repeated and necessarily intermittent stimulation of the brain eventually results in a lowering of the threshold for afterdischarges (enhanced excitability); increases in the duration, spread, and complexity of the afterdischarges (the development phase); and finally, the reliable appearance of full-blown seizures in response to the previously subthreshold stimulation (completed phase of kindling) (19, 57). In arriving at this completed phase of kindling, animals go through successive behavioral seizure-stage transitions from immobility and behavioral arrest (stage 1) to head nodding (stage 2), unilateral forepaw twitching (stage 3), and, finally, full-blown bilateral tonic/clonic seizures involving head, trunk, and forepaws (stage 4) with rearing and falling (stage 5). Following sufficient numbers of seizures (for example, several hundred kindled from the amygdala with once-a-day stimulation for one second at 200 to 800 mA), animals may be observed to undergo spontaneous seizures without any exogenous electrophysiological triggering.

Similar to what is observed in behavioral sensitization, it is now apparent that kindling induces an intricate cascade of neurobiological events at the level of gene expression that help code for the spatiotemporal evolution of neurobiological changes that accompany and may underlie kindling. For example, Clark et al. (12) have mapped the spread of the kindled seizure or memory trace during the developmental stages of kindling utilizing radiolabeled c-fos as the marker of what cellular pathways are being activated. This technique of in situ hybridization allows one to assess what cells are turning on their messenger ribonucleic acid (mRNA) for a transcription factor such as c-fos following activation or depolarization (Fig. 1). These effects at the level of the nucleus are similar to those observed while mapping metabolic pathways with deoxyglucose, although in the latter instance, it is the terminal areas of the neuron that are thought to be involved. With this in situ hybridization technique, Clark and associates have observed that initial phases of amygdala kindling are associated with unilateral activation of either the piriform cortex or the dentate gyrus of the hippocampus. With successive brain stimulations and seizure stage evolution, the piriform cortex and dentate gyrus become activated bilaterally. With completion of full-blown kindled seizures, there is increasingly widespread cortical involvement. Thus, in situ hybridization mapping of c-fos mRNA demonstrates the progressively more widespread activation of neural pathways associated with the evolution and development of amygdala kindling. Most recently, Clark et al. (unpublished observation) have observed that a spontaneous kindled seizure induced unilateral c-fos activation in the dentate gyrus on the side opposite to that originally kindled, further suggesting that as the seizure process evolves, so does its neuroanatomy, in this case, to the contralateral side of the brain.

Alterations in IEGs appear to be only the initial phase of impact on gene expression, as c-fos and FRAS are rapidly induced, returning to baseline over a period of minutes to hours. Their transient induction, as noted above, may be associated with longer-lasting events and transcription of late effector genes (Fig. 2). For example, Rosen and associates (61) have found that following c-fos induction, the mRNA for TRH is induced in similar areas of brain (piriform cortex and dentate gyrus) first unilaterally and then bilaterally, and it remains elevated for longer durations of time (hours to days) than c-fos. Rosen et al. (60) have also found that c-fos and TRH coexist in the same cells, further suggesting the likelihood that the two events may be related. In addition, the gene for TRH has an AP-1-like binding site at which fos and FRAS may affect transcription of this neuropeptide. In different cells in the dentate hilar region of the hippocampus, kindling is associated with induction of the mRNA for CRH in cells that previously did not manifest CRH (67). Thus, kindling is associated with the activation of presumably novel neuropeptide synthesis in cells that ordinarily did not express this substance. This finding is of potential interest not only in its own right, in relationship to kindling, but as it might relate to the repeated observations of neuropeptide receptor mismatches in the central nervous system (CNS) where peptide receptors often appear to exist in areas of brain not associated with the neuropeptide itself (21). It is possible that these receptors are there to receive neuropeptides that some day may be turned on in a cellular program related to a variety of neural functions induced by the environment.

As with TRH or CRH, a variety of alterations in enzymes, receptors, and protein kinases have been reported to be altered during kindling in a subacute to long-lasting fashion (Fig. 2). These are not detailed further other than to indicate that a multitude of adaptations may occur in different spatiotemporal sequences and the IEG c-fos induction and some of its associated neuropeptide correlates may be only two examples in that vastly more complicated process. In this regard, it is important to emphasize that not only are neurobiological alterations induced in kindling but micro- and macroneuroanatomical changes occur as well. For example, Geinisman et al. (16) have observed alterations in synapse formation in pathways thought to be intimately involved in kindling evolution and Sutula et al. (69) have demonstrated neuronal sprouting in the dentate granule cells of the hippocampus. Sutula and his group have also demonstrated that some cells in the dentate and hilar areas are dying in a fashion that is in proportion to the number of kindled seizures.

Thus, an active process of synaptic and neural remodeling may occur during kindling evolution and may account for some of the permanent alterations in neural and behavioral excitability that accompany kindling evolution. Because kindling is associated with the induction of nerve growth factors as well as neuropeptides which may exert tropic effects, there is a potential link between the intermediate neuropeptide inductions and the longer-lasting if not permanent alterations in synaptic microstructure and neuronal macrostructure and circuitry (Fig. 1). Although the process of preprogrammed cell death or apoptosis (35) is not definitively documented in the process of kindling, it is likely that cell death occurs by this process rather than by more classic toxic or degenerative forms of cell death, which are associated with obvious lesions, scarring, and glial proliferation. In the process of apoptosis, it is generally thought that a cell engages active cellular machinery, requiring ongoing protein synthesis, to activate an inherent cell death program and commit suicide. It is likely that there is a continuum of processes involved in cell death from those occurring in the absence of appropriate trophic factors to those occurring by more traditional excitotoxic processes. Similar processes of sculpting the CNS are thought to underlie not only critical stages of neural embryogenesis, but also the formation of the basic wiring diagram of the brain during development. The kindling data raise the possibility that related phases of neuroplasticity, synaptic reorganization, and neuronal tract sculpting could be occurring throughout adult life in relationship to processes of adaptation in response to environmental impact. Similar processes underlying memory-like events may be revealed by the kindling paradigm, which has been considered as a model of neuronal learning and memory (3).

 

STRESS AND EPISODE SENSITIZATION IN THE RECURRENT MOOD DISORDERS

Utilizing the principles derived from the sensitization and kindling models discussed above, one is now in a position to formulate a template for how stressors and episodes of mood illness could impact on the long-term course of recurrent mood disorders. The postulate is that, as in sensitization and kindling, appropriate psychosocial stressors may, through their impact on IEGs and late effector genes, reach a threshold for inducing full-blown episodes of affective illness. As in the kindling model, initial stressors may be insufficient to precipitate full-blown episodes, but with sufficient genetic vulnerability, repetition of stressor, or magnitude of stimulation, they become capable of inducing the neurobiological alterations associated with full-blown episodes. This formulation is consistent with the data that stresses not only induce IEGs (37), but also have a longer-lasting impact on neuropeptides and other transmitter and receptor alterations thought to be intimately involved in mood disorders. For example, increases in CRF and TRH have been reported in some studies of the CSF of depressed patients (6, 38), whereas state-dependent decreases in somatostatin have also been observed in multiple studies of depressed patients (64). As illustrated in Fig. 3, it is possible that repeated occurrences of stressors become capable of triggering the appropriate combination of transcription factors that then result in the longer lasting regulation of mRNAs for CRH, TRH, and somatostatin in the appropriate direction with levels of respective neuropeptides remaining altered for much of the duration of the depressive episode. The long-term vulnerability to the stressor induction of episodes may be manifest in the observation that lesser degree or incidence of stress induction is required to induce subsequent episodes (46).

These observations also imply that another phenomenon is occurring simultaneously-that of episode sensitization. That is, it is the recurrence of sufficient numbers of triggered affective episodes themselves (similar to that observed with amygdala-kindled seizures) that not only leaves the organism progressively more vulnerable to subsequent episodes, but eventually results in the occurrence of episodes in the absence of exogenous triggers. The nature of the long-lasting memory traces left behind by episodes of mood illness remain to be adequately described, but the conditioned and unconditioned components of stimulant-induced behavioral sensitization discussed above suggest that a vast array of mechanisms induced by episodes of affective illness may be capable themselves of impacting on gene transcription, both of the immediate early and late effector variety. This formulation would not be inconsistent with the existing data which show that a variety of the neurotransmitter and neuropeptide alterations postulated to occur during episodes of mood illness (such as increases in dopamine and norepinephrine during mania, increases in acetylcholine during depression, and increases in CRH and associated peptide abnormalities) have all been reported to impact on IEGs. The manic and depressive behaviors associated with these and a multiplicity of other biological abnormalities may thus become sensitized with recurring episodes.

Preliminary endocrinological data are also compatible with this; patients who fail to normalize their pathological escape from dexamethasone suppression during their well interval are at risk for increased incidence of relapse into another depressive episode (5). Similar observations have also been made for failure to normalize the blunted TSH response to TRH (putatively linked to TRH hypersecretion). Banki and colleagues (7) have observed that patients who do not decrease their high CSF levels of CRH during a depressive episode are also at higher risk for relapse when they are restudied during a euthymic episode. These data suggest the possibility that some of the neurobiological abnormalities associated with acute affective episodes may persist in some fashion into the euthymic or apparently clinically well interval. Even those patients demonstrating a full recovery based on clinical characteristics and neurobiological assessment, may still be at increased risk for episode recurrence compared with patients whose episodes have been prevented with adequate long-term prophylaxis. This postulate of episode sensitization requires direct clinical demonstration in appropriate clinical trials to ascertain whether patients undergoing long-term prophylaxis with episode prevention become less vulnerable to episode recurrence than patients who only receive acute and intermittent treatment of their recurrent mood disorder. Other indirect evidence does exist, however, that suggests that episodes may change pharmacological responsivity, if not the course of untreated illness.

DISCONTINUATION AND REFRACTORINESS

We have observed a series of patients who have been well maintained on lithium prophylaxis, discontinue their medication, experience a relapse, and then fail to rerespond to the reinstitution of lithium at the same or higher dosages (48, 49). In some instances, patients have been capable of re-responding after the first period of lithium discontinuation, but not after the second. The duration of time on lithium with the patients maintained in a relatively euthymic state does not appear to preclude the phenomenon of discontinuation-induced refractoriness as it has been observed after patients have been well for as long as 10 to 15 years.

These data are subject to a variety of interpretations, but one possibility is that the reemergence of an additional episode itself is sufficient to alter the neurobiology of the illness in such a fashion that it is no longer responsive to initially effective treatment. It is possible that this phenomenon is not unique to lithium. The Hillside Hospital group have observed that schizophrenic patients undergoing neuroleptic discontinuation and suffering relapses become more refractory to antipsychotic treatment with neuroleptics following each relapse (33). Thus, as in neuroleptic refractoriness in the late (expression) phase of cocaine sensitization (76), it may be the emergence of new behavioral pathology and its associated biochemical alterations that impacts on the degree of pharmacological response. Consistent with this perspective are the data of Gelenberg et al. (17) and O'Connell et al. (39), who observed that patients with greater numbers of prior episodes of mood illness before instituting lithium prophylaxis were at higher risk for lithium nonresponsiveness than patients who had had only three or four prior episodes.

It is noteworthy that neuroleptics are able to block the development, if not the expression, of cocaine-induced behavioral sensitization (76). This implies that the sensitizing effect of repeated cocaine-induced behavioral alterations or their associated biochemical changes are capable of conveying neuroleptic nonresponsiveness. In a similar fashion, new episodes engendered following lithium or neuroleptic discontinuation could alter the neurobiological substrate in such a fashion that patients become less responsive.

The kindling paradigm offers an additional perspective for treatment resistance. During kindling evolution, not only does the neuroanatomy and biochemistry appear to evolve, as discussed below, but the pharmacology does as well. That is, the three major phases of kindling evolution, a, development; b, completed; c, spontaneous, each show a differential pharmacology (51). Some drugs that are effective in one phase are not effective in another. For example, carbamazepine is ineffective in blocking the development of amygdala-kindled seizures, but it is highly effective on the completed variety. Classical N-methyl-D-aspartate (NMDA) antagonists appear to show the opposite pattern. Diazepam is effective in the first two phases of kindling, but does not appear to be useful in blocking spontaneous seizures, whereas phenytoin shows the opposite pattern (43). These data suggest the possibility that as recurrent affective syndromes evolve, they too might change their pharmacological responsivity, as outlined in Fig. 4, Fig. 5 and discussed in the last section.

TOLERANCE EMERGENCE DURING LONG-TERM PROPHYLAXIS

Another route to treatment resistance is the development of tolerance to a previously effective prophylactic agent. We have noted a substantial incidence of loss of efficacy during long-term treatment with carbamazepine, either alone or adjunctively with lithium (45). During long-term prophylaxis with lithium, some loss of efficacy is being reported with this agent as well (34, 49). In assessing the possible routes of loss of responsivity to lithium in a group of 66 lithium-refractory patients referred to our clinical research unit because of treatment resistance, we observed that 45% had originally shown a good response to lithium but developed tolerance. Fourteen percent of the patients in this cohort of lithium-resistant patients developed lithium-discontinuation-induced refractoriness; that is, responsiveness was not lost during lithium treatment but following its discontinuation.

In our patients observed to show eventual loss of efficacy to carbamazepine prophylaxis, those with a more rapid progression of their illness in the 4 years prior to institution of treatment appeared to be at greatest risk (45, 50). This may be consistent with observations in the preclinical model that animals kindled at higher currents (greater pathological drive) develop tolerance faster than those kindled at threshold currents (Weiss et al., unpublished data). As discussed below, tolerance may be conceptualized as either a loss of drug efficacy or the possibility that an underlying pathological process is remanifesting itself through an otherwise effective treatment modality.

CONTINGENT TOLERANCE

Loss of efficacy to the anticonvulsant effects of carbamazepine can also occur in some patients with seizure disorders, and tolerance to the antinociceptive effects of carbamazepine is a well-recognized problem in the long-term treatment of patients with trigeminal neuralgia. In this latter instance, carbamazepine shows an initial 80% to 90% response rate, but as many as 50% of patients will demonstrate some degree of loss of efficacy over time. A better understanding of the neurobiological aspects of tolerance development may allow for the generation and testing of new clinical approaches to both its prevention and reversal once it has become manifest.

To this end, Weiss and associates have begun to study development of tolerance to the anticonvulsant effects of carbamazepine on amygdala-kindled seizures as a paradigm that may help unravel different components of tolerance phenomenology, pharmacology, physiology, and biochemistry. Weiss et al. (71, 72, 74, 77) have elucidated a type of tolerance related to a unique type of pharmacodynamic mechanism rather than pharmacokinetic ones. In this instance, tolerance is not related to the mere presence of the drug in the organism, but is dependent upon its being present during the episode being treated; that is, it is contingent tolerance.

Once the animal has been made tolerant to the anticonvulsant effects of carbamazepine with repeated administration of the drug before seizures have occurred, kindling the animal with no drug (or even continuing to give daily drug, but after the seizure has occurred) is sufficient to reverse the tolerance and reinstate anticonvulsant efficacy when the drug is then given before the kindling stimulation. Following further drug exposure, the animal will again become tolerant at approximately the same rate as initially observed. These data demonstrate that drug efficacy in this model can be manipulated at will, depending on whether the drug is given before or after the seizure has occurred.

Animals made tolerant to the anticonvulsant effects of carbamazepine compared with those receiving equal drug treatment but that were not tolerant (i.e., those given carbamazepine after seizures have occurred) show a differential seizure susceptibility when tested in the drug-free condition. That is, tolerant animals have a lower seizure threshold (increased convulsive responsivity) compared with nontolerant animals. When tolerance is reversed by a period of kindling with no drug (or with drug after kindling), the threshold returns to the baseline kindled state (72, 77). These data suggest that endogenous biochemical and physiological processes occur during tolerance development that render the animals more seizure prone in a long-lasting fashion.

Dr. Weiss and colleagues have begun to examine the possible neural substrates mediating this altered responsivity during carbamazepine tolerance. During the generation of amygdala-kindled seizures, a variety of biochemical effects in the hippocampus have been observed in our laboratory, as summarized below. When animals have become tolerant to the anticonvulsant effects of carbamazepine, some of these biochemical adaptations normally associated with kindled seizures are inhibited or fail to occur altogether. These biochemical changes are intimately associated with contingent tolerance as they are observed only when animals are treated with carbamazepine before kindled seizures occur, but do not occur in animals treated with equal doses of carbamazepine after the seizure has occurred; that is, a situation not associated with tolerance development. In these studies, animals were matched for number of seizures, as well.

Following amygdala-kindled seizures, increases in c-fos are observed in the granule cells of the dentate gyrus as well as increases in the neuropeptide TRH, benzodiazepine and GABA receptors, mineralocorticoid, and glucocorticoid receptors. Kindled seizures also increase CRH and CRH-binding protein in the dentate hilar region (66). In animals that have been rendered contingently tolerant to the anticonvulsant effects of carbamazepine by repeated pretreatment, the seizures are no longer associated with the same degree of increase in c-fos, GABAA receptors, or mRNA for TRH (Weiss et al. and Clark et al., unpublished observations) (Fig. 6). Again, these biochemical alterations are selective to the contingently tolerant rats as nontreated animals exposed to equal numbers of kindled seizures and drug doses (but after the seizures have occurred) do not show this failure to increase these indices.

To the extent that some of the changes following kindled seizures are endogenous compensatory adaptive processes (attempting to prevent or terminate the seizure process; i.e., endogenous anticonvulsant mechanisms), a failure to increase these substances during carbamazepine treatment could account for the loss of efficacy. As a major inhibitory neurotransmitter in brain, GABA is thought to exert its endogenous anticonvulsant effects through the GABAA receptor. Thus, failure to increase GABAA receptors during carbamazepine-induced contingent tolerance could be associated with the drug's loss of efficacy. Similarly, the neuropeptide TRH is thought to exert anticonvulsant effects when administered intrathecally (80).

Post and Weiss have observed, paradoxically, that animals given a period of time off from seizures for 4 days or more lose their ability to respond to carbamazepine (52). It is of interest that seizure-induced increases in TRH occur over a period of approximately 4 days. Thus, increases in TRH induction as well as other intermediately lasting changes in neurotransmitters or receptors (such as the increase in GABAA and benzodiazepine receptors) remain candidates for the time-off-from seizure effect. Whatever turns out to be the precise mechanism of this effect, it is clear that endogenous processes, which have occurred in response to seizures, enable the anticonvulsant effects of carbamazepine.

These data become all the more intriguing in relationship to mechanisms underlying loss of efficacy to carbamazepine during contingent-tolerance development as this process, too, is associated with a failure to increase GABAA receptors, TRH mRNA, as well as other variables. Thus, contingent tolerance may, in part, resemble an animal in the time-off effect where increases in TRH, GABAA receptors, or some other endogenous mechanism are no longer apparent, and this loss or failure of adaptive response is associated with a loss of anticonvulsant effects of carbamazepine on amygdala-kindled seizures. The time-off data are consistent with the formulation that some of the neurobiological changes associated with amygdala-kindled seizures represent secondary or compensatory adaptations (i.e., they are good guys attempting to provide an endogenous anticonvulsant mechanism), whereas others are a more primary part of the pathophysiological process of kindling (i.e., the bad guys, relating to kindling persistence or progression). Differentiating between the two may be of considerable import as one would want to facilitate the good effects but inhibit the bad effects to maximize anticonvulsant efficacy.

This perspective raises the possibility that some of the biochemical alterations associated with the evolution of mood disorders could similarly be divided into those representing a part of the primary pathophysiological process (bad guys) and those representing secondary compensatory attempts at endogenous psychotropic effects (good guys) (52). For example, it is likely that sleep loss in depression represents a secondary adaptive response, as one night's sleep deprivation results in substantial amelioration of depression in 50% to 60% of severely depressed patients (63, 79). If it were part of the primary pathophysiology of the illness it would be expected that further deprivation of sleep would exacerbate rather than ameliorate the depressive process. In a similar fashion, one would postulate that other neurobiological alterations in the illness may similarly be dichotomized. This raises the question of whether the postulated increases in TRH during depression based on increases in CSF levels of TRH (7) and blunted TSH responses to intravenous TRH are pathological or compensatory and adaptive. Obviously such a distinction is of considerable importance in developing better therapeutics.

One prediction of the contingent tolerance model is that a period of time off medication may be associated with the renewal of therapeutic efficacy. Preliminary data in epilepsy (14), trigeminal neuralgia, and mood illness (41) based on small case series are consistent with this hypothesis but remain to be directly tested in prospective clinical trials.

It should be noted from the outset, however, that opposite predictions of the utility of a period of time off medications are derived from the two phenomena described in this manuscript: discontinuation-induced refractoriness and contingent tolerance. In the case of a patient who is continuing to show adequate response, a period of time off medication may be deleterious (48). In contrast, for a patient who has already lost responsiveness to a given treatment, as in the contingent-tolerance paradigm, the current model holds the possibility that renewed responsivity could occur (41). These formulations highlight the possibility that the history of drug responsivity (rather than mere presence of the drug in an organism) could differentially affect treatment strategies in the treatment-resistant mood disorders. In the case of the well-maintained patient, drug discontinuation could not only result in relapse (68) but also in treatment resistance (48, 49), whereas the opposite could hypothetically occur in the patient who has already become tolerant (72). In addition, because there is some mechanistic specificity to contingent-tolerance development, at least on amygdala-kindled seizures (71), using a new drug with a different mechanism of action may not be associated with cross-tolerance. However, unexpectedly, when animals demonstrated tolerance to the anticonvulsant effects of carbamazepine, they were also tolerant to the anticonvulsant effects of valproate (77). Because carbamazepine causes a failure of kindled seizures to up-regulate GABAA receptors during tolerance development, as discussed above, this relative loss of GABAA-receptor tone might provide a basis for the cross-tolerance to valproate.

It is also hoped that understanding the mechanisms underlying development of the contingent tolerance to carbamazepine's anticonvulsant effects might provide additional clues toward its prevention or reversal as well as in pertinent clinical syndromes. In the contingent-tolerance paradigm described, all of the loss of efficacy to the anticonvulsant effects of carbamazepine are related to contingent administration of the drug, and noncontingent drug administration (i.e., drug that administered after each seizure has occurred) is not associated with any loss of efficacy compared with vehicle-treated controls. These data suggest that even in clinical situations with chronic drug administration, a contingent component of tolerance development may account for a greater degree of loss of efficacy than had previously been surmised. Moreover, if this proves to be the case, it would provide for additional predictions regarding vulnerability to drug discontinuation effects. Chronic treatment with a given drug may be associated with long-lasting adaptations that change polarity during the drug discontinuation phase, leaving the patient more prone to illness emergence in this withdrawal period.

In addition to the classic drug withdrawal effects, the current analysis raises the possibility that an additional vulnerability could occur during the withdrawal period in patients who are contingently tolerant to the effects of a drug. That is, they have not only lost the primary effects of the drug, but they have also lost the illness-induced endogenous adaptations associated with tolerance development. These two liabilities may then combine to make the patient who has lost drug responsivity via contingent tolerance more vulnerable to withdrawal phenomena than a patient similarly treated with a drug, but who has not shown loss of efficacy or never responded in the first place; in other words, this patient would only be vulnerable on the basis of classic withdrawal effects. This proposition could be directly tested in patients with refractory seizure, mood, or anxiety disorders, where the prediction would be that patients who had lost efficacy through tolerance would be more seizure prone during the discontinuation phase compared with patients who had never shown response to the drug.

FURTHER PREDICTIONS OF THE SENSITIZATION AND KINDLING MODELS

As illustrated in Table 1, the sensitization and kindling paradigms as they apply to the long-term course of mood disorders have specific predictions that have long-term treatment implications. To the extent that the experience of episodes of mood illness impacts on gene expression and other mechanisms conveying a long-lasting vulnerability to recurrence, early institution of long-term maintenance treatment should have an impact on not only the course of illness but also, potentially, drug response. A dual role for long-term prophylaxis is postulated. Patients undergoing prophylaxis would gain the benefit of amelioration and prevention of affective episodes, and, if the current formulation proves correct, upon drug discontinuation, say after a decade of treatment, they would be less vulnerable to recurrences than other patients who received only intermittent acute treatment of multiple recurrent episodes as they emerged. Although such a study is technically feasible, it would be extremely difficult to perform in an ideal or systematic fashion at this time. Not only would the study be extremely expensive because of its duration, but even the initial randomization of patients to intermittent versus prophylactic treatment would raise ethical dilemmas, as would the period of drug discontinuation after a given number of years of successful prophylaxis.

Thus, less ideal designs are required to assess the possibility of a dual impact of prophylaxis in prevention of episodes and prevention of sensitization (i.e., increased vulnerability to subsequent recurrences). Clearly, the initial data of Gelenberg et al. (17) and O'Connell et al. (39) are consistent with the notion that a greater number of prior episodes before instituting a prophylaxis is associated with a lesser degree of lithium response; however, a variety of other interpretations are possible, including that these patients were preselected for a more refractory illness to begin with. Nonetheless, one could make the conservative argument that even if earlier institution of treatment did not affect the long-term course of illness and its drug responsiveness, one is still better off for having successfully prevented the episodes with effective prophylaxis, and such a single benefit is worthwhile, even in the absence of a convincing demonstration of a dual benefit (i.e., the additional prevention of episode sensitization). Nonetheless, the current data regarding the impact of lithium, carbamazepine, and valproate on the illness and the recent metaanalysis of Suppes et al. (68) showing that 80% to 90% of patients relapse following lithium discontinuation provide a very strong empirical basis for the institution and maintenance of long-term prophylaxis, even in the absence of the demonstration of the sensitization factor. Alas, it is highly likely that both the clinician and the patient will be left to struggle with these risk-benefit formulations for a long time to come in the absence of highly convincing data about the long-term impact of prophylaxis on course of illness and treatment response.

EFFICACY OF NIMODIPINE IN ULTRAFAST MOOD OSCILLATIONS: IMPLICATIONS FOR CALCIUM-BASED MECHANISMS

In an ongoing double-blind, controlled trial, Pazzaglia and collaborators (42) found preliminary evidence of the efficacy of the L-type calcium channel antagonist nimodipine in ameliorating the amplitude and frequency of mood swings in patients with ultrafast mood oscillations. This included some patients with ultra-ultra-rapid (ultradian) cycling and one patient with recurrent brief depression where responses to treatment were confirmed in a B-A-B-A design. Patients with more traditional and slower cycle frequencies did not appear to respond as often. Should these observations be confirmed in a larger patient sample, they suggest the possibility that alterations in calcium flux through L-type channels could play a pathophysiological role in these ultrafast cycle frequencies.

In one patient whose significant but partial response to nimodipine was confirmed and reconfirmed in a B-A-B-A design, the addition of carbamazepine yielded further clinical improvement and attenuation in the frequency and amplitude of this patient's ultradian oscillations. Thus, this patient, who was inadequately responsive to either carbamazepine or nimodipine alone, showed an additive effect when the two drugs were used in combination, achieving the first clinical remission (absence of nurses' blind ratings of functional incapacity) for the first time in more than 4 years.

The efficacy of nimodipine in patients with ultradian mood fluctuations suggests that erratic fluxes in calcium channel regulation could underlie this type of chaotic mood dysregulation. The additive effects of carbamazepine and nimodipine, should they occur in a larger series of patients, also raise the possibility that blockaded L-type calcium channels, in addition to the concurrent effects of carbamazepine on other neurotransmitter or ion channel systems, could provide the basis for the efficacy of these two agents in combination therapy. Of particular interest among the panoply of neurotransmitter second-messenger and ion-channel effects of carbamazepine (53) is the recent recognition that carbamazepine can block calcium flux through NMDA receptors, at least in cerebellar granule cells (Hough, Ragowski, and Chuang, unpublished observations). If this were the mechanism of carbamazepine-induced potentiation of nimodipine, similar clinical effects should be achievable by more traditional glutamate antagonists. It is also possible that effects of carbamazepine on entirely different mechanisms unrelated to calcium could be important to its potentiative effects with nimodipine.

Nevertheless, the efficacy of nimodipine alone, and possibly in combination with carbamazepine in patients with ultradian and chaotic mood fluctuations, does raise the possibility of important alterations in calcium dysregulation in these patients with ultrafast frequencies of mood disorder. Calcium oscillations appear to be fundamental aspects of neurobiology at various spatial and temporal domains ranging from molecular fluctuations in intracellular calcium and membrane-bound calcium involved in neurosecretion and various aspects of neuroplasticity to more global aspects of neuronal function (9, 58, 59). Clarifying the role for L-type and other types of calcium channel blockers in mood dysregulation may not only provide a new class of compounds for the therapeutic armamentarium (13, 23), but may also provide clues to some of the pathophysiological mechanisms involved in the neurobiology of treatment-refractory bipolar illness.

CONCLUSIONS

The kindling and sensitization formulations suggest the possibility that alterations in a series of transcriptional activating and suppressor factors could also occur, not by somatic mutation as in carcinogenesis, but by environmental and experiential impact (Fig. 1, Fig. 2, and Fig. 3) in addition to the more typically considered inherited genetic vulnerability. In the course of affective evolution, we have postulated that, based on alterations in experienceinduced gene expression, there is sensitization both to stressors and to episodes themselves. In the kindling process, each apparently similar and behaviorally stereotyped occurrence of a seizure episode appears, nonetheless, to propel the process gradually toward autonomy (i.e., the spontaneous occurrence of seizures with their differential anatomy and pharmacology). If a similar phenomenon were found to occur in the different neural systems implicated in the mood disorders, then one would postulate not only that repeated episodes may propel the illness toward autonomy, but that a differential pharmacology may also exist as a function of stage-of-illness evolution (Fig. 4).

Preliminary data that are available to date are consistent with such a formulation (Fig. 5). For example, multiple studies have documented that lithium is less effective in rapid-cycling compared with non-rapid-cycling patients (47). Although some patients begin to show rapid cycling from the onset of their illness, this phase is often a late manifestation of the illness. Moreover, initially, all of the patients we have seen with ultradian cycling illness have been inadequately responsive to lithium (18, 31). In rapid, ultrarapid, and ultradian cycling, the anticonvulsants carbamazepine and valproate have shown some success (44). Whether these agents are selectively more effective in this phase of the illness is unclear from the current data. A recent study of Okuma (40) suggests that carbamazepine, like lithium, is more effective in less rapid cycling patients. Nonetheless, in the late rapid cycling and ultrarapid cycling phases of the illness, these newer mood stabilizing drugs (carbamazepine and valproate), either alone or in combination with lithium, may prove to be effective. In addition, Ketter et al. (29) and Keck et al. (28) have demonstrated that, in some instances, patients will respond to carbamazepine and valproate in combination when they have been inadequately responsive to either agent alone. In addition, some patients appear to respond to one anticonvulsant but not the other (44).

These data on differential responsivity among the anticonvulsants and their use in combination with or without lithium augmentation, raise the possibility that, as in cancer chemotherapy, patients in late phases of their illness with severe manifestations of symptomatology and rapid cycling, may require complex treatment regimens to target multiple mechanisms of action. It is of interest that each of the drugs used in long-term prophylaxis (lithium, carbamazepine, and valproate) has been postulated to have multiple targets of drug action and are putative dirty drugs. When these agents are used in combination, perhaps their multiple targets of action converge or diverge on multiple neurobiological processes that are increasingly disordered in the most severely cycling patients. In this fashion, cleaner drugs might not necessarily be more effective treatments in late phases of the illness unless they were more precisely and multiply targeted to block the "bad guys" and assist the "good guys", as discussed above. However, it is also possible that endogenous antidepressant adaptations that are the good guys for a depressive episode could (like tricyclic antidepressants) help precipitate the next manic episodes and contribute to cycle induction. Mood stabilizers may then be unique in their ability to dampen or prevent both manic and depressive episodes by targeting both types of endogenous adaptations.

To the extent that the development of clinical loss of efficacy in long-term prophylaxis is mediated by processes parallel to those seen in contingent tolerance in the kindling model, there may be an associated failure of positive, illness-related endogenous adaptations to occur. In the case of such tolerance development, a sufficient period of time off medications could lead to renewed efficacy because of the reinduction of these illness-driven putative good guys. Conversely, in lithium-discontinuation refractoriness, it is postulated that the primary pathological process of the illness is predominately facilitated by the occurrence of a new episode, propelling the illness to a new stage (like metastases in malignancy), which is now no longer responsive to the previously effective medication. That is, in this instance of the ongoing battle of the pathophysiological illness process versus endogenous, adaptive compensations, the bad guys win. This battle may be taking place at multiple levels of the neuroaxis from ion channels in the membrane and effects on gene expression in the nucleus to more global changes in the balance within and between nuclei (amygdala versus hippocampal) or whole regions of brain and its lateralized function.

The kindling and sensitization models suggest that this balance process between pathological and compensatory mechanisms may be constantly evolving in a complex spatiotemporal pattern, providing multiple targets for conceptualizing new treatment interventions and for systematically testing some of the preliminary formulation offered based on these indirect preclinical models. In this fashion, it is hoped that a more detailed understanding of the primary and adaptive neurobiological processes involved in the recurrent mood disorders will rapidly advance therapeutics. In addition, these formulations add a growing theoretical foundation for the already substantial empirical data base indicating the importance of early institution and long-term maintenance of pharmacoprophylaxis in the recurrent unipolar and bipolar mood disorders. In addition to preventing the considerable morbidity and potential mortality of the illness, appropriate prophylaxis may prevent the bad guys from engaging the long-term biochemistry and microstructure of the brain and propelling the illness to cycle acceleration, autonomy, and treatment resistance.

published 2000