I read with interest the article titled “Effect of maintenance hemodialysis on diastolic left ventricular function in end-stage renal disease” by Duran et al.1 in the October 2010 issue of this Journal and would like to present some comments on this article. The authors conducted an intriguing study to determine whether a change in cardiac function occurs after commencing hemodialysis with arteriovenous fistula. To better address this issue, I believe that several points need to be explored: the duration of the follow-up period, the method used to detect change in the primary endpoint and other factors that might have influenced the primary endpoint.
First, the length of the follow-up period may have played an important role in the neutral result of the study. Left ventricular (LV) diastolic dysfunction, which includes abnormal diastolic distensibility, delayed relaxation and impaired filling, may result from pathologic ventricular hypertrophy (which interferes with myofilament cross-bridge detachment), regional asynchrony from myocardial ischemia or heightened cardiac afterload.2 LV stiffness also contributes to diastolic dysfunction and presents as an increasing LV operating mass.3 Considering the chronological course of LV hypertrophy (LVH), which includes an increasing LV mass over time that leads to LV diastolic dysfunction, the period of follow-up is important. In a seminal article, Lo et al. searched for predictive factors associated with LVH in non-diabetic hemodialysis patients and prospectively identified several biochemical and physical parameters in a single cohort over a two-year period. They found that systolic blood pressure, atrial natriuretic peptide, albumin levels and hemoglobin levels can each be used as predictors of LV mass index following the initiation of hemodialysis.4 Bajraktari et al. utilized tissue Doppler to assess LV filling pressure and identified reduced systolic myocardial velocity and advanced age as factors associated with increasing LV filling pressure over a median time span of 4.4 years.5 Losi et al. performed a cross-sectional study in a group of maintenance hemodialysis patients (typical duration 6–24 months) to investigate the relationship between LV diastolic dysfunction and echocardiographic backscatter from the myocardium; they observed that approximately 40% of the patients presented evidence of diastolic dysfunction.6 In the above studies, the average length of the follow-up period ranged from just over half a year to 4 years. The results of these studies support the importance of the duration of the follow-up period when identifying myocardial remodeling in end stage renal disease patients undergoing hemodialysis. As the median duration of the follow-up period in Duran's study was approximately 2 months,1 the myocardial change is expectedly small, notwithstanding the possibility of a more subtle change in diastolic dysfunction even without an increasing LV mass.2
Second, the method used to detect LV diastolic dysfunction may have been inferior to other options. Echocardiography is inherently deficient in determining the echo window and spatial resolution, and it suffers from an operator-dependent nature.7 The volume-dependent effect also attenuates its diagnostic accuracy. Cardiac magnetic resonance imaging (MRI) is a new volume-independent modality that provides a detailed depiction of LV geometry and structural variation. Cardiac MRI has been validated in end-stage renal disease patients for both the detection of LVH and LV mass quantification, and this method of quantification correlates fairly well with cardiovascular outcome.8,9 Moreover, cardiac MRI can detect smaller changes in the myocardium. Myerson et al. examined a selected group of LVH patients and found that cardiac MRI was able to accurately determine a statistically significant LV mass change (10 g) in approximately 80% of the patients for whom a corresponding change could be detected using 2-D echocardiography.9 In the field of diastolic dysfunction, there is a surging interest in using newer techniques, such as left atrium circulation transit time (LATT) and late gadolinium enhancement (LGE) sequence, to determine myocardial fibrosis;7 good correlations with echocardiographic findings have been demonstrated. This imaging modality offers a promising method for earlier detection of diastolic dysfunction, especially for the small number of patients who exhibit different degrees of LV pathology.
The inability to detect LV and even right ventricular function change in the study by Duran et al.1 may have also been the result (to a lesser degree) of a decreased severity of the LVH in their patient population, which may reflect the lower frequency of angiotensin-converting enzyme inhibitor/angiotensin receptor blocker use. The pre-dialysis blood pressure of their patients was also better controlled (average systolic/diastolic 135/83 mmHg), possibly attenuating an otherwise detectable LV change. In summary, the length of the follow-up period played an important role in the neutral result of Duran's study, and a more accurate method for the detection of LV diastolic dysfunction might have identified a significant difference even in the short term. As time passes, newer technologies emerge that will allow for earlier detection of even trivial abnormalities.