The introduction of the calcineurin inhibitors cyclosporine and tacrolimus in the immunosuppressive regimens for kidney transplant has been associated with substantial reductions in the incidence of acute rejection, with a subsequent improvement in 1-year graft survival. However, this has not directly correlated with improvements in long-term allograft survival. Immunosuppressive medications are associated with toxicities related directly to immunosuppressive effects, and these are similar among different agents. In addition, there are other toxicities that are unique for each drug. Immunosuppressive minimization strategies have attempted to address both of these toxicities. Calcineurin inhibitors have been associated with chronic nephrotoxicity, and various calcineurin inhibitor-sparing strategies have been used to address this issue with the aim of improving long-term outcomes. However, there has been a paradigm shift over the past 10 to 15 years, with the appreciation that calcineurin inhibitor nephrotoxicity is not the major cause of late graft failure. Studies have now shown that chronic immune injury mediated by donor-specific antibodies may account for most late graft losses. Although some patients do benefit from calcineurin inhibitor-sparing approaches, others may have late allograft loss from chronic and subacute immune-mediated injury. Unfortunately, the vast majority of calcineurin inhibitor-sparing studies have short-term follow-up and have not explored the change in the donor-specific antibody profile. One of the biggest challenges that we face is being able to distinguish among patients who will benefit from this strategy and those who will not. In this study, we review the various strategies used to limit or avoid the use of calcineurin inhibitors and address the benefits and pitfalls associated in pursuing such strategies.