Abstract
Accurate detection of HLAs and non-HLA antigens is critical for managing long-term allograft transplantation, particularly in the context of hyperacute, acute, and chronic allograft rejection. Recent studies have identified the role of non-HLA antibodies, such as those against Angiotensin II Type 1 receptor (AT(1)R) in transplant rejection. The enzyme-linked immunosorbent assay (ELISA) is the primary method for measuring AT(1)R-specific antibodies (AT(1)R-Ab), offering high specificity and reasonable sensitivity. Despite its widespread use in clinical settings, some reports have suggested that pre-treating the samples with latex beads can eliminate the detection signal in the CellTrend AT(1)R ELISA assay, potentially raising concerns over false reactivity in the assay. In this study, we demonstrate that the bovine serum albumin (BSA) present in the adsorb out beads (AOB) buffer, even at a dilution of 10(-6), plays a key role in signal elimination in the CellTrend AT(1)R-Ab detection kit. Additionally, we evaluated the performance of the CellTrend kit and an in-house affinity-purified AT(1)R ELISA in detecting eluted AT(1)R-Abs from live cells using the adsorption crossmatch and elution (AXE) technique, which achieved a median elution efficiency of 30% when tested on the CellTrend ELISA platform. Our findings support that the CellTrend ELISA kit accurately detects anti-AT(1)R antibodies that bind to the active form of AT(1)R. However, serum treatments containing BSA interfere with the antibody-antigen capture interface, leading to signal suppression.