Back on my RBM Soapbox

Time for another installment of Nadia’s RBM Soapbox

More from Transcelerate this week on the Risk-Based Monitoring (RBM Methodology).  Check out the July update here.

As an industry, we absolutely need more conversations about quality, oversight, completeness, and appropriate monitor expectations.  So don’t get me wrong, I love dialogue on this and many other topics. Join the conversation in the comments, shoot me an email, or connect to our free and open discussion group ClinOps Toolkit at LinkedIn.

RBM Soapbox: Challenging the Value of SDV

Every time I read about their so-called “challenging the value of SDV” initiative I find it incomplete. I’ve heard nobody yet suggesting measuring the rates of data entry in the days leading up to monitoring visit, during those visits, and the days following. They keep focusing on query rates which are just one measure of data completeness and quality. I can open queries all day long that do nothing to improve quality. I think that having the monitor on-site is like a carrot-stick approach to get the study staff to actually put the data properly and in a timely fashion into EDC. Only then can we talk about centralized statistical monitoring and meaningful remote analysis. From their analysis, I see this review of data input cycle times as a glaring omission.

RBM Soapbox: Does source data verification (SDV) protect subject safety?

To the degree that unreported adverse outcomes and accurate/precise patient characteristics, responses, and measures are located in the source and then reconciled/added to the database when missing or incorrect, sure. SDV alone does not create quality, it is that gap review and asking questions about missingness and gaining clarifications (that get entered into safety reports and the database) that makes the monitor’s SDV process so valuable. We can’t remotely monitor data that simply hasn’t yet been entered.

I still maintain that the unique value monitors bring while on site is the cooperative auditing approach to monitoring: reviewing for negative trends, and assisting the site in avoiding important protocol deviations. The simple, “does this match that” exercise is really not what I am paying my monitors for, never was. The trending piece is becoming easier to do remotely, as is the reinforcement of protocol training. That auditing piece though, confirming appropriate investigator oversight, reviewing that authorities have been delegated appropriately and only to trained and qualified persons, checking that the records are complete, attributable, organized, etc. These items, are just easier now to do in-person.

Don’t get me wrong, I love remote monitoring

I support the increased vigilance and review of data both in aggregate blinded fashion and for individual patients. However, I still want my monitors on site within 2 weeks of the first subject enrollment, I still want phone calls to reinforce protocol training if certain thresholds of time have passed between screening/enrollment activity. I want the monitors there to review IP storage and record-keeping, check supply levels, cheerlead and encourage quality patient recruitment and reinforce the goals of the trial. I am counting on my trained monitors to perform eligibility and compliance checks to make sure that the enrolled subjects actually end up in the analysis set.

We can have RBM, Just Not in Lieu of Real Quality On-site Visits

OK, climbing down off of my RBM Soapbox now for closing thoughts.  in short, RBM metrics and “risk-indicators” should not be used to justify canceling monitoring visits or limiting this important oversight activity. Regular on-site monitoring visits according to plan remain imperative to confirm quality data and ensure subject safety.

Please join the conversation below in the comments, at one of San Francisco Bay Area ClinOps Toolkit Meetup events, or in the LinkedIn ClinOps Toolkit discussion group.  Are you an RBM fan or do you have your own right-sized RBM Soapbox?

Further reading at the ClinOps Toolkit blog:

You may also like…from The Lead CRA archives:


About The Author


Nadia Bracken, lead contributor to the Lead CRA blog and the ClinOps Toolkit blog, is a Clinical Program Manager in the San Francisco Bay Area.


  • This is a great discussion. Nadia, I think the question is what is done at the onsite visits. If the only thing accomplished is to check transcription and to copy a stack of documents–that is not value added activities.

    We, as an industry, communicate remotely with many of our collaborators and colleagues. Once you build a relationship with the sites, you can maintain that relationship remotely–as we all do with our colleagues across the world.

    When we have changed the dialog with the sites to focus on areas where they need support, we have been very effective and had great response from the sites. When you can identify issues early and prevent duplication of errors, you provide real value to the sites.

    Supporting onsite Monitoring visits is expensive for the sites. One site told me that if they had all their monitoring visits remotely, they could have 30% more revenue generating space. If the Study Coordinators have to spend an entire day pulling documents for the monitor, that is not effective use of the site resources. With regular calls with the coordinators and PI’s that focus on actual site performance issues–everyone wins.

    Many monitors worry that you cannot do effective oversight without SDV–the beauty of the FDA’s RBM guidance is that it recommends Sponsors and CROs focus on many other quality measures for site performance. If SDV is taken off the table, then more time can be spent on the other important issues (Protocol compliance, IP dosing/administration and management, and patient safety) which are short changed because the monitor is spending all the time on SDV. In addition, by doing more work remotely, the monitors can be more accessible to the sites because they are not on the road–they can have a better quality of life.

    Sheetz et al just published a paper on SDV ( Less than 4% of data ever changed in their evaluation of 1168 trials. Of that only about 1/3 was related to SDV. So expanding the focus of review of quality to include the other indicators will enhance the overall quality of the trials.

    Good relationships with the sites and support of the sites is critical to the success of the study and success of the products post-launch–but it doesn’t have to be onsite in many/most situations. When the monitor IS onsite, the goal will not be SDV, but having data-driven discussions with the staff to improve the study and site function.

  • Jason

    August 11, 2014

    I liked reading all your perspectives, I’m so glad RBM-related dialogue is starting industry-wide. As someone who has monitored in the field off & on for ten years after working in other industries, I find myself wondering why research sites require the “carrot-stick approach” you mentioned to deliver the basics.

    I know of no highly regulated industry besides conduct of clinical trials where the work has such a high level of QA checks & queries (re-work); & where professionals require so much help during the start-up of projects. It’s even more surprising given that protocols are executed under the supervision of the most highly-trained & experienced experts in the field (Principal Investigators). I realize that multi-center trials will always require consulting from a high-level perspective, but I sincerely hope in the future research site staff will begin to function similar to their peers in other fields. Accountants, engineers, architects, etc. assume 100% responsibility for their work, with very infrequent visits from regulatory authorities to ensure accountability.

  • You’ve made some excellent points here. I just published a blog post about one of them – the interpretation of the phrase SDV. You touched on that when you said, ” The simple, “does this match that” exercise is really not what I am paying my monitors for, never was.” Here’s a link:

    I think it’s quite possible to support the RBM philosophy without throwing on-site monitoring out with the bath water.

Leave A Response