Data Integrity

Completeness

The first theme to emerge from data exploration was data completeness. Missing values were most prominent for timestamp variables such as

  1. the datetime the Office of Sponsored Projects was notified of about the status of the award and
  2. the datetime the award was actioned on by the Office of Sponsored Projects.

Timestamps are an integral data point because they allows us to track the lifecycle of an award as well as compute the duration of time an award spends in any given phase of the lifecycle. Without timestamp data, it is increasingly difficult to gain a comprehensive understanding of processes and improve them from a data-informed approach.​

Researchers have stated that bias is likely in analyses with more than 10% missingness and that more than 40% missing data means we can only postulate but draw no firm conclusions. (Dowd et. al, 2019). The visualization below shows that timestamp variables were missing at critical levels prior to and after the Covid-19 pandemic.

The significant missingness of timestamp data meant my analysis would stop short of making conclusion with statistical significance as well as limit the possibility of finding definitive trends. Despite this, the subsequent analysis still sheds some light into our workflows and provides a starting framework for how this research might continue going forward.

Process efficiency

The figure below visualizes the typical time to complete each task from FY17-23. Across all fiscal years, subawards have shown to take the longest to complete with an expected completion of around 50 days. It’s worth noting that across all fiscal years, CEHD also processes about 50 subawards each year.

The figure below shows us a breakdown of process times for each workflow across all fiscal years.

Proposals

In order to reach GSU’s goal of doubling it’s research expenditures by the next decade, one of the first milestones we must inevitably reach is increasing our number of proposals. The logic here being that as the numbers of proposals increase, then the number of awards will (hopefully) also tend to increase. However, the path to a proposal’s acceptance begins long before it reaches the sponsor. Proposals must first be reviewed, approved, and submitted to the sponsor by the Office of Sponsored Projects and Awards (OSPA), ideally days before the deadline so that OSPA can conduct a diligent review.

The number of same-day submissions in CEHD peaked in FY19 with approximately 9.68% of proposal being submitted to the sponsor the same day they were received by OSPA. Soon thereafter, OSPA implemented a policy that said proposals should be submitted five days in advance of the proposal due date to allow sufficient time for review. But did that policy have an effect?

At first glance, it appears that the policy did have a positive effect on decreasing same-day submissions as evidenced by the decreasing red lines in the figure below. In FY23, the percentage drops to as low as 3.66%. However, recall that nearly 100% of timestamp data were missing in FY22 and approximately 50% of the timestamp data were missing in FY23. Therefore, these values are removed from our sample (as evidenced by the dots in the visualization below which have no value). Given that those durations could be anything (including same-day submissions), it is impossible to draw any conclusions on whether we have made progress on this front.

The fact remains, submitting proposals to OSPA is a requirement and them being submitted early is also necessary. If we are to meet our 10-year strategic goals, I believe we must first invest in an infrastructure that empowers PIs to submit proposals both early and often.

Subawards

Subawards is yet another contentious area of research administration. Best practice says that subawards should be set up within ~30 days. However, in FY23 we see that process took as long as 90 days in some instances.

Completing a subaward 90 days after the initial award set up is well beyond the 30-day best practice. When you couple in the fact that subawardees often begin their research well before the award set up but can’t be reimbursed, we get a greater sense of the compounding effects that this bottleneck has on the system. Additionally, it’s reasonable to assume that a process this insufficient can have a negative impact on recruiting and retaining both faculty and research administrators from considering GSU as their home.

Cost Transfers

This analysis of GSU’s research administration processes ends with cost transfers. One highlight within the data is that in FY23, CEHD recorded its lowest number of cost transfers for a total of 34. However, despite this decrease, interestingly we also see that cost transfers are tending to take longer to complete, M = 7 days in FY23. This might suggest that our research administrators are getting better at catching budget discrepancies before its too late.

Despite this fewer numbers of cost transfers, we also see that cost transfers may be tending to take longer to process over time. (Recall, given the significant missingness of the data, we must be careful not to make definitive conclusions). If this is in fact true, this may be due to the fact that some PIs are doing a worse job of reviewing their expenses and these may be the most complex cost transfers to process.