Be Flexibile...


Summary: The Ongoing Professional Practice Evaluation (OPPE) intitiative required hospitals to review a range of data related to the quality and safety of the care provided by each clinician. As the developer, I worked with a team of Clinicians to define a list of 11 metrics to measure Provider proficiency

In my experience, it is rare to be involved in a project from its earliest phase through production. The OPPE project offered me the opportunity to do just that. First I met with the group of physicians and established which data they wanted to use to measure provider's performance for each metric. Then it was time to create the data model by digging through the database objects and establishing how each should be connected. Next, I started the development, producing numerous test files for rounds of testing and validation for each of the 11 metrics. Once approved, I developed and provided the reporting solutions. Normally, this is where my role as developer would be significantly reduced. However, the team decided they wanted to utilize Epic-based dashboards to visualize the data instead of the more popular visualization tools. With no other available resources on the application teams, I decided to learn how to create the dashboards to prevent the project from falling behind. Once successful, I created comprehensive user documentation, established a troubleshooting and issue escalation protocol and designed the dashboard access provisioning process. These dashboards are widely used throughout the organization, provide valuable insight to clinicians, and allow for improved patient care.

To me, one of the most pivotal aspects of development is problem solving. Being flexible allows you to take your blinders off to overcome obstacles and find solutions outside of the norm.



When the request is big, start small...


Summary: Transition from PeopleSoft to Workday

Often times, the scope of a project is so large it can be easy to feel overwhelmed. This was never more apparent than when my organization switched financial/HR systems from PeopleSoft to Workday. Faced with an extremely aggressive timeline and limited resources, myself and one other developer were tasked with handling all necessary reporting obligations for the implementation. This included, but was not limited to, performing impact assessments for hundreds of reporting assets, querying our database for potential areas of liability, code changes, endless test files, meetings and status updates to the larger team and our stakeholders. There were times, especially in the beginning, where it seemed the best option was to wave the white flag. Then I remembered one of the first lessons I learned while coding - start small. As a developer, it is important to take large tasks and break them down into manageable pieces. Once we employed this approach our productivity increased and the project all of a sudden seemed more manageable. Although still underway, we completed the majority of the heavy lifting and the project is on pace for success.



You're only as good as your data...


Summary: Backfilling the Data Warehouse

In my opinion, data integrity is the single most important aspect of data and analytics. The insights produced from bad data are just that, bad. Data integrity issues come in many shapes and sizes, ranging from human error to planned database table updates. Throughout my years as a developer I have been asked to perform many backfills in order to keep our source of truth accurate, but one in particular sticks out. I was tasked with running multiple data extracts to update a downstream reporting data repository...for the last ten years. These extracts were very old and very large, capturing demographic, financial and clinical information. I first went through the stored procedures used to produce the extracts and identified any areas for optimization while testing to ensure consistency. Next, I updated the packages in SSIS by wrapping them in a ForLoop container enabling them to run without interruption. Finally, I created an automated job to allow the package to run early in the morning, until its termination to allow for the nightly ETL. Although this process took a long time to complete, it was incredibly satisfying as it ensured all data generated from the repository was accurate.