Degreed meteorologist with over 15 years of professional software development experience in architecting and building database, desktop, embedded, cloud, mobile and web applications. I am experienced in the full development life-cycle process, object-oriented and functional programming, research and development, am a good communicator and can work well on multiple projects.
As a Research Associate for the Cooperative Institute for Severe and High-Impact Weather Research and Operations (CIWRO) and the National Severe Storms Lab (NSSL), I led the development effort in transitioning Warn-on-Forecast (WoFS) from an on-premise, hardware-specific application to a scalable, platform-agnostic cloud application (Cb-WoFS). WoFS is a high-resolution numerical weather prediction (NWP) model consisting of 36 individual members that assimilate conventional and remote sensing observations, radar reflectivity, velocity, and satellite data every 15 minutes and generate forecasts every half hour. Forecasts are post-processed in near-real time and graphics are generated for forecasters to review during a severe or high-impact weather event. Transitioning WoFS into a cloud application required a high familiarity with WRF-ARW, Fortran and GSI. The complete solution required the development of a C# .net 5 web app that was integrated with Azure Batch for managing HPC resources and the WoFS workflow, containerization of WoFS (including WRF, GSI and other tools), continuous integration and continuous development pipelines (for automatic builds and publishing), terraform scripts for cloud resource build up / tear down, containerization of post-processing python apps, and integration with the following Azure services: Batch, Storage, Queue, CosmosDB, Docker Container Registry, App Service / Functions, CDN and Azure Active Directory.
My role as a Senior Software Developer Consultant in KiZAN’s Software Development group focused on creating software for clients in many different industries. Recent projects at KiZAN have included ASP.NET MVC web applications implementing a Service Oriented Architecture via WCF services, as well as NHibernate, AutoMapper and StructureMap for dependency injection. A recent project for a mid-size manufacturer in Indiana involved developing an optimization algorithm in F#, with an ASP.NET MVC web front-end and a SQL database for saved optimizations. In another project, I developed a mobile application for a nationwide field service company that was deployed to the iOS and Android app stores. The application was developed in Telerik’s NativeScript which allowed for a common codebase, but also allowed for custom platform-specific module development.
The CLOUDMAP project is focused on the development and integration of unmanned aircraft systems with sensors for atmospheric measurements. As a Research Assistant for the University of Oklahoma’s Center for Autonomous Sensing and Sampling (CASS), I have assisted in UAV flights, sensor testing and placement, as well as MATLAB script development. I also learned the open-source paparazzi project and developed flight plans for CLOUDMAP’s Small Unmanned Observer (SUMO) UAV’s. In 2017, I architected a solution for sending sensor data to the cloud, which could then be processed and live streamed to the CLOUDMAP web application and any other subscriber interested in live atmospheric data. The solution involved custom development of the open source Ardupilot platform (C++), custom MAVLink messages and multiple Azure Cloud services such as the Event Hub, Functions, Storage (NoSQL) and the Service Bus.
SevereStreaming is the backbone of SevereStudios.com, a site where verified storm chaser’s can stream live video to the web. As the Lead Developer of SevereStreaming, I redesigned the streaming platform and moved all components to the cloud. The solution required building a web API for clients and partners to quickly download chaser’s live stream information, as well as multiple virtual machines for handling server load and automatic edge server creation, load balancer configuration and chaser’s video thumbnails. The solution involved ASP.NET Web API / C#, PHP, PostgreSQL, Amazon S3, AWS Load Balancer and Debian based virtual machines hosted in EC2.
During the Fall 2017 semester, I worked as an intern with the Storm Prediction Center to develop a Python web application for forecast verification. The web application analyzed and compared forecast areas with actual storm reports using popular GIS Python packages. The results were then saved to a PostgreSQL database and plotted with HTML5 based graphing libraries.
As a developer for the Sales Compensation team, my role consisted of building a new solution for Heartland's extensive payroll process. The solution utilized Microsoft SQL Server 2008 and Entity Framework 5; ASP.NET MVC 4, Web API for REST and WCF for SOAP based communication with other Heartland services; HTML 5 and an extensive use of modular, AMD compliant JavaScript for client side features and communication with Web API.
Working for the leader in healthcare subrogation services, I was part of a large development team building internal tools for our auditors using WCF Services and Silverlight. The software we created was used by our analysts to help visualize and mine terabytes of data using the latest technologies, such as the Entity Framework, Microsoft SQL Server 2008 and the .NET Framework 4.0.
In early 2005 I was hired as a Software Developer for one of the largest, most influential churches in the country with a membership base of over 28,000 and a staff of 350 employees. I was given the responsibility of creating numerous ASP.NET, Windows Forms and Windows Services projects with C# and VB.NET, implementing Microsoft SQL Server 2000 – 2008 for most database solutions.
Provide custom software development to various clients across industries. My solutions include web, database or client development to serve the need of the client. In a recent project for a mid-size company in downtown Oklahoma City, I developed a solution for the client to control their outdoor sign display from a web portal. The project involved an Arduino with network access that communicated to a small ASP.NET Core web app running on CentOS. The Arduino would execute various light displays in response to the configuration created in the web app.
Transcripts available upon request.
Transcripts available upon request.
The goal of this project was to build a network of autonomous UAV's that could sample the atmosphere at various intervals, as well as remotely on-demand. Data would then be processed in the cloud and streamed live to interested parties, such as the local National Weather Service Forecasting Office. This project was the topic of an AMS presentation on Tuesday, January 9, 2018.
During the 2017 semester, I architected a solution for sending sensor data to the cloud, which could then be processed and streamed live to any registered subscriber (via AMQP). In addition, I developed the WxUAS portal, where any observer can watch in near-real-time as our UAV's ascend and descend the atmosphere. The WxUAS portal will plot pressure vs temperature, dew point and flight path. You can view previous sessions (such as 10/6/2018, for example), or if you logon at the right time, perhaps catch a live flight!
The solution involved custom development of the open source Ardupilot platform (C++), custom MAVLink messages and multiple Azure Cloud services such as the Event Hub, Functions, Storage (NoSQL) and the Service Bus. The WxUAS portal was built on .NET Core and uses Web Sockets for streaming live atmospheric sensor data to connected clients.
The Storm Prediction Center issues daily convective outlooks for the entire continental Unites States, breaking down severe weather risk into three categories: tornado, hail and wind risk. Probabilities (2%, 5%, 10%, 15%, 30% or 45%) of experiencing these severe weather events within 25 miles of any given point are attached to each forecast. These forecasts are then verified after all severe weather reports have been received in order to evaluate forecast accuracy.
The current forecast verification scheme plots the outlook on a latitude-longitude 80-km grid spacing, with each grid cell then being assigned it's respective outlook probability. This map is overlaid with verified storm reports, which are then assigned to their respective grid cell. This data is analyzed to determine the accuracy of the individual probabilistic outlooks. Unfortunately, this forecast verification scheme may not reveal the complete picture for a couple reasons:
During the Fall 2017 semester, I developed a Point-Based Convective Outlook Verification App for the SPC. This app analyzed the May/June 2017 outlook and report archive by overlaying the reports and probabilistic forecast regions, generating and saving forecast accuracy analytics to a PostgreSQL database. The statistics were generated by overlaying a 25-mi circle over each storm report and including that report for every probabilistic region overlaid.
The application gives the user the ability to browse historical verification results with interactive maps and graphs, re-analyze with user defined variables (such as the 25-mi radius, dates and outlook times), and generate accuracy reports over specific timeframes. The project was built on Python and various open-source tools.
Severe Studios is a platform where verified storm chasers can live stream their chasing adventures to the web. Users can freely keep tabs on active chasers via the Live Storm Chasing page, virtually experiencing the severe weather from their desktop or mobile device.
I joined the Severe Streaming team in 2013, rebuilding the streaming platform from a Windows Media Service to a cloud-based service, capable of live streaming to all the major client platforms and adjusting to high demand during severe weather days. An API was developed for business partner usage, as well as a Windows Store App, complete with a chaser map and NWS issued products.