Resume

Coder and Parent

Resume

My Resume as Word Document
My Resume as PDF

Current Position

Since 2019, I’ve been working for BYO Recreation. One of my first major accomplishment was that I spearheaded migrating them fully into the cloud, so we could retire the aging local server. Though it would turn out to be incredibly useful several months later when the pandemic forced us to transition to working remotely. Another interesting project I ended up building, was a system which could take form submissions from all of our websites, and evaluate them both on the submitted content, but also on the metadata about the submission, so that we can filter out spam form submissions.

Previous Positions

From 2016 to 2019, I’ve been working for ERT and I’ve spent that time working on Drupal 7 & 8, and Symfony 3 & 4, for Drought.Gov. I will go ahead and follow that all the opinions that I have are my own, and do not interfere with my ability to do my job.

More than just the various technologies I’ve used, like geo-json, I’ve been doing a lot more cross application programming, with the email notification system for drought alerts, the geo-spatial analysis was handled in python, by another programmer, and we were developing in tandem. The website being the same way, with Drupal feeding json into a symfony application which cached and pieced together the final webpage from multiple Drupal entities.

Beyond the technical complexities of my work, also came my move to another state, to allow my wife to take a better job. Switching to 100% tele-working was a massive transition for me. Our team had long since before me been located in two different timezones, but you never really realize how much information you get through normal office chit-chat until you skip it for a few months.

Turning Back The Wheel of Time

With Kimmel and Associates, I grew a lot as a programmer, picked up real world usage of a different programming language (ruby), used several new frameworks (laravel for php, and ruby on rails). Probably the best thing that happened was to help get me past my old irrational dislike of frameworks.

Construction Jobs was quite fun, I was a team of one, I built a replacement for their old website, which included a very carefully crafted migration script of my own design to preserve all the data from their old job board system. After the website I built was lunched, I spent over another year upgrading it with a never ending list of new features. I had such a lovely boss who was able to clearly express desired features for the website, and while the priorities would shift, sometimes daily, there was never any pressure to work long into the night.

Building the website was pretty neat, we used a service to parse job seekers resumes so that we could have all the individual pieces of data and hiring managers would have the option to see everyone’s resume consistently, to make it easier to compare candidates; it also meant that job seekers only had to review their information, after uploading their resume, rather than having to type it all in again.

The resume parser we used had a SOAP interface and returned xml, which was quite extensive, and beyond just giving the information, also gave a confidence score on each piece of extracted information, which allowed us to flag for the user fields we wanted to ensure were reviewed by a human. Having all the resumes split allowed our resume search to be quite powerful, letting employers search by job title, education, years of experience, and such; it was backed by solr.

The most frequently requested feature that I handled was automatically importing job postings from a construction company’s website.

Very rarely would the careers section have anything like a feed, so 90% of the time, it was up to me to build a web scraper which had to build a list of all the jobs, figure out if there was an existing record to update, or if it was a new job. There was also the matter of sorting out which jobs were no longer posted, so that we would remove them from our website.

Beyond the complexities of mimicking a user visiting a website programmatically, a big piece of it, was determining problems and making sure I was alerted, so I could make corrections to my code to compensate for changes companies made to their websites.