Hi!, I am Sean Kent a first year Ph.D. student at the University of Wisconsin - Madison. I completed my undergraduate Bachelor of Science in Mathematics at the University of Notre Dame in 2017. My research interests are currently still broad—but I have an eye towards machine learning and its applications.
I'm currently studying as a Ph.D student in the Department of Statistics, with the hopes of also completing a graduate minor in Compuer Science. I'm only a first year now, so stay tuned for more updates on my research interests, interesting courses, and other information!
I completed my Bachelor of Science in Mathematics, with a minor in actuarial science from Notre Dame in 2017, but that hardly tells the whole story. I took courses ranging from honors general education courses, including Philosophy, History, and English, to challenging elective courses like Machine Learning, graduate level Econometrics, Investment Theory, and Topological Data Analysis. In my spare time, I was an active participant in my residence hall community—serving as its representative in the Student Goverment Senate and running the hall's pizza restaurant, Keough Kitch.
[Madison, WI] Currently facilitating and leading discussions in 2 sections of Introductory Applied Statistics for the Life Sciences
[Madison, WI] Facilitated and lead discussions in 3 sections of Introductory Statistics for Engineers, including a total of 143 students. Reviews pending.
[Skokie, IL] Working under Paynet's Lead Statistical Modeler and Senior Economist, my major contributions were twofold. First, I researched and explored the opportunity of machine learning models in the heavily regulated commercial lending space. This project involved understanding the theretical framework, best practices, and practical implementations of several models—neural networks, random forests, gradient boosting, and elastic net—as well as testing these models in python as a comparison to one of Paynet's current products. This brought the entire modeling team up to speed on the most popular machine learning techniques and proved we could ensemble a few different models to quickly benchmark the performance of the interpretable models currently in use. The later portion of the summer was spent creating tools for analyzing past performance of our AbsolutePD model, both to serve as guidance for AbsolutePD v2 and set up a framework for evaluating future model performance. Near autonomy was achieved by integrating SQL code in an R script, allowing for an evaluation of 4 key metrics across hundreds of categories and cross-categories. The SQL/R integration that I figured out saved countless hours of manually running similar SQL scripts, and could speed up many other processes in the Analytics department.
[Notre Dame, IN] Tucked away in the first floor of Keough Hall lies a small yet bustling pizza restaurant fondly called Keough Kitch. My good friend and I took over ownership in 2015. Combining a data-driven approach and a revamped menu and marketing strategy, we drove sales above all past sales we could find data for, and revived the community surrounding the restaurant to its former glory. This was all possible because of the excellent team we hired and managed, excel workbooks containing dashboards where we could see key performance indicators at a glance, and superb restaurant knowlegde for our age. I primarily handled the design and upkeep of the excel documents, planning and excecution of marketing promotions, and all financial aspects, but having full responsibility for the restaurant led to plenty of other duties along the way.
[Skokie, IL] In my first stint at Paynet, I came in with no professional experience, and a modereate understaning of excel. By the time I finished, I had made significant contributions to a project that predicts the probability of default for any industry and location combination, became comfortable coding in R and SQL, and developed an advanced understanding of Excel. This evolution was all possible due to a passion for learning and an excellent boss. He started me off reading an academic paper that underpinned the model he was building, and soon I was documenting and assuring the quality of his R code. Because of my familiarity with his code—and because my boss was swamped with work—I finished writing his code and did the majority of the analysis of results for our probability of default model, which was based on a hierarchical credibility framework. With the remaining time, I helped build automated excel spreadsheets for various deliverables and helped design and run SQL queries on Paynet's database of over 23 million small business loans.
Copyright 2017 My Info. All Rights Reserved | Design By W3layouts