A Little About MePermalink
My journey into the world of AI began with a childhood curiosity for understanding how things work. I was always taking things apart, driven by a desire to understand how they operated inside and, hopefully, put them back together. This hands-on exploration often extended beyond electronics and computers; I spent nearly a decade as a flying trapeze artist, even traveling and training with different circus troupes and spending a few nights sleeping out under the stars. While seemingly unrelated, these experiences instilled in me a love for pushing boundaries, embracing challenges, as well as a strong comfort for failure as a mandatory step to success.
As I grew older, my fascination with technology deepened. I worked on various DIY projects, usually driven by my desire to learn through experimentation:
- DIY Marine Electronics: I designed and built a software suite and custom wiring system for a sailboat with a faulty electrical system.
- Boatbuilding from Scratch: I constructed a different (admittedly not very seaworthy) sailboat from scrap wood in my garage.
- Upcycled Coffee Roaster: I repurposed a popcorn maker into a customized coffee roaster with inline fan and temperature controls.
- Game-Playing AI: I developed reinforcement learning agents that learned to play and win games like Othello and Monopoly.
These early experiences sparked a passion for problem-solving and a desire to build things that could learn and adapt.
My academic path wasn’t linear. I explored pre-med and accounting before discovering my true calling in data science. I was fortunate to be at NYU during the rise of data science as a field, allowing me to dive into statistics, programming, and machine learning before the market became overly saturated. Through many (often failed, sometimes successful) research projects, I further honed my skills:
- Neural Networks for Gesture Recognition: I developed a recurrent neural network to identify hand gestures from nerve data, achieving results that surpassed existing benchmarks.
- Optimization for Customer Retention: I implemented advanced resampling techniques to improve the performance of a customer retention classification model.
- Reproducible Research: I replicated and expanded upon published research using newly available datasets, generating novel insights.
- Data Science Education: I designed and delivered over 30 hours of lectures on Python programming, machine learning with scikit-learn, and deep learning with PyTorch.
My professional journey later began at Polen Capital when I interned with their then-small data team. I gained valuable experience in data engineering, pipeline development, and dashboarding, contributing to one of the first implementations on their new data warehouse. After graduating from NYU (with a memorable semester dedicated solely to Discrete Mathematics and Piano!), I rejoined Polen Capital’s Investment Analytics department.
Over the next few years, I benefited immensely from the mentorship and support of my colleagues. I had the opportunity to:
- Conduct Investment Research: I performed statistical analysis and tested investment theories proposed by portfolio managers, drawing from academic literature and conducting novel research.
- Optimize Idea Generation: I developed sophisticated optimization techniques, including differential evolution and custom genetic algorithms, to enhance investment idea generation.
- Develop Risk Indexing Tools: I created a risk indexing engine using BERTopic and a custom Dash-based frontend for visualization and analysis.
- Build Data Access APIs: I developed a suite of APIs to facilitate complex data extraction from our databases modelled after Bloomberg BQNT libraries.
- Create a Python Querying Package: I designed and released a Python package with a custom Lark-based query language for streamlined data access.
- Oversee Machine Learning Practices: I managed and reviewed all machine learning activities within the firm, including deployments to training and inference clusters.
With the rise of generative AI and large language models (LLMs), I was called upon to leverage my prior experience with these technologies (like RoBERTa, which we had deployed internally) to spearhead Polen Capital’s AI strategy. This has led to my current focus on:
- LLM-Powered Applications: Developing a custom Dash frontend for LLM interaction and chat, currently serving ~170 monthly active users.
- Agentic Workflows: Implementing AI agents for complex data retrieval and question answering, integrating both text and API-based queries.
- AI Infrastructure Management: Managing compute and training infrastructure for all machine learning initiatives within the firm.
- AI Education and Training: Designing and conducting learning events to educate users on prompting techniques and practical AI applications.
- AI Strategy and Architecture: Leading the development of Polen Capital’s overall AI strategy, technical direction, and experimentation roadmap.
I’m incredibly grateful for the journey that has led me to this point. If you have any challenges you believe I could assist with, or if you have a non-profit initiative where my skills could be of value, please don’t hesitate to connect with me. I’m always eager to collaborate and explore new opportunities.