About Me
I'm a Computer Science student at Appalachian State University graduating in May 2026, with a focus on machine learning, deep learning, and natural language processing. My long-term goal is to work in research-driven ML or ML engineering roles, where I can contribute to model development, experimentation, and the translation of research ideas into reliable systems.
My primary interest lies in understanding how models learn representations—particularly through attention mechanisms—and how architectural and training decisions affect performance, generalization, and interpretability. I enjoy working close to the math and implementation, moving fluidly between theory, code, and experimental results.
Current Work
My main project is LexiMind, a custom multi-task Transformer-based NLP system built in PyTorch for abstractive summarization, emotion classification, and topic clustering over literary and news datasets. The system uses an encoder–decoder architecture with attention, supports user-controlled compression levels for summarization, and performs multi-label emotion classification alongside topic grouping.
A key aspect of this project is experimentation. I study architectural tradeoffs, representation learning behavior, and training dynamics by iterating on model components and evaluating results empirically. I also focus on model interpretability, including attention weight visualization, to better understand how the system processes and prioritizes information.
Alongside LexiMind, I've built additional ML and data-driven projects, including a Formula 1 race outcome predictor using classical ML methods and feature engineering, and PlayAxis, a data-intensive platform integrating multiple external APIs. These projects have strengthened my ability to reason about datasets, evaluation metrics, and real-world constraints.
Background & Learning Approach
I've completed advanced university coursework in applied machine learning, numerical methods, and systems-oriented classes, as well as the Machine Learning Specialization by Andrew Ng. I regularly study foundational and modern ML literature, including transformer-based research, to inform both my understanding and implementation choices.
I learn best by building: implementing models from scratch, validating ideas through experiments, and refining systems based on evidence rather than assumptions. I'm particularly interested in NLP, representation learning, and applied deep learning, and I enjoy explaining complex ideas clearly—whether through writing, diagrams, or code.
What I'm Looking For
I'm actively seeking ML Engineering or research-oriented Software Engineering roles where I can:
- Contribute to model development and experimentation
- Work closely with experienced engineers and researchers
- Continue building a strong foundation in ML systems and applied research
This site serves as a place to share my projects, technical writing, and ongoing work as I continue developing as a machine learning engineer.
Education
Appalachian State University • B.S. Computer Science, Minor in Mathematics, Data Science Certificate
Relevant Coursework
- Applied Machine Learning
- Data Structures & Algorithms
- Database
- Numerical Methods
Academic Projects
- Mathematical language interpreter (Haskell)
- Y86 pipelined CPU simulator (C/C++)
- 2D game engine (Java)
Leadership
- Team Captain, University Esports Club