Alexander Karpekov Alexander Karpekov

Alexander Karpekov

PhD Student in Computer Science. Data Scientist. Explorer.

Hi there 👋🏻 My name is Alexander. I am working on my PhD in Computer Science at Georgia Tech, advised by Sonia Chernova and Thomas Plötz .

After spending a decade in the industry, working on breaking news discovery at Dataminr, and Search and YouTube Music recommendations at Google, I returned to academia to focus on Machine Learning and AI research. I’m passionate about Explainable AI, pattern discovery in large datasets, and interactive data visualization.

Outside of school, I enjoy snowboarding (calling Silverthorne, Colorado my second home), rock climbing (an aspiring lead belayer), and rowing (GT Crew). I like studying foreign languages and learning about new cultures. One day, I hope to learn how to play piano.

I am always open to collaborations — feel free to get in touch!

Latest News

  • Nov 2024

    Transformer Explainer accepted to AAAI'25 Demo track

  • Oct 2024

    Transformer Explainer won Best Poster Award at IEEE Viz'25 conference [link]

  • Aug 2024

    I started my 👨🏻‍🎓 PhD journey at Georgia Tech [link]

  • Apr 2024

    Today was my last day at Google. I’m so grateful for all the wonderful people I’ve met and the friendships I’ve made over the years. I’ll miss y'all a ton! 🥲 [link]

Featured Projects and Publications

DISCOVER: An Unsupervised Approach to Cluster and Label Human Activities in Smart Homes

DISCOVER: An Unsupervised Approach to Cluster and Label Human Activities in Smart Homes

Currently Under Review '25

In this paper we introduce DISCOVER, an active-learning method to identify fine-grained human activities from unlabeled smart home sensor data. DISCOVER combines self-supervised feature extraction and embedding clustering with a custom built visualization tool, which allows researchers to identify, label, and track human activities and changes over time.

Transformer Explainer: Interactive Tool to Learn about LLMs

Transformer Explainer: Interactive Tool to Learn about LLMs

IEEE Viz '25AAAI '25

An interactive visualization tool that helps users understand how transformer models work through hands-on experimentation and real-time feedback.

Went Viral: 150K+ visitors in the first 3 months
Is Attention Truly All We Need?

Is Attention Truly All We Need?

Deep Learning for Text: Final Project '23

This project investigates the use of Transformer attention weights for deriving feature importance in NLP tasks, demonstrating that combining attention weights with gradient information improves explainability and providing an open-source GitHub tool for applying this method to any Transformer model.

Double-Relocation Policy Evaluation in Guangdong, China using Night Lights Data

Double-Relocation Policy Evaluation in Guangdong, China using Night Lights Data

ArcGIS: Final Project '14

This project examines Guangdong's shifting economic growth using Night Lights data from satelites, focusing on development beyond the Pearl River Delta and the impact of 2008 government policies.