Capturing human behavior and language for interactive systems
- Ethan Fast.
- [Stanford, California] : [Stanford University], 2018.
- Copyright notice
- Physical description
- 1 online resource.
Also available at
At the library
All items must be viewed on site
Request items at least 2 days before you visit to allow retrieval from off-site storage. You can request at most 5 items per day.
|3781 2018 F||In-library use|
- From smart homes that prepare coffee when we wake, to phones that know not to interrupt us during important conversations, our collective visions of human-computer interaction (HCI) imagine a future in which computers understand a broad range of human behaviors. Today our systems fall short of these visions, however, because this range of behaviors is too large for designers or programmers to capture manually. In this thesis I will present three systems that mine and operationalize an understanding of human life from large text corpora. The first system, Augur, focuses on what people do in daily life: capturing many thousands of relationships between human activities (e.g., taking a phone call, using a computer, going to a meeting) and the scene context that surrounds them. The second system, Empath, focuses on what people say: capturing hundreds of linguistic signals through a set of pre-generated lexicons, and allowing computational social scientists to create new lexicons on demand. The final system, Codex, explores how similar models can empower an understanding of emergent programming practice across millions of lines of open source code. Between these projects, I will demonstrate how semi-supervised and unsupervised learning can enable many new applications and analyses for interactive systems.
- Publication date
- Copyright date
- Submitted to the Computer Science Department.
- Thesis Ph.D. Stanford University 2018.
Browse related items
Start at call number: