PyLips
A package for simple, expressive, and customizable text-to-speech with an animated face.
Install / Use
/learn @interaction-lab/PyLipsREADME
PyLips

PyLips is a Python-based interface for developing screen-based conversational agents. It is designed to make developing socially assistive robotics easier by providing a simple, expressive, and customizable framework for developing conversational agents.
PyLips is easy to install, simple to use, and open-source. It comes ready to use with your system's speech synthesis tools, and uses other free and open-source software for turning these sounds into facial expressions.

To Install from PyPI
You can install PyLips using pip. To install PyLips, run this command in your terminal:
python3 -m pip install pylips
If you are running PyLips on a Linux Distribution, you may need to also install the following packages:
sudo apt update && sudo apt install espeak-ng ffmpeg libespeak1
PyLips Quickstart
Here is a quick example to test your installation. This code will make your computer face say "Hello, welcome to pylips!". The voice will be the default system voice, but this is something we can change later.
First, we will have to start the PyLips server. This is a simple flask sever that can serve several faces at the same time. To start the server, run the following command:
python3 -m pylips.face.start
This will start the server on port 8000. Do not worry about the warning message, the package will
still work. You can connect any web browser to the urls printed, even across computers on the local network.
For now, just open a browser and go to http://localhost:8000/face to see the face.
Now open a new terminal tab and run the following code:
from pylips.speech import RobotFace
face = RobotFace()
# you may need to wait here for a minute or two to let allosaurus download on the first run
face.say("Hello, welcome to pylips!")
If all goes well, the face should have said the message!
If you use PyLips in an academic publication, please use this
@inproceedings{dennler2024pylips,
title={PyLips: an Open-Source Python Package to Expand Participation in Embodied Interaction},
author={Dennler, Nathaniel Steele and Torrence, Evan and Yoo, Uksang and Nikolaidis, Stefanos and Mataric, Maja},
booktitle={Adjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology},
pages={1--4},
year={2024}
}
Related Skills
node-connect
349.2kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
109.5kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
109.5kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
model-usage
349.2kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
