Recent MIT class uses anthropology to enhance chatbots

-

Young adults growing up in the eye economy — preparing for adult life, with social media and chatbots competing for his or her attention — can easily fall into unhealthy relationships with digital platforms. But what if chatbots weren’t mere distractions from real life? Could they be designed humanely, as moral partners whose digital goal is to be a social guide somewhat than an addictive escape?

At MIT, a friendship between two professors — one an anthropologist, the opposite a pc scientist — led to creation of an undergraduate class that set out to search out the reply to those questions. Combining the 2 seemingly disparate disciplines, the category encourages students to design artificial intelligence chatbots in humane ways in which help users improve themselves.

The category, 6.S061/21A.S02 (Humane User Experience Design, a.k.a. Humane UXD), is an upper-level computer science class cross-listed with anthropology. This unique cross-listing allows computer science majors to meet a humanities requirement while also pursuing their profession objectives. The 2 professors use methods from linguistic anthropology to show students integrate the interactional and interpersonal needs of humans into programming.

Professor Arvind Satyanarayan, a pc scientist whose research develops tools for interactive data visualization and user interfaces, and Professor Graham Jones, an anthropologist whose research focuses on communication, created Humane UXD last summer with a grant from the MIT Morningside Academy for Design (MAD). The MIT MAD Design Curriculum Program provides funding for faculty to develop latest classes or enhance existing classes using modern pedagogical approaches that transcend departmental boundaries.

The Design Curriculum Program is currently accepting applications for the 2026-27 academic 12 months; the deadline is Friday, March 20.

Jones and Satyanarayan met several years ago after they co-advised a doctoral student’s research on data visualization for visually impaired people. They’ve since turn into close friends who can just about finish each other’s sentences.

“There’s a way by which you don’t really fully externalize what or how you’re thinking that until you’re teaching,” Jones says. “So, it’s been really fun for me to see Arvind unfurl his expertise as a teacher in a way that lets me see how the pieces fit together — and discover underlying commonalities between our disciplines and our ways of considering.”

Satyanarayan continues that thought: “One in all the things I actually enjoyed is the reciprocal version of what Graham said, which is that my field — human-computer interaction — inherited loads of methods from anthropology, reminiscent of interviews and user studies and remark studies. And over the a long time, those methods have gotten increasingly watered down. Consequently, loads of things have been lost.

“As an example, it was very exciting for me to see how an anthropologist teaches students to interview people. It’s completely different than how I might do it. With my way, we lose the rapport and connection you should construct along with your interview participant. As a substitute, we just extract data from them.”

For Jones’ part, teaching with a pc scientist holds one other type of allure: design. He says that human speech and interaction are organized into underlying genres with stable sets of rules that differentiate an interview at a cocktail party from a conversation at a funeral.

“ChatGPT and other large language models are trained on naturally occurring human communication, in order that they have all those genres inside them in a latent state, waiting to be activated,” he says.

“As a social scientist, I teach methods for analyzing human conversation, and provides students very powerful tools to do this. But it surely finally ends up often being an exercise in pure research, whereas this can be a design class, where students are constructing real-world systems.”

The curriculum appears to be on course for preparing students for jobs after graduation. One student sought permission to miss class for per week because he had a trial internship at a chatbot startup; when he returned, he said his work on the startup was similar to what he was learning at school. He got the job.

The sampling of group projects below, built with Google’s Gemini, demonstrates a few of what’s possible when, as Jones says, “there’s a extremely deep intertwining of the technology piece with the humanities piece.” The scholars’ design work shows that entirely latest ways of programming may be conceptualized when the humane is made a priority.

The bots reveal clearly that an interdisciplinary class may be designed in such a way that everybody advantages: Students learn more and in a different way; they will fulfill a non-major course requirement by taking a category that’s directly useful to their careers; and long-term faculty partnerships may be forged or strengthened.

Team Pond

One project guarantees to be particularly useful for graduating seniors. Pond is designed to assist young college graduates adapt to the challenges of independent adult life. Team Pond configured the chatbot not to easily parrot the user, or to sycophantically praise mistaken answers. As a substitute, Pond provides advice to assist with “adulting” (behaving as a responsible adult).

“Pond is built to be your companion from college life into post-college life, to enable you in your transition from being a small fish in a small pond to being a small fish in a really big pond,” says sophomore Mary Feliz.

“College may be very much a high-proximity and high-context environment, within the sense that everyone around you goes through the identical thing, and it’s easy to construct relationships or find opportunities, because there are structured pathways that you could have access to,” explains graduate student Emaan Khan. “Post-grad life is low-context. You’re not all the time surrounded by your peers or your professors. It’s no-proximity also, within the sense that you simply don’t have opportunities at your step. Pond is a tool to assist empower you to access certain opportunities, or learn navigate.”

Pond’s developers are graduate student Claire Camacho, Harvard University graduate student Charlotte Cheah, Feliz, and Khan. They trained Pond to supply expertise in three areas: social life, skilled life, and adult skills.

The scholars demonstrated Pond’s social communication by asking start conversations in a hobby group. Next, they exhibited Pond’s adulting communication by typing in a matter.

“My landlord has asked me personal questions that make me uncomfortable. What should I do?”

Appearing in Pond’s textbox was this response: “Politely set boundaries. You will not be obligated to reply personal questions. You’ll be able to gently redirect the conversation back to property-related topics. Know your rights: familiarize yourself with tenant rights in your city or state regarding privacy and landlord conduct.”

When asked try this, Pond offered specific instructions: “You’ll be able to try saying phrases like: ‘I prefer to maintain our conversations focused on the property,’ and ‘let’s persist with topics related to the apartment, please.’ All the time redirect the conversation back to the apartment or a legitimate maintenance issue. Keep your tone polite but firm. Document any conversations if needed.”

Pond also offered a role-playing scenario to assist the user learn what polite-but-firm language may be in that situation.

“The ethos of the practice mode is that you simply are actively constructing a skill, in order that after using Pond for a while, you’re feeling confident you can swim on your personal,” Khan says. The chatbot uses a degree system that permits users to graduate from a subject, and a treasure chest to store prizes, elements added to spice up the bot’s appeal.

Team News Nest

One other of the projects, News Nest, provides a complicated technique of helping young people engage with credible news sources in a way that makes it fun. The name is derived from this system’s 10 appealing and colourful birds, each of which focuses on a selected area of stories. When you want the headlines, you ask Polly the Parrot, the essential news carrier; should you’re desirous about science, Gaia the Goose guides you. The flock also includes Flynn the Falcon, sports reporter; Credo the Crow, for crime and legal news; Edwin the Eagle, a business and economics news guide; Pizzazz the Peacock for pop and entertainment stories; and Pixel the Pigeon, a technology news specialist.

News Nest’s development team is made up of MIT seniors Tiana Jiang and Krystal Montgomery, and junior Natalie Tan. They intentionally built News Nest to stop “doomscrolling,” provide media transparency (sources and political leanings are all the time shown), and so they created a clever, healthy buffer from emotional manipulation and engagement traps by employing birds somewhat than human characters.

Team M^3 (Multi-Agent Murder Mystery)

A 3rd team, M^3, decided to experiment with making AI humane by keeping it fun. MIT senior Rodis Aguilar, junior David De La Torre, and second-year Deeraj Pothapragada developed M^3, a social deduction multi-agent murder mystery that includes 4 chatbots as different personalities: Gemini, OpenAI’s ChatGPT, xAI’s Grok, and Anthropic’s Claude. The user is the fifth player. 

Like an everyday murder mystery, there are locations, weapons, and lies. The user has to guess who committed the murder. It’s very just like a board or online game played with real players, only these are enhanced AI opponents you’ll be able to’t see, who may or may not tell the reality in response to questions. Users can’t get too involved with one chatbot, because they’re playing all 4. Also, as in an actual life murder mystery game, the user is usually guilty.

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x