Science and Technology Studies Spotlight: Conversation with Faculty

February 27, 2026

Science and Technology Studies Spotlight: Conversation with Faculty

Maya and Lilianna

 

How can we better understand artificial intelligence through the lens of Science and Technology Studies (STS)? In this interview, Professors Liliana Gil and Maya Cruz share how they teach the cultural, ethical, and political dimensions of AI, and reflect on why critical perspectives on technology matter for students across all majors.

 

1. Can you briefly introduce yourselves and explain how your teaching connects to Science and Technology Studies and Artificial Intelligence? 

 

LG: Gladly! I’m Dr. Gil and I teach courses on Science and Technology Studies (STS) in the Department of Comparative Studies. STS is an interdisciplinary field that brings together humanities and social science approaches to understand science and technology in society, and there’s a lot you can do with it. In my research, for instance, I examine how context shapes ideas about innovation and technical skills, as well as the social inequities tied to those ideas. AI is a major topic in STS right now because of recent advances in machine learning and all the public attention it is receiving. But as we know, the desire to build machines that mimic human intelligence has a much longer history... An STS approach helps students explore both: what’s new about AI today and the deeper questions it raises. 

 

MC: I’m Maya! I’m an Assistant Professor in Comparative Studies here at OSU, and I also teach courses in STS, with a focus on Feminist and Ethnic Studies approaches to STS. Pedagogy is really important to me, and I love getting to work with students in STS classes because it is such an interdisciplinary field and students typically join our classes from all different backgrounds and majors, and are often getting to think critically about science, technology, and society for the first time. In my classes, we tackle big questions about what it means to be human in our current moment of AI expansion. My hope is that STS can offer a lens for students to better understand their worlds and the kinds of changes they may want to see. 

 

2. In Comparative Studies, we offer an STS Minor and a Certificate in AI, Ethics and Society. Could you say more about these? Who is the STS minor and the AI certificate for? What kinds of students or disciplines are especially welcome? 

 

LG: In Comparative Studies, we’re fortunate to offer a range of options for students who want to think critically about science, technology, and society. This includes the STS Minor and the AI, Ethics, and Society Certificate, which are open to students across the university. Any major is welcome! I think the key difference between the two is one of emphasis. While the STS Minor requires 15 credits that students can choose across a wide array of topics – from speculative fiction to the science of gender and sexuality – the Certificate in AI, Ethics and Society requires 12 credits in courses that focus more directly on AI. Maya can say more about the certificate, since she leads it. 

 

MC: The AI, Ethics and Society Certificate is a new certificate option for students from any major who are interested in learning more about the complex ethical, cultural, and social issues that are emerging alongside the development of AI in our current moment. Students can select from a range of courses that seek to balance critical perspectives from the humanities with courses on some of the more technical aspects of AI, with the goal of offering an educational path and credential for students to become thoughtful leaders in AI, Ethics, and Society. 

 

3. Why is it important for students to study the cultural and ethical aspects of AI and technology? How can they use this knowledge later, not just academically, but in workplace and real-life situations? 

 

LG: To me, it's always a bit vague and redundant to talk about technology. Hasn't technology been a vital component of society for centuries? From hand tools to railroads... And yet, the term still bears cultural significance. That's interesting. Technology hints at that thing – a tool, a machine, a system – that, while being external to us, extends our capacity of action and shapes the world in certain ways. I want students to reflect on that relationship. Many of our students will become professionals who use technological systems to make decisions and act upon the world. Understanding the cultural and political dimensions of technology – asking who benefits, who is excluded, what values are embedded in the systems we use/design/support, and what accountability should look like when these systems fail – helps develop critical thinking, self-awareness, and a sense of responsibility. These are skills that matter for any well-rounded citizen and in any workplace. 

 

MC: I think it’s really important for students to study the cultural and ethical aspects of AI and technology to gain critical science and technology literacies! We live in a world where technology shapes every aspect of life. I want students to gain the skills that they need to be able to think critically, carefully, and creatively about their world, and have the knowledge and confidence to shape their world for the better. I think STS is one way to do that.  

 

4. How does STS help students think about AI when it is very fast-moving? 

 

LG: Even though AI is fast moving, we know that new systems often reproduce existing social dynamics. That's a key insight of STS: no technology is neutral and, therefore, all technologies deserve careful scrutiny, especially if they are as consequential as AI promises to become. If AI models are used to facilitate decisions about things like medical coverage, hiring, housing, or policing, then questions of bias, transparency, and accountability become essential. STS helps students develop the habit of asking those questions. I also think that the more traditional approaches from the humanities and social sciences (including the really old books!) remain valuable here, as long as we use them with curiosity and flexibility, rather than treating one-size-fits-all frameworks. 

 

MC: My favorite work in STS emphasizes slowness as a critical practice. In a world that seems to move fast, STS can help us slow it all down so we can figure out if we’re moving in the ways we want to be moving, and if we’re not, what we might do instead. We live in a world where technological development often appears to be inevitable – for example, that AI is going to keep expanding, changing, and getting better (or worse) – but STS teaches us to question these kinds of narratives and the systems that proffer them. We often also don’t get time to pause and think critically – and STS classes are spaces where that is our focus.  

 

 

5. Can you share an example of a project or assignment that shows what students do in these courses? 

 

LG: It's hard to pick one! In a recent iteration of "Introduction to Cultures of Science and Technology" (COMPSTD 2340), students did final projects documenting practices of repair in their communities. One goal was to develop an appreciation for maintenance work – labor that is often essential but undervalued and frequently performed by marginalized groups. Students used that lens to think about care, sustainability, and collective well-being. It was very rewarding.  

 

MC: One of my favorite assignments I do with my students is a Media Analysis assignment, where students are asked to find a piece of media (such as the latest news headline, a viral video on TikTok, a comment on a Youtube video) and analyze how their media object reflects the complex relationships between science, technology, and society. We are often led to believe that science and culture are separate, or even opposite – but this assignment really invites students to think about how science, technology, and culture are deeply entangled and inseparable domains of life. My students are often busy doom scrolling on TikTok or overwhelmed by the news, and this assignment lets them bring some of that into class and work through it.  

 

 

6. Are there upcoming courses or changes you’re excited about? 

 

LG: We have several exciting new courses under development. I'm particularly looking forward to offering the online course "Technology, Science, and Citizenship" (COMPSTD 3007), which focuses on the relationship between citizenship and technoscience, covering topics such as democracy and misinformation, border surveillance, and online activism. I'm also organizing a study abroad course in Portugal called "Lived Infrastructures: A Field School in Lisbon" (COMPSTD 4456), where students will reflect, through experiential learning, on the relationship between (post)colonialism, urban infrastructure, and social inequality. 

 

MC: I’ll be teaching COMPSTD 2340: Introduction to Cultures of Science and Technology in Fall 2026! This course is great because we get students from all sorts of majors and minors coming in to learn about science, technology, and society, but this course is also a core course for the AI, Ethics and Society Certificate. I’m also really excited to teach a new course I’ve developed on the “lived environments” of AI, called “Hello World(s): The Global Environments of Artificial Intelligence” in Spring 2027 (COMPSTD 4597.01). In this course, we will spend our time understanding how AI shapes lived environments, with an emphasis on questions about environmental justice and planetary crisis. So often, AI can seem disconnected from our material, environmental, and planetary worlds, and I’m excited to get to focus on conversations about everything from chatbots and algorithms and network cables and data centers, to speculative visions of AI futures. I hope to see you in class! 

 

 

7. What would you say to a student who’s curious about AI but unsure where they belong in these conversations? 

MC: I meet lots of students who are interested in AI but are unsure where they belong in these conversations. I think they feel differently by the end of the course! I emphasize that issues about AI cut across all domains of life, and we need different perspectives to understand these complex issues more fully. I always begin by asking students questions about how they’ve encountered AI in their daily lives, what they’ve heard, what they’ve seen, and what they know or don’t know. I ask them what they care about, and why. In these conversations, students often find that they have more to say than they might have thought! From there, we start building a skillset of “critical AI literacies” that teach students how to bring in their own expertise, values, and concerns into the conversation, alongside expert perspectives from authors across all sorts of disciplinary backgrounds – from STEM to business and the humanities and the arts. AI is expansive, seemingly ubiquitous, and I think that’s an opportunity for everyone to find their way into the conversation in a way that matters to them.