Putting GAI to Work: Q&A with Chris Campion

up close of brick wall with leaves on top and a lamp post

New Q&A series exploring how FAS faculty, staff, and researchers are using generative AI tools in their work.

Name: Chris Campion
Role: Director of Strategic Analysis, Division of Continuing Education (DCE)

What are some ways you have used GAI to help you at work?

One example that comes to mind is the work we’ve done in DCE to better understand qualitative feedback from student evaluations.

At the Extension School and Harvard Summer School, we serve thousands of students, which adds up to tens of thousands of pieces of feedback over time. It’s a lot of information, and making sense of all that qualitative data has historically been a very manual, time-consuming process.

Using AI to summarize and analyze student feedback has shown a lot of promise. For example, we’re exploring how a custom GPT or chatbot might help our teaching and learning team quickly surface meaningful insights, like common themes, areas for improvement, and what’s working well, by analyzing feedback across different student groups.

Another area we’re exploring is using AI to support administrative decision-making. For example, the Extension School offers many credentials (degrees, certificates, or individual courses) and staff often field complex questions like, “If I take this course, what does it count toward?” A chatbot could help by centralizing and simplifying those answers.

We’re also thinking about how AI can support degree planning. Say a student asks, “I’m pursuing this degree, what should I take in the fall and spring?” AI could help answer that.

And we’re even looking at course cancellation workflows. Deciding whether to cancel a course due to low enrollment is currently a mix of art and science. AI might help make that decision more data-informed and automate how we communicate it to students by suggesting alternate courses that meet similar needs.

This work is ongoing. We’re still testing, validating, and building confidence in the tools and we’re careful not to push anything prematurely.

What types of tasks do you think GAI is best suited for, and which ones does it not handle well?

There’s a lot to get excited about, especially on the technical side. I have a data science background, so I love how helpful AI is for coding. But what I really value is AI’s ability to act as a thought partner.

Often, I’ll go to ChatGPT Edu or the AI Sandbox with a half-baked idea in bullet points, and say, “Here’s what I want to do, what’s the best approach?” That back-and-forth is incredibly helpful for shaping project designs.

Right now, we’re all still building confidence in these tools. I wouldn’t recommend uploading a spreadsheet to ChatGPT Edu or the Sandbox, asking it questions, and sending off the results without double-checking. It’s still very much a “trust, but verify” process.

AI can hallucinate, especially with structured data. That said, I wouldn’t discourage people from experimenting, but you have to validate the results. But I’m mindful of the conversations around the ethical implications of AI in the workplace. As we explore these tools moving forward, it will be important to use AI thoughtfully and responsibly.

How do you get useful output from AI?

A prompt library is incredibly helpful. As you interact with AI, you’ll find that some prompts work better than others. Saving and sharing the good ones can make a big difference.

I also use AI to help me write better prompts. I’ll say, “I need a prompt that includes X, Y, and Z, but avoids A, B, and C,” and it helps me quickly craft something more effective. Then I use that prompt to get the output I need.

Sharing prompts with colleagues is eye-opening—it shows them new possibilities, and we start building on each other’s ideas. It’s collaborative and energizing.

How can a beginner learn to use AI?

Try things in a safe and controlled way. Ask the AI questions about your role: “Here’s my job. What are some ways you could help me?” You might be surprised at what comes back.

And when someone says, “AI can’t do this,” I encourage them to have that conversation with the AI. Explain the task and your concerns. You might be surprised at what it can help with, especially with custom GPTs or digital assistants designed for specific use cases.

Making these assistants is surprisingly easy—it’s like creating a Google Doc. You define the instructions, the knowledge base, and the prompt. If you’re getting wrong answers, one of those three pieces likely needs adjustment.