New Q&A series exploring how FAS faculty, staff, and researchers are using generative AI tools in their work.

Name: Sabrina Azinheira
Role: Faculty Coordinator, SEAS
What are some ways you have used GAI to help you at work?
I work with three different labs handling both logistical and administrative tasks while supporting professors with various projects. One big way I’ve used generative AI is for managing citations. One of the faculty members I support is working on a textbook, so part of my job is to format all of his citations. The citations need to be in a specific format, BibTeX, which is super precise and syntax-heavy, and using a standard citation manager just wasn’t cutting it. I used to spend a whole day formatting citations for a single chapter by hand. I decided to build a custom GPT that I could train to output citations exactly the way he wants them. Now I just input the article or source, and it gives me a properly formatted BibTeX entry, including all the little details he prefers, like including the name of the person who added the citation. I still check them over, but the whole process has gone from taking a full day to just an hour or two.
I’ve also found AI especially useful for handling those complex, multi-variable tasks that can easily take up your whole day. For example, a professor needed to schedule interviews with eight Ph.D. candidates over three specific days and time windows, while accounting for someone’s time zone. Normally, that would’ve taken hours of back-and-forth and manual planning. Instead, I pasted the request into ChatGPT, asked it to create time slots, and it gave me a schedule with proposed assignments that worked for nearly everyone. I just sent out the time slots, and most people confirmed right away. What would’ve taken half a day took maybe 20 minutes.
Another way I’ve used AI is to help answer administrative questions quickly. For example, we’ve got a ton of finance and policy documents, and while those are the official sources of truth, it can be really hard to find specific information when someone needs an answer on the spot, so I created a custom GPT by uploading the documents that it could use as reference. One time, a lab member asked me, “How soon before a conference can I leave?” I was pretty sure it was 24 hours, but I couldn’t find the policy anywhere, and they were standing right in front of me. So I asked my GPT, and it immediately confirmed that 24 hours was the correct answer. That kind of quick access is incredibly helpful.
How do you get useful responses from GAI?
I wouldn’t say I have one specific prompt I always use because I think it’s more about how I approach the question. If I have a really long or complicated question, I don’t just ask the whole thing at once. Instead, I split it into smaller parts and ask those individually. That way, the model can focus on one thing at a time and give me a clearer, more useful response. I’ve found that asking bite-sized questions works best.
Also, the way you phrase the question really matters. If I keep it short and straightforward, it’s easier for the model to understand what I’m asking. If I’m not clear, or if the question is too long or vague, I usually won’t get a great answer. So overall, my strategy is to keep questions brief and focused.
What types of tasks do you think GAI is best suited for, and which ones does it not handle well?
One of the biggest things I’d say up front is that generative AI, like ChatGPT, is not a search engine. That’s a common misconception. People expect it to work like Google, where you ask a question, and it gives you a factually correct answer. But that’s not really how it works. It’s much more conversational, and it’s basically doing a lot of processing behind the scenes to figure out what’s the most likely or reasonable response. It doesn’t actually know anything, it just sounds smart because it’s really good at imitating how humans speak.
So in that sense, it’s great for certain things and not so great for others. For example, it’s really good at brainstorming, pattern recognition, and helping you get started on something. It’s like asking a very informed friend for help, but that friend can still be wrong, so you have to double check anything important.
It’s not so good at tasks where accuracy is critical unless you’re closely reviewing everything it gives you. When I use it to generate citations for a professor’s textbook, it saves me a ton of time, but I still check every single detail like titles, URLs, and formatting because if anything’s off, it reflects poorly on our work. It definitely speeds up the process, but I wouldn’t trust it to get everything right on its own.
Has using generative AI improved the way you approach your work in any lasting way?
Using generative AI has saved me time. It hasn’t changed everything I do because there are still plenty of parts of my job where AI just isn’t helpful. But for the things it can help with, it’s made a noticeable difference, and then it gives me a chance to look at everything else from a bigger picture of how I can save time. What other GPTs can I build to make my own life and my coworker’s lives easier? It’s nice to have a little bit of space to look at everything and say okay, how can I optimize my work and automate some more of my day to free up my time to do other things? And I do keep finding new ways to use it.
I know not everyone feels the same way. Some of my coworkers are skeptical. They’ll say things like, “I can just do this better myself,” or I know there is a lot of worry in the broader world that AI might replace jobs. But I always tell them, if you don’t find it useful right now, that’s totally fine. It’s just a tool, one that I find very helpful, but I’m not trying to convince everyone they need to use it.