Artificial Intelligence at GUC: What We Need to Know

Member for

2 months 1 week
Apr 28, 2025
Person holding phone using AI

ChatGPT, Bard, CoPilot, Alexa, Siri – these are just a few examples of how artificial intelligence (AI) is quickly becoming a part of our daily lives, both personally and professionally. While these tools are great at making our lives more efficient, it's essential for us to stay informed and aware of best uses and practices. 

As AI is being introduced by companies that provide software, hardware, devices, and services already in use at GUC, our IT Department continually evaluates how it is being used and what it means for protecting GUC data from unauthorized or malicious use.  

“It's crucial for us all to be mindful of how we use AI to ensure it is accurate and appropriate for the intended purpose,” said Director of IT Andy Anderson. “This helps maintain the integrity, reliability, and safety of our systems and communications.” 

Several GUC-purchased products are currently incorporating, or will soon incorporate, AI into their offerings such as Microsoft, GIS, BOSS, CCS, and iPhones. While IT is communicating with employees who are directly working with these products, we should all still be aware of any AI add-ons for these or other products and understand that third parties may be gaining access to GUC data and systems.  

If you are using AI for work-related purposes or thinking about it, please remember to contact the IT Department for assistance and guidance. Public Information is working with IT to share AI-related information and helpful tips to employees in a variety of ways, beginning here in Spotlight with a focus on the new AI Policy. 

Dissecting the AI Policy
In 2024, a new section was added to the Acceptable Use Policy to address AI, and all employees were asked to sign the updated Policy. The new AI section has seven parts. This month we will focus on the Confidentiality and Labeling sections 9.d and 9.f. The AILM references below stands for “artificial intelligence language model.”  

9.d. Confidentiality: Confidential, sensitive, and GUC proprietary information must not be entered into an AILM tool, as information may enter the public domain. Users must follow all applicable data privacy laws and company policies when using AILM. 

As with all communication that employees have with others outside of GUC (email, text, phone, etc.), we must treat AI similarly with the understanding that we are communicating with a tool that could potentially share the information with a third party. For example, if you would not disclose information in an email, like a customer’s social security number, then it should not be input into an AI tool.  

Some AI tools, such as Microsoft CoPilot which is being rolled out by IT, is configured to ensure that the data entered by GUC employees doesn’t get shared outside of GUC. Before using any AI tool or AI add-on to a software you use, make sure you understand how that system stores and protects GUC data. 

9.f. Label: Content produced via AILM must be labeled or footnoted as containing AILM information. 

By including a label disclaimer with anything generated by or assisted with an AI tool, you ensure that the recipient is aware of the involvement of AI in the content creation process, which helps maintain trust, credibility, and compliance with regulations.  

A good example for labeling is this article! For the first time in the history of Spotlight, we used Microsoft CoPilot to assist with organizing notes and generating ideas for this story. Therefore, we included the “AI edited” note at the end.  

Another common use of AI in our industry is note-taking for meetings. Suggested disclaimer labels for this instance are “Made with AI” or “AI edited.”  

Again, when in doubt, check with the IT Help Desk. 

A GUC AI Project in the Works
As mentioned earlier, our IT Department continually evaluates AI tools and how they could assist our GUC teams without compromising sensitive data. Todd Cannon, IT Applications Manager, shared with us a sneak peak into an AI pilot project they have with the Gas Department: analyzing images of gas meters.  

Our Gas Department uses a vendor to inspect and take photos of meters and assign each meter a rating. These ratings allow staff to prioritize meter maintenance and replacement.  

IT’s pilot project involves providing old inspection photos to an AI tool, which then rates each meter. IT compares the vendor’s rating with the AI-generated rating to see how close they match.  

"Right now, we are working in batches of 25 meters and gathering feedback directly from Gas Engineering staff, which helps teach the AI system to become better and more accurate,” said Todd. “Our goal is to eventually have the Gas Department use AI as a quality control tool to help them in the prioritization of meter maintenance.”  

The Gas pilot project, along with other AI initiatives, involves collaboration across all IT teams, including Nick Klapp, Joe Metz, Taylor Nethercutt, Isaiah Bradshaw, John Bartolotti, Adam Wainright, Aaron Johnson, Eban Kea, Jason White, Amber Garris, and Tyler Reynolds.  

*This article was edited with assistance from AI. 

Recent Posts

Recent Posts

View All Posts