Hello, very nice initiative, to answer, just take a step backward :) (since i try this myself 1 year ago for R&D purpose)
Every AI inside microsoft have a lot of included guardails and forbiden subject (like Harmful Content etc)
So most subject cannot be answer by AI bydesign.
For the other part like Personal Identifiers : The tool to prevent datastorage about it are to look from Microsoft Pureview, it's the tool inside the tenant and it's job to protect data.
With that in mind : that's why managing all of this case will be very difficult, and slow down a lot the agent, and will not be 100% reliable. It's not from the agent side to be managed, but from data storage and internal guardrail.
At this point we have 2 solutions if you can't use those solution or still want to use agent to do this :
-> stop using copilot studio and go on microsoft foundry agent -> you have access to all the guardails configuration about everything you want. this is done for this.
-> using copilot studio ---> use topic to detect everycase you want (or child agent for long prompt detection) - no 100% coverage, at the moment, the tech was not build for this, because it's part of Power Platform and already have a lot of guardail.
-> last option but it will cost a lot of money : i did what you want with AI Builder (named prompt inside tool), when you create a topic you can change the topic trigger to trigger everytime a message is about to be send, then you could take the question, send it to a custom AI builder with all you rules, and then it will apply all your rull + microsoft rulle. Why it's not a real life solution -> the cost -> it will cost real time money on every prompt. if money is not the problem : it's a solution. but it will slow down a lot the agent.
Important information to take in account :
-> even if you detect everything you wana detect : it's too late in the process : it's in the conversation log.
-> Microsoft AI don't train on data from the tenant. Conversation log and inference are not send to openAI for training.
-> dangerous subject are block by Microsoft AI responsible policy by design
-> it's a copilot studio agent : the agent will store only data you explicitly store : so if the student send the personnal ID, it will be in conversation log and nowhere else (or you added something to store it, but it's another story-> pureview will be here for this.)
I hope i could help you with my insigh about previous experience on this :) if yes please mark this post as verify (green it with button verify answer) it's very important for the community and search engine :)