Skip to content
#

user-safety

Here are 5 public repositories matching this topic...

Language: All
Filter by language

Independent researcher bridging AI ethics theory and implementation. Building preventive systems for education and welfare with call center hospitality background.

  • Updated Sep 16, 2025
ToxicGuard_AI

ToxiGuard AI is a browser extension that detects and censors toxic language in real-time using TensorFlow.js. It offers fine-grained controls, visual feedback, auto-censor, adjustable sensitivity, and respects user privacy.

  • Updated Nov 1, 2025
  • HTML

Improve this page

Add a description, image, and links to the user-safety topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the user-safety topic, visit your repo's landing page and select "manage topics."

Learn more