This page documents how Next in Arts receives reports, reviews decisions, and tracks SLA outcomes for moderation actions.
We review reports covering harassment, hate speech, illegal content, spam, impersonation, and rights violations across submissions, profiles, responses, and collaboration surfaces.
Reports enter a moderation queue, are triaged by severity, and can be resolved, dismissed, or escalated to legal/safety response. High-risk incidents are prioritized.
P0/P1 incidents receive immediate or same-day handling. Standard policy violations are reviewed within target queues aligned to trust operations runbooks.
Internal moderation metrics are tracked continuously, with weekly trust governance reviews. Public summary cadence is last 30 days.
If you believe moderation action was incorrect, contact trust@nextinarts.com with case details for review.