Zero Tolerance Policy
nYtevibe has an absolute zero-tolerance policy for child sexual abuse material (CSAM), child sexual exploitation, and any content or behavior that endangers minors. We are committed to protecting children and cooperating fully with law enforcement to ensure their safety.
1. Our Commitment
E&EL Global Inc., the company behind nYtevibe, is firmly committed to preventing the use of our platform for any purpose related to the sexual abuse or exploitation of children. This policy applies to all users, all content, and all features of the nYtevibe platform worldwide.
Scope: This policy covers all forms of child sexual abuse and exploitation, including but not limited to: creation, distribution, or possession of child sexual abuse material (CSAM); grooming or solicitation of minors; trafficking of minors; any sexual content involving individuals under 18 years of age; and any conduct that facilitates or promotes the sexual exploitation of children.
2. Platform Design and Age Restrictions
nYtevibe is a nightlife venue discovery and community platform designed exclusively for adults:
- Minimum Age: Users must be at least 13 years old to create an account, in compliance with COPPA
- Nightlife Features: All venue-related features (vibe check-ins, scout clips, social map, bookings) are designed for users aged 21 and older
- Content Context: All user-generated content (Scout Clips, vibe reports) is tied to nightlife venues and is subject to content moderation
- No Private Messaging: nYtevibe does not offer private direct messaging between users, reducing the risk of unsupervised contact with minors
- GPS Verification: All content submissions are geofence-verified at adult venue locations, making the platform inherently unsuitable for targeting minors
3. Prohibited Content and Conduct
The following is strictly prohibited on nYtevibe and will result in immediate account termination and reporting to authorities:
- Any sexual content involving minors (individuals under 18 years of age)
- Any content that depicts, promotes, normalizes, or glorifies child sexual abuse
- Sharing, distributing, or soliciting child sexual abuse material (CSAM) in any form
- Attempting to contact, groom, or solicit minors for sexual purposes
- Using the platform to facilitate trafficking or exploitation of minors
- Creating accounts impersonating minors for the purpose of exploitation
- Any content that sexualizes minors, including AI-generated, illustrated, or fictional content
- Sharing links to external sites containing CSAM or exploitation material
Immediate Action: Any account found to be in violation of these prohibitions will be immediately and permanently terminated. All associated data will be preserved for law enforcement purposes. We will file reports with the National Center for Missing & Exploited Children (NCMEC) and cooperate fully with law enforcement investigations.
4. Detection and Prevention Measures
4.1 Content Moderation
- All user-generated media (Scout Clips, photos) are subject to automated and manual review
- Content is verified as being captured at licensed adult venue locations via geofencing
- Media uploads require in-app capture (gallery uploads are rejected), reducing the risk of pre-existing harmful content being uploaded
- Community reporting tools allow any user to flag suspicious content or behavior
4.2 Account Verification
- Users must provide a valid email address and phone number for account creation
- Social authentication (Google, Apple) provides additional identity verification
- Device fingerprinting tracks device-level activity for suspicious pattern detection
4.3 Behavioral Monitoring
- Our credibility and trust scoring system monitors user behavior patterns
- Users with credibility scores below threshold are subject to increased scrutiny
- Unusual activity patterns are flagged for manual review
- IP addresses and device information are logged for all account actions
5. Reporting Mechanisms
We provide multiple channels for reporting CSAE concerns:
5.1 In-App Reporting
- Every piece of user-generated content has a "Report" option
- Reports categorized as child safety are escalated immediately to our Trust & Safety team
- Reporters' identities are kept confidential
5.2 Direct Contact
5.3 External Reporting
We encourage anyone who suspects child exploitation to also contact:
- NCMEC CyberTipline: www.missingkids.org/gethelpnow/cybertipline or call 1-800-THE-LOST (1-800-843-5678)
- FBI: tips.fbi.gov
- Internet Crimes Against Children (ICAC): Contact your local ICAC Task Force
- Local Law Enforcement: Call 911 for emergencies involving immediate danger to a child
6. Response Procedures
When CSAE-related content or behavior is identified or reported:
- Immediate Removal: Suspected CSAM is removed from public view immediately upon detection or report
- Account Suspension: The associated account is immediately suspended pending investigation
- Evidence Preservation: All relevant data (content, metadata, IP addresses, device information, account history) is preserved for law enforcement
- NCMEC Report: A CyberTipline report is filed with the National Center for Missing & Exploited Children within 24 hours, as required by U.S. federal law (18 U.S.C. § 2258A)
- Law Enforcement Cooperation: We cooperate fully with law enforcement investigations, providing all requested information and data
- Permanent Ban: Confirmed violators are permanently banned from the platform with no possibility of reinstatement
7. Data Retention for Safety
While nYtevibe generally minimizes data retention (vibe reports expire after 60 minutes), data related to CSAE investigations is handled differently:
- Content flagged as potential CSAM is preserved in a secure, access-restricted environment
- Account data associated with CSAE violations is retained as required by law and until law enforcement confirms it is no longer needed
- Only authorized Trust & Safety personnel and law enforcement have access to preserved evidence
- Data preserved for CSAE investigations is stored with encryption at rest and in transit
8. Employee and Contractor Training
- All employees and contractors with access to user content receive training on identifying and reporting CSAE
- Trust & Safety team members receive specialized CSAE training
- Regular refresher training is conducted to stay current with evolving threats and legal requirements
- Background checks are conducted for all employees with access to user data
9. Legal Compliance
This policy is designed to comply with applicable laws and regulations, including:
- 18 U.S.C. § 2258A: Federal reporting requirements for electronic service providers
- PROTECT Act (2003): Prosecutorial Remedies and Other Tools to end the Exploitation of Children Today
- COPPA: Children's Online Privacy Protection Act
- CSAM Laws: Federal and state laws prohibiting child sexual abuse material
- EARN IT Act: Compliance with best practices for preventing online child exploitation
- EU Directive 2011/93/EU: Combating sexual abuse and sexual exploitation of children (for EU users)
10. Transparency and Accountability
- We will publish an annual transparency report detailing the number of CSAE reports received, actions taken, and reports filed with NCMEC
- This policy is reviewed and updated at least annually
- We participate in industry coalitions and working groups focused on child safety online
- We welcome feedback on our child safety practices from experts, NGOs, and the public
11. Policy Updates
This CSAE policy may be updated to reflect changes in law, technology, or best practices. Material changes will be communicated through the platform and posted on this page. The "Last Updated" date at the top of this page indicates the most recent revision.
← Back to Home