Protecting cyber health with AI
Social media sites and gaming communities are concerned about cyberbullying, violence, toxic content, and the responsibility of the hosting company for these and other online dangers.
Today, 90% of game developers use some form of content moderation or online moderating with human moderators. Two Hat offers Content Moderation using Artificial Intelligence (AI), which prevents inciting violence or sharing pornographic or CSAM images. They needed a marketing and policy directive that would showcase Two Hat Community Sift as a solution for social and gaming sites.
Getting the attention of the right people
Yes& leveraged our global media contacts and Washington, DC policymakers to create awareness of the AI solution that could replace human content moderators.
Two Hat senior management are all former Disney Interactive executives in British Columbia, who want to make the Internet and social communities safer for families. We assisted in a global Internet safety conference in Australia’s E-Safety Commission. Finally, we are working with legislators and policy staff on the Children’s Online Privacy Protection Act (COPPA 2) now before Congress to create incentives for companies to use automated content moderation. Two Hat’s CEO has met with Senator Ed Markey on the Hill who leads on COPPA and with two Baroness Cox and Kidron, Members of the House of Lords, who lead on Online Harms.
Donations and global attention
Our team has provided these industry leaders with global media visibility in: Financial Times, New York Times, Wall Street Journal, BBC TV and BBC Radio, and Canadian Broadcasting (CBC). Our team also provided contacts and advocacy work at the UK Parliament on the Online Harms Act legislation which provides a policy of Duty of Care for young people in social and gaming.
Two Hat has just received an additional $7.5 million funding round from major VC’s in Silicon Valley for expansion capital.