Voice AI, data privacy, and kids: What’s now and what’s next
January 27, 2023
One of the few areas of growing bipartisan agreement among US politicians is around regulatory action to protect children’s online privacy. And while a recent push for new children’s privacy bills wasn’t successful in congress, advocates are continuing to work on getting important new legislation passed (including an update to COPPA) in the short to medium term.
Companies, particularly those with digital products used by children, have an obligation to design their solutions in a way that safeguards the information of users. These companies also have an obligation to stay up to date with and employ the latest privacy practices across their digital products and associated online services.
So, this Data Privacy Week, we’re taking a look at best practices for data and privacy protection — including, of course, children’s voice data privacy — and what’s coming next from governmental bodies.
In this privacy checkup, we explore the latest regulatory requirements, protection laws, and privacy practices.
What’s now: Data privacy practices
Staying true to privacy-by-design principles
With companies collecting an unprecedented amount of user data, privacy-by-design is an essential measure to protect personal information from unauthorized access and misuse. It’s also an important way for companies to build trust with end users and the wider public.
Privacy by design comprises six core principles:
- Taking a proactive approach to privacy that aims to prevent risks or infractions (e.g., such as a child accessing malicious content or sharing data with third-party services without consent).
- Ensuring privacy is the default setting within any system or product.
- Embedding privacy into the design of products — and doing so in a way that still enables full functionality (avoiding false trade-offs) and great experiences across products.
- Ensuring data is secure and private throughout its use (and securely destroyed).
- Ensuring visibility and transparency for users and providers.
- Making privacy controls user-friendly.
Staying accountable
Failure to uphold accountability and compliance requirements comes with significant business risk. For example, Meta was just fined $400 million for violating GDPR. Epic Games paid over $500 million to the FTC in a recent settlement, and the French government fined Apple about $8 million for failing to get user consent related to targeted and personalized advertising.
Privacy laws like GDPR, COPPA, and FERPA may offer important privacy safeguards, but today, companies need to think well beyond “letter of the law” compliance. They need to implement processes and structures that bring those measures to life as built-in DNA to their products, and as part of the actual and daily end-user experience.
What’s next for data privacy
States taking the lead
In the U.S at least, states have been more active than the federal government in creating new privacy regulations. Connecticut and Utah will join California, Colorado, and Virginia with new privacy laws going into effect this year.
More regulation on the way?
Yes, there is. An article this month in the MIT Technology Review predicts new EU regulations for generative AI, and it’s likely that the US regulatory body, the FTC, will follow suit.
Taking a closer look via impact assessments and algorithm audits
The California Age-Appropriate Design Code, which was passed in 2022 and goes into effect in 2024, requires covered entities to complete a data protection impact assessment for any new product, service, or feature, including plans to mitigate or eliminate risk to young users.
And as the use of algorithms expands, Stanford’s Human-Centered AI program offers best practices on conducting audits to understand the impact of algorithms on both individual users and society more broadly.
Privacy, voice tech, and kids
Privacy-minded voice AI companies like SoapBox protect children’s voice data by using data best practices, including
- Keeping data secure through anonymization and encryption.
- Preventing other (non-voice) data related to the identity of a child from being collected or received by the system.
- Only using voice data to improve accuracy (not for marketing or shared with third parties for any other purpose).
- Giving clients the choice on the jurisdiction for data processing and whether data can be retained for product improvement purposes.
About SoapBox’s approach to voice data privacy
SoapBox is a privacy-first company. We take a privacy-by-design approach to building our voice technology and to the collection and processing of children’s voice data.
Since 2013, SoapBox has worked with PRIVO, a safe harbor company in the US, to ensure we maintain our role as leaders when it comes to privacy-first processes and approaches.
We believe that all children have a right to voice data privacy, and we will continue to advocate for the expansion of legislation covering data protections for children in the US and globally.
Resources on voice data privacy
Looking to dive deeper into voice AI, data privacy, and kids? Here are some of our popular privacy resources and publications:
- Voice Prints and Children’s Rights – Submission to the UN Office of the High Commission for Human Rights
- Let’s Talk Voice Tech, Data Privacy, and Kids – CEO Dr. Martyn Farrows’ guespost in Voicebot.ai
- Data Privacy Matters for Kids. Here’s Why – Interview with children’s data privacy expert Dr. Veronica Barassi