Is your voice safe? Cyber security experts say your ‘voice print’ needs just as much protection as other personal information

Melinda Rizzo//September 27, 2019

Is your voice safe? Cyber security experts say your ‘voice print’ needs just as much protection as other personal information

Melinda Rizzo//September 27, 2019

Getty Images –

Is voice recording data security the next identity frontier?

The pervasive use of voice technology for identity recognition, personal or business orders, paying for goods, accessing financial accounts or collaborating with colleagues near and far, creates the same security minefield as any other personal or business information.

“If you are allowing your voice print [recording to be made] essentially your voice is being digitized…it’s like a fingerprint,” said Craig Stonaha, president of Laughing Rock Technology in Sinking Spring.

Stonaha said with voice security there is an element of trust. “We’ve been telling people for decades to verify who a person is before giving sensitive information.  Voice recognition is easy, it’s convenient, it’s unique, and you always have it with you,” Stonaha said.

Rebecca Wang, assistant professor of marketing at Lehigh University said the industry is new and needs standard operating procedures.

“There should be guidelines [for use and participation] at least, and I don’t think we’re there yet,” Wang said.

And as good as voice and Artificial Intelligence is becoming, Wang said technology is not human, it can’t take in nuance, and it can’t render decisions outside its programming.

She suggested government should take a bigger role in addressing cyber security with voice technology because hacking breaches, especially for smaller firms, are a greater risk to consumers.

Stephanie Spangler, an associate attorney with Norris McLaughlin P.A. in New York City, said no specific laws exist in Pennsylvania to regulate voice tech usage. There are no federal laws that specifically address voice data either. Norris has an office in Allentown.

“Only time will tell how [voice technology law] evolves,” she said.
“[For now] how data is collected and used through voice recognition technology is included under privacy laws.”

Spangler said a California law covers voice recognition technology as it relates to new smart television installations, but so far the law is specific only to smart TVs.

“At the end of the day, law is reactionary,” she asid. “It’s going to fall behind slightly. Technology is growing at such an expedited pace, it’s not [realistic] for the law to anticipate it.”

Several cyber security regulations extend to voice recognition, including PCI Cyber Security Compliance [Payment Card Industry Data Security Standard], as well the Health Insurance Portability and Accountability Act (HIPPA), Stonaha said.

“The regulations are taken very seriously and the fines are steep,” she said. “These are resources in the tool kit,” Stonaha said.

Cloud-based systems and VoIP (Voice Over Internet Protocol) phone systems must record calls, and these systems automatically digitize voice conversations, he said.

Pennsylvania is a two-party consent state for phone call recording, Spangler said. That means both parties must agree for the conversation to be recorded.

Experts interviewed agree keeping your vocal signature safe is as important as any kind of personal information protection. They advise:

  • Know who is recording your voice.
  • Be in control of home or office devices. Decide when the device will be turned on and collecting data, and when it is turned off. Regularly monitor voice data by deleting data from the history.
  • Because data breaches are a huge concern for consumers, only purchase goods or services using voice technology from reputable retailers or vendors.
  • Ask if your voice data will be disclosed to a third party. “Certainly, you don’t want anyone to have access to that,” Spangler said.
  • How will the data be used? Is it transcribed and shared with a third party for a purpose you haven’t agreed to?
  • Look for relevant terms of use and privacy policies, and read the fine print.
  • Beyond “bad actors” or nefarious parties gaining access to voice data, law enforcement may have access to voice data in the course of criminal investigations, understand this is possible.

“If you look at relevant terms of use and privacy policies, which are incorporating voice recognition technology, there is explicit language that advises data can and may be disclosed to law enforcement,” Spangler said.

When using Alexa or Google Assistant, people may not be thinking about those things, she said.

Understand how the voice data will be used, before you agree to voice activated services. “By agreeing to use the devices we agree to have the biometric data collected,” Spangler said.

She added using voice recognition services or devices is completely voluntary.

“You don’t have to use these devices, but at some point voice recognition devices will be integrated into everyday activities,” Spangler warned.

It’s more important for consumers to be aware of these things, she said, because the laws people count on to protect them may not be on the books.