In a letter to a U.S. senator earlier this month, Amazon confirmed that when Alexa is spoken to, it's all ears. Specifically, the company said its smart speaker doesn't always delete transcripts of conversations, even if users manually delete the recording.

To be sure, Alexa gives users notice of its collection of recordings and the opportunity to change those settings, meaning its practice clears most privacy law hurdles in the U.S., lawyers said. But should Alexa collect medical information and background conversations, it may put the transcripts under the scope of various state laws.

Alexa's deletion habits made news earlier this month when U.S. Sen. Chris Coons, D-Delaware, released Amazon.com Inc.'s response to the senator's letter requesting details about the smart speaker's data privacy and security practices. The letter stemmed from a CNET article claiming Amazon doesn't delete transcripts after users manually delete recordings

In Amazon's response letter, the tech company wrote it retains customers' voice recordings and transcripts until the customer deletes them, with a few exceptions. The company confirmed a "skill developer," akin to an app creator whose voice-driven app appears on Alexa, may also retain records of the interaction. Amazon also said the "underlying data" from a spoken request setting a recurring action isn't deleted. It did not specify the nature of such data. 

While some may be concerned that deleted information on Alexa isn't actually erased, the process does not break most privacy laws in the U.S.

"With regard to privacy, an analysis typically begins with a determination of whether a user's expectation of privacy is reasonable," said Duane Morris partner Sandra Jeskie. "In connection with the Alexa device, a user's expectation of privacy is defined by the privacy policy set by Amazon for the device." 

Jeskie noted that Amazon's lack of deleting data regarding repeated actions is also consistent with other privacy policies because such engagements "requires a longer-term retention."

But Sara Jodka, a member at Dickinson Wright, highlighted that because Alexa doesn't anonymize user identity or other information in its transcripts, Amazon could face potential legal issues if it retains information that includes protected health information (PHI) or the side conversations of people who did not agree to their terms of service. 

"That opens you to categories of information and state law protections about how that information needs to be captured and stored," she said.

While Alexa's ability to record a background conversation while a question is directed to Alexa may seem harmless, not all states allow one-party consent to be recorded.

If Amazon is recording and storing background conversations in two-party consent states without prior authorization, the company could face "potential liability," Jodka said. 

To be sure, Amazon said that it doesn't anonymize data in an effort to promote customer transparency to review records and improve the machine learning powering Alexa's actions.

Jodka noted that Amazon does have a business and operational need to retain certain information to improve its technology. 

"The more human conversations it collects its technology continues to learn and become almost human in nature," Jodka said. "I think the information they [Amazon] collect for Alexa is going to be used to make those other platforms more functional to know the consumer DNA of the people who use Alexa and Amazon in general."