Apple privacy policies on their surface give consumers the perception the company cares more about protecting user data than exploiting it like rivals. But new iPhone features like facial recognition and AI have raised more questions than answers about how Apple is treating users’ private information, according to a prominent privacy watchdog.

The Electronic Frontier Foundation (EFF) — a San Francisco-based international digital rights non-profit — says new features included in Apple’s iPhone 8 and iPhone X unveiled in September, like facial recognition and machine learning, mean “Apple is collecting more data than it ever has before.”

“Two of these features—on-device facial recognition and differential privacy—deserve a closer look from a privacy perspective,” EFF said Wednesday. “While we applaud these steps, it’s hard to know how effective they are without more information from Apple about their implementation and methods.”

Most facial recognition technology processes the information necessary to identify individuals in cloud-based servers, opening such sensitive data up to a greater threat of exposure. Apple forewent that industry trend in the latest version of its Photos app, which processes facial recognition in the background of its smartphone, tablet, or computer’s operating system.

As a result of this unconventional method “Apple loses speed, power, and instant access to mountains of user data for its facial recognition machine learning model,” according to EFF. But in exchange, “users gain something much more important: privacy and control over their information.”

That helps protect highly identifiable information like photos from cyber theft, and “especially in terms of law enforcement access to their data,” the group says. While “not a privacy guarantee,” Apple proved the cloud doesn’t have to be the default for heavy data processing.

Apple now employs a method of making datasets with “differential privacy,” meaning those sets are anonymized to avoid maintaining a database full of private information susceptible to leaks. The process helps Apple learn broad and detailed user trends like product choices, news preferences, how they use apps and predicting text or emojis in messages and searches without revealing identifiable data.

The iPhone maker isn’t the first tech giant to employ differential privacy (Microsoft, Google and even the U.S. Census Bureau use the technique). It’s even gone the extra step of asking users to opt-in to sharing their data, but hasn’t said much about how it simultaneously harnesses and anonymizes bulk data.

“It has publicly mentioned statistics and computer science methods like hashing (transforming data into a unique string of random characters), subsampling (using only a portion of all the data), and noise injection (systematically adding random data to obscure individuals’ information),” EFF said.

But until Apple is ready to divulge more, “we are left guessing as to exactly how and at what point in data collection and analysis such methods are applied,” the group says.

“Differential privacy is still a new, fairly experimental pursuit, and Apple is putting it to the test against millions of users’ private data,” EFF says. “And without any transparency into the methods employed, the public and the research community have no way to verify the implementation—which, just like any other initial release, is very likely to have flaws.”

Though meant to safeguard against flaws, “the details of such a large roll-out can blow away those guarantees,” according to the privacy experts. “[W]ith Apple both building and utilizing its datasets without any oversight, we have to rely on it to self-police.”

While EFF believes Apple should be commended for new steps toward privacy, they think Apple should be more forthcoming about sharing its specific methods with “other technologists, researchers, and companies.”

That may be unlikely in the competitive market for devices, and even less so in Silicon Valley’s competitive breakthrough environment. Since NSA whistleblower Edward Snowden’s disclosures of mass electronic surveillance programs in 2013 and an ongoing FBI push to access end-to-end encrypted devices like iPhones, Apple has billed itself as the privacy-focused alternative to Google’s Android and its reputation for monetizing bulk users data.

Apple announced updates to its Safari web browser earlier this year including a system dubbed “intelligent tracking prevention.” The system prevents tracking by third-party advertising on websites users visit by using machine-learning and other techniques to disable cookies.

Follow Giuseppe on Twitter