PhotoNow that Apple, Google, Facebook, and Amazon have been called on the carpet by the House Judiciary Committee and its investigation into competition in the digital marketplace, the consumer can expect some immediate changes.

One company hoping to beat the committee to adjustments that might otherwise be forced upon it is Apple when it comes to its ScreenTime iPhone app.

The backstory

Last year, Apple introduced the “ScreenTime” app, giving parents control over an iPhone’s operating system (iOS) -- a move quickly mimicked by Facebook and Instagram. While the company’s intent was benign, its follow-through wasn’t. The mistake Apple apparently made is that it began to sift out apps from its App Store that were similar but not functionally the same.

Was that move kosher? Some say yes.

“It did so under what, to me, is a perfectly fair piece of reasoning: these apps were all using (iOS network and data management code) explicitly for the purpose of monitoring a person's activity on their iOS device,” wrote AndroidPolice’s David Ruddock, who called the situation “a boiling pot of a controversy.”

“Some of the activities tracked are genuinely innocent, such as authentic screen time apps merely meant to enforce parents' rules about phone usage for their children. Others, though, do exactly what you'd expect of an app indistinguishable from one meant to maliciously and silently spy. Full-time location tracking, geo-fencing, explicit image and text content alerts, web browsing history (and filtering), and more.”

One of the examples Ruddock points to is the “Bark” app -- which likes to think of itself as a tool that uses “advanced algorithms to look for a variety of potential issues, such as cyberbullying, sexting, drug-related content, and signs of depression.”

“Looking at all the praise the app has earned from the media and parents, you might be inclined to think this is OK - after all, Bark doesn't let parents read full chat logs, just the ones that raise red flags,” Ruddock said.

“But the fact that an app like Bark can even exist while Apple contends that privacy is priority one on iOS is simply ridiculous. An app that can scrape images, private texts, emails, web history, call logs, and location data, send it up to an untrusted third party's cloud for analysis, then return it to parents to peruse without their child's knowledge is consistent with that ethos? In reality, Bark is merely Silicon Valley spin and polish on what is a gross model: Spying on your kids - now easier and faster with AI! (Bark suggests telling your kids you're spying on them, for what that's worth.)”

Apple says, on second thought…

Nonetheless, Apple has decided to try and unscramble this egg in hopes of one less bullet it will have to dodge from the Judiciary Committee.

News of that decision, however, was much, much quieter than the rest of the news Apple was crowing about at its recent WorldWide Developer Conference (WWDC). As a matter of fact, it appears that notice of the changes was left for the world to stumble upon inside of Apple’s updated “App Store Review Guidelines.”

In the updated guidelines, Apple gives the apps in question a thumbs-up as long as the app’s developer doesn’t “sell, use or disclose to third parties any data for any purpose, and must commit to this in their privacy policy.”

“These apps were using an enterprise technology that provided them access to kids’ highly sensitive personal data,” an Apple spokeswoman told the New York Times. “We do not think it is O.K. for any apps to help data companies track or optimize advertising of kids.”

Protect your home with a warranty.

Take our two-minute quiz and get matched with a home warranty company.


    Share your Comments