Social and Dating

Tech News

TikTok asks for a delay in hopes of a reprieve from Trump Administration

Trump's stance on the TikTok question has evolved over time

Featured Tech News photo

TikTok is asking for a delay in the U.S. law that would force its China-based owner to sell the popular app or see it banned. The delay would give the Supreme Court a chance to review the law.

A pause will pose “no imminent threat to national security” or “material harm on the government,” according to the motion.

The government asked that the court quickly deny TikTok’s request.

The law was enacted by Congress because of fears that TikTok's Chinese ownership constituted a ...

Read article
Featured Tech News photo

Latest Articles

  1. Roku and Instacart now offering on-screen ordering and one-hour delivery of products
  2. AI isn't always as smart as we think it is
  3. Hurricane Helene may cause problems for the semiconductor industry
  4. CNN online? That will be $3.99
  5. DirecTV acquires Dish and Sling TV for $1

Not sure how to choose?

Get expert buying tips about Social and Dating delivered to your inbox.

    By entering your email, you agree to sign up for consumer news, tips and giveaways from ConsumerAffairs. Unsubscribe at any time.

    Thanks for subscribing.

    You have successfully subscribed to our newsletter! Enjoy reading our tips and recommendations.

    Recent Articles

    Newest
    • Newest
    • Oldest
    Article Image

    Florida passes bill that bans children under 14 from social media

    There are now three states that have taken legal action regarding kids’ social media use

    Florida is cracking down on kids’ social media use. 

    Governor Ron DeSantis officially signed bill HB 3 into law, which tightens social media restrictions for teens. The bill mandates bars all children under the age of 14 from using social media, and all existing accounts will be deleted. 

    Additionally, parents of 14 to 15-year-olds are being given more control over their kids’ social media presence. Parents are required to give permission for their 14- or 15-year-olds to be on social media sites, and if at any point they revoke that permission, the platform must delete the account. 

    “Social media harms children in a variety of ways,” DeSantis said in a statement. “HB 3 gives parents a greater ability to protect their children.” 

    “The internet has become a dark alley for our children where predators target them and dangerous social media leads to higher rates of depression, self-harm, and even suicide,” said House Speaker Paul Renner. “Thanks to Governor DeSantis’ signature, Florida leads the way in protecting children online as states across the country fight to address these dangers.” 

    The specifics of the bill

    In addition to the age requirements and giving parents’ more access and control over their kids’ social media use, the bill also requires users to verify their age before accessing explicit websites. 

    To ensure users’ personal information is kept private, these sites will employ what is known as anonymous age verification. This means that the site itself won’t store users’ responses, but instead, a third-party will verify the information and discard it. 

    Companies that violate the terms of the bill could be charged up to $50,000 per violation. On top of that, any minor whose account isn’t deleted before the bill goes into effect on January 1, 2025, has the right to sue the platform for up to $10,000 in damages.  

    Social media platforms strike back

    Though HB 3 has been signed into law, social media companies are expected to fight back. 

    NetChoice, a trade association of social media platforms, has members like Amazon, Google, TikTok, X, Meta, and others. The company sued the state of Utah in December 2023 after it passed a similar bill, arguing that the law violates the First Amendment. 

    Before the bill officially passed in Florida, NetChoice sent a veto request to Governor DeSantis, arguing that the bill violates constitutional rights. 

    “We’re disappointed to see Governor DeSantis sign onto this route,” Carl Szabo, NetChoice’s vice president and general counsel said in a statement. “There are better ways to keep Floridians, their families, and their data safe and secure online without violating their freedoms.” 

    Similarly, policymakers in Montana and Arkansas sought to pass laws that restricted teens’ social media use, and federal judges blocked both such laws. The law in Florida, which is being cited as one of the most restrictive, could face a similar test.

    Florida is cracking down on kids’ social media use. Governor Ron DeSantis officially signed bill HB 3 into law, which tightens social media restriction...

    Article Image

    Thinking about signing up for Threads? Here are some things to consider.

    Do you know why Meta needs your credit score and where you go to church? Neither do we.

    The race is on!

    Inside of a few days time, Meta’s new “Threads” app has done to Twitter what no other social media company has done – signed up 100 million text-post loving users, close to a fourth of Twitter’s audience base.

    But, despite the spectacle of the punk-out between Meta’s Zuckerberg and Twitter’s Musk and the temptation to join the crowd, experts are saying that there are too many people jumping into Threads without thinking about what they’re giving Meta in the way of personal data and tracking.

    Threads' key differentiator is its integration with Instagram, which provides a ready user base and a seamless transition. But Ani Chaudhuri, CEO at Dasera, says that could also be its Achilles heel when we look at it from a data privacy and security perspective.

    “Merging data across platforms creates a rich, integrated dataset that may be more attractive to potential threat actors. If not managed correctly, this could result in unintended data sharing and privacy breaches,” he told ConsumerAffairs, reminding our readers that this is the exact same company that brought us that infamous Cambridge Analytica scandal. 

    Specific issues people need to be aware of

    If you haven’t leaped into Threads yet, or you just signed up, there are things you should consider before you go too far. 

    You can’t delete Threads without deleting Instagram. “Most people are eager to jump on board new social media platforms, especially if they believe all of their peers are doing it, too. There are some immediate pitfalls people need to be aware of, one of which is that if you sign up for Threads by linking your Instagram account, you cannot delete Threads later without having to delete your Instagram,” Sharad Varshney, CEO of OvalEdge, a data governance consultancy, told ConsumerAffairs. 

    “The two are ultimately married, sealing your data within its system indefinitely unless Meta changes this policy. You can only deactivate a Threads profile if you want to hang onto your Instagram, so you’ll be stuck with a dormant Threads account. So consider manually setting up an independent Threads profile with an email if you want to give yourself that ‘Threads deletion escape hatch.’”

    More advertising junk to deal with. Don’t forget – Threads is owned by Meta and you know what that means. “This comes with the usual caveats that your data will be sold in Meta's advertising platform,” Art Shaikh, Founder & CEO, DigitalWill.com, said.

    “Granted, the fact that it is built on the Instagram platform means that much of the data users have shared with IG is already in Meta's database, but new interaction and engagement data will also be added. As the saying goes, if the product is free, then your data is the cost. Users should take the usual precautions.”

    You’re giving away your entire life. Well, darn near. Chris Hauk, Consumer Privacy Champion at Pixel Privacy, says he, for one, can’t trust Meta when it comes to user information and how it is used.

    “Meta exists for one reason, to collect as much information as it can about its users and then sell that information to anyone able to pay the asking price,” Hauk said.

    How much information? According to Threads disclosure on the Google Play store, 14 pieces, including: 

    • Approximate and precise location

    • Name, email address, user IDs, home address, phone number, political and religious beliefs, sexual orientation

    • Financial info including user payment info, purchase history, credit score, and other financial info

    • Health info and fitness info

    • Emails, text messages

    • Photos and videos, voice or sound recordings, music files, and other audio files

    • Files and docs

    • Calendar events

    • Contacts

    Additional concerns

    "Potential users should take time to review this list and ensure they are comfortable with sharing their information before signing up for an account, which as a pre-requisite will be tied to their Instagram account," David Abramowitz, chief technologist at Trend Micro, told ConsumerAffairs.

    Parents beware! Here's something pretty telling: Did you know that when Meta chief Mark Zuckerberg posted a Fourth of July picture of his family on Instagram, he blurred the faces of his children? If he's worried about the privacy of his children, then, shouldn't other parents be?

    Erfan Shadabi, cybersecurity expert with data security specialists comforte AG says that before a child or young adult joins Threads, parents should familiarize themselves with the app's privacy policy and not only consider that laundry list of personal information Threads collects, but how it is stored and how it is used. 

    “Assess whether the app aligns with your family's privacy expectations and values. And take note of any potential risks associated with the app's content, including user-generated content, public posts, or interactions with strangers,” he told ConsumerAffairs.

    “No matter which app is in question, parents should also actively explore the app's settings and privacy controls. Teach them how to set their profile to private, limit who can view their posts or content, and how to manage friend or follower requests.”

    Shadabi suggests that parents educate their children about the importance of strong passwords, avoiding suspicious links or downloads, and being cautious about sharing personal information or engaging with strangers online. 

    “Teach them to identify potential risks and to report any suspicious or malicious activities on the platform. Above all, maintain an open line of communication and offer ongoing guidance and support," Shadabi said.

    The race is on!Inside of a few days time, Meta’s new “Threads” app has done to Twitter what no other social media company has done – signed up 100 mill...

    Article Image

    The TikTok ban wagon has started to roll. What this means for parents.

    Need some support? We know where you can find 300,000 parents who'll stand by your side.

    Montana has officially become the first state to completely ban TikTok for everyone in the state. It may be the first, but it’s not likely to be the last.

    As of April 2023, the app has been banned for use by federal employees and banned for use by state employees in 34 states.

    But students and Gen Z’er are looking at a far greater squeeze. Major colleges and universities like the University of Florida, University of Wisconsin, University of Texas, University of Georgia and others have cut off access to TikTok for hundreds of thousands of students. And a recent survey found that most people over the age of 23 want the app out of everyone’s life, basically challenging the younger generation's obsession with TikTok and its value in their lives.

    The tech giants’ smoke-and-mirrors act

    Any parent who’s tried to wean their child off the high they get from any dopamine-driven app knows that if or when the walls of TikTok keep tumbling down, they’re going to be dealing with a ton of bummed-out kids.

    Titania Jordan is the chief parenting officer at Bark Technologies, a company that offers content monitoring for parents so their children can still get their internet fix without getting addicted to an app or website. Jordan told ConsumerAffairs that while Montana’s effort is a good start, it’s not going to remedy the situation.

    “What the platform really needs to do, along with other platforms such as Snapchat, is open their APIs (Application Programming Interface) to allow third-party monitoring systems to get in there and do the job of helping to protect kids from digital dangers the right way,” she said – those dangers being cyberbullying, online predation, suicidal ideation, self-harm, violent content, and disordered eating. 

    But she thinks the larger problem is going to be all the smoke and mirrors from these tech giants who “say” they’re putting in their own safety systems. 

    “They’re not," she said. "Tech companies need checks and balances, much like any organization with too much power, to ensure they are truly putting in the right protective measures to help kids stay safer on their platforms, not just protect and further their own financial interests.”

    How can parents safely monitor their child’s phone and social media use?

    With summer around the corner, Jordan encourages parents to set borderlines, set screen time limits, model positive behavior, learn how to use the parental controls on the apps and devices their kids use – and a seemingly forgotten parental skill – spend some time with their kids outside, among other things like boundaries and contracts.

    “Parental controls can block inappropriate content, help protect kids from communicating with strangers – or worse – predators, teach our kids limits and boundaries, help establish schedules for homework, chores, or bedtime, and mostly, help set them up for success to be responsible digital citizens,” Jordan said.

    Put yourself in their shoes

    Another suggestion – from Elana Pearl Ben-Joseph, MD – is for parents to take the initiative to see for themselves just what their kids are being fed on the apps and sites they’re using.

    “The best way to monitor media that kids use is to experience the media yourself,” she said. “Test apps and play games before your kids use them. View and play apps and games together. And watch what they watch so you can talk about what they see on their screens. You know your kids best, so you're the best judge of what they can handle.”

    Jordan offered another plus that can help parents navigate their way through this. There are more than 300,000 parents, caregivers, educators, and mental health professionals in the Parenting in a Tech World Facebook Group where they can post questions specific to their family situation. "But, also find a group of support to remind them that we’re all in this together," she said.

    Montana has officially become the first state to completely ban TikTok for everyone in the state. It may be the first, but it’s not likely to be the last....

    Article Image

    TikTok is the latest social media platform to be accused of spreading misinformation

    It can sometimes be difficult to identify accurate information from inaccurate information on social media

    A new report from NewsGuard, a company that tracks misinformation on the internet cites examples of how it says TikTok users are likely to bump into misinformation on the platform. While users of all ages go to the app for recipes, dance routines, and generally funny videos, searching for more serious topics may not always lead to the most accurate information.

    Searching for videos on any current events topics, including climate change, COVID-19 vaccines, and Russia’s invasion of Ukraine, among several others, is likely to be met with misinformation. The NewsGuard study found that about 20% of all videos that TikTok suggests after these key searches contain inaccurate information. 

    The report also points to the terms that auto-populate on TikTok when users are searching for information on COVID-19 vaccines. The search term “COVID vaccine” yielded the searches “COVID vaccine injury” and “COVID vaccine exposed,” both of which may lead to videos with misinformation. On the other hand, searching for “COVID vaccine” on Google led to prompts for booster shots and health care facilities. 

    This is particularly concerning when thinking about the primary audience on TikTok – young people. It can be difficult for consumers of any age to discern what’s accurate and what’s not, but having access to legitimate information – especially where important topics are concerned – is crucial. 

    In a statement, representatives from TikTok said that the company plans to remove any misinformation from the platform. The platform’s community guidelines outline that it does not tolerate misinformation of any kind, and any videos containing inaccurate information will be removed. 

    A bigger social media problem

    Since the start of the COVID-19 pandemic, social media has been a breeding ground for misinformation. In the last few years, Facebook, YouTube, and Twitter have all had issues related to spreading misinformation about COVID-19 and the vaccines. 

    By mid-May 2020, nearly 30% of all YouTube videos contained misinformation about the pandemic. The biggest culprit was entertainment news outlets, which accounted for 30% of these videos, and they had garnered over 62 million views by that point. 

    After similar instances at Twitter, the company started monitoring all tweets related to the virus. Twitter started flagging tweets with misinformation about COVID-19 and the vaccines, and by early March 2021, the platform had removed over 8,400 tweets and flagged over 11.5 million accounts. 

    More recently, misinformation has been spread on nearly every social platform about abortion reversal pills. Following the Supreme Court’s decision on Roe v. Wade earlier this summer, posts on social media about abortion reversal treatment were gaining traction.

    However, the treatment has yet to be proven safe or effective, and leading health care organizations have spoken out about the dangers of taking such pills. These types of posts make it difficult for consumers to know what’s true and what’s not, which further clouds these important, and potentially life-threatening choices. 

    Battling misinformation

    Spotting misinformation on social media can be difficult. While typos or grammatical errors can be obvious signs that certain posts aren’t to be trusted, not every inaccurate post is so clearly inaccurate. Experts encourage social media users to report and flag any posts that they think contain misinformation. 

    Consumers need to be cautious and discerning when scrolling through their feeds and pay close attention to the source of social media posts. Look for those who are experts in a particular field, original creators of posts, or posts that are current – those who re-post items that are several years old aren’t likely to be accurate. 

    Doing more research is never a bad thing. Because misinformation can be hard to spot, taking the time to search for answers from credible sources outside of social media is the best way to ensure you’re only accessing the facts. 

    A new report from NewsGuard, a company that tracks misinformation on the internet cites examples of how it says TikTok users are likely to bump into misinf...

    Article Image

    When Musk takes full ownership of Twitter, its users could see a variety of changes

    Goodbye ads, hello subscription? Maybe.

    Now that Elon Musk has another new toy to play with courtesy of his buyout of Twitter, the world will be watching every move he makes. By ponying up $44 billion to buy Twitter, Musk went all-in on his quest to improve what he calls “the digital town square where matters vital to the future of humanity are debated." 

    How will people who use Twitter see his mission play out? Among the things the SpaceX, Starlink, and Tesla CEO has said is on his wish list is shaking up Twitter’s content rules in the name of free speech. Musk thinks of himself as a  "free speech absolutist" – going as far as taking a not-so-cheap shot at the company he just bought for what he views as excessive moderation.

    “If it’s a gray area, let the tweet exist,” Musk said in Tweets and conversations leading up to his takeover of Twitter.

    Revoking bans?

    Does that mean Twitter’s current stance of banning harassing and abusive tweets will end on the first day that Musk is in charge? 

    “Experts who study social networks fret about Musk's push to loosen the rules of engagement on Twitter,” Bobby Allyn reported on NPR’s Morning Edition. “They say that could give license to harassers, trolls and others who abuse the platform to target people.”

    Allyn said the same experts fret that relaxing Twitter's rules will give power to those who want to exploit the platform to spread misinformation about political events, government officials, and matters related to public health and safety.

    Editing tweets and cutting out ads

    Another change – one that Twitter users have been begging for for years – is a rudimentary edit button. There’s no guarantee that change will happen, but Musk has gone on record saying he supports letting people change what their tweets say. If Musk gives users that power, they can change content on the fly just like they can on other platforms like Facebook and Instagram.

    Musk took a poll of what changes Twitter users would like to see in advance of his takeover bid. An edit button got the thumbs-up on more than 3.2 million of the 4 million votes cast. 

    Lastly, users who hate advertising may no longer have to deal with it on Twitter. With Musk taking the company private, it won’t be under the same stress to perform for shareholders like it is now.

    However, Musk has indicated he might move Twitter to a subscription model instead of making it ad-free. That's a move that's been tested out before. Last year, the company introduced Twitter Blue -- a premium service that cost $2.99 a month for additional features like different color schemes and advanced editing options.

    Now that Elon Musk has another new toy to play with courtesy of his buyout of Twitter, the world will be watching every move he makes. By ponying up $44 bi...

    Article Image

    Facebook faces new challenges as user growth slows

    The platform is facing more competition from sites like TikTok

    Meta, the parent company of Facebook, set a Wall Street record last week, and not the good kind. 

    After reporting earnings at midweek, shares plunged and kept going down, losing $232 billion in one day alone – the biggest loss in Wall Street history.

    The company reported weaker-than-expected revenue for the fourth quarter, but that’s not what led to the wave of selling. CEO Mark Zuckerburg was blunt in his assessment of the company’s immediate future, citing inflation, supply chain issues affecting advertisers, and users shifting to alternatives that “monetize at lower rates.”

    “People have a lot of choices for how they want to spend their time and apps like TikTok are growing very quickly,” Zuckerburg said on the conference call. “And this is why our focus on Reels is so important over the long-term. As is our work to make sure that our apps are the best services out there for young adults, which I spoke about on our last call.” 

    Facebook purchased Instagram when young adults switched from Facebook, which was increasingly being used by their parent's generation. Even though Facebook has more than 2 billion users worldwide, the latest earnings report showed a slowdown in user growth.

    ConsumerAffairs reviewers weigh in

    An analysis of verified reviews of Facebook at ConsumerAffairs shows that the platform still has its fans, earning a respectable 3.4-star rating in a 5-star system. But recent reviews suggest a rising level of user frustration.

    Laurel, of Fredericksburg, Va., is among several Facebook users who are bewildered by the company’s policies.

    “I get banned from reacting, commenting, sharing, you name it, without warning because I violated some vague community standard,” Laurel wrote in a ConsumerAffairs review. “Facebook says I am ‘spamming’ people by reacting or commenting. Huh?”

    Lisa, of Fort Lauderdale, Fla., says the problem goes deeper than being temporarily banned. She says Facebook has shut down many users’ accounts since December for unspecified reasons. 

    “They have no support team to inquire, the phone numbers listed online do not work or are fraud, and the forms requested to fill out are never replied to,” Lisa told us. “I sent in 30+ requests for help and NO one has replied to my requests.”

    Challenges for small businesses

    Justin, a ConsumerAffairs reviewer from San Diego, said he owns a business and has to work with Facebook on advertising. Lately, he says it hasn’t been easy.

    “Their AI robots will reject your ad for no reason other than it made a mistake,” Justin contends in a ConsumerAffairs review. “Even worse, if their stupid robots make a big mistake, they'll shut down your entire ad account. Even worse, when you chat with their support, they'll never tell you why it was rejected or taken down. You can get everything back up and running again if you constantly send it in for a review and after many reviews, an actual human will do it and reinstate it.”

    According to Reuters, analysts are beginning to wonder if Facebook’s problems are contagious and will eventually spread to Instagram, which is favored by a younger demographic. The news service cites Insider Intelligence, a forecasting firm, which recently estimated that Instagram’s growth in users could eventually be at risk, slowing to 5.8% this year and to 3.1% by 2025.

    Meta, the parent company of Facebook, set a Wall Street record last week, and not the good kind. After reporting earnings at midweek, shares plunged an...

    Article Image

    Facebook plans new features to promote young users’ safety

    One feature would prompt Instagram users to ‘take a break’ from the platform

    In the wake of damning testimony from a whistleblower who says the company ignores potential harm to young users, Facebook has announced a set of features that it says will promote the health and safety of teens and young adults.

    Nick Clegg, Facebook's vice president for global affairs, appeared on two network news programs on Sunday to outline changes. Among the new features is an automated prompt suggesting that Instagram users “take a break” if they spend too much time on the platform.

    The company also said it plans to introduce a feature for parents of teens so that they can monitor how their children are spending time online.

    “We are constantly iterating in order to improve our products,” Clegg said on CNN’s “State of the Union." “We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use."

    Congress steps in following whistleblower account

    Clegg said the company has already invested heavily in features that have the objective of keeping young users safe. He said a team of about 40,000 Facebook employees is at work on these features.

    Last month, former Facebook data scientist Frances Haugen leaked documents to the Wall Street Journal that suggested Facebook’s own internal research showed that there are a number of issues that could negatively affect users.

    The journal reported excerpts from documents that showed teenage girls feel bad about themselves after viewing others’ seemingly perfect lives on Instagram. Haugen also sent the documents to Sen. Richard Blumenthal (D-Conn.) and Sen. Marsha Blackburn (R-Tenn.), who lead a Senate subcommittee on consumer protection. Both lawmakers said they were spurred to action by the leaked documents.

    “It is clear that Facebook is incapable of holding itself accountable,” Blumenthal and Blackburn said in a joint statement. “The Wall Street Journal’s reporting reveals Facebook’s leadership to be focused on a growth-at-all-costs mindset that valued profits over the health and lives of children and teens.”

    Facebook loses ground among young people

    While Facebook is trying to put out regulatory fires in Washington, the London Guardian reports that the social media giant may be losing ground with some of its young users. The Guardian cites some of the leaked documents that show erosion among younger demographics.

    An internal Facebook document warns management that Facebook’s daily teenage and young adult users have “been in decline since 2012-13.” 

    Twenty-three-year-old Oliver Coghlin is one of them, telling the Guardian that he is thinking about deleting the Facebook app from his phone because he doesn’t find the content relevant.

    “There were comments that would come up from people arguing about stuff they don’t know about,” he said.

    In the wake of damning testimony from a whistleblower who says the company ignores potential harm to young users, Facebook has announced a set of features...

    Article Image

    Facebook whistleblower revealed on ‘60 Minutes’

    The former employee leaked internal documents to the media

    The whistleblower behind charges that Facebook content is “toxic” and that the company knows it says she acted because she wants to make Facebook better, not damage the social media platform.

    Frances Haugen, a former Facebook computer engineer, provided thousands of internal Facebook research documents to the Wall Street Journal, which last month published a series of stories about the platform, including one that details how Facebook research shows that Instagram makes many teenage girls feel bad about their bodies.

    On CBS’ “60 Minutes” Sunday evening, Haugen went public, telling the network that the world needed to know what she knew.

    "The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money," Haugen said.

    Haugen, who previously worked at Google and Pinterest, said the research presented Facebook executives with evidence that its content is responsible for a growing amount of hate and violence around the world. But because anger often increases engagement with the platform, it can be highly profitable. 

    "I've seen a bunch of social networks, and it was substantially worse at Facebook than anything I've seen before," Haugen said. "At some point in 2021, I realized I'm going to have to do this in a systemic way, that I'm going to have to get out enough [documents] that no one can question that this is real."

    How anger makes money

    Haugen is not the first to call out the media for its role in stirring up anger, but she’s the first to present documentation. In his best-selling book “Hate, Inc.,” Matt Taibbi, former political editor at Rolling Stone, makes a case that the internet has caused even mainstream news organizations to focus on information designed to anger and energize viewers and readers.

    “We started to turn the ongoing narrative of the news into something like a religious contract, in which the idea was not just to make you mad but to keep you mad, whipped up in a state of devotional anger,” Taibbi writes.

    Facebook, meanwhile, has disputed the charges that it is toxic for society and said in a statement to the media that many of the inferences drawn from the leaked documents are “misleading.”

    Either way, the charges may get a thorough airing this week in Washington. Haugen is scheduled to appear before a congressional committee looking into her claims.

    The whistleblower behind charges that Facebook content is “toxic” and that the company knows it says she acted because she wants to make Facebook better, n...

    Article Image

    Instagram shelves its plans for ‘Instagram Kids’ app

    The company defends its intentions and says research is on its side

    Instagram sent shockwaves through the social media community on Monday when it announced that it’s pausing any further development of its “Instagram Kids” project. 

    For months, the Facebook-owned company has resisted urges from advocacy groups and state attorneys general to shelve the idea. It tried to quell the hue and cry by announcing new safety features for younger users. In a blog post, Adam Mosseri, head of Instagram, said the company’s original intentions were good. 

    “We started this project to address an important problem seen across our industry: kids are getting phones younger and younger, misrepresenting their age, and downloading apps that are meant for those 13 or older … We’re announcing these steps today so we can get it right.”

    Instagram defends app but decides to pull the plug

    The straw that may have broken Instagram’s steadfastness came in the form of a Wall Street Journal article. In its report, the Journal asserted that Instagram’s own in-house research suggested that there was a “significant teen mental-health issue that Facebook plays down in public.”

    Instagram executive Pratiti Raychoudhury responded to the article, saying that it doesn’t accurately reflect the facts. “It is simply not accurate that this research demonstrates Instagram is ‘toxic’ for teen girls. The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced.” 

    Raychoudhury claims that more teenage girls who struggle with issues like loneliness, anxiety, sadness, and eating said that Instagram made those difficult times better rather than worse. The only area where teen girls supposedly said Instagram wasn’t helpful was when it came to body image. 

    Despite its defense of the project, Instagram ultimately decided to pull the plug on the app after facing wave after wave of criticism. However, Mosseri still says the idea has merit. 

    “We firmly believe that it’s better for parents to have the option to give their children access to a version of Instagram that is designed for them — where parents can supervise and control their experience — than relying on an app’s ability to verify the age of kids who are too young to have an ID,” he wrote. 

    Instagram sent shockwaves through the social media community on Monday when it announced that it’s pausing any further development of its “Instagram Kids”...

    Article Image

    Congress plans probe of Instagram’s effect on teenage girls

    Lawmakers respond to published report stating Facebook knows the platform is ‘toxic’

    Two U.S. senators say they will launch an investigation into allegations that Facebook is aware that its popular Instagram platform is “toxic” for teenage girls.

    Sen. Richard Blumenthal (D-Conn.) and Sen. Marsha Blackburn (R-Tenn.), who lead a Senate subcommittee on consumer protection, were spurred to action by an investigative report published Tuesday in The Wall Street Journal.

    The report cited company documents and sources it said showed Facebook is aware that many teenage girls on the app are prone to negative body image. It suggests that the constant access to photos of fashion and fitness influencers' bodies is damaging to teens' self-esteem.

    The article cited March 2020 internal research that found that 32% of teen girls said Instagram only made them feel worse when they felt bad about their bodies. “Comparisons on Instagram can change how young women view and describe themselves,” the researchers concluded.

    Congress steps in

    “It is clear that Facebook is incapable of holding itself accountable,” Blumenthal and Blackburn said in a joint statement. “The Wall Street Journal’s reporting reveals Facebook’s leadership to be focused on a growth-at-all-costs mindset that valued profits over the health and lives of children and teens.”

    The two lawmakers said they were in touch with Facebook senior management over the summer and received “evasive” and “misleading” answers when they asked about how the platform affected its youngest users.

    “We are in touch with a Facebook whistleblower and will use every resource at our disposal to investigate what Facebook knew and when they knew it — including seeking further documents and pursuing witness testimony,” the lawmakers concluded. “The Wall Street Journal’s blockbuster reporting may only be the tip of the iceberg.”

    Facebook willing to work with Congress

    A Facebook spokeswoman told the Journal that the company welcomed “productive collaboration” with members of Congress and would seek opportunities to work with outside researchers on credible studies.

    The company also previously acknowledged internal research on the subject but said the findings are proprietary and would not be released. Congress, of course, has subpoena power.

    One question lawmakers might pursue is whether or not Instagram is more harmful than other similar platforms. The Journal cites what it says is an internal document that suggests it is more damaging than other social media apps and sites.

    “Social comparison is worse on Instagram,” the 2020 research report states. According to the Journal, the document points out that TikTok is all about performance, while rival Snapchat is focused on jokey face filters. It said Instagram, on the other hand, focuses heavily on appearance and lifestyle.

    Two U.S. senators say they will launch an investigation into allegations that Facebook is aware that its popular Instagram platform is “toxic” for teenage...