Category Archives: social media

Facebook is Dying, Here’s Proof

https://whiskeytangotexas.files.wordpress.com/2018/11/e46fe-1zuqxgccwui09vt0imrkoxg.jpeg?w=623&h=350Facebook is not sorry, it’s getting desperate.

The #DeleteFacebook hashtag is alive and well. It’s become an attractive scenario for anyone brave enough to deal with mobile addiction. In fact, we’re deleting the app and deleting our accounts at such a pace, Facebook is fighting back! Yes, you heard me right.

Facebook is now making users wait twice as long to delete their accounts

So you finally realized your life would be better without Facebook, Instagram, Messenger and WhatsApp, until you realize something odd. It’s harder than ever to quit, because now Facebook makes saying goodbye a long-drawn out affair. Of course, Facebook doesn’t want you (their product) to leave.

In a shocking and bizarre move to the product and customer experience it now takes a full month to delete your Facebook account — twice as long as before. That’s not exactly free will or the right to say no and breakup with an app that you may have spent way too much time on in your life.

Facebook Thinks we Should have a “Grace” Period

Facebook thinks it should over-ride our decision by making us wait longer. Facebook really understands trust and consumers it seems. When a user tries to delete their account, it makes them wait for a “grace period” before it is actually deleted.

Facebook must seriously be losing a lot of users in 2018 to pull this stunt. Where’s the strategy, bro? The change comes after months of scandals and PR crises for Facebook, including Cambridge Analytica and the recent hack of 50 million people. That we should delete Facebook’s apps has never been clearer to people under 45.

Facebook thought it could be the gateway to the world and help the world feel closer together; instead it made us the product, harvested our data and sold it to the highest bidder, not to mention scamming brands out of $millions of dollars each year. I may not even exist any more on Facebook but their targeting data on me still does — think about what that means for a while.

Facebook even shared user data with Chinese companies. Facebook handed over “deep access” to user data to 60 other tech companies. Facebook’s laundry list of privacy invasion is downright criminal, and not only should Mark Zuckerberg not be its CEO, he should be held accountable.

Listen, if a data breach of 50 million could cost Facebook $1.6 Billion in fines by the EU, how much should it pay the average user for its crimes in the long run? Facebook’s already breaking the law with breaches of the General Data Protection Regulation (GDPR) in the European Union.

A One-Month Wait to Delete Facebook is Worse than Censorship

So now we are prisoners online it would seem, according to Facebook. Thinking of deleting your Facebook account? Not so fast. Sorry, my friends, it now will take an entire month, up from 14 days before. Even if you delete your account, don’t expect Facebook to put your data in the trash bin. That’s impossible.

Facebook’s trust dilemma is so epic the stock could decline further even as pundits say it’s alright. Just wait until we find how many users leave Facebook’s flagship app by 2020. It’s going to be pretty epic. The change to the deletion time was first noticed by The Verge.

Amazon gives me speed and convenience, and Facebook gives me fraud and imprisonment. Good luck competing in the future of Advertising, Facebook. This is the sort of thing that will make users revive the #deletefacebook campaign, and it certainly makes me upset.

  • When a user decides to delete their Facebook account, it doesn’t actually get deleted straight away. Instead, there’s a “grace period,” in which the account remains inactive but accessible — just in case the user gets cold feet and decides to stay on Facebook after all.
  • Aw, yeah, I really can’t quit, just one more thumb scroll of my legacy feed where nobody I know is active anymore.

Facebook thinks being nostalgic is cool as, historically, that grace period has been 14 days, or two weeks. I bet Mark is nostalgic for the good old days but major failures in leadership, strategy and pivoting have led Facebook into a dead-end path. When the vanity of having billions of users retreats into the past, Facebook doesn’t have a product, because the product has always been you and your data!

Silicon Valley doesn’t take the regulation of AI seriously (because it’s too expensive) and Facebook and YouTube are prime examples of this. When the talent exodus starts like it has for Facebook and Snapchat, it’s pretty serious. There’s no saving a sinking ship that makes it harder to leave.

Don’t be afraid we all have to move on. It’s best to terminate now if you want your data deleted. It’s time to do the unthinkable to lead a higher quality life:

https://whiskeytangotexas.files.wordpress.com/2018/11/b52f5-1e52orrngpnv4nrjlekvrdg.jpeg?w=249&h=155

  • Delete Facebook
  • Delete Messenger
  • Delete Instagram
  • Delete WhatsApp

Facebook was once a candy treat of the digital dopamine variety for human connection. That era is long gone.

Of course, a longer grace period is also to Facebook’s advantage as it mindf8cks us into thinking it’s still relevant. Twitter is nostalgia, Facebook is just dumb.

If your value is tied to massive numbers of users and those users are leaving you; you have nowhere to go but down. Instagram isn’t no YouTube and WhatsApp isn’t any WeChat. If only they had had the sense to change their CEO, things could have been different. But all good things must come to an end, even the app of many regrets.

Source: by Michael K. Spencer | Medium.com

Advertisements

Facebook Sued By PTSD-Stricken Moderator For Non-stop Exposure To “Rape, Torture, Bestiality, Beheadings, Suicide And Murder”

A Northern California woman hired to review flagged Facebook content has sued the social media giant after she was “exposed to highly toxic, unsafe, and injurious content during her employment as a content moderator at Facebook,” which she says gave her post traumatic stress disorder (PTSD).

Selena Scola moderated content for Facebook as an employee of contractor Pro Unlimited, Inc. between June 2017 and March of this year, according to her complaint. 

“Every day, Facebook users post millions of videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder,” the lawsuit reads. “To maintain a sanitized platform, maximize its already vast profits, and cultivate its public image, Facebook relies on people like Ms. Scola – known as “content moderators” – to view those posts and remove any that violate the corporation’s terms of use.

“You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off,” one content moderator recently told the Guardian.

According to the lawsuit, Facebook content moderators are asked to review over 10 million potentially rule-breaking posts per weekwith an error rate of less than one percent – and a mission to review all user-reported content within 24 hours. Making the job even more difficult is Facebook Live, a feature that allows users to broadcast video streams on their Facebook pages. 

The Facebook Live feature in particular “provides a platform for users to live stream murder, beheadings, torture, and even their own suicides, including the following:” 

In late April a father killed his 11-month-old daughter and live streamed it before hanging himself. Six days later, Naika Venant, a 14-year-old who lived in a foster home, tied a scarf to a shower’s glass door frame and hung herself. She streamed the whole suicide in real time on Facebook Live. Then in early May, a Georgia teenager took pills and placed a bag over her head in a suicide attempt. She live streamed the attempt on Facebook and survived only because viewers watching the event unfold called police, allowing them to arrive before she died.

As a result of having to review said content, Scola says she “developed and suffers from significant psychological trauma and post-traumatic stress disorder (PTSD)” – however she does not detail the specific imagery she was exposed to for fear of Facebook enforcing a non-disclosure agreement (NDA) she signed. 

Scola is currently the only named plaintiff in the class-action lawsuit, however the lawsuit says that the potential class could include “thousands” of current and former moderators in California. 

As Motherboard reports, moderators have to view a constant flood of information and use their judgement on how to best censor content per Facebook’s “constantly-changing rules.” 

Moderating content is a difficult job—multiple documentaries, longform investigations, and law articles have noted that moderators work long hours, are exposed to disturbing and graphic content, and have the tough task of determining whether a specific piece of content violates Facebook’s sometimes byzantine and constantly-changing rules. Facebook prides itself on accuracy, and with more than 2 billion users, Facebook’s work force of moderators are asked to review millions of possibly infringing posts every day. –Motherboard

“An outsider might not totally comprehend, we aren’t just exposed to the graphic videos—you’ll have to watch them closely, often repeatedly, for specific policy signifiers,” one moderation source told Motherboard. “Someone could be being graphically beaten in a video, and you could have to watch it a dozen times, sometimes with others present, while you decide whether the victim’s actions would count as self-defense or not, or whether the aggressor is the same person who posted the video.” 

The lawsuit also alleges that “Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop … Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled. Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

Facebook told Motherboard that they are “currently reviewing the claim.”

We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources,” the spokesperson said. “Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling—available at the location where the plaintiff worked—and other wellness resources like relaxation areas at many of our larger facilities.”

“This job is not for everyone, candidly, and we recognize that,” Brian Doegan, Facebook’s director of global training, community operations, told Motherboard in June. He said that new hires are gradually exposed to graphic content to “so we don’t just radically expose you, but rather we do have a conversation about what it is, and what we’re going to be seeing.” 

Doegan said that there are rooms in each office that are designed to help employees de-stress. –Motherboard

“What I admire is that at any point in this role, you have access to counselors, you have access to having conversations with other people,” he said. “There’s actual physical environments where you can go into, if you want to just kind of chillax, or if you want to go play a game, or if you just want to walk away, you know, be by yourself, that support system is pretty robust, and that is consistent across the board.”

Read the lawsuit below: 

Source: ZeroHedge