The truth about Facebook

solarguy
Posts: 126
Joined: Wed Jul 12, 2017 3:29 pm

The truth about Facebook

Post by solarguy » Thu Mar 01, 2018 1:43 pm

http://www.dailymail.co.uk/news/article ... quiry.html

Facebook routinely gathers data from its 1.4 billion daily active users worldwide
It also uses tracking devices that follow a user's internet activity via third-parties
Even if you have never entered the Facebook domain, the company can track you
Facebook account holders are able to download a copy of the file kept on them
The privacy of users tracked via-third parties is currently less transparent



Facebook may be tracking your every move online even if you have never been on the site.

Not content with monitoring the movements of its own users, the largest social network in the world is building secret files on the activities of billions of people.

Mark Zuckerburg's company says that is uses this information to target adverts and content based on your preferences, as well as for security purposes.

Facebook account holders are able to download a copy of the file kept on them, which contains detailed records of their activities while logged in.

The privacy of users tracked via-third parties is currently less transparent, with no way of checking exactly what Facebook knows about you.

solarguy
Posts: 126
Joined: Wed Jul 12, 2017 3:29 pm

Re: The truth about Facebook

Post by solarguy » Thu Mar 01, 2018 1:45 pm


solarguy
Posts: 126
Joined: Wed Jul 12, 2017 3:29 pm

Re: The truth about Facebook

Post by solarguy » Thu Mar 01, 2018 1:48 pm

THE HARSH TRUTH ABOUT FACEBOOK WILL BE HEAVILY REVEALED IN 2018

http://www.collective-evolution.com/201 ... d-in-2018/

What is Facebook? What did it start out as, and what has it become?

Everything evolves — we know that. Sometimes what we set out to do in our initial intentions changes and becomes something different. In business it’s called pivoting; in real life it’s called reflecting on whether there is any meaning, value, or purpose to what we’re doing, and whether this matches our initial intentions.

When Facebook began in 2004, it was a platform for people to stay connected in a meaningful way, specifically students enrolled in post-secondary school. By 2007, Facebook opened its platform up to the public and began on-boarding users like crazy. It quickly became the largest and most influential social media network in the world, now serving over 2 billion users.

I remember those days. The people you connected with, the pages you chose to follow — you actually saw what they posted. Point blank, you saw what you cared about and wanted to see. Now though, it’s quite different. This is where Facebook faces its greatest challenges. I also feel this is where they are most heavily lying to the public.

To preface, this is not a Facebook smear piece. Facebook has been a valuable network to share information to millions and create some incredible change in the world. However, things have changed dramatically since Facebook went public and moved under the thumb of intelligence agencies.

Convenient Newsroom Information
Facebook has consistently told the world that users provide valuable feedback about what they want changed on Facebook, and then Facebook follows up. However, the changes also seem to align with greater profits and a poorer user experience. We never actually see the results of this apparent feedback Facebook claims to be getting. We simply get statements like “Maintaining a relevant and interesting News Feed is important to satisfying users.”

Now Facebook has a huge job on their hands. They must sort through the millions of posts posted every day and place the meaningful ones in front of the right people. But there is a problem with this. Facebook doesn’t have to sort through millions of posts for each user, they simply need to sort through the couple thousand or less that each user is technically subscribed to. Allow me to explain.

The average Facebook user has 155 friends. I could not find 2017 stats, but as of 2013 the average user Likes about 70 pages. Again, a stat I can’t find is how often a Facebook user posts on Facebook per day, but brands, on average, post about 10 times per day. So in any given day, if we assumed every brand a user liked posted 10 times and every friend posted twice, a user would have to sift through a little over 1,000 posts. Given the average user spends about 50 minutes per day on its platforms, a user would have about 2.5 – 3 seconds per post if they were fed all 1,000 posts.

Interestingly, Facebook sees value in a number that small, as video views are tallied when a user spends just three seconds watching. So if the newsfeed really only has to sift through about 1,000 posts per user, based on what a user ACTUALLY has chosen to connect with, why do they claim it has such a hard time showing users what they want? Think about it: Even if the newsfeed had a whopping 100% organic reach on every post from every person, brand, or page a user likes, the average user would only have to sift through about 1,000 posts per day.

Facebook took the path of pulling out users’ interests and replacing them with posts users may or may not like from who knows where. On the flip side, this has allowed them to charge brands to reach their audience; the same audience that asked Facebook to show them their posts in the first place.

Have You Noticed the Newsfeed Does Not Work as Advertised?
This is where things get interesting and where I feel Facebook is going to reach its demise in 2018 in a big way. But first let’s turn to a harsh reality.

Facebook is “ripping apart the social fabric.” Those are the words of Chamath Palihapitiya, the company’s former vice president of user growth. Why would this be said about a company whose mission is to apparently “build global community?”

Because Facebook’s actions do not align with their mission; they align with intelligence agencies, political pressure, and stock holders.

Once again, “It literally changes your relationship with society, with each other … God only knows what it’s doing to our children’s brains,” says Sean Parker, Facebook’s former president.

But let’s reflect here. Is this not showing us on a global scale what happens when our initial intentions and passionate hearts are set aside in favour of appeasing the destructive nature of politics, financial greed, and big brother? This doesn’t happen only to a company like Facebook, this happens all over the place. We are willing to give up true connection, community, value, and overall societal health in the name of money, power, and control.

Facebook claims its algorithm is designed to show you relevant posts from the people and brands you want to see them from, but do you actually see that? I see posts from people I never engage with. I see ads. I see, for lack of a better term, meaningless drivel from brands that I don’t ‘Like’ on Facebook nor care about. Now, there is nothing wrong with ads, but why can’t a user see what they want?

How is a newsfeed, apparently designed for a user, filled with so many unwanted posts? Whereas, when a user not only likes a specific group or page, but also asks to see that FIRST in their newsfeed, it’s not seen. How is that possible? Simply put, the newsfeed does not work as advertised, nor is it designed to provide the user with what they want. Which goes along with the harsh reality that Facebook likely does not give users the truth about why it’s making changes.

But I can’t blame them. They have to appease stockholders, and so stripping brands from the newsfeed makes sense. But it detracts from their mission and value, as now, users don’t get what they want. People loved Facebook because they could be informed and get updates from things they care about, and brands were largely responsible for helping to build Facebook to begin with.

The average user is left with mindless content they often don’t really care about, which is why we get the types of quotes we get from the executives above. It seems, for the large part, Facebook has chosen to feed users things they don’t want in exchange for making more money, and in turn users have to go out of their way to get content they want. Now, the average user should do that, but we don’t. Instead we just look at what we’re fed and thus this is why I feel we are seeing the types of mental and emotional challenges we’re seeing from Facebook use.

What if we were given posts we wanted, that helped us learn, stay informed, and explore what’s happening in our world in a more meaningful way rather than simply posts from our friends about what they ate today, where they are going, and other mindless posts that are said to be ‘feel good’ yet are not when read in excess. Might this produce a more exercised, informed, and engaged mind?

solarguy
Posts: 126
Joined: Wed Jul 12, 2017 3:29 pm

Re: The truth about Facebook

Post by solarguy » Thu Mar 01, 2018 1:50 pm

http://www.bitrebels.com/social/the-tru ... cared-yet/

It is quite clear when reading the EULA that something isn’t right. Why do we need to accept the conditions to give up the rights of our images that we upload? It can’t be to be able to show them on our personal profile pages as that would have been an entirely different clause to begin with. It is easy to understand that a lot of people, when they eventually found out that they gave up the rights for their stuff, started freaking out. But is there a reason to freak out about the EULA of Facebook or is it just an exaggeration brought upon us by the media?

solarguy
Posts: 126
Joined: Wed Jul 12, 2017 3:29 pm

Re: The truth about Facebook

Post by solarguy » Thu Mar 01, 2018 1:52 pm

https://www.cheatsheet.com/breaking-new ... aled.html/

The primary dispute centered around whether Mark had entered into an “agreement” with the Harvard seniors, Cameron and Tyler Winklevoss and a classmate named Divya Narendra, to develop a similar web site for them — and then, instead, stalled their project while taking their idea and building his own.

The litigation never went particularly well for the Winklevosses.

In 2007, Massachusetts Judge Douglas P. Woodlock called their allegations “tissue thin.” Referring to the agreement that Mark had allegedly breached, Woodlock also wrote, “Dorm room chit-chat does not make a contract.” A year later, the end finally seemed in sight: a judge ruled against Facebook’s move to dismiss the case. Shortly thereafter, the parties agreed to settle.

But then, a twist.

After Facebook announced the settlement, but before the settlement was finalized, lawyers for the Winklevosses suggested that the hard drive from Mark Zuckerberg’s computer at Harvard might contain evidence of Mark’s fraud. Specifically, they suggested that the hard drive included some damning instant messages and emails.

The judge in the case refused to look at the hard drive and instead deferred to another judge who went on to approve the settlement. But, naturally, the possibility that the hard drive contained additional evidence set inquiring minds wondering what those emails and IMs revealed. Specifically, it set inquiring minds wondering again whether Mark had, in fact, stolen the Winklevoss’s idea, screwed them over, and then ridden off into the sunset with Facebook.

Unfortunately, since the contents of Mark’s hard drive had not been made public, no one had the answers.

But now we have some.

Over the past two years, we have interviewed more than a dozen sources familiar with aspects of this story — including people involved in the founding year of the company. We have also reviewed what we believe to be some relevant IMs and emails from the period. Much of this information has never before been made public. None of it has been confirmed or authenticated by Mark or the company.

neighbor
Posts: 426
Joined: Sun Jul 09, 2017 6:06 pm

Facebook routinely suppresses conservative news

Post by neighbor » Thu Mar 01, 2018 3:04 pm

https://gizmodo.com/former-facebook-wor ... 1775461006

Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section, according to a former journalist who worked on the project. This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.

Several former Facebook “news curators,” as they were known internally, also told Gizmodo that they were instructed to artificially “inject” selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion—or in some cases weren’t trending at all. The former curators, all of whom worked as contractors, also said they were directed not to include news about Facebook itself in the trending module.

In other words, Facebook’s news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation. Imposing human editorial values onto the lists of topics an algorithm spits out is by no means a bad thing—but it is in stark contrast to the company’s claims that the trending module simply lists “topics that have recently become popular on Facebook.”

These new allegations emerged after Gizmodo last week revealed details about the inner workings of Facebook’s trending news team—a small group of young journalists, primarily educated at Ivy League or private East Coast universities, who curate the “trending” module on the upper-right-hand corner of the site. As we reported last week, curators have access to a ranked list of trending topics surfaced by Facebook’s algorithm, which prioritizes the stories that should be shown to Facebook users in the trending section. The curators write headlines and summaries of each topic, and include links to news sites. The section, which launched in 2014, constitutes some of the most powerful real estate on the internet and helps dictate what news Facebook’s users—167 million in the US alone—are reading at any given moment.

neighbor
Posts: 426
Joined: Sun Jul 09, 2017 6:06 pm

Facebook, Twitter manipulate public opinion

Post by neighbor » Thu Mar 01, 2018 3:06 pm

https://www.theguardian.com/technology/ ... ok-twitter

From Russia, where around 45% of highly active Twitter accounts are bots, , where a campaign against President Tsai Ing-wen involved thousands of heavily co-ordinated – but not fully automated – accounts sharing Chinese mainland propaganda, the studies show that social media is an international battleground for dirty politics.

The reports, part of the Oxford Internet Institute’s Computational Propaganda Research Project, cover nine nations also including Brazil, Canada, China, Germany, Poland, Ukraine, and the the . They found “the lies, the junk, the misinformation” of traditional propaganda is widespread online and “supported by Facebook or Twitter’s algorithms” according to Philip Howard, Professor of Internet Studies at Oxford.

At their simpler end, techniques used include automated accounts to like, share and post on the social networks. Such accounts can serve to game algorithms to push content on to curated social feeds. They can drown out real, reasoned debate between humans in favour of a social network populated by argument and soundbites and they can simply make online measures of support, such as the number of likes, look larger – crucial in creating the illusion of popularity.

The researchers found that in the US this took the form of what Samuel Woolley, the project’s director of research, calls “manufacturing consensus” – creating the illusion of popularity so that a political candidate can have viability where they might not have had it before.

neighbor
Posts: 426
Joined: Sun Jul 09, 2017 6:06 pm

Facebook mood manipulation

Post by neighbor » Thu Mar 01, 2018 3:08 pm

https://www.theatlantic.com/technology/ ... nt/373648/

We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.

This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.

neighbor
Posts: 426
Joined: Sun Jul 09, 2017 6:06 pm

Facebook illeminati

Post by neighbor » Thu Mar 01, 2018 3:10 pm


neighbor
Posts: 426
Joined: Sun Jul 09, 2017 6:06 pm

Re: The truth about Facebook

Post by neighbor » Thu Mar 01, 2018 3:14 pm


Post Reply